- #1
- 1,861
- 1
I wrote a program that is supposed to find the standard deviation of a set, but something really strange happens in a for loop that I have in my main function.
int main()
{
int i=1;
int NUMBER_OF_ELEMENTS;
double data[NUMBER_OF_ELEMENTS];
cout << "Please enter the number of elements followed by each individual data element:" << endl;
cin >> NUMBER_OF_ELEMENTS;
for(i=1;i<=NUMBER_OF_ELEMENTS;i++)
{
cout << "Data element " << i << ": ";
cin >> data[i-1];
cout << endl;
}
cout << "The standard deviation is: " << std_dev(data, NUMBER_OF_ELEMENTS);
return EXIT_SUCCESS;
}
No matter what size I give to the array (for the number_of_elements), the integer, 'i,' in the for loop will go to 2, and the next time through the loop 'i' increments to some wacky number (i.e. 1432104).
Can anyone pick out what would cause something strange like this?
int main()
{
int i=1;
int NUMBER_OF_ELEMENTS;
double data[NUMBER_OF_ELEMENTS];
cout << "Please enter the number of elements followed by each individual data element:" << endl;
cin >> NUMBER_OF_ELEMENTS;
for(i=1;i<=NUMBER_OF_ELEMENTS;i++)
{
cout << "Data element " << i << ": ";
cin >> data[i-1];
cout << endl;
}
cout << "The standard deviation is: " << std_dev(data, NUMBER_OF_ELEMENTS);
return EXIT_SUCCESS;
}
No matter what size I give to the array (for the number_of_elements), the integer, 'i,' in the for loop will go to 2, and the next time through the loop 'i' increments to some wacky number (i.e. 1432104).
Can anyone pick out what would cause something strange like this?