- #1
Hypochondriac
- 35
- 0
ok so perhaps not a calculation queery of sorts but a coursework question none the less.
in my investigation I'm using a Newton meter to fire projectiles and i need to know the initial velocity, to work it out i need the time it takes to travel up the runway as it were.
now I have the Newton meter infront of me and it isn't as simple as timing it because it is VERY fast.
so perhaps i should assume the time, (its relatively constant at different parts of the Newton meter because as you increase the distance it travels you also increase the average force on it)
so what should i assume it to be? 0.1s? 0.01s? although i can't tell the difference between 0.1s and 0.01s it changes my initial velocity value 10fold
in my investigation I'm using a Newton meter to fire projectiles and i need to know the initial velocity, to work it out i need the time it takes to travel up the runway as it were.
now I have the Newton meter infront of me and it isn't as simple as timing it because it is VERY fast.
so perhaps i should assume the time, (its relatively constant at different parts of the Newton meter because as you increase the distance it travels you also increase the average force on it)
so what should i assume it to be? 0.1s? 0.01s? although i can't tell the difference between 0.1s and 0.01s it changes my initial velocity value 10fold