The speed of a falling body might be based on the observation that the velocity of a falling object seems to increase the further it has fallen. Model the hypothesis "The speed of a falling object is proportional to the distance it has fallen" as a differential equation initial value problem. By analyzing the predictions of your model, explain why this "law of gravity" could not be correct.
The Attempt at a Solution
So, obviously, v(t)=v0+k*d(t), where k is the proportionality constant.
d(t) = (v0+v(t)/2)*t
But plugging in d(t) and solving yields -(2*v0 + k v)/(-2 + 1*k) which if v0 = 0 is always 0 and there are restrictions on k due to the numerator. Also, it's not a differential equation and I'm not sure how to get one. I tried setting dv/dt = a and solving that way but it yields a nonsensical equation for v(t).
So my question is, how do I set up the model?