How may the error on gravitational acceleration (g) be determined, given a set of measurements of time (t) and distance (d)? It is stated that the distances are measured precisely and time with an accuracy of 0.01 sec. I have applied Least Squares on t = √(2/g) * √d and found (σ of √g) albeit was unsuccessful at deducing (σ of g). I have also tried using Error Propagation, hence (σ of t)^2 = [(∂t/∂g) * (σ of g)]^2 → (σ of t)^2 = (d/2g^3) * (σ of g)^2, yet what value would d have? Please advise!