1. The problem statement, all variables and given/known data How can I calculate the relative error of a laser beam waist and angle of divergence using this experimental data: I have measured the increase in laser beam diameter as the distance from the laser increases. With this data I plotted a graph of distance from the laser beam2 vs. laser beam diameter2. 2. Relevant equations half angle of divergence = 2*λ / π*s0 s2=s02+Φ2z2 s0: beam waist z: distance of beam from source Φ: angle of divergence 3. The attempt at a solution I fitted a LINEST trendline in excel to the graph of laser beam2 vs. laser beam diameter2, which equals y = 4.7E-7x + 3.4E-1. I also got absolute uncertainties in both of these values, being 5.5E-9 for the 4.7E-7, and 6.24E-3 for the 3.4E-1. I know that from the trendline 3.4E-1 = beam waist2. Taking the √3.4E-1 gives me the beam waist = 5.8E-1. Converting this beam waist to 1/e2 gives a beam waist = 8.2E-1 mm If I use the equation: half angle of divergence = 2*λ / π*beam waist = 4.8E-4. Multiplying by 2 and the full angle of divergence = 9.6E-4 rad. beam waist = 8.2E-1 mm and angle of divergence = 9.6E-4 rad are consistent with the manufacturers specs, but I'm not sure how to calculate the relative uncertainties of each. Any guidance would be much appreciated.