I'm writing a physics practical report on the refractive index of glass. We performed an experiment in which we measured the incident angle and the refractive angle of light entering a glass block. When plotting sin(ϴa) against sin(ϴg) with the y-intercept as 0 (to satisfy snell's law: nasin(ϴa)=ngsin(ϴg) sin(ϴa)=(ng/na)sin(ϴg) where ng=refractive index of glass and na =refractive index of air. This is the same form as y=mx) it yields a line with a gradient equal to the refractive index of glass. We also have to calculate the error, using the minimum and maximum gradient method. However, my data would better fit a trend line without a y-intercept of 0, but when I give it this trend line the gradient is 1.1, which obviously can't be the refractive index of glass. My problem is, that when I add my maximum and minimum gradient trend lines to fine error, they are both less than the one I have. Is there any other way to find uncertainty in a graph?