Hi everyone, I have a plot of some data points that have error bars on the y axis. A bit of software I am using gives me the best fit gradient and a "Standard Error", but it doesn't take the size of error bars into consideration. I'm assuming that it just looks at how well the gradient fits the data points to give the standard error on the gradient. But doesn't one need to take into consideration the size of the error bars on the data points?? Thank you.