I couldn't find a Linear Algebra section in the Homework forum. I think you can do this without cal... (I hope, but it's been a long day so you never know) I am asked to prove that the shortest distance from a point P0(x0,y0) to a given line ax+by+c=0 is |ax0+by0+c|/sqrt(a2+b2) 2. Relevant equations D(P0, L) ; standard inner product, projection of a vector onto a subspace, orthogonal composition of a vector 3. The attempt at a solution Seeing as it's a linear algebra class I figured I should probably use some linear algebra methods. (Work attached, summary below) However I'm still not sure how to deal with the constant term in the line equation. I tried neglecting the C and having a subspace W: ax + by =0 : span (-b, a). I then drew a vector x = (x0,y0) and from these two I generated an equation for y y = x - projWx. Then ||y|| is my D(P0,L) However, I ran into an issue solving for ||y||. I could separate ||y-x|| which gives rise to an inequality but then I'm left with a nasty ||x||||w|| term in my absolute value bracket when I try and put all the terms over sqrt(a2+b2). I'd like that to equal c but that's wishful thinking. Any help is greatly appreciated.