# If V is a subspace of Rn, prove that the orthogonal complement of V is also a subspac

This Exercise 3.3 from Advanced Calculus of Several Variables by C.H. Edwards Jr.:

If $V$ is a subspace of $\Re^{n}$, prove that $V^{\bot}$ is also a subspace.

As usual, this is not homework. I am just a struggling hobbyist trying to better myself on my own time.

The only progress I've been able make towards a formal proof is to unpack the definition of a subspace: a set of objects which is closed over two operations: $o_{1} + o_{2}$ and $a \cdot o$, where $o, o_{1}$, and $o_{2}$ are any objects in the set, and a is a real number.

So what I want to show is that for two objects $o_{1}, o_{2} \in V^{\bot}$, their sum $o_{1} + o_{2} \in V^{\bot}$, and $a \cdot o \in V^{\bot}$.

But, I think this is the sticking point: what does it take to formally establish that some vector $o$ is in $V^{\bot}$?

Dick
Homework Helper

This Exercise 3.3 from Advanced Calculus of Several Variables by C.H. Edwards Jr.:

If $V$ is a subspace of $\Re^{n}$, prove that $V^{\bot}$ is also a subspace.

As usual, this is not homework. I am just a struggling hobbyist trying to better myself on my own time.

The only progress I've been able make towards a formal proof is to unpack the definition of a subspace: a set of objects which is closed over two operations: $o_{1} + o_{2}$ and $a \cdot o$, where $o, o_{1}$, and $o_{2}$ are any objects in the set, and a is a real number.

So what I want to show is that for two objects $o_{1}, o_{2} \in V^{\bot}$, their sum $o_{1} + o_{2} \in V^{\bot}$, and $a \cdot o \in V^{\bot}$.

But, I think this is the sticking point: what does it take to formally establish that some vector $o$ is in $V^{\bot}$?

Show that the dot product of o with any vector in V is zero.

At this point in the book, he's barely touched on the the dot product, which he's calling the "usual inner product". He's only been talking about inner products at a higher level of generality as any binary operation with the three properties of positivity, "symmetry" (commutativity), and linearity. If I proved anything using the dot product, it would reduce the generality that he's clearly taking pains to establish. I'm worried that any such proof is going to get left behind in the subsequent chapters.

Dick
Homework Helper

At this point in the book, he's barely touched on the the dot product, which he's calling the "usual inner product". He's only been talking about inner products at a higher level of generality as any binary operation with the three properties of positivity, "symmetry" (commutativity), and linearity. If I proved anything using the dot product, it would reduce the generality that he's clearly taking pains to establish. I'm worried that any such proof is going to get left behind in the subsequent chapters.

What definition were you given for the orthogonal complement? It would depend on which inner product you are using, right?

Last edited:

Given a subspace $V$ of $\Re^{n}$, denote by $V^{\bot}$ all of those vectors in $\Re^{n}$, each of which is orthogonal to every vector in $V$.

Orthogonality is defined like this:

A set of nonzero vectors $v_{1}, v_{2}, ...$ in $V$ is said to be an orthogonal set if $<v_{i}, v_{j}> = 0$ whenever i ≠ j.

But your question has lead me further down the path by leading me back to the definition of orthogonality. I have to prove that $<v, o>= 0$. It's the essentially what you said, except instead of using the dot product, I should prove it using only the properties of the generalized inner product $<>$. I'll work towards that goal now. Thank you for the hint :).

HallsofIvy
Homework Helper

In order to show that a set of vectors is a subspace, you need only prove that the sum of two vectors in the set is also in the set and that a scalar times a vector in the set is also in the set.

If u and v are in the orthogonal complement of V, then <u, a>= 0 and <v, a>= 0 for every vector in V. So what is true of <u+ v, a>? What is true of <ku, a> for k any scalar?

Thanks Ivy. I solved this one quickly once I referred back to the definition of orthogonality. It follows directly from property 3 of inner products, < ax + by, z > = a< x, z > + b< y, z >.

Thus using your variables, < u, a > + < v , a > = < u + v, a > = 0 + 0 = 0, proving that u+v is orthogonal to a.

Also by linearity,
< ku, a > = k< u, a > = k * 0 = 0

HallsofIvy