1. The problem statement, all variables and given/known data Let B be an ordered orthonormal basis for a k-dimensional subspace V of ℝn. Prove that for all v1,v2 ∈ V, v1·v2 = [v1]B · [v2]B, where the first dot product takes place in ℝn and the second takes place in ℝk. 2. Relevant equations 3. The attempt at a solution Let B = (b1,...,bk) Express v1 and v2 as linear combinations of the vectors in B: v1 = a1v1 + a2v2 + ..... + akvk v2 = b1v1 + b2v2 + ..... + bkvk I am confused as to where to go from here.