Hello I'm reading Lang's Intro to Linear Algebra & I've noticed that he uses squaring an equation,(adsbygoogle = window.adsbygoogle || []).push({});

working out the algebra and then square-rooting to prove a theorem.

I'm trying to get used to proofing for analysis &

I'd like to know whether squaring is considered an adequate method of proof?

A quick example is;

||xA|| = |x| ||A||

1/||xA||² = {√[(xA) · (xA)]}² = xA · xA

(Using the definition ||A|| = √(A · A) = √(a_1² + a_2² + ... + a_n²)

2/xA · xA = (xa_1, xa_2, ..., xa_n) · (xa_1, xa_2, ..., xa_n)

(Using the definition of vector A : (a_1, a_2, ..., a_n) )

3/x²a_1² + x²a_2² + ... + x²a_n² = x²(a_1² + a_2² + ... + a_n²)

(Using the Dot Product property for components)

4/x² A · A = |x|² ||A||²

(Rewriting the squared components in3/asA · A,

x² as |x|² to account for ± x values & A · A as ||A||²

as just representing the definition of ||A|| squared).

5/Take a square root, and voila!

Here he set out to prove his theorem by squaring out one side,

working out the algebra and achieving the other side i.e. proving an equality.

I'm just concerned as to whether this would constitute a rigid proof

in an analysis book as the ones I've read

(albeit I was lacking the mathematical maturity I have now, which is still in it's infancy!)

would seemingly come out of nowhere :p

Another question is the proof of orthogonality.

||A + B|| = ||A - B|| iff A · B = 0

1/||A + B||² = ||A - B||² <==> {√[(A + B) · (A + B)]}² = {√[(A - B) · (A - B)]}²

2/(A + B) · (A + B) = (A - B) · (A - B) <==> A² + 2A · B + B² = A² - 2A · B + B²

3/2A · B = - 2A · B

4/4A · B = 0

5/A · B = 0

If A · B = 0 then the above is true, but if A · B ≠ 0

how are you supposed to show ||A + B|| ≠ ||A - B|| ?

I'm thinking you're supposed to find a contradiction.

You assume all of the above & do the proof like I did,

then when you get to the end,5/,

you find A · B = 0 but you know that A & B are not orthogonal

so we see that the assumption cannot be true.

Is that considered a proof or just a small exercise?

Note: I'm supposed to be proving all of this just using 4 properties of the dot product

as this is a way to achieve generality;

1/A · B = B · A

2/A · (B + C) = A · B + A · C

3/(xA) · B = x(A · B)

4/A · A > 0 iff A ≠ 0

So, basically to sum up, I'm asking about

a)Squaring an equation as an adequate method of proof, especially in analysis courses or is this just a trivial exercise used in college algebra at most (and intro linear algebra!).

b)In the orthogonality part above, I'm asking what happens when vectors A & B arenotorthogonal but you're given the equation ||A + B|| = ||A - B|| and asked to test whether this equality is true. Is the contradiction achieved solely by arriving at A · B = 0 and our previous knowledge of A · B = 0 tells us that the equation can only be true if vectors A & B are orthogonal?

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Linear Algebra Proof Question?

Loading...

Similar Threads - Linear Algebra Proof | Date |
---|---|

B Help understanding a proof | Jun 8, 2017 |

B Proof of elementary row matrix operation. | Jun 6, 2017 |

I Proving a lemma on decomposition of V to T-cyclic subspace | Mar 16, 2017 |

I Proof that every basis has the same cardinality | Oct 6, 2016 |

Proving dependent columns when the rows are dependent | Sep 3, 2015 |

**Physics Forums - The Fusion of Science and Community**