# Cauchy Schwarz equality implies parallel

• Bipolarity
In summary, in an inner product space over complex numbers, if the inner product of two vectors is equal to the product of their norms, then one vector is a scalar multiple of the other. This can be proven by considering the orthogonal projection of one vector onto the subspace spanned by the other, and showing that the difference between the two vectors is zero. This can also be shown using the Cauchy-Schwarz inequality and the properties of the inner product.
Bipolarity
I'm learning about Support Vector Machines and would like to recap on some basic linear algebra. More specifically, I'm trying to prove the following, which I'm pretty sure is true:
Let ##v1## and ##v2## be two vectors in an inner product space over ##\mathbb{C}##.
Suppose that ## \langle v1 , v2 \rangle = ||v1|| \cdot ||v2|| ##, i.e. the special case of Cauchy Schwarz when it is an equality. Then prove that ##v1## is a scalar multiple of ##v2##, assuming neither vector is ##0##.

I've tried using the triangle inequality and some other random stuff to no avail. I believe there's some algebra trick involved, could someone help me out? I really want to prove this and get on with my machine learning.

Thanks!

BiP

How <v1,v2> is defined?

Proving this should not require the definition of the inner product, only the properties.

What's the difference? Which properties do you mean?

Conjugate symmetry, linearity in the first argument, and positive-definiteness.

Looks to me as another version of the cosine formula if applied to v1+v2

By definition, $\langle v_1, v_2 \rangle = \| v_1 \| \cdot \| v_2 \| \cdot \cos(\theta)$ where $\theta$ is the angle between vectors $v_1$ and $v_2$. If you also additionally know that $\langle v_1, v_2 \rangle = \| v_1 \| \cdot \| v_2 \|$, then the angle between the two vectors must either be 0 or 180 degrees. So they are parallel; hence one is a scalar multiple of the other.

That's the definition? It would be true in a real inner product space, but this one is over ℂ.

rs1n
zinq said:
That's the definition? It would be true in a real inner product space, but this one is over ℂ.

You are absolutely right! My eyes failed me, somehow.

Bipolarity said:
I'm learning about Support Vector Machines and would like to recap on some basic linear algebra. More specifically, I'm trying to prove the following, which I'm pretty sure is true:
Let ##v1## and ##v2## be two vectors in an inner product space over ##\mathbb{C}##.
Suppose that ## \langle v1 , v2 \rangle = ||v1|| \cdot ||v2|| ##, i.e. the special case of Cauchy Schwarz when it is an equality. Then prove that ##v1## is a scalar multiple of ##v2##, assuming neither vector is ##0##.

I've tried using the triangle inequality and some other random stuff to no avail. I believe there's some algebra trick involved, could someone help me out? I really want to prove this and get on with my machine learning.

Thanks!

BiP

One way to do it is to consider the vector ##u = v_2 - \frac{<v1, v2>}{<v1, v1>} v_1##

Look at ##<u, u>## and show that it's zero when you have C-S equality. This also leads to a proof of the C-S inequality.

To get back to the problem, though... over the complex numbers, the inner product is presumably a Hermitian inner product. So

##\begin{align*}
\| u + v \|^2 & = \langle u + v, u+v \rangle = \langle u,u \rangle + \langle u,v \rangle + \langle v,u \rangle + \langle v, v \rangle\\
& = \langle u,u \rangle + \langle u,v \rangle + \overline{\langle u,v \rangle} + \langle v, v \rangle \\
& = \langle u,u \rangle + 2 \mathrm{Re}(\langle u,v \rangle) + \langle v, v \rangle\\
& = \| u\|^2 + 2 \mathrm{Re}(\langle u,v \rangle) + \| v\|^2
\end{align*}##

Similarly,

## 0 \le \| u + \lambda v \|^2 = \| u\|^2 + 2 \mathrm{Re}(\overline{\lambda} \langle u,v \rangle) + |\lambda|^2 \| v\|^2##

Let $$\lambda = -\frac{\langle u, v\rangle }{\|v \|^2}$$ and the right hand side (above) will simplify to the C.S. inequality. Equality occurs if $$\| u + \lambda v \| = 0$$

There are few possible ways of doing that. The first one is just to follow the proof of the Cauchy--Schwarz. Namely, for real ##t## consider $$\|v_1 - t v_2\|^2 = \|v_1\|^2 +t^2\|v_2\|^2 - 2t (v_1, v_2) = \|v_1\|^2 +t^2\|v_2\|^2 - 2t \|v_1\|\cdot \|v_2\| = (\|v_1\|-t\|v_2\|)^2.$$ The right hand side of this chain of equations is ##0## when ##t=\|v_1\|/\|v_2\|##. So for this ##t## you get that ##v_1-tv_2=0##, which is exactly what you need.

Another way is more geometric and probably more intuitive. You define ##w## to be the orthogonal projection of ##v_2## onto the one dimensional subspace spanned by ##v_1##, ##w= \|v_1\|^{-2} (v_2, v_1) v_1##. Then ##(v_1, v_2)= (v_1, w)## (checked by direct calculation) and ##v_2-w## is orthogonal to ##v_1## (and so to ##w##).
Therefore ##\|v_2\|^2 =\|w\|^2+\|v_2-w\|^2##.

By Cauchy--Schwarz ## (v_1, w) \le \|v_1\|\cdot \|w\|##, but on the other hand ##(v_1, w) = (v_1, v2) = \|v_1\|\cdot \|v_2\|##, so ##\|v_1\|\cdot \|v_2\| \le \|v_1\|\cdot \|w\|## and therefore ##\|v_2\|\le \|w\|##. Comparing this with ##\|v_2\|^2 =\|w\|^2+\|v_2-w\|^2## we conclude that ##v_2-w=0##.

The second proof is a bit longer, but it is more intuitive, in a sense that it is a pretty standard reasoning used when one works with orthogonal projections.

Hawkeye18 said:
Another way is more geometric and probably more intuitive. You define ##w## to be the orthogonal projection of ##v_2## onto the one dimensional subspace spanned by ##v_1##, ##w= \|v_1\|^{-2} (v_2, v_1) v_1##. Then ##(v_1, v_2)= (v_1, w)## (checked by direct calculation) and ##v_2-w## is orthogonal to ##v_1## (and so to ##w##).
Therefore ##\|v_2\|^2 =\|w\|^2+\|v_2-w\|^2##.

By Cauchy--Schwarz ## (v_1, w) \le \|v_1\|\cdot \|w\|##, but on the other hand ##(v_1, w) = (v_1, v2) = \|v_1\|\cdot \|v_2\|##, so ##\|v_1\|\cdot \|v_2\| \le \|v_1\|\cdot \|w\|## and therefore ##\|v_2\|\le \|w\|##. Comparing this with ##\|v_2\|^2 =\|w\|^2+\|v_2-w\|^2## we conclude that ##v_2-w=0##.

The second proof is a bit longer, but it is more intuitive, in a sense that it is a pretty standard reasoning used when one works with orthogonal projections.

The second method is what I suggested in post #10. And, in fact, you can prove Cauchy Schwartz more intuitively this way.

I see! Thank you all for your replies! I knew I had seen it somewhere, little did I know it was right there in the proof of the C-S inequality itself!

BiP

## 1. What is the Cauchy Schwarz inequality and how does it relate to parallel lines?

The Cauchy Schwarz inequality, also known as the Cauchy Schwarz-Bunyakovsky inequality, states that for two vectors, the dot product of the vectors is always less than or equal to the product of the magnitudes of the vectors. This inequality is closely related to the concept of parallel lines, as it can be used to prove that the shortest distance between two parallel lines is constant.

## 2. How does the Cauchy Schwarz inequality imply parallel lines?

If the dot product of two vectors is equal to the product of their magnitudes, then the vectors are parallel. This means that the two vectors have the same direction and therefore, they will never intersect. This is why the Cauchy Schwarz inequality is often used to prove parallelism.

## 3. Can the Cauchy Schwarz inequality be used to prove the converse of parallel lines?

Yes, the Cauchy Schwarz inequality can be used to prove the converse of parallel lines. If two vectors have the same direction, then their dot product will be equal to the product of their magnitudes, thus satisfying the equality condition of the Cauchy Schwarz inequality.

## 4. Are there any other applications of the Cauchy Schwarz inequality?

Yes, the Cauchy Schwarz inequality has many applications in mathematics and other fields such as physics and economics. It is commonly used in optimization problems, functional analysis, and probability theory.

## 5. Is the Cauchy Schwarz inequality a generalization of other inequalities?

Yes, the Cauchy Schwarz inequality is a generalization of several other important inequalities, such as the Cauchy inequality, the Minkowski inequality, and the Hölder inequality. It provides a powerful tool for proving these inequalities and has applications in various mathematical fields.

• Linear and Abstract Algebra
Replies
8
Views
2K
• Linear and Abstract Algebra
Replies
2
Views
2K
• Topology and Analysis
Replies
2
Views
1K
• Calculus and Beyond Homework Help
Replies
5
Views
3K
• Linear and Abstract Algebra
Replies
8
Views
3K
• Linear and Abstract Algebra
Replies
2
Views
2K
Replies
9
Views
2K
• Linear and Abstract Algebra
Replies
6
Views
4K
• Linear and Abstract Algebra
Replies
3
Views
1K
• Topology and Analysis
Replies
4
Views
1K