Why Are My Gram-Schmidt Results Different on Wolfram|Alpha and MATLAB?

  • Context: MHB 
  • Thread starter Thread starter nedf
  • Start date Start date
  • Tags Tags
    Process
Click For Summary
SUMMARY

The discrepancy in Gram-Schmidt results between Wolfram|Alpha and MATLAB arises from the treatment of the dot product for complex numbers. Specifically, MATLAB uses the conjugate form of the dot product, defined as $\mathbf a \cdot \mathbf b = \sum a_i\overline{b_i}$, while the user's formula incorrectly assumes a different variant. This inconsistency leads to different outputs in the Gram-Schmidt process. Additionally, the orthonormal basis generated is not unique, as multiple valid bases can exist for a given matrix.

PREREQUISITES
  • Understanding of the Gram-Schmidt process
  • Familiarity with complex numbers and their dot product
  • Knowledge of MATLAB syntax and functions
  • Basic linear algebra concepts, including orthonormal bases
NEXT STEPS
  • Review the definition of the dot product in MATLAB documentation
  • Learn about the Gram-Schmidt process in detail, focusing on complex vectors
  • Explore the concept of orthonormal bases and their properties in linear algebra
  • Investigate the differences between various inner product definitions in mathematics
USEFUL FOR

Mathematicians, computer scientists, and engineers working with linear algebra, particularly those using MATLAB for computations involving complex numbers and orthonormal bases.

Physics news on Phys.org
nedf said:
Wolframalpha provides this solution:
Wolfram|Alpha: Computational Knowledge EngineHowever when i compute the second term v2 = x2 - (x2.v1)/(v1.v1) * v1
The result is different from that of above. What's wrong?
Wolfram|Alpha: Computational Knowledge Engine)

Hi nedf! Welcome to MHB! ;)

Did you take into account that the dot product for complex numbers requires a conjugation?
That is, $\mathbf a \cdot \mathbf b = \sum a_i\overline{b_i}$.
 
I like Serena said:
Hi nedf! Welcome to MHB! ;)

Did you take into account that the dot product for complex numbers requires a conjugation?
That is, $\mathbf a \cdot \mathbf b = \sum a_i\overline{b_i}$.
Thanks.
At the end of the page on https://www.mathworks.com/help/matlab/ref/dot.html
Why is the dot product defined this way instead? Wouldnt the answer be different?
$\mathbf a \cdot \mathbf b = \sum\overline a_i{b_i}$

Also, i computed the orth on matlab:
o1nZzHA.png


Why is it different from Wolfram|Alpha: Computational Knowledge Engine?
peUHk3W.png


Is the orthonormal (normalized) basis unique for a given matrix?
 
Last edited:
nedf said:
Thanks.
At the end of the page on https://www.mathworks.com/help/matlab/ref/dot.html
Why is the dot product defined this way instead? Wouldnt the answer be different?
$\mathbf a \cdot \mathbf b = \sum\overline a_i{b_i}$

It's a matter of convention. Both dot products are valid inner products. And one is the conjugate of the other.
However, your formula for the Gram-Schmidt process assumes the $\sum a_i\overline {b_i}$ version.
Otherwise the dot product in the fraction should have been the other way around.
So the results will be the same - if we use the proper formula.
And your Gram-Schmidt formula is incompatible with mathworks's variant.

Note that with the standard $\sum a_i\overline{b_i}$ we have:
$$\mathbf v_1\cdot \mathbf v_2 = \mathbf v_1 \cdot\left( \mathbf x_2-\frac{\mathbf x_2 \cdot \mathbf v_1}{\mathbf v_1 \cdot \mathbf v_1}\mathbf v_1\right)
= \mathbf v_1 \cdot \mathbf x_2- \mathbf v_1 \cdot \left(\frac{\mathbf x_2 \cdot \mathbf v_1}{\mathbf v_1 \cdot \mathbf v_1}\mathbf v_1\right)
= \mathbf v_1 \cdot \mathbf x_2- \left(\frac{\mathbf x_2 \cdot \mathbf v_1}{\mathbf v_1 \cdot \mathbf v_1}\right)^*\mathbf v_1 \cdot \mathbf v_1 \\
= \mathbf v_1 \cdot \mathbf x_2- \left(\frac{\mathbf v_1 \cdot \mathbf x_2}{\mathbf v_1 \cdot \mathbf v_1}\right)\mathbf v_1 \cdot \mathbf v_1
= \mathbf v_1 \cdot \mathbf x_2 - \mathbf v_1 \cdot \mathbf x_2 = 0
$$

Also, i computed the orth on matlab:Why is it different from Wolfram|Alpha: Computational Knowledge Engine?Is the orthonormal (normalized) basis unique for a given matrix?

And no, an orthonormal basis is typically not unique.
Consider for instance $\mathbb R^3$.
The standard orthonormal basis is $\{(1,0,0),(0,1,0),(0,0,1)\}$.
But $\{(1,0,0),(0,1/\sqrt 2,1/\sqrt 2),(0,1/\sqrt 2,-1/\sqrt 2)\}$ is also an orthonormal basis.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
1
Views
3K
Replies
3
Views
5K
  • · Replies 10 ·
Replies
10
Views
3K