Would does it mean to say that two matrices or functions are

Click For Summary
SUMMARY

Orthogonality in linear algebra signifies that two vectors, matrices, or functions are orthogonal if their inner product equals zero. The discussion highlights various inner product definitions, such as the trace of A(^T)B for matrices and integrals for functions. Real-life applications include Fourier analysis, quantum theory, and polynomial approximations. The concept is foundational in mathematical physics, providing powerful tools for approximating complex functions.

PREREQUISITES
  • Understanding of linear algebra concepts, particularly inner products
  • Familiarity with Fourier analysis and its applications
  • Knowledge of quantum mechanics and wave functions
  • Basic comprehension of polynomial series and approximation methods
NEXT STEPS
  • Study the properties of inner products in vector spaces
  • Explore Fourier transforms and their applications in signal processing
  • Learn about quantum mechanics, focusing on wave functions and bra-ket notation
  • Investigate polynomial approximation techniques, including Chebyshev and Legendre polynomials
USEFUL FOR

Students of mathematics, physicists, and engineers interested in linear algebra applications, particularly in quantum theory and signal processing.

matqkks
Messages
282
Reaction score
5
Would does it mean to say that two matrices or functions are orthogonal? What does this signify?
I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.
Is there a real life application of orthogonal vectors in these sort of vector spaces?
 
Mathematics news on Phys.org


Good morning matqkks

It is a fundamental result in linear algebra that two vectors in a vector space are orthogonal if their inner product is zero.

From the point of view of linear algebra both the space of continuous functions and the space of square matrices form vector spaces with the vectors being individual functions and matrices respectively.

When you study linear algebra you start with some stuff that seems pretty abstract.
That is because you are going beyond simple everyday experience and need some vocabulary to work with.

To know that two functions or a set of functions are orthogonal is of incredibly powerful mathematical value.
That is because you can use the theorems of linear algebra to develop approximations that are as close as you like to difficult functions that may be impossible to calculate any other way.

Examples of this abound in mathematical physics for isntance:

Fourier analysis
Trigonometric and Polynomial series approximations
Chebyshev and legendre polynomials



I see you state degree : mathematics Have you just started? Good luck with the course.
 


To say "A and B are othogonal matrices" might mean "A is an orthogonal matrix and B is an orthogonal matrix" or it might mean "A is orthogonal to B". Which interpretation is the question about?
 


"A is orthogonal to B"
 


For functions, the usual meaning of orthogonality is that they are othogonal with respect to a particular inner product. One can define various inner products for functions. For example, an inner product for real valued functions f(x) and g(x) might be defined as <f,g> = \int_{-1}^1 f(x) g(x) dx or <f,g> = \int_{-\infty}^{\infty} f(x) g(x) dx or we might pick a function k(x) (an "integrating kernel") which stays the same for all f and g and define <f,g> = \int_{-1}^{1} f(x) g(x) k(x) dx etc. The functions are orthogonal to each other iff their inner product is zero.

I don' recall seeing any mathematics that hinged on two matrices being orthogonal to each other or any standard definition of what that would mean. Of course, you could consider the entries of a matrix to be one long vector and determine the orthogonality of two matrices by looking at the inner product of those two vectors. However, I think you should state the complete context where you encountered two matrices being orthogonal to each other. There might be several ways to define it. For example, a matrix M can be associated with a vector valued function y = M x where y is a row vector and x is a column vector. Any method of defining the orthogonality of two vector valued functions could be applied to matrices.
 


It would be useful if we knew the OP's level of mathematical operation before launching into too much detail.
 


matqkks said:
Would does it mean to say that two matrices or functions are orthogonal? What does this signify?
I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.
Is there a real life application of orthogonal vectors in these sort of vector spaces?

Yep, it depends on the inner product.
If it is zero they are orthogonal.
This orthogonality for matrices (as you defined it) is well-defined mathematically, but I guess you already knew that.
For matrices I've never seen it used in practice, only in mathematical courses.

However, for functions it is the foundation of quantum theory.
In quantum theory a wave function is defined, often denoted with the symbol ψ.
The inner product between 2 such functions is typically defined as the product of the conjugate of the first function with the other function, integrated from minus infinity to plus infinity.

The square of the (complex valued) modulus |ψ|2 is equal to the probability density (not just the probability) of finding a particle in an infinitesimal space element surrounding a point in space and time.

The inner product between 2 such functions is denoted as <φ|ψ> or <φ|A|ψ>.
This also gave rise to the so called bra-ket notation, where <φ| is "bra" and |ψ> is "ket".
 


Orthogonal vectors in "interesting" spaces (matrix spaces, function spaces) often follow intuitions set by vectors at "right angles" in 2- and 3-D space.

Orthogonal signifies a certain kind of "independence" or a complete absence of interference.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
941
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
16
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K