Would does it mean to say that two matrices or functions are

In summary: It's useful when you have a complicated system that you want to decompose into simpler parts. If the parts are "orthogonal" then you can often decompose the system into those parts and study them separately. This is an extremely powerful technique in mathematics and physics, and it has many real-life applications such as signal processing, data compression, and quantum mechanics.In summary, the concept of orthogonality is fundamental in linear algebra and signifies the independence or absence of interference between two matrices or functions. It can be defined using different inner products and has many real-life applications in areas such as signal processing and quantum mechanics.
  • #1
matqkks
285
5
Would does it mean to say that two matrices or functions are orthogonal? What does this signify?
I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.
Is there a real life application of orthogonal vectors in these sort of vector spaces?
 
Mathematics news on Phys.org
  • #2


Good morning matqkks

It is a fundamental result in linear algebra that two vectors in a vector space are orthogonal if their inner product is zero.

From the point of view of linear algebra both the space of continuous functions and the space of square matrices form vector spaces with the vectors being individual functions and matrices respectively.

When you study linear algebra you start with some stuff that seems pretty abstract.
That is because you are going beyond simple everyday experience and need some vocabulary to work with.

To know that two functions or a set of functions are orthogonal is of incredibly powerful mathematical value.
That is because you can use the theorems of linear algebra to develop approximations that are as close as you like to difficult functions that may be impossible to calculate any other way.

Examples of this abound in mathematical physics for isntance:

Fourier analysis
Trigonometric and Polynomial series approximations
Chebyshev and legendre polynomials



I see you state degree : mathematics Have you just started? Good luck with the course.
 
  • #3


To say "A and B are othogonal matrices" might mean "A is an orthogonal matrix and B is an orthogonal matrix" or it might mean "A is orthogonal to B". Which interpretation is the question about?
 
  • #4


"A is orthogonal to B"
 
  • #5


For functions, the usual meaning of orthogonality is that they are othogonal with respect to a particular inner product. One can define various inner products for functions. For example, an inner product for real valued functions [itex]f(x) [/itex] and [itex] g(x) [/itex] might be defined as [itex] <f,g> = \int_{-1}^1 f(x) g(x) dx [/itex] or [itex] <f,g> = \int_{-\infty}^{\infty} f(x) g(x) dx [/itex] or we might pick a function [itex] k(x) [/itex] (an "integrating kernel") which stays the same for all [itex] f [/itex] and [itex] g [/itex] and define [itex] <f,g> = \int_{-1}^{1} f(x) g(x) k(x) dx [/itex] etc. The functions are orthogonal to each other iff their inner product is zero.

I don' recall seeing any mathematics that hinged on two matrices being orthogonal to each other or any standard definition of what that would mean. Of course, you could consider the entries of a matrix to be one long vector and determine the orthogonality of two matrices by looking at the inner product of those two vectors. However, I think you should state the complete context where you encountered two matrices being orthogonal to each other. There might be several ways to define it. For example, a matrix [itex] M [/itex] can be associated with a vector valued function [itex] y = M x [/itex] where [itex] y [/itex] is a row vector and [itex] x [/itex] is a column vector. Any method of defining the orthogonality of two vector valued functions could be applied to matrices.
 
  • #6


It would be useful if we knew the OP's level of mathematical operation before launching into too much detail.
 
  • #7


matqkks said:
Would does it mean to say that two matrices or functions are orthogonal? What does this signify?
I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.
Is there a real life application of orthogonal vectors in these sort of vector spaces?

Yep, it depends on the inner product.
If it is zero they are orthogonal.
This orthogonality for matrices (as you defined it) is well-defined mathematically, but I guess you already knew that.
For matrices I've never seen it used in practice, only in mathematical courses.

However, for functions it is the foundation of quantum theory.
In quantum theory a wave function is defined, often denoted with the symbol ψ.
The inner product between 2 such functions is typically defined as the product of the conjugate of the first function with the other function, integrated from minus infinity to plus infinity.

The square of the (complex valued) modulus |ψ|2 is equal to the probability density (not just the probability) of finding a particle in an infinitesimal space element surrounding a point in space and time.

The inner product between 2 such functions is denoted as <φ|ψ> or <φ|A|ψ>.
This also gave rise to the so called bra-ket notation, where <φ| is "bra" and |ψ> is "ket".
 
  • #8


Orthogonal vectors in "interesting" spaces (matrix spaces, function spaces) often follow intuitions set by vectors at "right angles" in 2- and 3-D space.

Orthogonal signifies a certain kind of "independence" or a complete absence of interference.
 

1. What does it mean for two matrices to be equal?

Two matrices are equal if they have the same dimensions and if each corresponding element is equal. This means that both matrices have the same number of rows and columns, and each element in the same position in both matrices has the same value.

2. How do you determine if two matrices are equivalent?

Two matrices are equivalent if they have the same dimensions and if they can be transformed into each other through elementary row operations. This means that both matrices have the same number of rows and columns, and can be manipulated using row operations (such as multiplication by a scalar or adding rows together) to produce the same final matrix.

3. What does it mean for two functions to be equal?

Two functions are equal if they have the same domain and if they output the same value for every input in that domain. This means that both functions have the same set of input values, and when those inputs are plugged into the functions, they produce the same output.

4. How do you determine if two functions are equivalent?

Two functions are equivalent if they have the same domain and if they produce the same output for every input in that domain. This means that both functions have the same set of input values, and when those inputs are plugged into the functions, they produce the same output.

5. What are the key differences between matrices and functions?

Matrices are a way of representing data or transformations using a grid of numbers, while functions are mathematical relationships between inputs and outputs. Matrices have a fixed size and can be added or multiplied together, while functions can be manipulated using algebraic methods. Additionally, matrices can have multiple dimensions, while functions generally have a single input and output.

Similar threads

Replies
3
Views
1K
  • General Math
Replies
9
Views
339
  • Linear and Abstract Algebra
Replies
2
Views
864
  • Linear and Abstract Algebra
Replies
3
Views
937
Replies
9
Views
1K
Replies
5
Views
1K
Replies
14
Views
1K
  • Quantum Physics
Replies
11
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
1K
Back
Top