Would does it mean to say that two matrices or functions are

Click For Summary

Discussion Overview

The discussion revolves around the concept of orthogonality in the context of matrices and functions, exploring its mathematical significance and potential real-life applications. Participants examine the definitions and implications of orthogonality, particularly in relation to inner products.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants propose that two vectors or functions are orthogonal if their inner product is zero, with various definitions of inner products being discussed.
  • One participant mentions that orthogonality can be defined differently depending on the context, such as using the trace of A(^T)B for matrices.
  • Another participant highlights the mathematical value of knowing functions are orthogonal, as it allows for approximations of complex functions using linear algebra theorems.
  • There is a suggestion that the term "orthogonal matrices" could refer to either matrices being orthogonal to each other or each being an orthogonal matrix individually, leading to a clarification request.
  • One participant notes that while orthogonality for matrices is mathematically well-defined, they have not encountered practical applications for it, unlike for functions, which are foundational in quantum theory.
  • Another participant emphasizes that orthogonal vectors in higher-dimensional spaces maintain intuitions from lower dimensions, signifying independence and absence of interference.

Areas of Agreement / Disagreement

Participants express varying interpretations of orthogonality, particularly regarding matrices versus functions. There is no consensus on the practical applications of orthogonality for matrices, while functions are acknowledged to have significant implications in quantum theory.

Contextual Notes

Some discussions hinge on the definitions of inner products and the context in which orthogonality is applied, indicating that the understanding of orthogonality may depend on specific mathematical frameworks or applications.

Who May Find This Useful

This discussion may be of interest to students and practitioners in linear algebra, quantum mechanics, and mathematical physics, as well as those exploring the theoretical aspects of vector spaces.

matqkks
Messages
283
Reaction score
6
Would does it mean to say that two matrices or functions are orthogonal? What does this signify?
I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.
Is there a real life application of orthogonal vectors in these sort of vector spaces?
 
Physics news on Phys.org


Good morning matqkks

It is a fundamental result in linear algebra that two vectors in a vector space are orthogonal if their inner product is zero.

From the point of view of linear algebra both the space of continuous functions and the space of square matrices form vector spaces with the vectors being individual functions and matrices respectively.

When you study linear algebra you start with some stuff that seems pretty abstract.
That is because you are going beyond simple everyday experience and need some vocabulary to work with.

To know that two functions or a set of functions are orthogonal is of incredibly powerful mathematical value.
That is because you can use the theorems of linear algebra to develop approximations that are as close as you like to difficult functions that may be impossible to calculate any other way.

Examples of this abound in mathematical physics for isntance:

Fourier analysis
Trigonometric and Polynomial series approximations
Chebyshev and legendre polynomials



I see you state degree : mathematics Have you just started? Good luck with the course.
 


To say "A and B are othogonal matrices" might mean "A is an orthogonal matrix and B is an orthogonal matrix" or it might mean "A is orthogonal to B". Which interpretation is the question about?
 


"A is orthogonal to B"
 


For functions, the usual meaning of orthogonality is that they are othogonal with respect to a particular inner product. One can define various inner products for functions. For example, an inner product for real valued functions [itex]f(x)[/itex] and [itex]g(x)[/itex] might be defined as [itex]<f,g> = \int_{-1}^1 f(x) g(x) dx[/itex] or [itex]<f,g> = \int_{-\infty}^{\infty} f(x) g(x) dx[/itex] or we might pick a function [itex]k(x)[/itex] (an "integrating kernel") which stays the same for all [itex]f[/itex] and [itex]g[/itex] and define [itex]<f,g> = \int_{-1}^{1} f(x) g(x) k(x) dx[/itex] etc. The functions are orthogonal to each other iff their inner product is zero.

I don' recall seeing any mathematics that hinged on two matrices being orthogonal to each other or any standard definition of what that would mean. Of course, you could consider the entries of a matrix to be one long vector and determine the orthogonality of two matrices by looking at the inner product of those two vectors. However, I think you should state the complete context where you encountered two matrices being orthogonal to each other. There might be several ways to define it. For example, a matrix [itex]M[/itex] can be associated with a vector valued function [itex]y = M x[/itex] where [itex]y[/itex] is a row vector and [itex]x[/itex] is a column vector. Any method of defining the orthogonality of two vector valued functions could be applied to matrices.
 


It would be useful if we knew the OP's level of mathematical operation before launching into too much detail.
 


matqkks said:
Would does it mean to say that two matrices or functions are orthogonal? What does this signify?
I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.
Is there a real life application of orthogonal vectors in these sort of vector spaces?

Yep, it depends on the inner product.
If it is zero they are orthogonal.
This orthogonality for matrices (as you defined it) is well-defined mathematically, but I guess you already knew that.
For matrices I've never seen it used in practice, only in mathematical courses.

However, for functions it is the foundation of quantum theory.
In quantum theory a wave function is defined, often denoted with the symbol ψ.
The inner product between 2 such functions is typically defined as the product of the conjugate of the first function with the other function, integrated from minus infinity to plus infinity.

The square of the (complex valued) modulus |ψ|2 is equal to the probability density (not just the probability) of finding a particle in an infinitesimal space element surrounding a point in space and time.

The inner product between 2 such functions is denoted as <φ|ψ> or <φ|A|ψ>.
This also gave rise to the so called bra-ket notation, where <φ| is "bra" and |ψ> is "ket".
 


Orthogonal vectors in "interesting" spaces (matrix spaces, function spaces) often follow intuitions set by vectors at "right angles" in 2- and 3-D space.

Orthogonal signifies a certain kind of "independence" or a complete absence of interference.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K