- #1

- 201

- 2

I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.

Is there a real life application of orthogonal vectors in these sort of vector spaces?

- Thread starter matqkks
- Start date

- #1

- 201

- 2

I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.

Is there a real life application of orthogonal vectors in these sort of vector spaces?

- #2

- 5,439

- 9

Good morning matqkks

It is a fundamental result in linear algebra that two vectors in a vector space are orthogonal if their inner product is zero.

From the point of view of linear algebra both the space of continuous functions and the space of square matrices form vector spaces with the vectors being individual functions and matrices respectively.

When you study linear algebra you start with some stuff that seems pretty abstract.

That is because you are going beyond simple everyday experience and need some vocabulary to work with.

To know that two functions or a set of functions are orthogonal is of incredibly powerful mathematical value.

That is because you can use the theorems of linear algebra to develop approximations that are as close as you like to difficult functions that may be impossible to calculate any other way.

Examples of this abound in mathematical physics for isntance:

Fourier analysis

Trigonometric and Polynomial series approximations

Chebyshev and legendre polynomials

I see you state degree : mathematics Have you just started? Good luck with the course.

- #3

Stephen Tashi

Science Advisor

- 7,353

- 1,356

To say "A and B are othogonal matrices" might mean "A is an orthogonal matrix and B is an orthogonal matrix" or it might mean "A is orthogonal to B". Which interpretation is the question about?

- #4

- 201

- 2

"A is orthogonal to B"

- #5

Stephen Tashi

Science Advisor

- 7,353

- 1,356

For functions, the usual meaning of orthogonality is that they are othogonal with respect to a particular inner product. One can define various inner products for functions. For example, an inner product for real valued functions [itex]f(x) [/itex] and [itex] g(x) [/itex] might be defined as [itex] <f,g> = \int_{-1}^1 f(x) g(x) dx [/itex] or [itex] <f,g> = \int_{-\infty}^{\infty} f(x) g(x) dx [/itex] or we might pick a function [itex] k(x) [/itex] (an "integrating kernel") which stays the same for all [itex] f [/itex] and [itex] g [/itex] and define [itex] <f,g> = \int_{-1}^{1} f(x) g(x) k(x) dx [/itex] etc. The functions are orthogonal to each other iff their inner product is zero.

I don' recall seeing any mathematics that hinged on two matrices being orthogonal to each other or any standard definition of what that would mean. Of course, you could consider the entries of a matrix to be one long vector and determine the orthogonality of two matrices by looking at the inner product of those two vectors. However, I think you should state the complete context where you encountered two matrices being orthogonal to each other. There might be several ways to define it. For example, a matrix [itex] M [/itex] can be associated with a vector valued function [itex] y = M x [/itex] where [itex] y [/itex] is a row vector and [itex] x [/itex] is a column vector. Any method of defining the orthogonality of two vector valued functions could be applied to matrices.

- #6

- 5,439

- 9

It would be useful if we knew the OP's level of mathematical operation before launching into too much detail.

- #7

I like Serena

Homework Helper

- 6,577

- 176

Yep, it depends on the inner product.

I suppose it depends on the inner product. Say if the inner product is the trace of A(^T)B.

Is there a real life application of orthogonal vectors in these sort of vector spaces?

If it is zero they are orthogonal.

This orthogonality for matrices (as you defined it) is well-defined mathematically, but I guess you already knew that.

For matrices I've never seen it used in practice, only in mathematical courses.

However, for functions it is the foundation of quantum theory.

In quantum theory a wave function is defined, often denoted with the symbol ψ.

The inner product between 2 such functions is typically defined as the product of the conjugate of the first function with the other function, integrated from minus infinity to plus infinity.

The square of the (complex valued) modulus |ψ|

The inner product between 2 such functions is denoted as <φ|ψ> or <φ|A|ψ>.

This also gave rise to the so called bra-ket notation, where <φ| is "bra" and |ψ> is "ket".

- #8

- 811

- 6

Orthogonal vectors in "interesting" spaces (matrix spaces, function spaces) often follow intuitions set by vectors at "right angles" in 2- and 3-D space.

Orthogonal signifies a certain kind of "independence" or a complete absence of interference.

- Last Post

- Replies
- 2

- Views
- 1K

- Replies
- 7

- Views
- 335

- Replies
- 5

- Views
- 4K

- Last Post

- Replies
- 1

- Views
- 1K

- Replies
- 16

- Views
- 4K

- Replies
- 4

- Views
- 1K

- Last Post

- Replies
- 5

- Views
- 938

- Last Post

- Replies
- 10

- Views
- 4K

- Replies
- 96

- Views
- 7K

- Last Post

- Replies
- 1

- Views
- 2K