# Orthonormal Basis times a real Matrix

• MHB
• linearishard
In summary, to create an orthonormal basis for a vector space, you need to define a real matrix $A$ that is orthogonal and has the desired property.
linearishard
Hi!

I have an orthonormal basis for vector space $V$, $\{u_1, u_2, ..., u_n\}$. If $(v_1, v_2, ..., v_n) = (u_1, u_2, ... u_n)A$ where $A$ is a real $n\times n$ matrix, how do I prove that $(v_1, v_2, ... v_n)$ is an orthonormal basis if and only if $A$ is an orthogonal matrix?

Thanks!

Last edited by a moderator:
Could you explain how $(u_1, u_2, \ldots, u_n)A$ is defined? I don't understand, since the $u_j$ are vectors in an arbitrary $n$-dimensional (real) vector space, while $A$ is an $n \times n$ matrix.

Do you mean to write $v_j = Au_j$ where $A$ is an endomorphism of a real inner product space $V$ and then prove that the $v_j$ form an orthonormal basis iff $A$ is an orthogonal transformation?

Last edited:
Hi, yes that is what I meant! Sorry!

Hi linearishard,

First suppose $A$ is orthogonal. Then $A^TA = I$, so for all $i,j\in \{1,2,\ldots, n\}$, $$\langle Au_i, Au_j\rangle = \langle A^TAu_i, u_j\rangle =\langle Iu_i, u_j\rangle = \langle u_i, u_j\rangle = \delta_{ij}$$ where the last equality follows from orthonormality of $\{u_1,\ldots, u_n\}$. Therefore, $\{v_1,\ldots, v_n\}$ is an orthonormal set. The set $\{v_1,\ldots, v_n\}$ is also linearly independent, for if $$\sum_{i = 1}^n c_i v_i = 0$$ is a linear dependence relation, then $$A\left(\sum_{i = 1}^n c_i u_i\right) = 0$$ Pre-multiplying both sides of the equation by $A^T$ and using the fact $A^TA = I$, one receives $$\sum_{i = 1}^n c_i u_i = 0$$ whence $c_1 = \cdots = c_n = 0$ by linear independence of $\{u_1,\ldots, u_n\}$. Thus, $\{v_1,\ldots, v_n\}$ is a basis for $V$.

Conversely, suppose $\{v_1,\ldots, v_n\}$ an orthonormal basis. For all indices $i$ and $j$, $\langle Au_i, Au_j\rangle = \delta_{ij}$. So since $\{u_1,\ldots, u_n\}$ is an orthonormal basis then for every $i$ $$A^T Au_i = \sum_{j = 1}^n \langle A^T A u_i, u_j\rangle u_j = \sum_{j = 1}^n \langle Au_i, Au_j\rangle u_j = \sum_{j = 1}^n \delta_{ij}u_j = u_i$$ Consequently $A^TA = I$, i.e., $A$ is orthogonal. $\blacksquare$

Thank you for your response, could you please explain to me the logic behind the last line? How do you go from <ATAui,uj> to <Aui,Auj>?

Since $A^T$ is the adjoint of $A$, $\langle A^Tx,y\rangle = \langle x, Ay\rangle$ for all $x,y\in V$. In particular, the equality holds when $x = Au_i$ and $y = u_j$.

Since we're apparently working with real numbers, we can write:
$$\langle A^Tx,y\rangle = (A^Tx)^T \cdot y = (x^TA)\cdot y =x^T\cdot (Ay) =\langle x, Ay\rangle$$
This is generally true for the dot product of vectors with real numbers.
For complex numbers the same thing holds, just with the conjugate transpose or adjoint (usually written as $A^\dagger$, $A^*$, or $A^H$) instead of the transpose $A^T$.

## What is an orthonormal basis?

An orthonormal basis is a set of vectors in a vector space that are mutually perpendicular and have unit length. This means that each vector is perpendicular to every other vector in the set, and each vector has a magnitude of 1.

## What is a real matrix?

A real matrix is a matrix that contains only real numbers as its elements. These matrices are used to represent linear transformations in real vector spaces.

## How are orthonormal bases used in conjunction with real matrices?

Orthonormal bases are used to transform a vector from one coordinate system to another. When multiplied by a real matrix, the orthonormal basis will produce a new vector with coordinates relative to the new coordinate system.

## What is the significance of an orthonormal basis times a real matrix?

The product of an orthonormal basis and a real matrix is a fundamental concept in linear algebra and has many applications in physics, engineering, and computer graphics. It allows for efficient calculation and manipulation of vectors in different coordinate systems.

## How do you compute the product of an orthonormal basis times a real matrix?

To compute this product, you would simply multiply each element of the matrix by the corresponding vector in the orthonormal basis and sum the results. This can be done manually or using software such as MATLAB or Python's NumPy library.

Replies
3
Views
2K
Replies
14
Views
2K
Replies
3
Views
2K
Replies
4
Views
2K
Replies
2
Views
1K
Replies
6
Views
3K
Replies
3
Views
2K
Replies
2
Views
1K
Replies
3
Views
1K
Replies
4
Views
1K