About orthogonality and nullspaces

  • Thread starter Thread starter applechu
  • Start date Start date
  • Tags Tags
    Orthogonality
applechu
Messages
10
Reaction score
0
Hi:
I see an example about nullspace and orthogonality, the example is following:

$$Ax=\begin{bmatrix} 1 & 3 &4\\ 5 & 2& 7 \end{bmatrix} \times \left[ \begin{array}{c} 1 \\ 1\\-1 \end{array} \right]=\begin{bmatrix} 0\\0\end{bmatrix}$$

The conclusion says the nullspace of A^T is only the zero vector(orthogonal to every vector). I don't know why the columns of A and nullspace of A^T are orthogonal spaces.
I know nullspace is the solution of Ax=0; but in this theorem, why columns of A is related
to nullsapce of A^T.
Thanks.
 
Last edited:
Physics news on Phys.org
Notice the null space of A^T is orthogonal to (1,5), (3,2) and (4,7). So it is orthogonal to the space spanned by the vectors, which is the column space of A.
 
A^T= \begin{bmatrix}1 & 5 \\ 3 & 2 \\ 4 & 7\end{bmatrix}
so that the condition that a vector be in the null space of A^T is
\begin{bmatrix}1 & 5 \\ 3 & 2 \\ 4 & 7\end{bmatrix}\begin{bmatrix}x \\ y \end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0 \end{bmatrix}
which is the same as
\begin{bmatrix}x+ 5y \\ 3x+ 2y \\ 4x+ 7y\end{bmatrix}= \begin{bmatrix}0 \\ 0 \\ 0 \end{bmatrix}
Do you see now, how the columns of A, which become the rows of AT, are relevant here?
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top