Inner products and orthogonal basis

  • Thread starter Thread starter NullSpaceMan
  • Start date Start date
  • Tags Tags
    Basis Orthogonal
NullSpaceMan
Messages
8
Reaction score
0
Hi all!

This looks a pretty nice forum. So here's my question:

How do I find/show a basis or orthogonal basis relative to an inner product? The reason I ask, is because in my mind I see the inner product as a scalar, and thus I find it difficult to "imagine" how a scalar lives in a space.

Many thanks! I would like to discuss.

Have a good one:cool:
 
Physics news on Phys.org
Try the The Gram-Schmidt Algorithm - it will construct an basis orthogonal wrt your inner product given any other basis.

The inner product of two vectors is a scalar (it is in a sense the angle between the two vectors) - what exactly are you trying to imagine?
 
What river_rat said. If you have a set of linearly independent vectors that span the inner product space, you can use the Gram-Schmidt orthogonalisation process to find an orthonormal basis of identical span.

The inner-product is nothing more than a bilinear form defined over any vector space. It helps to distinguish the operation from the vector space itself, since all "inner-product spaces" are vector spaces without their respective inner products as well.EDIT: Typo correction
 
Thanks.

I am familiar with the grahm-schmidt algorithm, but I was wondering how I would go about proving a given basis is orthogonal relative to an inner product? How do I picture such in my head?

thanks again,

:cool:
 
Well check if \vec{e_i} \cdot \vec{e_j} = 0 \forall i \neq j. If that is true then your basis is orthogonal relative to that innerproduct. For R^2 a non standard inner product amounts to declaring some other angle to mean " at \frac{\pi}{2} " - so all you have done is shift your axis so that they are no longer meet at right angles (relative to the normal inner product that is, changing the inner product also changes the metric - so you are squishing some directions and expanding others)
 
Last edited:
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top