## Maximum inner product between two orthgonal vectors (in standard dot procut))

Hello buddies,

Here is my question. It seems simple but at the same time does not seem to have an obvious answer to me.

Given that you have two vectors $\mathbf{u},\mathbf{v}$.
• They are orthogonal $\mathbf{u}^T\mathbf{v}=0$ by standard dot product definition.
• They have norm one $||\mathbf{u}||=||\mathbf{v}||=1$ by standard dot product definition.
• Define the weighted inner product as $\mathbf{u}^T\left(\begin{matrix}\lambda_1\\&\ddots\\&&\lambda_M\end{mat rix}\right)\mathbf{v}$ where $M$ is the number of components. Then the norm according to this inner product is also one for both vectors $\mathbf{u}^T\left(\begin{matrix}\lambda_1\\&\ddots\\&&\lambda_M\end{mat rix}\right)\mathbf{u}= \mathbf{v}^T\left(\begin{matrix}\lambda_1\\&\ddots\\&&\lambda_M\end{mat rix}\right)\mathbf{v}=1$. Notice the dot product is a particular case where the matrix is the identity.
• Edit: I forgot to add this $\lambda_1+\cdots+\lambda_M=M$. It does not make the problem more complicated as it just narrows the possible lambdas.

What is then the maximum inner product (in absolute value) among two vectors satisfying the previous conditions? I.e.
$\operatorname{max}\limits_{\mathbf{u},\mathbf{v}} \left| \mathbf{u}^T\left(\begin{matrix}\lambda_1\\&\ddots\\&&\lambda_M\end{mat rix}\right)\mathbf{v} \right|$

Cheers
 PhysOrg.com science news on PhysOrg.com >> Galaxies fed by funnels of fuel>> The better to see you with: Scientists build record-setting metamaterial flat lens>> Google eyes emerging markets networks
 Do you require that the BOTH norms, the standard one and the weighted one are 1?

 Quote by Hawkeye18 Do you require that the BOTH norms, the standard one and the weighted one are 1?
Yes.

## Maximum inner product between two orthgonal vectors (in standard dot procut))

Take M=2

write $u=\begin{array}{c}\cos(\alpha)\\ \sin(\alpha)\end{array}; v=\begin{array}{c}-\sin(\alpha)\\ \cos(\alpha)\end{array}$

The weighted norm is then
$u^T\lambda u = \lambda_1 \cos^2(\alpha) + \lambda_2 \sin^2(\alpha) =1$
$v^T\lambda v = \lambda_1 \sin^2(\alpha) + \lambda_2 \cos^2(\alpha)=1$

The sum between these gives $\lambda_1+\lambda_2 = 2$

The difference gives $(\lambda_1 - \lambda_2)(\cos^2(\alpha)-\sin^2(\alpha))=0$

Either you use the standard norm, $\lambda_1=\lambda_2=1$ or $\alpha=\pi/2$ (or the 3 other quadrants) and no further restrictions on $\lambda_{1,2}$

Then $u^T \lambda v = \frac{1}{2}(\lambda_2 - \lambda_1)$

For M>2, find the two $\lambda$ with the largest difference but sum 1 - but I am not entirely sure that there cannot be another, larger solution.

BTW, is this a homework problem?

 Quote by M Quack The weighted norm is then $u^T\lambda u = \lambda_1 \cos^2(\alpha) + \lambda_2 \sin^2(\alpha) =1$ $v^T\lambda v = \lambda_1 \sin^2(\alpha) + \lambda_2 \cos^2(\alpha)=1$ The sum between these gives $\lambda_1+\lambda_2 = 2$ The difference gives $(\lambda_1 - \lambda_2)(\cos^2(\alpha)-\sin^2(\alpha))=0$ Either you use the standard norm, $\lambda_1=\lambda_2=1$ or $\alpha=\pi/2$ (or the 3 other quadrants) and no further restrictions on $\lambda_{1,2}$
I think you meant $\alpha=\pi/4$ (or $\pi/4+\pi$)
For $M=2$, the solution only allows these values for the lambdas. I am interested in a generic $M$ which is less obvious.

I actually forgot to mention $\lambda_1+\cdots+\lambda_M=M$, i.e. the average lambda is one.

 Quote by M Quack For M>2, find the two $\lambda$ with the largest difference but sum 1 - but I am not entirely sure that there cannot be another, larger solution.
But you derived this conditions imposing orthogonality and norm-1 for vectors of two components. This probably does not carry on for bigger $M$.

 Quote by M Quack BTW, is this a homework problem?
Not at all. I am a theoretical radar engineer.

 Tags inner product, maximum, weighted function