Can Constant Vectors Solely Satisfy the Kernel Conditions of a Quadratic Form?

topcomer
Messages
33
Reaction score
0
Here is an interesting problem I came up with during my research. I first present a slightly simplified version. Let us the define component-wise the following bilinear symmetric form, returning a vector:

a_i(u,v) = \frac{1}{2} (u^T A_i v - d_i) \;\;\; i=1 \ldots m

where u,v \in V = R^n, d_i = 1, and the A_i are symmetric n-by-n matrices having constant vectors u=const in the kernel, i.e. rows and columns of A_i add up to zero. Given A, I want to solve the following:

Find u : a_i(u,u) = 0 \;\;\; \forall i=1 \ldots m,

Or, equivalently, I want to show that the only u satisfying the equation are constant vectors.

I believe that a non-trivial u always exists in this simplified problem for a very large class of given A. The idea is to use the fact that A is diagonalizable, then it is possible to build u as a "sampling" proportional to (cos(t),sin(t)) on the eigenvector basis in order to pick up, for each component of a(), only two eigenvalues of the matrix and use the fact that cos^2+sin^2=1 to cancel the entries of d. However, I don't know if it's possible to implement a numeric test to check if this is true for a given A.
 
Last edited:
Physics news on Phys.org
To complete the statement, the more general problem consists in substituting d_i with a 2-by-2 identity matrix. It can be formulated as follows:

<br /> a_{ij}(u,v) = \frac{1}{2} (u^T A_{ij} v - d_{ij}) \;\;\; i=1 \ldots m, \;\;\; j=1 \dots 4,<br />

where d_{i1} = d_{i2} = 1, d_{i3} = d_{i4} = 0. Then:

Find <br /> u : a_{ij}(u,u) = 0 \;\;\; \forall i=1 \ldots, m \;\;\; \forall j=1 \dots 4.<br />

My conjecture is that in this case a non-trivial solution does not exist unless a very special A is given. What test can I implement in Matlab or Mathematica? Newton's method is an overkill for just proving non-existence.
 
Last edited:
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top