Dimension of an intersection between a random subspace and a fixed subspace

Click For Summary

Discussion Overview

The discussion revolves around the problem of determining the number of zero constraints that can be imposed on a symmetric matrix X, whose column space is contained in a random k-dimensional vector subspace V of ℝn, before the only solution becomes X=0. The conversation touches on theoretical aspects, mathematical reasoning, and the implications of Grassmannian manifolds.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant expresses difficulty in approaching the problem and seeks hints or advice regarding the constraints on the symmetric matrix X.
  • Another participant suggests that the problem may relate to Grassmannian manifolds, indicating a need for alternative terminology or keywords to find relevant literature.
  • A participant questions the clarity of the problem statement, particularly regarding the definition of "high probability" and the uniform distribution over k-dimensional subspaces.
  • Clarifications are made regarding the notation and the meaning of "non-zero solution," emphasizing the intersection of spaces MV and MX.
  • There is speculation about the probability of finding a non-zero element in the intersection of MV and MX, with conjectures about its behavior relative to the number of constraints q.
  • One participant introduces the concept of singular value decomposition (SVD) and its potential relevance to understanding the symmetric matrix's properties.
  • Another participant references an external thread, questioning whether certain observations about the SVD are universally evident from a geometric perspective.

Areas of Agreement / Disagreement

Participants express varying levels of understanding and clarity regarding the problem, with no consensus on the best approach or terminology. Multiple competing views and uncertainties remain regarding the implications of the constraints and the behavior of the intersection of vector spaces.

Contextual Notes

There are unresolved questions about the implementation of a uniform distribution over k-dimensional subspaces and the precise interpretation of "high probability." The discussion also highlights the complexity of the mathematical framework involved.

Petawa
Messages
7
Reaction score
0
I've been struggling with this problem for about two weeks. My supervisor is also stumped - though the problem is easy to state, I don't know the proper approach to tackle it. I'd appreciate any hints or advice.

Let V be an random k-dimensional vector subspace of ℝn, chosen uniformly over all possible k-dimensional subspaces. Let X\inℝn\timesn be a symmetric matrix whose column space is contained in V. Now I add constraints to X: given some pairs (i,j) such that 1\leq i < j\leq n, I need Xij=0. The pairs are fixed and independent of V. How many of these zero constraints can I satisfy before (with high probability) the only solution is X=0?

I've found a sufficient condition for a non-zero solution to exist: the number of constraints q must satisfy q< k(k+1)/2. From simulations, I think it is also a necessary condition (with probability one), but I can't seem to show it. I'd appreciate any ideas on how I might proceed.

Proof of sufficient condition
Let MV be the space of symmetric n by n matrices whose column space is contained in V. An orthogonal basis for MV is {Q(ei ejT+ej eiT)QT : 1\leq i \leq j \leq n} where the columns of Q\inℝn\timesk form an orthonormal basis for V and ei are the standard basis vectors for ℝk, so dim(M_V)=k(k+1)/2.

Let MX be the space of symmetric n\timesn matrices that satisfy the q zero constraints. Now suppose no non-zero X satisfying the constraints exist: this implies MV\cap MX={0}. Hence dim(MV+MX) = dim(MV)+dim(MX) = k(k+1)/2+n(n+1)/2-q. Since MV+MX is contained within the space of symmetric matrices, its dimension is bounded by n(n+1)/2. Thus "no non-zero X" implies q\geqk(k+1)/2. The contrapositive implies there is a non-zero solution when q<k(k+1)/2.
 
Physics news on Phys.org
I'm still struggling with this. It seems to be related to Grassmannian manifolds, which provide a distribution over all possible random subspaces. However I can't find any papers that address this problem. Perhaps I'm using the wrong terminology? Does anyone have alternative keywords or anything that might help?

Thanks again.
 
I probably couldn't solve this problem if I understood it, but for the sake of someone who could, I think you should clarify it.

Petawa said:
Let X\inℝn\timesn be a symmetric matrix whose column space is contained in V.

How many of these zero constraints can I satisfy before (with high probability) the only solution is X=0?

I don't understand the use of "Let X be..." in combination with the request to find X as a solution to something. Is "high probability" any non-zero probability?

I'm also curious how you implement a uniform distribution over all k dimensional subspaces of real n-space.
 
You're right, my choice of words is poor. X is a matrix lying in the space MV\capMX (using the notation from the proof in the original post). That is, X's column space is contained in V and X satisfies the zero constraints. When I said "a non-zero solution exists", I should have said "MV\capMX contains a non-zero element".

Is "high probability" any non-zero probability?
If my conjecture is right, P(MV\capMX contains a non-zero element) is 1 for q< k(k+1)/2 and 0 for q>= k(k+1)/2. However, a result with any non-zero probability would be equally useful.

I'm also curious how you implement a uniform distribution over all k dimensional subspaces of real n-space.
I had the exact same question when I started looking into this. The distribution has support on a so-called "Grassmann manifold", the theory of which seems very well developed. My favourite paper is this one - they use a nice graphical approach to relate random subspaces to random points on a sphere. Hopefully, P(MV\capMX contains a non-zero element) has a transition point from 1 to 0 at q= k(k+1)/2, so we can ignore the exact specification of the distribution over the Grassmannian. However, that might be the correct approach: I just haven't had any luck with it so far, it gets very complex very quickly.
 
I don't know enough about this problem to make any suggestions. However, maybe if you answer some of my questions, you'll be struck by a blinding flash of inspiration.

I've always been impressed by the utility of the singular value decomposition of a matrix and the most natural interpetation of a matrix (to me) is to view it as defining a linear mapping. So what would this point of view say in the case of your symmetric matrix and how would the SVD exhibit the column span of the matrix?

( Another thing that currently puzzles me is whether sunjin09's observation in the thread https://www.physicsforums.com/showthread.php?p=3849082#post3849082 is something that "anybody can see" from the geometric interpretation of the SVD. I can't see it! A summary of that is on another forum: http://www.sosmath.com/CBB/viewtopic.php?f=5&t=57763
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
Replies
15
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 18 ·
Replies
18
Views
3K