Ill-posed inverse problem (linear operator estimation)

Click For Summary
The discussion centers on estimating a constant symmetric real matrix A in a dynamical system defined by x_{t+1} = Ax_t, with only initial vectors x_0 and x_1 known. The problem is deemed ill-posed due to having more unknowns than data points. A Bayesian approach is suggested, utilizing a prior distribution for A with maximum entropy and updating it based on observed data to maximize the probability of the observed x_i. Additionally, an operator algebra technique is proposed to derive A from powers of A and the known vectors, potentially using iterative methods to refine the estimate. The conversation highlights the complexity of the problem and the need for robust mathematical frameworks to address it effectively.
bpet
Messages
531
Reaction score
7
Ok this kind of question seems to come up a lot in research and applications but has me completely stumped.

Say we have a dynamical system x_{t+1} = Ax_t where for simplicity we'll assume A is a constant symmetric real matrix but otherwise unknown.

1. What is the "best" estimate of A, when only the vectors x_0 and x_1 are known?

2. What is the new "best" estimate of A when x_2 is observed?

Obviously it's ill-posed because there are more unknowns than data. Various generalizations of the problem could include noise, observations of only y=Bx, infinite dimensions, etc but question 1 has it in a nutshell.

Any thoughts?
 
Last edited:
Mathematics news on Phys.org
No thoughts!?

I would have replied if I fully understood the problem.
 
How do you define best estimate?
 
bpet said:
1. What is the "best" estimate of A, when only the vectors x_0 and x_1 are known?

2. What is the new "best" estimate of A when x_2 is observed?

Being a Bayesian, I would look for the "prior distribution" for the matrix A that had maximum entropy. Then I would do a Bayesian update of A that makes it the matrix (or one of the matrices) that makes the observed x_i most probable.

If that proved too involved, I'd try to find a model for the joint distribution of the first n observations x_1,x_2,... x_n (assuming A is n by n ). I'd want a model that was at least good enough to eliminate things that I think are absurd. (For example, the sequence (3.9, 4.2, 0, 0, 0, 0,...,0) might be implausible based on the physics of the problem at hand.) Given the first k observations, you could pick the matrix A by various criteria. You could pick it to be one that produces a subsequent series of observations that has a high probability. Or you could pick A to be one whose predictions are the best mean square error estimator of the subsequent observations.
 
I'm wondering if you can use some kind of operator algebra technique to solve for A.

Essentially you are going to be given something in terms of powers of A, where you have something like:

x1 = Ax0
xn = A^nx0

If you can get some root involving the various powers of A and x0 in terms of x1,x2,...,xn, then you can get an operator relationship for A in its powers and x0. Then by using an iterative technique, you could extract a good estimate for the linear operator A.

There is already an established theory to work out functions of linear operators given that the operator has certain conditions, and with an iterative method, I think this might be useful.

You have to check though what the requirements of the operators are for the operator algebraic techniques to work and given something useful.
 
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 38 ·
2
Replies
38
Views
11K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 125 ·
5
Replies
125
Views
20K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K