Solve derivative of least squares matrix equation

What is your question?In summary, the problem is to find an optimum transform matrix T for a MIMO communication system, using the least squares solution. The received signal is corrupted by noise and its estimate s_hat is calculated using s_hat = inv(TH)*y = inv(H)inv(T)THs + inv(H)inv(T)Tn. However, the matrix T disappears from the expression for the least squares solution and it is unclear how to find the derivative dD/dT to solve for T. A suitable book for understanding this type of matrix algebra may be helpful.
  • #1
beyondlight
65
0

Homework Statement



I am designing a MIMO communication system, with input signal s, channel H and transform matrix T. The received signal is corrupted by noise.

Homework Equations


[/B]
The received signal is r = Hs+n

And then it is transformed (compressed) by:

y = Tr

And then its estimate s_hat is computed:

s_hat = inv(TH)*y = inv(H)inv(T)THs + inv(H)inv(T)Tn

Set C = inv(H)inv(T)Tn
I want to find an optimum T based on the least squares solution:

D = norm(s-s_hat)^2
dD/dT = 0

The Attempt at a Solution


[/B]
[itex]D= (s-s_{hat})^{H}(s-s_{hat})=0[/itex]
[itex]D = (s-H^{-1}T^{-1}{THs})^{H}(s-H^{-1}T^{-1}THs)[/itex]
[itex]D = ||s|| -s^{H}H^{-1}T^{-1}{THs}-s^{H}C-s^{H}H^{H}T^{H}(T^{-1})^{H}(H^{-1})^{H}s+(s^{H}H^{H}T^{H}(T^{-1})^{H}(H^{-1})^{H}s)(H^{-1}T^{-1}{THs})-sH^{H}T^{H}(T^{-1})^{H}(H^{-1})^{H}C-C^{H}s-C^{H}H^{-1}T^{-1}{THs}+C^{H}C[/itex]How do I find the derivative dD/dT? Suppose that I do find it, how then do I proceed to obtain T alone on one side of the equation?

I would also like to get some ideas on which book covers this kind of matrix algebra.
 
Physics news on Phys.org
  • #2
beyondlight said:

Homework Statement



I am designing a MIMO communication system, with input signal s, channel H and transform matrix T. The received signal is corrupted by noise.

Homework Equations


[/B]
The received signal is r = Hs+n

And then it is transformed (compressed) by:

y = Tr

And then its estimate s_hat is computed:

s_hat = inv(TH)*y = inv(H)inv(T)THs + inv(H)inv(T)Tn

Set C = inv(H)inv(T)Tn
I want to find an optimum T based on the least squares solution:

D = norm(s-s_hat)^2
dD/dT = 0

The Attempt at a Solution


[/B]
[itex]D= (s-s_{hat})^{H}(s-s_{hat})=0[/itex]
[itex]D = (s-H^{-1}T^{-1}{THs})^{H}(s-H^{-1}T^{-1}THs)[/itex]
[itex]D = ||s|| -s^{H}H^{-1}T^{-1}{THs}-s^{H}C-s^{H}H^{H}T^{H}(T^{-1})^{H}(H^{-1})^{H}s+(s^{H}H^{H}T^{H}(T^{-1})^{H}(H^{-1})^{H}s)(H^{-1}T^{-1}{THs})-sH^{H}T^{H}(T^{-1})^{H}(H^{-1})^{H}C-C^{H}s-C^{H}H^{-1}T^{-1}{THs}+C^{H}C[/itex]How do I find the derivative dD/dT? Suppose that I do find it, how then do I proceed to obtain T alone on one side of the equation?

I would also like to get some ideas on which book covers this kind of matrix algebra.

The matrix ##T## disappears from your expression for ##D## as you have written it:
##H^{-1} T^{-1} T H = H^{-1} H = I## (the unit matrix), because ##T^{-1}T = I## and ##H^{-1}H = I##. So, what you have written is, basically, ##D = (s-s)^H (s-s)##, which is just 0 for all ##H, T##.
 
  • #3
Ray Vickson said:
The matrix ##T## disappears from your expression for ##D## as you have written it:
##H^{-1} T^{-1} T H = H^{-1} H = I## (the unit matrix), because ##T^{-1}T = I## and ##H^{-1}H = I##. So, what you have written is, basically, ##D = (s-s)^H (s-s)##, which is just 0 for all ##H, T##.
But since s_hat is corrupted by noise then this will not be exactly true?
 
  • #4
beyondlight said:
But since s_hat is corrupted by noise then this will not be exactly true?

I am just going by what you wrote. Perhaps what you wrote is not appropriate.
 

What is a derivative of a least squares matrix equation?

A derivative of a least squares matrix equation is the rate of change of the function that represents the squared differences between the observed data and the predicted values.

Why do we need to solve for the derivative of a least squares matrix equation?

Solving for the derivative allows us to find the minimum value of the function, which is the point where the squared differences between the observed data and the predicted values are the smallest. This is important in data analysis as it helps us find the best fit for our data.

How do you solve for the derivative of a least squares matrix equation?

To solve for the derivative, we use the method of calculus called the method of least squares. This involves taking the partial derivatives of the function with respect to each variable and setting them equal to 0. We then solve the resulting equations to find the values of the variables that minimize the function.

What are the applications of solving for the derivative of a least squares matrix equation?

The derivative of a least squares matrix equation is commonly used in regression analysis, where we try to find the best fit line or curve for a set of data points. It is also used in machine learning algorithms, such as linear regression, to find the optimal parameters for a predictive model.

Can a derivative of a least squares matrix equation be negative?

Yes, a derivative of a least squares matrix equation can be negative. This indicates that the function is decreasing at that point, and the slope of the curve is pointing downwards. This can happen when the predicted values are higher than the observed data points, resulting in a higher squared difference.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
161
  • Calculus and Beyond Homework Help
Replies
3
Views
573
  • Calculus and Beyond Homework Help
Replies
3
Views
335
  • Advanced Physics Homework Help
Replies
1
Views
318
  • Calculus and Beyond Homework Help
Replies
6
Views
303
  • Introductory Physics Homework Help
Replies
6
Views
729
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
5
Views
1K
Replies
0
Views
459
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
Back
Top