Lagrangian Gradient Simplification

zmalone
Messages
10
Reaction score
0
From the attached image problem:
When deriving the third term in the Lagrangian:
\lambda_{2}(w^{T}∑w - \sigma^{2}_{\rho}) with respect to w, are w^{T} and w used like a w^{2} to arrive at the gradient or am I oversimplifying and it just happens to work out on certain problems like this?

(∑ is an n x n symmetric covariance matrix and w is n x 1 vector)
 

Attachments

  • LinAlgQuestion1.jpg
    LinAlgQuestion1.jpg
    20.2 KB · Views: 404
Physics news on Phys.org
zmalone said:
From the attached image problem:
When deriving the third term in the Lagrangian:
\lambda_{2}(w^{T}∑w - \sigma^{2}_{\rho}) with respect to w, are w^{T} and w used like a w^{2} to arrive at the gradient or am I oversimplifying and it just happens to work out on certain problems like this?

(∑ is an n x n symmetric covariance matrix and w is n x 1 vector)

The best way of avoiding errors, at least when you are starting, is to write it out in full:
w^T \Sigma w = \sum_{i=1}^n \sum_{j=1}^n \sigma_{ij} w_i w_j<br /> = \sum_{i=1}^n \sigma_{ii} w_i^2 + 2 \sum_{i &lt; j} \sigma_{ij} w_i w_j

BTW: you don't need all those '[ itex]' tags: just enclose the whole expression between '[ itex]' and ' [/itex]' (although I prefer an opening '# #' (no space) followed by a closing '# #' (again, no space). Compare this: ##\lambda_2 (w^T \Sigma w - \sigma_{\rho}^2)## with what you wrote. (Press the quote button, as though you were composing a reply; that will show you the code. You can then just abandon the reply and not post it.)
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top