Changing diagonal elements of a matrix

  • Context: Undergrad 
  • Thread starter Thread starter adelaide_user_1009
  • Start date Start date
  • Tags Tags
    Elements Matrix
Click For Summary

Discussion Overview

The discussion revolves around the manipulation of a variance-covariance matrix, specifically the implications of changing only the diagonal elements while leaving the off-diagonal elements unchanged. Participants explore the theoretical and practical consequences of such modifications, questioning the validity and meaning of the resulting matrix.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant proposes scaling the diagonal elements of a variance-covariance matrix W using a vector of weights v, while questioning the implications of leaving off-diagonal elements unchanged.
  • Another participant argues that changing the variances inherently alters the covariances, suggesting that if variances are modified without adjusting covariances, the resulting matrix would lose its meaning as a covariance matrix.
  • It is noted that a covariance matrix must remain symmetric and positive semi-definite, and that increasing diagonal entries is permissible, but doing so would represent a different set of random variables.
  • A suggestion is made to explore a convex combination of the original matrix W and a diagonal matrix, indicating that this approach might preserve some properties of the covariance matrix.
  • Participants discuss the implications of introducing new uncorrelated random variables to adjust variances, with some emphasizing the need to maintain the overall structure of the covariance matrix.
  • One participant expresses uncertainty about how the vector of weights is intended to modify the variances, indicating a need for clarification on this aspect.

Areas of Agreement / Disagreement

Participants generally disagree on the feasibility and implications of modifying only the diagonal elements of the covariance matrix. There is no consensus on whether such modifications can be made without losing the matrix's properties as a covariance matrix.

Contextual Notes

Participants highlight the importance of maintaining the properties of symmetry and positive semi-definiteness in the covariance matrix. There are unresolved questions regarding the specific conditions under which the proposed modifications can be made without altering the meaning of the matrix.

Who May Find This Useful

This discussion may be of interest to those studying statistics, particularly in the context of covariance matrices, as well as researchers exploring the implications of modifying statistical models.

adelaide_user_1009
Messages
3
Reaction score
0
TL;DR
Can I transform only diagonal elements of a variance-covariance matrix?
I have a variance-covariance matrix W with diagonal elements diag(W). I have a vector of weights v. I want to scale W with these weights but only to change the variances and not the covariances. One way would be to make v into a diagonal matrix and (say V) and obtain VW or WV, which changes both diagonal and off-diagonal elements of W. Does it make sense to only multiply diag(W) with v and leave the off-diagonal elements of W untouched?

I have searched for any intuition around it but found nothing that supports it or otherwise. Any help would be appreciated on references, texts, etc that would involve such a situation.
 
Physics news on Phys.org
Why?

The variance is only a specific case of covariance. Hence changing it means changing a covariance automatically.

Assume for a moment that you were successful. What would the new matrix represent? Definitely no covariances anymore.

Assume a very simple case: ##X_k \longmapsto X'_k:=\alpha_kX_k.## Then ##\operatorname{cov}(X'_m,X'_n)=\alpha_m\alpha_n \operatorname{cov}(X_m,X_n).## Since all covariances should remain unchanged, we get ##\alpha_m\alpha_n = 1## for all ##m\neq n##. If we have enough indices, then we will be left with ##\alpha_k=1## for all ##k.##

You are asking the wrong question.
What do you intend to do, in the sense that your result will still have a meaning?​
 
  • Like
Likes   Reactions: adelaide_user_1009
A matrix is a covariance matrix if and only if it is symmetric and positive semi definite. So you can change the diagonal as long as you don't make it too small, and it should continue to satisfy these properties. It will be the covariance matrix for a totally different set of random variables though as fresh points out.

As long as you want to increase the diagonal entries, you can just add some new random variable that is uncorrelated with all your existing random variables whose variance is equal to the increase. Making them smaller requires something more clever (and isn't anyways possible)
 
  • Like
Likes   Reactions: adelaide_user_1009, WWGD and FactChecker
Office_Shredder said:
A matrix is a covariance matrix if and only if it is symmetric and positive semi definite. So you can change the diagonal as long as you don't make it too small, and it should continue to satisfy these properties. It will be the covariance matrix for a totally different set of random variables though as fresh points out.

As long as you want to increase the diagonal entries, you can just add some new random variable that is uncorrelated with all your existing random variables whose variance is equal to the increase. Making them smaller requires something more clever (and isn't anyways possible)
You are right — changing only the diagonal of the covariance matrix would not transform the matrix and it would not be the covariance matrix of the variables anymore. I will try to explore the possibility of a convex combination of the full matrix W and matrix with only diag(W). Something like (1-a)diag(W) + aW
 
That is the covariance matrix of ##\sqrt{a}## multiplied by your original variables, plus new uncorrelated random variables added to each of them with variance ##(1-a)## times the variance of the variable it is being added to.
 
  • Like
Likes   Reactions: adelaide_user_1009
Office_Shredder said:
That is the covariance matrix of ##\sqrt{a}## multiplied by your original variables, plus new uncorrelated random variables added to each of them with variance ##(1-a)## times the variance of the variable it is being added to.
Right. So assuming no restriction is imposed on their correlation, ##a = 1## and W is true and assuming variables are totally uncorrelated, ##a = 0## diagonal matrix of diag(W) is true. I need to figure out how to specify ##a## in my case.
 
I think I don't fully understand when you say you have a vector of weights, how those are supposed to modify the variances
 

Similar threads

  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K