Search results

  1. A

    I Eigenvalues of block matrices

    This is close to what I am thinking in. If X is an nxn matrix in certain vector space with entries representing some coefficients, it would be wonderful if we can reduce this representation to some form of 2x2 blocks of its states even if it is in some other space. For example, In Markov case...
  2. A

    I Eigenvalues of block matrices

    Thank you for comprehensive analysis. But should be any closed form for that function of the eigen value of A and the eigenvalues of B? For example, suppose X represents the transitional probability in Markov chain process, will I be able to represent it, using certain transformation, by a...
  3. A

    I Eigenvalues of block matrices

    But the matrices in the last example have rows and columns which are linear combination of each other. What if we have a matrix with random entries like the one in your post #4. Again, will be any relationship between the eigen values of the matrix and its blocks matrices?
  4. A

    I Eigenvalues of block matrices

    I used online matrix calculator and I found no obvious correlation between the eigen vectors or eigen values of the ##\begin{bmatrix}\mathbf{A} & 0 \\ 0 & \mathbf{B}\end{bmatrix}## and ##\begin{bmatrix}0 & \mathbf{A} \\ \mathbf{B} & 0\end{bmatrix}##. There is of course obvious similarity as...
  5. A

    I Eigenvalues of block matrices

    This is clear if C is a diagonal matrix with entries are real numbers, in such case, the eigen vectors of C is ##(1,0,0...0)^T##, ...##(0,0,0...1)^T## and the eigen values are the numbers themselves. The eigen vectors and eigen values of each block will follow the same pattern. What if C has...
  6. A

    I Eigenvalues of block matrices

    If there is matrix that is formed by blocks of 2 x 2 matrices, what will be the relation between the eigen values and vectors of that matrix and the eigen values and vectors of the sub-matrices?
  7. A

    I Can we retrieve the inverse of matrix A in this example?

    So in special case where all elements of A are real and positive, then no additional information is required and A is solved.
  8. A

    I Can we retrieve the inverse of matrix A in this example?

    But why is A, if it is real diagonal, not unique? If I get A2 then each diagonal element in A is ##\sqrt {A^2}##.
  9. A

    I Can we retrieve the inverse of matrix A in this example?

    We will get ##A^2## because ##U^TU=1##, right
  10. A

    I Can we retrieve the inverse of matrix A in this example?

    I appreciate your contribution to answer but frankly I have no clear idea about your words. My question is clear from the beginning and still I have no answer on it, will I be able to decompose UA into U and A where is U is a unique unitary and A a unique diagonal or no?
  11. A

    I Can we retrieve the inverse of matrix A in this example?

    What if we find that UA is invertiabe, can we decompose it?
  12. A

    I Can we retrieve the inverse of matrix A in this example?

    U and A are square matrices of the same rank, so UA is a square matrix too and it should be invertible. But even in this case, how to find A and its inverse from UA? In other words, can we decompose UA into U and A?
  13. A

    I Can we retrieve the inverse of matrix A in this example?

    Suppose we have a product formed by a multiplication of a unitary matrix U and a diagonal matrix A, can we retrieve the inverse of A without knowing either U or A?
  14. A

    I Which reduced vector should be used in cosine similarity?

    In Latent semantic analysis, the truncated singular value decomposition (SVD) of a term-document matrix ##A_{mn}## is $$A=U_rS_rV^T_r$$ In many references including wikipedia, the new reduced document column vector in r-space is scaled by the singular value ##S## before comparing it with other...
  15. A

    Singular spectral analysis of periodic series with period L

    Ok, here is an attached image of the tarjectory matrix X, the column vector of length L which is the window length of the series. Now suppose that the time series that is represented by this matrix has a period which is just equal to the time between 2 successful Xs values. For example, the...
  16. A

    Singular spectral analysis of periodic series with period L

    Let's have a time series with a period=L. Suppose we arbitrarily choose the window length of the trajectory matrix to be equal to L which is also equal to the period of a time series. Then the second column of the matrix will also start with the same entry as the first column, because all...
  17. A

    What is more efficient, autocorrelation or SSA?

    By efficient, I meant the ability of the algorithm to get all possible information about the spectral components of the time series. Real time analysis is an important concern too.
  18. A

    What is more efficient, autocorrelation or SSA?

    What is more efficient in extracting the pattern in a time series analysis, autocorrelation or singular spectrum analysis?
  19. A

    SVD of a reduced rank matrix still has non-zero U and V`?

    So I have two concerns here: 1) does that mean we can construct infinite number of matrices U and V' by arbitrarily choosing orthonormal columns vectors of U and row vectors of V'? 2) back to my question, U And V' are constructed from AA'=USSU' and A'A=VSSV', but if this is applied for the...
  20. A

    SVD of a reduced rank matrix still has non-zero U and V`?

    In a given matrix A, the singular value decomposition (SVD), yields A=USV`. Now lets make dimension reduction of the matrix by keeping only one column vector from U, one singular value from S and one row vector from V`. Then do another SVD of the resulted rank reduced matrix Ar. Now, if Ar is...
  21. A

    Can the system response function be calculated?

    But in linear regression, we seek to calculate the regressors β0 and β1 by using different xij as representing χ matrix of independent variables. In my example, I am doing the opposite by seeking calculation of the system response function represented by matrix, χ in analogue with linear...
  22. A

    Can the system response function be calculated?

    Do you mean multivariate linear regression like Y=XB, with Y is a random vector, B is a regressor vector and X is a matrix? Can you explain more please?
  23. A

    Can the system response function be calculated?

    This is a good and simple method to calculate all ai.m. But still it can not grantee that rows and columns of A are not linearly dependent. Suppose for simplicity, that A is (2x2) matrix. For A to be diagonalizable, the following condition must be satisfied; a21/a11 ≠a22/a12. So no matter which...
  24. A

    Can the system response function be calculated?

    In my mind, the linear dependency depends on the values of the matrix because the inputs can be chosen arbitrarily. But the values of matrix cells are themselves unknown, so how to make sure that the matrix coefficients can maintain linear independent sets of equations before being calculated?
  25. A

    Can the system response function be calculated?

    Suppose we represent the input information as a (nx1) column vector, the output information as another (nx1) column vector and the system response function as a (nxn) matrix. My question, is it possible to calculate the values of the cells of the matrix knowing the input and the output? For...
Top