Recent content by Luka

  1. L

    Writing a matrix as sum of a constant * matrix

    Isn't the solution of the system we get an indicator of the existence of those numbers? If the system has a solution, such x and y exist. Otherwise, they don't.
  2. L

    Writing a matrix as sum of a constant * matrix

    Well, you need just a few equations to solve the system, so it doesn't matter if a matrix is 10 x 10. It wouldn't take too long. As far as I know, there is no more efficient method. Today, if we want to deal with 100 x 100 matrices, we use computers. That's why we've made them in the first...
  3. L

    Writing a matrix as sum of a constant * matrix

    You can check if there are x and y that make the equation true by solving the system of equations generated by matrices. Let matrix A be A=\left| \begin{array}{cc} a_{11} & a_{12} \\ a_{21} & a_{22} \end{array} \right|, and B B=\left| \begin{array}{cc} b_{11} & b_{12} \\ b_{21} & b_{22}...
  4. L

    Linear independence of sin (x), cos (x) and 1, proof

    It does if we want to prove the linear independence (because of the definition itself). I'm worried about the fact that not all x satisfy the conditions sin(x)\neq 0, cos(x)\neq 0 that allow us to prove it.
  5. L

    Linear independence of sin (x), cos (x) and 1, proof

    For x=\pi, we get \gamma - \beta = 0 which means that \alpha can be of any value, and the expression still equal to zero. Then those elements (f(x), g(x) and h(x)) would not be linearly independent according to the definition of linear independence. I think that we need all three scalars to be...
  6. L

    Linear independence of sin (x), cos (x) and 1, proof

    What would be the best way to show that functions f(x)=1, g(x)=sin(x) and h(x)=cos(x) are linearly independent elements of the vector space \mathbb{R}^{\mathbb{R}}? I know that the linear independence means that an expression like \alpha \mathbb{x}_1 + \beta \mathbb{x}_2 + \gamma \mathbb{x}_3...
Back
Top