Recent content by onako

  1. O

    Looking for a common solution of two systems

    Given two systems, Ax=b and Cy=d, for nxn matrices A and C, and n-dimensional vectors b and d, each of which has at least one solution, it is know that one solution is common to both (satisfies both equations). Could such solution be z found by solving Az+Cz=b+d? I understand that a common...
  2. O

    Understanding displacements of points by interpreting directions

    So, adding a(x-y) to x, means that the distance x to y changes depending on a: positive a implies increased distance, and negative a implies decreased distance? It's a bit "non-rigid" to state that a vector is added to a point.
  3. O

    Understanding displacements of points by interpreting directions

    Suppose that points x and y are given in Euclidean space. Point x is displaced to point x1 by x1=x+a(x-y) Given that a is positive number, how can it be shown that the distance x1 to y is larger than distance x to y. I'm mainly interested in a vector interpretation of the above update...
  4. O

    Weighted average of arbitrary k points from a line

    I'm concerned with a 2D case, with a straight line.
  5. O

    Weighted average of arbitrary k points from a line

    Let's simplify the question. Suppose that non-negative arbitrary weights o_i are associated with each x_i chosen from the straight line. Does x_o lie on that same line? The point is: points x_i are chosen as arbitrary points from the line, and are associated coefficients o_i which are non-negative.
  6. O

    Weighted average of arbitrary k points from a line

    Suppose a set of k arbitrary points, x_i, 1<=i<=k, x_i from R^2 are selected from a line. How can it be shown that a weighted barycenter x_o=(o_i*x_i)/(o_1+o_2+...+o_k) also belongs to that line (assume o_i are arbitrary weights)? Does the choice of weights restrict the solutions (ie, a...
  7. O

    Equivalence of the nullspace and eigenvectors corresponding to zero eigenvalue

    @mathwonk Thanks. What is the exact meaning of "at least" in your statement? Could the set of vectors of null vectors be a superset of the set of eigenvectors corresponding to zero eigenvalue (of course, any linear comb of such eigenvectors will also be a vector of null space)?
  8. O

    Equivalence of the nullspace and eigenvectors corresponding to zero eigenvalue

    Suppose a square matrix A is given. Is it true that the null space of A corresponds to eigenvectors of A being associated with its zero eigenvalue? I'm a bit confused with the terms 'algebraic and geometric multiplicity' of eigenvalues related to the previous statement? How does this affect the...
  9. O

    Derivative of a function involving square root of sum of squares

    It then means you're squaring each term, and not the function itself. If a function is squared, then these would be equivalent. Given a set of points in 2D, a point that minimizes the sum of squared distances to such points is the barycenter; I'm not sure about the sum of distances (so, not...
  10. O

    Derivative of a function involving square root of sum of squares

    Thanks. Now, faced with the problem of minimizing f(x) for provided 2D parameters x1, x2, x3, ..., x_k, one sets the derivative to zero, and computes for x. However, in case of more than one dimension this problem is non-trivial, I think. What would be the minimizer of f(x), provided 2D...
  11. O

    Derivative of a function involving square root of sum of squares

    Provided is a function f(x)=\sum_{j=1}^n ||x-x_j||, for x being a two dimensional vector, where ||.|| denotes the Euclidean distance in 2D space. How could one obtain a derivative of such a function?
  12. O

    Transforming a matrix to orthogonal one

    Thank your for such a good explanation.
  13. O

    Transforming a matrix to orthogonal one

    Thanks. Just one note: I suppose you've taken into account that there are p columns in X (which is an n x p matrix). If I'm not wrong, only n linearly independent columns of dimensionality R^n define a basis in R^n. So, given an input X, with linearly independent columns, such columns could...
  14. O

    Transforming a matrix to orthogonal one

    Suppose a matrix X of size n x p is given, n>p, with p linearly independent columns. Can it be guaranteed that there exists a matrix A of size p x p that converts columns of X to orthonormal columns. In other words, is there an A, such that Y=XA, and Y^TY=I, where I is an p x p identity matrix.
  15. O

    Use of a derivative or a gradient to minimize a function

    Could you provide a simple example?
Back
Top