Recent content by ekkilop

  1. E

    Derivative of best approximation

    Say that we have a continuous, differentiable function f(x) and we have found the best approximation (in the sense of the infinity norm) of f from some set of functions forming a finite dimensional vector space (say, polynomials of degree less than n or trigonometric polynomials of degree less...
  2. E

    Function + tangent line = 0

    Thank you for your reply! Yes, f(x) is continuous. And indeed f(x) + g(x,y) is monotone. What I meant to ask was if there is a way to explicitly find that value for x at which f+g=0 other than the "brute force" way of inverting the expression? Or perhaps, more generally - does an equation of...
  3. E

    Function + tangent line = 0

    Say we have two functions with the following properties: f(x) is negative and monotonically approaches zero as x increases. g(x,y) is a linear function in x and is, for any given y, tangent to f(x) at some point x_0(y) that depends on the choice of y in a known way. Additionally, for any...
  4. E

    Max of sum of sines

    Hi! Consider the function \frac{d^n}{dx^n} \sum_{k=1}^m \sin{kx}, \quad 0 \leq x \leq \pi/2 . If n is odd this function attains its largest value, \sum_{k=1}^m k^n at x=0 . But what about if n is even? Where does the maximum occur and what value does it take? Any help is much...
  5. E

    Approximating function by trigonometric polynomial

    Thank you mfb for your reply! Yes, that was my original idea as well. If g is the approximation in the RHS of (1) , then I reasoned that the optimal result should be when (f-g) \perp f . However, (f-g, f) is a linear function in the coefficients a_n so there are no extrema (I am...
  6. E

    Approximating function by trigonometric polynomial

    Hi! Say that we wish to approximate a function f(x), \, x\in [0, 2\pi] by a trigonometric polynomial such that f(x) \approx \sum_{|n|\leq N} a_n e^{inx} \qquad (1) The best approximation theorem says that in a function space equipped with the inner product (f,g) = \frac{1}{2...
  7. E

    A tricky finite series!

    Hi! I've encountered the series below: \sum_{l=0}^{k-1} (r+l)^j (r+l-k)^i where r, k, i, j are positive integers and i \leq j . I am interested in expressing this series as a polynomial in k - or rather - finding the coefficients of that polynomial as i,j changes. I have reasons to...
  8. E

    Cauchy expansion of determinant of a bordered matrix

    Hi! It just dawned on me that any such matrices (I suppose there are only 4 places A could go ^^, ) are related by simple permutations. Since any permutation matrix has determinant + or - 1 then what you say must be true. Thank you for the enlightenment! =)
  9. E

    Cauchy expansion of determinant of a bordered matrix

    The Cauchy expansion says that \text{det} \begin{bmatrix} A & x \\[0.3em] y^T & a \end{bmatrix} = a \text{det}(A) - y^T \text{adj}(A) x , where A is an n-1 by n-1 matrix, y and x are vectors with n-1 elements, and a is a scalar. There is a proof in Matrix Analysis by Horn and...
  10. E

    Addition to a random matrix element

    Hi all! I have no application in mind for the following question but it find it curious to think about: Say that we have a square matrix where the sum of the elements in each row and each column is zero. Clearly such a matrix is singular. Suppose that no row or column of the matrix is the...
  11. E

    Properties of a special block matrix

    Thank you! I think I shall have to return to the drawing board for a closer investigation :)
  12. E

    Properties of a special block matrix

    That's a fair point. I was playing around with a different matrix - Hermitian and also symmetric about the anti-diagonal. Turns out that the eigenvectors are closely related to the eigenvectors of the matrix R above so I was curious about their structure. It seems reasonable that the upper and...
  13. E

    Properties of a special block matrix

    Hi folks! I've encountered the matrix below and I'm curious about its properties; R= \begin{pmatrix} 0 & N-S\\ N+S & 0 \end{pmatrix} where R, N and S are real matrices, R is 2n by 2n, N is n by n symmetric and S is n by n skew-symmetric. Clearly R is symmetric so the...
  14. E

    Prove the Lyapunov equation

    The sufficiency can be obtained by considering B=\int_{0}^{∞} e^{A^τ t} Q e^{A t} dt Inserting into the Lyapunov equation gives AB + BA^{T} = A \int_{0}^{∞} e^{A^τ t} Q e^{A t} dt + \int_{0}^{∞} e^{A^τ t} Q e^{A t} dt A^{T} = \int_{0}^{∞} \frac{d}{dt} (e^{A^τ t} Q e^{A t}) dt = [e^{A^τ...
  15. E

    Zero as an element of an eigenvector

    Thank you for your reply! Is there a way to determine from the matrix whether a zero will appear without calculating the eigenvector explicitly?