Recent content by winterfors

  1. W

    Integration of functions mapping into a vector space

    Tack Fredrik, I've looked briefly at Serge Lang's book and it's along the lines of what I'm looking for. He seems to use the norm on the vector space K the function injects into to define a metric to assure convergence, so maybe it would be sufficient to assume that K is a subset of a metric...
  2. W

    Integration of functions mapping into a vector space

    The context is that I have a function f:\Gamma \to {\rm K} where \Gamma is a set of probability measures all defined on the same sigma-algebra \Sigma, and {\rm K} is some subset of a vector space equipped with a partial ordering. Now, \Gamma is also a probaiblity space (\Gamma ,{\Sigma _\Gamma...
  3. W

    Integration of functions mapping into a vector space

    Given a measurable function f that is not real- or complex valued, but that maps into some vector space, what are the necessary conditions for it to be integrable? I've looked through over 20 books on integration and measure theory, but they all only deal with integration of real (or...
  4. W

    Jensen inequality, unexplained distribution, very confusing problem

    I suggest you look up the definition of a convex/concave function. A function does not have to be differentiable to be convex, for instance is f(x)=abs(x) convex even though is it not differentiable at x=0. Also, that a function is not convex does not imply it is concave, or vice versa.
  5. W

    Jensen inequality, unexplained distribution, very confusing problem

    Well, you cannot simply "remove" X=0 by setting 1/0=0. Then your function is not convex any more, and this is the source of your problem.
  6. W

    Jensen inequality, unexplained distribution, very confusing problem

    Without having looked at the details of your calculations, you certaintly have a problem with Y = 1/X for X=0
  7. W

    N:th derivative of exp(x)/x type function?

    Well, using the general Liebniz rule you can quite easily get to \frac{{{\partial ^k}}}{{\partial {x^k}}}f(x) = f(x)k!\sum\limits_{{\bf{m}} \in \Theta _k^{n + 1}} {\frac{1}{{\prod\limits_{j = 1}^{n + 1} {{m_j}!} }}} \prod\limits_{j = 1}^n {\frac{{({z_j} - 1 + {m_j})!}}{{({z_j} - 1)!}}} {({y_j}...
  8. W

    N:th derivative of exp(x)/x type function?

    I have a function f(x) = \exp (x)\prod\limits_{j = 1}^n {{{(x - {y_j})}^{ - {z_j}}}} of which I need to find an expression of the k:th derivative with respct to x: \frac{{{\partial ^k}}}{{\partial {x^k}}}f(x) . I have been able to make a conjecture that seems to be correct...
  9. W

    Average of function vs. function evaluated at average

    If the function is concave, it does not depend on tha shape of p. See http://en.wikipedia.org/wiki/Jensen%27s_inequality"
  10. W

    How to prove Bayes' rule for probability measures?

    Yes, reformulating this using probaility measures instead of probability densities would allow me to prove what I want.
  11. W

    How to prove Bayes' rule for probability measures?

    Yeah, I meant that {dP_{\Theta\times\Omega}}/{dP_\Omega} would not be a function \Theta\rightarrow[0,\infty), but a function \Theta\rightarrow\Gamma , where \Gamma is a set of probaility measures on \Sigma_\Omega. It might be a bit of a strech to call it a Radon-Nikodym derivative...
  12. W

    How to prove Bayes' rule for probability measures?

    In the equation you refer to, the integration is not over a subset A \in \Sigma_\Theta\times\Sigma_\Omega but over a subset of \Theta.
  13. W

    How to prove Bayes' rule for probability measures?

    I only meant that for every x \in \Theta there is one probability measure P_{\Omega | x} on \Sigma_\Omega over the space \Omega. The probaility measures P_{\Omega | x} are thus consitional on x. This allows us to define a joint probability measure on the (Cartesian) product space (\Theta...
  14. W

    How to prove Bayes' rule for probability measures?

    As far as I can see, it's not quite what I'm looking for. What I'm trying to do above is to reformulate Bayes' rule for probability densities, usually expressed p(x|y) = \frac{{p(y|x)}}{{p(y)}}p(x) which follows trivially from the definition of a joint probability density p(x,y) = p(y|x)...
  15. W

    How to prove Bayes' rule for probability measures?

    Consider a probability space (\Theta, \Sigma_\Theta, P_\Theta), where P_\Theta is a probability measure on the sigma-algebra \Sigma_\Theta. Each element x \in \Theta maps onto another probability measure P_{\Omega | x}, on a sigma-algebra \Sigma_\Omega on another space \Omega. In this...
Back
Top