Tack Fredrik,
I've looked briefly at Serge Lang's book and it's along the lines of what I'm looking for. He seems to use the norm on the vector space K the function injects into to define a metric to assure convergence, so maybe it would be sufficient to assume that K is a subset of a metric...
The context is that I have a function
f:\Gamma \to {\rm K}
where \Gamma is a set of probability measures all defined on the same sigma-algebra \Sigma, and {\rm K} is some subset of a vector space equipped with a partial ordering.
Now, \Gamma is also a probaiblity space (\Gamma ,{\Sigma _\Gamma...
Given a measurable function f that is not real- or complex valued, but that maps into some vector space, what are the necessary conditions for it to be integrable?
I've looked through over 20 books on integration and measure theory, but they all only deal with integration of real (or...
I suggest you look up the definition of a convex/concave function. A function does not have to be differentiable to be convex, for instance is f(x)=abs(x) convex even though is it not differentiable at x=0. Also, that a function is not convex does not imply it is concave, or vice versa.
I have a function
f(x) = \exp (x)\prod\limits_{j = 1}^n {{{(x - {y_j})}^{ - {z_j}}}}
of which I need to find an expression of the k:th derivative with respct to x:
\frac{{{\partial ^k}}}{{\partial {x^k}}}f(x) .
I have been able to make a conjecture that seems to be correct...
Yeah, I meant that {dP_{\Theta\times\Omega}}/{dP_\Omega} would not be a function \Theta\rightarrow[0,\infty), but a function \Theta\rightarrow\Gamma , where \Gamma is a set of probaility measures on \Sigma_\Omega. It might be a bit of a strech to call it a Radon-Nikodym derivative...
I only meant that for every x \in \Theta there is one probability measure P_{\Omega | x} on \Sigma_\Omega over the space \Omega. The probaility measures P_{\Omega | x} are thus consitional on x.
This allows us to define a joint probability measure on the (Cartesian) product space (\Theta...
As far as I can see, it's not quite what I'm looking for. What I'm trying to do above is to reformulate Bayes' rule for probability densities, usually expressed
p(x|y) = \frac{{p(y|x)}}{{p(y)}}p(x)
which follows trivially from the definition of a joint probability density p(x,y) = p(y|x)...
Consider a probability space (\Theta, \Sigma_\Theta, P_\Theta), where P_\Theta is a probability measure on the sigma-algebra \Sigma_\Theta.
Each element x \in \Theta maps onto another probability measure P_{\Omega | x}, on a sigma-algebra \Sigma_\Omega on another space \Omega.
In this...