Taylor's Theorem for Vector-Valued Functions (Real Analysis)

1. Dec 23, 2013

Antiderivative

1. The problem statement, all variables and given/known data

"Formulate and prove an inequality which follows from Taylor's theorem and which remains valid for vector-valued functions."

2. Relevant equations

I know that Taylor's theorem generally states that if $f$ is a real function on $[a,b]$, $n$ is a positive integer, $f^{(n-1)}$ is continuous on $[a,b]$, and $f^{(n)}(t)$ exists for every $t \in (a,b)$, then there exists $x \in (a,b)$ such that $f(\beta) = P(\beta) + \frac{f^{n}(x)}{n!}(\beta - \alpha)^{n}$, where $P(t)$ is the function given by $P(t) = \sum\limits_{k=0}^{n-1} \frac{f^{k}(\alpha)}{k!}(t - \alpha)^{k}$.

3. The attempt at a solution

I'm not sure how to set up an equation from here. I know that vector-valued functions exist in several variables and often on the complex plane, but I'm not entirely sure how one proceeds in creating an inequality. Does it stem from the definition of a vector-valued function? I know that part of it is similar to a mean value theorem form...

Any help would be appreciated. Thanks!

2. Dec 24, 2013

HallsofIvy

You are misleading yourself. Yes, "vector valued functions" can depend on multiple variables or complex variables but that is NOT what is intended here. The given function here depends upon a single real variable and the generalization you are asked to make is not to the variable but to the value of the function.

You are correct that you cannot have an inequality with vectors since vector spaces do not have a linear order. Instead use the "norm" or length of the vector.

3. Dec 24, 2013

Antiderivative

I see. So we're trying to make a relationship for $f(t)$, not just the single-variable $t$ itself.

As for the length, what justifies using that as the manner in which one orders these vectors? I mean I can do the math (it's just $z = \sqrt{z\overline{z}}$), but why are we allowed to? Or is it implicitly just asking for that since that's what'll arise from Taylor's theorem anyways?

4. Dec 24, 2013

Antiderivative

I can derive the first-order version of the inequality, which is

$\left|\mathbf{f}(b) - \mathbf{f}(a)\right| \leq (b - a)\left|\mathbf{f}'(x)\right|$

As I said this is a "mean value theorem" for vector-valued functions. How would I extend this to an $n^{th}$ order inequality so as to "follow from Taylor's theorem" directly? I can see by looking at this that it should be quite possible yet for some odd reason I cannot seem to fashion a proof that does so.

Can somebody help me out? I used the norm to derive this inequality, as HallsofIvy suggested.

Edit: Update—I figured something out. I think that one cannot determine an exact expression for the "remainder" in Taylor's theorem because this is a vector-valued function, so instead we can use an upper bound based on an upper bound on $\mathbf{f}^{(n)}$. So doing that would generate an inequality, as the actual value would have to be $\leq$ the value with the upper bound. I think the easiest way is to find the upper bound on $f^{(n)}$ for real-valued functions $f$ and then applying it for each component function if this were a vector-valued function instead. But my question now is if that's a valid approach, as it seems to be committing a faux pas in some form. Can somebody let me know if this is a plausible approach? Thanks, I appreciate it.

Last edited: Dec 24, 2013