Are you sure this is a well-defined vector, though?
The matrix notation, \vec{u} = \begin{bmatrix}x_1 \\ x_2 \\ \vdots \\ x_n\end{bmatrix} is defined such that x_1, \dots, x_n are the components of your vector in a given basis of your vector space, so they must be scalars.
That's why I don't...
Another way to prove it using basic linear algebra would be as follows.
First of all, the set E of sequences (u_n) such as u_{n+2} = u_{n+1} + u_n for all n \in \mathbf N is a vector space over \mathbf R (this is very easy to show).
Also, E is of dimension 2 (intuitively, any sequence of E is...
Even for numbers up to 1 million, the sum \sum\limits_{k=1}^{+\infty} \left\lfloor\dfrac{n}{5^k}\right\rfloor only contains 8 terms, namely 200 000, 40 000, 8 000, 1 600, 320, 64, 12, 2, which makes a total of 249 998 trailing zeroes.
Let me try to explain it formally. I'll be using the 2-dimensional plane \mathbf R^2, but you can easily generalize this with \mathbf R^n.
In affine geometry, we can see \mathbf R^2 as a set :
of points (let's note simply \mathbf R^2 the set \mathbf R^2 seen as a set of points)
or a...
Actually, having 1 as the lower bound of integration makes it more coherent, since it makes it so that ln is the inverse function of the exponential (with another lower bound, its inverse would be the exponential multiplied by the lower bound constant).
The first fundamental theorem only states the derivative of F is f.
And when you differentiate, the constants don't matter, so the fact that F is an antiderivative of f is actually independent from which a you choose. More concretely, and without any mathematical rigor ...
Hi,
I'm guessing you meant :
If you're interested in a proof, here you go.
Let x \in \mathbb R, n \in \mathbb N.
I'll be using radians instead of degrees, so I'll show that we have :
\sum_{k=0}^n \sin k = f(\pi) \sin^2\left(\dfrac{x}{2}\right)+\dfrac{\sin x}{2}
Where : f(n) =...
Isn't this a much simpler way to prove that f'(x) \propto \dfrac{1}{x}?
If you suppose f differentiable on its domain, and if you derivate the equation with respect to y, you get that f satisfies:
\forall x, y \in \mathbb R, \hspace{10pt} xf'(xy) = f'(y)
By chosing y = 1:
\forall x \in...
Usually, the set of sequences which elements in a set A is denoted by A^{\mathbb N}.
More generally, if A and B are two sets, then the set of functions from A to B is written B^A.
Using the inverse function derivative theorem, you can show that \ln' = \frac{1}{\exp' \circ \ln} = \frac{1}{\exp \circ \ln} = \frac{1}{\text{Id}}
Thus \ln'(x) = \frac{1}{x}
And by integrating
\int_1^x \frac{dt}{t} = \int_1^x \ln'(t)dt = \ln(x)