How powerful are continued fraction representations?
From what I understand, they could be used to exactly represent some irrational numbers
So, could they represent any root of an nth degree polynomial equation?
Specially where n>4, since 5th degree roots are not guaranteed to have an...
Can I have more clues please? I'm getting nowhere plus I'm not that good with integral calculus since we were never taught this in my high school & university.
I tried that, but to know if it is indeed monotonic I have to show that the selected expression in the term the picture I attached is always < 1 or > 1.
So to proof of the original problem requires a proof for:
n*(log(n+1)-log(n)) < 1
Right now it looks pretty circular.
I need to prove:
(n+1)*(log(n+1)-log(n) > 1 for all n > 0.
I have tried exponentiating it and I got
( (n+1)/n )^(n+1) < e.
And from there I couldn't go any farther, but I do know that it is true by just looking at its graph.
Could anybody help me please?
Ohh, the function log(F(x))
-----------------------------
So this means that
log(F*(x)) = F'( log(F(x)) ) right?
What else does it imply or what else could it be applied to?
If you take the logarithm, the denominator will be the same Δx, but the numerator will have logs wrapped around it
log(F(x+Δx)) - log(F(x)) instead of F(x+Δx) - F(x).
I have been playing around with calculus for a while and I wondered what would it be like to make some changes to the definition of derivatives.
I'd like to look at the original definition of derivatives in this way (everything is in lim Δx→0):
F(x+Δx) - F(x) = F'(x) * Δx
The Δx factor...
I've been taught that for any system of linear equations, it has a corresponding matrix.
Why do people sometimes use systems of linear equations to describe something and other times matrices?
Is it all just a way of writing things down faster or are there things you could do to matrices that...
Given a new constrant that A+B = C+D = 1
Does showing that:
d[ -1(a*log(a)+(1-a)*log(1-a)) ] / d[a] * d[ a*(1-a) ] / d[a] to be always greater than or equal to zero prove the original claim?
Since satisfying this means that the two functions grow and shrink together (albeit not in the exact...
The closest I've got is I've tried to log both sides of the 1st inequality giving log(a)+log(b) < log(c)+log(d) then I tried to make one side similar the the 2nd inequality but then I realized that I'm going in circles.
How do I use the concave down point?
Can somebody help me please, I've tried solving this for hours but I still couldn't get it.
Given that a, b, c, d are positive integers and a+b=c+d.
Prove that if a∗b < c∗d,
then a∗log(a)+b∗log(b) > c∗log(c)+d∗log(d)
How do I do it?