Challenge Math Challenge - April 2020

Click For Summary
The Math Challenge - April 2020 discussion covers various mathematical problems and their solutions, including topics in functional analysis, statistics, differential equations, and number theory. Key problems involve showing the unique continuation of linear operators in normed spaces, testing hypotheses for normally distributed variables, and solving initial value problems for differential equations. Participants also explore properties of continuous functions and polynomials, as well as combinatorial problems related to tiling rectangles. The thread emphasizes the importance of clear, complete solutions and rigorous mathematical reasoning.
  • #31
archaic said:
We have ##n\in\mathbb{N}=\{0,\,1,\,2,\,...\}## (just clarifying because I initially thought of ##\mathbb{N}## as not having ##0##).
Since all the zeroes of ##P(x)## are negative, and its domain is ##\mathbb{R}##, ##\frac{1}{P(x)}## is well defined for ##x\geq0##.
Moreover, the sign of ##\frac{1}{P(x)}## is constant for ##x\geq r_0##, where ##r_0## is the biggest zero of ##P(x)##, and is the same as ##P(x)##'s. As such, it is positive because ##P(x)\underset{\infty}{\sim}x^n## and ##x^n\geq0## when ##x\to+\infty##.
The integral ##\int_1^\infty\frac{dx}{P(x)}## exhibits an improper behaviour only at ##\infty## because of what is stated above about the zeroes of ##P(x)##. I will use this test for improper integrals'convergence:
If ##f## and ##g## are two (piecewisely) continuous and positive real functions on ##[a,\,b)##, where ##a\in\mathbb{R}## and ##b## the same or is infinity, then, if ##f(x)\underset{b}{\sim}g(x)##, then ##\int_a^bf(x)\,dx## converges ##\Leftrightarrow\int_a^bg(x)\,dx## converges.
$$\lim_{x\to\infty}\frac{\frac{1}{P(x)}}{\frac{1}{x^n}}=\lim_{x\to\infty}\frac{x^n}{P(x)}=\lim_{x\to\infty}\frac{x^n}{x^n\left(1+\frac{a_{n-1}}{x}+\frac{a_{n-2}}{x^2}+...+\frac{a_0}{x^n}\right)}=1$$
$$\int_1^\infty\frac{dx}{x^n}\text{ converges }\Leftrightarrow n>1\text{, by p-test}$$
Using the theorem stated above, ##\int_1^\infty\frac{dx}{P(x)}## converges if and only if ##n>1##, or ##n\geq2##.
I still don't like the inaccuracy of "behaves like" because it doesn't capture lower terms quantitatively. We have ##\dfrac{1}{x^n}<\dfrac{1}{x^{n-1}}##, so why can we neglect those terms of lower orders? In other words: Why is ##g(x)=x^{-n}## suited, if ##f(x)=P(x)^{-1}## is actually bigger than ##g(x)?##

There should be chosen a better function ##g(x)##, which is actually greater than ##x^{-n}## and not only "near", and the estimation ##\int 1/P(x) dx < \int g(x)dx## should be accurate.

Your argument goes: "I have a function (##x^{-n}##), which is possibly smaller than ##1/P(x)##, so it cannot guarantee convergence, but I don't care, since my error is small." However, you did not show that this heuristic is allowed.
 
Last edited:
Physics news on Phys.org
  • #32
fresh_42 said:
I still don't like the inaccuracy of "behaves like" because it doesn't capture lower terms quantitatively. We have ##\dfrac{1}{x^n}<\dfrac{1}{x^{n-1}}##, so why can we neglect those terms of lower orders? In other words: Why is ##g(x)=x^{-n}## suited, if ##f(x)=P(x)^{-1}## is actually bigger than ##g(x)?##

There should be chosen a better function ##g(x)##, which is actually greater than ##x^{-n}## and not only "near", and the estimation ##\int 1/P(x) dx < \int g(x)dx## should be accurate.

Your argument goes: "I have a function (##x^{-n}##), which is possibly smaller than ##1/P(x)##, so it cannot guarantee convergence, but I don't care, since my error is small." However, you did not show that this heuristic is allowed.
It is actually a theorem:
Capture.PNG

Do you want me to prove it?
 
  • #33
archaic said:
It is actually a theorem:
View attachment 259904
Do you want me to prove it?
You don't need the proof for a specific example. An idea, i.e. a better choice of ##g(x)## would do the job without the machinery. The proof must use a measure for the error, too, so this idea should be applied to the example. I have only 4 short lines for it, but it makes clear why this is true despite the fact that ##1/x^n < 1/x^{n-1}##. I mean this is the crucial point of the entire question, and hiding it in a theorem doesn't really help understanding.
 
Last edited:
  • #34
fresh_42 said:
You don't need the proof for a specific example. An idea, i.e. a better choice of ##g(x)## would do the job without the machinery. The proof must use a measure for the error, too, so this idea should be applied to the example. I have only 4 short lines for it, but it makes clear why this is true despite the fact that ##1/x^n < 1/x^{n-1}##. I mean this is the crucial point of the entire question, and hiding it in a theorem doesn't really help understanding.
Hm, maybe you're looking for ##f(x)=\frac{c}{x^n}## where ##c>1##, so that ##1/P(x)<f(x)## for large ##x##?
With that, I can also do a comparison test:
Let ##g(x)=1/f(x)##. We know that after some ##x=N>0## we have ##P(x)\geq g(x)## since ##\lim_{x\to\infty}\left(g(x)-P(x)\right)=-\infty<0##.
It follows that for ##x\geq N## we get ##\frac{1}{P(x)}\leq\frac{c}{x^n}##, and so ##\int_N^\infty\frac{dx}{P(x)}\leq\int_N^\infty c\frac{dx}{x^n}##.
If ##n\geq2##, the second integral converges by the p-test, so the first also does by comparison test. And since ##\int_1^N\frac{dx}{P(x)}## converges (because ##\frac{1}{P(x)}## is well defined for ##x\geq0##), ##\int_1^N\frac{dx}{P(x)}+\int_N^\infty\frac{dx}{P(x)}=\int_1^\infty\frac{dx}{P(x)}## also converges.
If ##n=1##, then ##\int_1^\infty\frac{dx}{x+a_0}=\int_{1+a_0}^\infty\frac{du}{u}## which diverges by the p-test.
If ##n=0##, then ##\int_1^\infty dx## surely diverges.
 
Last edited:
  • #35
archaic said:
Hm, maybe you're looking for ##f(x)=\frac{c}{x^n}## where ##c>1##, so that ##1/P(x)<f(x)## for large ##x##?
With that, I can also do a comparison test:
Let ##g(x)=1/f(x)##. We know that after some ##x=N>0## we have ##P(x)\geq g(x)## since ##\lim_{x\to\infty}\left(g(x)-P(x)\right)=(1-c)/c<0##.
It follows that for ##x\geq N## we get ##\frac{1}{P(x)}\leq\frac{c}{x^n}##, and so ##\int_N^\infty\frac{dx}{P(x)}\leq\int_N^\infty c\frac{dx}{x^n}##.
If ##n\geq2##, the second integral converges by the p-test, so the first also does by comparison test. And since ##\int_1^N\frac{dx}{P(x)}## converges (because ##\frac{1}{P(x)}## is well defined for ##x\geq0##), ##\int_1^N\frac{dx}{P(x)}+\int_N^\infty\frac{dx}{P(x)}=\int_1^\infty\frac{dx}{P(x)}## also converges.
If ##n=1##, then ##\int_1^\infty\frac{dx}{x+a_0}=\int_{1+a_0}^\infty\frac{du}{u}## which diverges by the p-test.
If ##n=0##, then ##\int_1^\infty dx## surely diverges.
Nothing follows and how can we know what you claim without specifying the functions and constants you use? How do you find such a ##c##, what are ##f## and ##g##? You are sloppy and hand waving in your arguments.

We have to show that
$$
\int_1^\infty \left|\dfrac{1}{P(x)}\right|\,dx = \int_1^\infty \left|\dfrac{1}{x^n+a_{n-1}x^{n-1}+\ldots+a_0}\right|\,dx < \infty
$$
If you use a comparison with ##\int_1^\infty \left|P(x)^{-1}\right|\,dx \leq \int_1^\infty g(x)\,dx <\infty## then I want to know what ##g(x)## is and why the inequality holds. The argument is obviously wrong if we integrate over zeroes of ##P(x)##. Where did you use that we do not have this case, except within the theorem somewhere? E.g. ##g(x)= x^n > x^n-x^2 =:P(x)## does not work. So which ##g(x)## can we use and why?
 
  • Like
Likes archaic
  • #36
fresh_42 said:
##x^n−x^2=:P(x)##
This can't be as it has a positive zero.
I have shown in my second solution post that ##P(x)## is positive after its greatest zero, which is negative, given the premises, so ##|P(x)|=P(x)## for ##x\geq0##.
##f(x)=c/x^n## where ##c\in(1,\,\infty)##, any number does the job, and ##g(x)=1/f(x)##.
Why any ##c## in that interval does the job? Because (for ##n\geq2##):
$$\lim_{x\to\infty}\left(g(x)-P(x)\right)=\lim_{x\to\infty}\left(\frac{x^n}{c}-x^n-a_{n-1}x^{n-1}+...+a_0\right)=-\infty<0$$
This tells me that, at some point, ##P(x)>g(x)\Leftrightarrow \frac{1}{P(x)}<f(x)##.
EDIT: I have initially written the result of that limit as ##(1-c)/c##, even in the previous post (I corrected it), sorry!
 
Last edited:
  • #37
This is what I meant by sloppy: the limit above is infinite, not ##(1-c)/c## and one of the last occurrences of ##P(x)## should probably be ##P(x)^{-1}##. A constant is not sufficient, as long as you have no control about the other coefficients. They can either exceed or remain below the borders any specific choice of ##c## gives. So you either choose a non-constant numerator, or you deal with the arbitrary terms of lower degree.

Of course, the negative zeroes are the key, but you only used the leading term of the polynomial, so all others can be arbitrary. The negative roots constrain this arbitrariness, but I cannot see how. Given any polynomial, you cannot see where the roots are, so the low order terms play a role and cannot be ignored.

The error is less in the proof, which is valid if we apply the theorem and only if, but rather in your logic.
 
  • #38
fresh_42 said:
Of course, the negative zeroes are the key, but you only used the leading term of the polynomial, so all others can be arbitrary. The negative roots constrain this arbitrariness, but I cannot see how. Given any polynomial, you cannot see where the roots are, so the low order terms play a role and cannot be ignored.
##P(x)## can be written as ##(x+|r_n|)(x+|r_{n-1}|)...(x+|r_1|)##, where ##r_i## are the negative roots, because it is a polynomial. Since ##|r_i|>0##, and thus the coefficients are all positive:
$$x^n \leq x^n+a_{n-1}x^{n-1}\leq...\leq P(x)\Leftrightarrow \frac{1}{x^n}\geq\frac{1}{P(x)}\Leftrightarrow\int_1^\infty\frac{dx}{x^n}\geq\int_1^\infty\frac{dx}{P(x)}$$
 
Last edited:
  • #39
fresh_42 said:
This is what I meant by sloppy: the limit above is infinite, not (1−c)/c(1−c)/c(1-c)/c and one of the last occurrences of P(x)P(x)P(x) should probably be P(x)−1P(x)−1P(x)^{-1}.
Thank you, corrected it also.
 
  • #40
fresh_42 said:
A constant is not sufficient, as long as you have no control about the other coefficients. They can either exceed or remain below the borders any specific choice of ##c## gives.
But since the limit of the difference is ##-\infty##, mustn't it be the case that, after some ##x##, ##P(x)## is strictly bigger than ##x^n/c## forever?
 
  • #41
archaic said:
##P(x)## can be written as ##x(x+|r_{n-1}|)...(x+|r_1|)##, where ##r_i## are the negative roots, because it is a polynomial. Since ##|r_i|>0##, and thus the coefficients are all positive:
$$x^n \leq x^n+a_{n-1}x^{n-1}\leq...\leq P(x)\Leftrightarrow \frac{1}{x^n}\geq\frac{1}{P(x)}\Leftrightarrow\int_1^\infty\frac{dx}{x^n}\geq\int_1^\infty\frac{dx}{P(x)}$$
Well, you used methods which are normally beyond the scope of high school students, here the fundamental theorem of algebra, the integral comparison earlier.

It would have been much easier and straight forward with ##g(x)=x^{-n+(1/2)}##.

And ... there is again a sloppiness: ##P(0)\neq 0##.
 
  • Informative
Likes archaic
  • #42
fresh_42 said:
And ... there is again a sloppiness: ##P(0)\neq 0##.
:cry:
 
  • #43
Question 13 a) (after the error being pointed out by @archaic, he deserves the credits)

$$ \lim_{x\to \pi/2} \frac{x-\pi/2} {\sqrt
{1-\sin x} } $$ Let ##y= x- \pi/2 ## $$ \lim_{y\to 0} \frac{y}{\sqrt {
1- \sin (y+\pi/2)}} $$

$$ \lim_{y\to 0} \frac{y}{ \sqrt {
1- \cos y}} \\
\lim_{y\to 0} \frac{y}{\sqrt { 2 \sin^2 y/2}} \\

\lim_{y\to 0} \frac{1}{\sqrt 2} \frac{y} {\sqrt {\sin^2 (y/2)}} $$
Now, let’s look at a crucial fact ## \sqrt{x^2} = \left ( \sqrt x \right) ^2## but if ##x## is negative then the RHS in the above equality breaks down, therefore, if ##x## can take both +ve and -be values then we consider only +ve values that is the mod value, ##|x|##.
Coming back to our limit $$ \frac{1}{\sqrt 2} \lim_{y\to 0} \frac{y}{|\sin y/2 |} $$
$$ \frac{1}{\sqrt 2} \lim_{y\to 0^-} \frac{ y/ y/2} { - \sin (y/2) / y/2} = -\sqrt 2$$ And $$ \frac {1}{\sqrt 2} \lim_{y\to 0^+} \frac{y/y/2 } { \sin (y/2) / y/2 } = \sqrt 2 $$
Hence, limit doesn’t exist.
 
  • Like
Likes QuantumQuest
  • #44
Problem #4
Solve$$
y'(x)=y^2(x)-2(x+\frac{1}{2})y(x) +x^2 +x+1
$$We complete the square on r.h.s.$$
y'(x)=(y-( x+\frac{1}{2}))^2 +\frac{3}{4}
$$and make the substitution,$$
v(x)=y(x)-( x+\frac{1}{2})\\
v'(x)=y'(x) - 1
$$to obtain,$$
v'(x)=v^2(x) -\frac{1}{4}$$Separate variables,$$
\int \frac{dv}{(v^2 -\frac{1}{4})}= \int dx\\

\int \frac{dv}{(v^2 -\frac{1}{4})}=\int \frac{dv}{(v+(\frac{1}{2})(v-(\frac{1}{2})}=\int[\frac{-1}{(v+\frac{1}{2})} +\frac{1}{(v-\frac{1}{2})} ]dv\\

=ln(\frac{(v-\frac{1}{2})}{(v+\frac{1}{2})})\\

\frac{(v-\frac{1}{2})}{(v+\frac{1}{2})}=Ce^x$$ where C is the constant of integration.$$

v(x)=\frac{1}{2}\frac{(1+Ce^x)}{(1-Ce^x)}\\

y(x)=\frac{1}{2}[\frac{(1+Ce^x)}{(1-Ce^x)}+2x+1]\\

y(0)=\frac{1}{1-C}$$ For the case ##y(0)=0## we take the limit of y(C) as C goes to infinity$$

lim_{C \rightarrow \infty}y(C)=x$$For the case ##y(0)=1##,$$
C=1\\
y=1+x$$For the case ##y(0)=2##$$
C=\frac{1}{2}\\
y(x)=\frac{1}{2}[\frac{(1+\frac{1}{2}e^x)}{(1-\frac{1}{2}e^x)}+2x+1]
$$
DE.png
 
  • #45
Fred Wright said:
Problem #4

Solve$$
y'(x)=y^2(x)-2(x+\frac{1}{2})y(x) +x^2 +x+1
$$We complete the square on r.h.s.$$
y'(x)=(y-( x+\frac{1}{2}))^2 +\frac{3}{4}
$$and make the substitution,$$
v(x)=y(x)-( x+\frac{1}{2})\\
v'(x)=y'(x) - 1
$$to obtain,$$
v'(x)=v^2(x) -\frac{1}{4}$$Separate variables,$$
\int \frac{dv}{(v^2 -\frac{1}{4})}= \int dx\\

\int \frac{dv}{(v^2 -\frac{1}{4})}=\int \frac{dv}{(v+(\frac{1}{2})(v-(\frac{1}{2})}=\int[\frac{-1}{(v+\frac{1}{2})} +\frac{1}{(v-\frac{1}{2})} ]dv\\

=ln(\frac{(v-\frac{1}{2})}{(v+\frac{1}{2})})\\

\frac{(v-\frac{1}{2})}{(v+\frac{1}{2})}=Ce^x$$ where C is the constant of integration.$$

v(x)=\frac{1}{2}\frac{(1+Ce^x)}{(1-Ce^x)}\\

y(x)=\frac{1}{2}[\frac{(1+Ce^x)}{(1-Ce^x)}+2x+1]\\

y(0)=\frac{1}{1-C}$$ For the case ##y(0)=0## we take the limit of y(C) as C goes to infinity
I assume that means that y(x)=x is the solution.
$$lim_{C \rightarrow \infty}y(C)=x$$For the case ##y(0)=1##,$$
C=1\\
y=1+x$$For the case ##y(0)=2##$$
C=\frac{1}{2}\\
y(x)=\frac{1}{2}[\frac{(1+\frac{1}{2}e^x)}{(1-\frac{1}{2}e^x)}+2x+1]
$$
Have you cross checked this expression? Mine is different.

Edit: This could either mean that there is more than one solution or one of us is wrong. I checked your result and couldn't retrieve the differential equation, but I'm not sure whether I didn't make a mistake. Your shift by 1/2 makes a big mess in the calculations.
 
Last edited:
  • #46
Adesh said:
Well ##2!## is just 2.

And by the definition of derivative $$ \lim_{h\to 0} \frac{f’(a+h) - f’(a)}{h} = f”(a)$$

Yes, I (obviously) know that ##2!## is 2. I mentioned it just in case you missed it. Also, it may be useful in the context of the hint I gave.

Now, going on, the definition of derivative is what you write but you have an equality and you take the limit of the left side only, you equate it with the right hand side and then you write an equality. I don't make sense of it. In any case, if you take both limits from the start they're both ##f''(a)##, at least as I see it, so, what it gives anyway?
 
Last edited:
  • #47
Adesh said:
Question 13 a)

$$\lim_{x\to \pi/2} \frac{x-\pi/2}{\sqrt { 1-\sin x}} $$ Let ##y = x - \pi/2## $$\lim_{y\to 0} \frac{y}{\sqrt{ 1- \sin(y+ \pi/2)}} $$. $$ \lim_{y\to 0} \frac{y}{\sqrt {1- \cos y}}$$ $$\lim_{y\to 0} \frac{y}{\sqrt{2\sin^2 y/2}} $$
$$\lim_{y \to 0} \frac{y}{\sqrt 2 \sin y/2} \\ \lim_{y\to 0} \frac{
\frac{y}{y/2} }
{\sqrt 2\frac {\sin y/2}{y/2} } $$ The denominator in the above limit goes to ##\sqrt2## and the numerator cancels out to become 2 hence the answer is ##\sqrt 2##.

Not correct. It's easy to find the mistake and it has to do with some absolute value you missed.
 
  • Like
Likes Adesh
  • #48
QuantumQuest said:
Now, going on, the definition of derivative is what you write but you have an equality and you take the limit only of the left side, you equate it with the right hand side and then you write an equality. I don't make sense of it. In any case, if you take both limits from the start they're both f′′(a)f″(a)f''(a), at least as I see it, so, what it gives anyway?
I can explain it this way, write everything on one side, then we can use the Theorem “limit of a sum is the sum of the limits” so, we can take the limit of that “derivative” part and can leave everything as it is with limit sign being there. Am I being legitimate here?
 
  • #49
QuantumQuest said:
Also, it may be useful in the context of the hint I gave.
Sir, I tried your hint. I Taylor expanded everything but you know... couldn’t get nothing
 
  • #50
Adesh said:
I can explain it this way, write everything on one side, then we can use the Theorem “limit of a sum is the sum of the limits” so, we can take the limit of that “derivative” part and can leave everything as it is with limit sign being there. Am I being legitimate here?

Well, to be honest, I don't quite understand. If you have an equality that holds true then either you take the limit of both sides or not. I don't really know if you wanted to write something else and you wrote it like this.
 
  • #51
Adesh said:
Sir, I tried your hint. I Taylor expanded everything but you know... couldn’t get nothing

So, let me help a little more. What you're interested at, is putting the third derivative into the expansion. Now, from there, you can do some math, also taking into account the given expression.
 
Last edited:
  • #52
QuantumQuest said:
Well, to be honest, I don't quite understand. If you have an equality that holds true then either you take the limit of both sides or not. I don't really know if you wanted to write something else and you wrote it like this.
Let’s consider this example (no relation with problem 7) $$ \lim_{h\to 0} \frac{ f(a+h) - f(a)}{h} = \frac{f”(a+h)}{h}$$ I think your argument is that equality above holds only if I take the limit both sides, I have something like this in my mind $$ \lim_{h\to 0} \frac{f(a+h) -f(a)}{h} = \frac{ f”(a+h)}{h} \\
\lim_{h\to 0} \frac{f(a+h) - f(a)}{h} - \frac{ f”(a+h)}{h} =0$$ Now, if we use the law that “limit of a sum is the sum of the limits” then we write $$ f’(a) -\lim_{h\to 0} \frac{f”(a+h)}{h} = 0 $$
 
  • #53
Adesh said:
Question 13 a) (after the error being pointed out by @archaic, he deserves the credits)

$$ \lim_{x\to \pi/2} \frac{x-\pi/2} {\sqrt
{1-\sin x} } $$ Let ##y= x- \pi/2 ## $$ \lim_{y\to 0} \frac{y}{\sqrt {
1- \sin (y+\pi/2)}} $$

$$ \lim_{y\to 0} \frac{y}{ \sqrt {
1- \cos y}} \\
\lim_{y\to 0} \frac{y}{\sqrt { 2 \sin^2 y/2}} \\

\lim_{y\to 0} \frac{1}{\sqrt 2} \frac{y} {\sqrt {\sin^2 (y/2)}} $$
Now, let’s look at a crucial fact ## \sqrt{x^2} = \left ( \sqrt x \right) ^2## but if ##x## is negative then the RHS in the above equality breaks down, therefore, if ##x## can take both +ve and -be values then we consider only +ve values that is the mod value, ##|x|##.
Coming back to our limit $$ \frac{1}{\sqrt 2} \lim_{y\to 0} \frac{y}{|\sin y/2 |} $$
$$ \frac{1}{\sqrt 2} \lim_{y\to 0^-} \frac{ y/ y/2} { - \sin (y/2) / y/2} = -\sqrt 2$$ And $$ \frac {1}{\sqrt 2} \lim_{y\to 0^+} \frac{y/y/2 } { \sin (y/2) / y/2 } = \sqrt 2 $$
Hence, limit doesn’t exist.

Well done @Adesh.
 
  • Like
Likes Adesh
  • #54
Adesh said:
Let’s consider this example (no relation with problem 7) $$ \lim_{h\to 0} \frac{ f(a+h) - f(a)}{h} = \frac{f”(a+h)}{h}$$ I think your argument is that equality above holds only if I take the limit both sides, I have something like this in my mind $$ \lim_{h\to 0} \frac{f(a+h) -f(a)}{h} = \frac{ f”(a+h)}{h} \\
\lim_{h\to 0} \frac{f(a+h) - f(a)}{h} - \frac{ f”(a+h)}{h} =0$$ Now, if we use the law that “limit of a sum is the sum of the limits” then we write $$ f’(a) -\lim_{h\to 0} \frac{f”(a+h)}{h} = 0 $$

No. In the example you give here there is already a limit on the LHS. What I said was about having an equality without limits - as was the case in your solution for question ##7## and you took the limit of the left side only.
 
  • #55
QuantumQuest said:
No. In the example you give here there is already a limit on the LHS. What I said was about having an equality without limits - as was the case in your solution for question ##7## and you took the limit of the left side only.
Are you talking about this step
Adesh said:
limh→0f′(a+h)−f′(a)h=f”(a+θh)+θh2f”′(a+θh)f”(a)=f”(a+θh)+θh2f”′(a+θh)​
Well, it is like this $$\lim_{h\to 0} \frac{f’(a+h) - f’(a) }{h} = f”(a +\theta h) + \frac{\theta h}{2} f”’(a +\theta h) \\

f”(a) = \lim_{h\to 0} f”(a+ \theta h) +\frac{\theta h}{2} f”’(a + \theta h)$$
 
  • #56
Adesh said:
Are you talking about this step

Well, it is like this $$\lim_{h\to 0} \frac{f’(a+h) - f’(a) }{h} = f”(a +\theta h) + \frac{\theta h}{2} f”’(a +\theta h) \\

f”(a) = \lim_{h\to 0} f”(a+ \theta h) +\frac{\theta h}{2} f”’(a + \theta h)$$

So, in fact, you're taking both limits in the expression ##\frac{f’(a+h) - f’(a)}{h} = f”(a+\theta h) + \frac{\theta h}{2} f”’(a+ \theta h) ## and not the way you wrote it in your solution to question ##7##. That's what I'm talking about.
 
Last edited:
  • #57
QuantumQuest said:
So, in fact, you're talking both limits in the expression ##f’(a+h) = f’(a) + h f”(a+ \theta h) + \frac{\theta h^2}{2} f”’(a +\theta h) \\ \frac{f’(a+h) - f’(a)}{h} = f”(a+\theta h) + \frac{\theta h}{2} f”’(a+ \theta h)## and not the way you wrote it in your solution to ##7##. That's what I'm talking about.
Yes, I thought writing the limit before the whole equation meant limit was getting applied to everything, but as I have experienced it has caused some ambiguities. I aplogizie for that unclearness, but I really meant limits being on both sides.
 
  • #58
Adesh said:
Yes, I thought writing the limit before the whole equation meant limit was getting applied to everything, but as I have experienced it has caused some ambiguities. I aplogizie for that unclearness, but I really meant limits being on both sides.

See again my post #56, as there was a mistake as I copied the expression which I corrected.
 
  • #59
QuantumQuest said:
See again my post #56, as there was a mistake as I copied the expression which I corrected.
QuantumQuest said:
See again my post #56, as there was a mistake as I copied the expression which I corrected.
Sorry but I’m unable to spot the mistake. Please help me in seeing it.
 
  • #60
Adesh said:
Sorry but I’m unable to spot the mistake. Please help me in seeing it.

I mean that I made a mistake in copying the expression. The one that is written now in post #56 is the one I'm talking about. And in this you took the limit of the left hand side only while you have to take the limits of both sides but as you answer in post #57 it's understood, so it's OK for this. But taking into account both limits and also, as there are further things beyond which I don't understand , I don't know where this solution leads to.
 

Similar threads

  • · Replies 61 ·
3
Replies
61
Views
11K
  • · Replies 61 ·
3
Replies
61
Views
12K
  • · Replies 60 ·
3
Replies
60
Views
12K
  • · Replies 33 ·
2
Replies
33
Views
9K
  • · Replies 100 ·
4
Replies
100
Views
11K
  • · Replies 77 ·
3
Replies
77
Views
15K
  • · Replies 56 ·
2
Replies
56
Views
10K
  • · Replies 102 ·
4
Replies
102
Views
10K
  • · Replies 67 ·
3
Replies
67
Views
11K
  • · Replies 64 ·
3
Replies
64
Views
15K