Need help proving solution of a differential equation

furth721
Messages
6
Reaction score
0
I need help proving the general soultion to this equation, dy/dx=(y-(y^2))/x, is x/(x+C)

where C cannot equal -x. When I separate the variables and integrate I get

ln|y|-ln|1-y|=ln|x|+C, and I cannot make this look like the general solution. I'm not sure if

I did the integration wrong but I think it is right. I am not too sure what to do after I

integrate. Any help would be greatly appreciated. Thanks
 
Physics news on Phys.org
Your integration is correct. Try combining the ln y functions and then raising both sides to the e power. From there, it should just be a simplification problem to solve.
 
Once you simplify, you may not see it right away I noticed when doing the simplification myself.

You should end up with y=-xc/(1-xc) but if we multiply the top and the bottom by (-1/c)/(-1/c), we would get x/[(-1/c)+x]. -1/c=c2 or C depending on how you want to notate it. Thus, yielding x/(C+x)
 
Im not sure how you ended up with y=-xc/(1-xc) when I combine the ln functions and raise to the e power I get y/(1-y)=x+C then you can change that into (1-y)/y=1/(x+C), then (1/y)-1=(1/x+C) I am not sure if that is the way you did it so can you please elaborate more on how you got y=-xc/(1-xc). Sorry if I am being difficult i usually don't have this much trouble with these kind of problems. thanks
 
Last edited:
First of all, elnx + c= Cx not x+c.

That means you will have y/(1-y)=Cx

Next, you need to multiple both sides by (1-y); thus, obtaining y=Cx(1-y)=Cx-ycx.

Now just add yCx to both sides. y+yCx=Cx.

Then factor: y(1+Cx)=Cx.

y=Cx/(1+Cx)

Multiple by (1/c)/(1/c) which is equal to 1.

y=x/((1/c)+x) and 1/c is just another constant.

y=x/(C2+x)
 
A HA! That makes perfect sense now you have to raise the entire right side of the equation to the e not just ln|x| to the e power plus c to the e power, because you raised ln|y/(1-y)| to the e power, wow, thanks you definitely saved me from a lot of frustration
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top