## Trying to solve any equation ever with Recurrences

So, Im trying to make up this theory of trying to solve any equation ever using recurrences... Ill show you what I mean

f(x) = x^2 - 3x + 2

Well, obviously you could do this the easy way and do factoring and figure out x = 2 or 1... no big deal right?

But what if we try a different way... albeit harder and more complicated... but Itll make a bit more sense later in this post

So what if we do this

x^2 - 3x + 2 = 0
x(x) - 3x + 2 = 0
x(x) = 3x-2
x = (3x-2)/x

So what does this mean?

Well, plug in a random number into your calculator... say 10...
Now do (3x-2)/x, and youll get 2.8...
now (if your using a TI-84 like me), plug in ((3*Ans)-2)/Ans
So Ans = 2.8, this should yield 2.28
Now Ans = 2.28, this should yield 2.125
From 2.125, you get 2.0588....

Basically, If you do this an infinite amount of times, it will converge to 2... which is one of the solutions of the equation...

Now say you rearrange the equation differently...
x^2 -3x+2 = 0
x(x) - 3x + 2 = 0
(x-3)x = -2
x = (-2)/(x-3)

If you do the same method as above, it will converge to 1 for any initial value you put in for x, which is the other zero for the quadratic equation...

So... why am I wasting my time with this?

Well, consider this equation

4x = e^(.5x)

Well... idk how to solve this equation.... but what if we do the thing I just did, where we set one side of the equation to x then infinitely compute the other side...

x = e^(.5x)
ln(4x) = .5x
2ln(4x) = x

If you pick 20 for the first x, 2ln(4x) will yield 8.76, then 7.11, then 6.69, and so on, until it converges to a number 6.523371369...

Is this right? well
4(x) = e^(.5x)
4(6.523371369) = e^(.5(6.523371369))
26.09 = e^(3.26)
26.09 = 26.09

So... thats the solution.... by using an infinite recursion of setting one side of the equation to x, then we will get some sort of solution....

Now... this doesnt always work... for example, say we did (from the previous equation)

4x = e^(.5x)
x = (e^(.5x))/4

If you do this, then x will converge to infinity. Now you may think that this is wrong... but I feel this is right, since they both technically (intersect) at infinity. In theory, 4(infinity) = e^(infinity).... so is it really wrong? I dont believe so....

Anyway... my theory is that you can solve a solution for an equation when you set the equation to the form of
x = f(x)
Using an infinite recursive calculation.

Now, I really don't know how to dive more into this theory, since I think this involves math thats super crazy (more than I know). Im a computer engineer, so I dont have the biggest arsenal of theoretical math at my disposal. I do know that this might have to do something with damping, since some of the recurrences will have an overdamped, underdamped, and critically damped response to converging to the solution... but... i dont know...

So... any thoughts on this? Any suggestions on how to move forward with this? Or has someone done this before?

And no, this does not have anything to do with homework, this is just fun theoretical stuff Im doing on my own and I would like input/other ideas to help me move forward with this. If this post is deemed by the admins to not belong here, then Ill delete this post (no questions asked) and move it to the Homework Section (or wherever they tell me to put it). All I want is some outside input for this.
 PhysOrg.com mathematics news on PhysOrg.com >> Pendulum swings back on 350-year-old mathematical mystery>> Bayesian statistics theorem holds its own - but use with caution>> Math technique de-clutters cancer-cell data, revealing tumor evolution, treatment leads
 A lot of these methods are pretty generic numerical techniques (see e.g. Newton's method). Note that you're not really "solving", only approximating solutions.

 Quote by Gackhammer Now... this doesnt always work... for example, say we did (from the previous equation) 4x = e^(.5x) x = (e^(.5x))/4 If you do this, then x will converge to infinity. Now you may think that this is wrong... but I feel this is right, since they both technically (intersect) at infinity. In theory, 4(infinity) = e^(infinity).... so is it really wrong? I dont believe so....
You don't converge to infinity, you diverge to infinity.
Two real valued functions cannot intersect at infinity.
And "in theory" (i.e. in real analysis), 4 times infinity is not defined. Neither is e to the power of infinity.
You are also using the term "technically" is very non-technical manner.

 Anyway... my theory is that you can solve a solution for an equation when you set the equation to the form of x = f(x) Using an infinite recursive calculation.
Try it with the logistic map.

## Trying to solve any equation ever with Recurrences

 Quote by pwsnafu Try it with the logistic map.
if x := f(x) gets you in a cycle,

x := k f(x) + (1-k)x with k < 1 will probably work. Use a smaller k if there are still problems

 Quote by Gackhammer So, Im trying to make up this theory of trying to solve any equation ever using recurrences... Ill show you what I mean Consider the quadratic function - f(x) = x^2 - 3x + 2 Well, obviously you could do this the easy way and do factoring and figure out x = 2 or 1... no big deal right? But what if we try a different way... albeit harder and more complicated... but Itll make a bit more sense later in this post So what if we do this x^2 - 3x + 2 = 0 x(x) - 3x + 2 = 0 x(x) = 3x-2 x = (3x-2)/x So what does this mean? Well, plug in a random number into your calculator... say 10... Now do (3x-2)/x, and youll get 2.8... now (if your using a TI-84 like me), plug in ((3*Ans)-2)/Ans So Ans = 2.8, this should yield 2.28 Now Ans = 2.28, this should yield 2.125 From 2.125, you get 2.0588.... Basically, If you do this an infinite amount of times, it will converge to 2... which is one of the solutions of the equation... Now say you rearrange the equation differently... x^2 -3x+2 = 0 x(x) - 3x + 2 = 0 (x-3)x = -2 x = (-2)/(x-3) If you do the same method as above, it will converge to 1 for any initial value you put in for x, which is the other zero for the quadratic equation... So... why am I wasting my time with this? Well, consider this equation 4x = e^(.5x) Well... idk how to solve this equation.... but what if we do the thing I just did, where we set one side of the equation to x then infinitely compute the other side... x = e^(.5x) ln(4x) = .5x 2ln(4x) = x If you pick 20 for the first x, 2ln(4x) will yield 8.76, then 7.11, then 6.69, and so on, until it converges to a number 6.523371369... Is this right? well 4(x) = e^(.5x) 4(6.523371369) = e^(.5(6.523371369)) 26.09 = e^(3.26) 26.09 = 26.09 So... thats the solution.... by using an infinite recursion of setting one side of the equation to x, then we will get some sort of solution.... Now... this doesnt always work... for example, say we did (from the previous equation) 4x = e^(.5x) x = (e^(.5x))/4 If you do this, then x will converge to infinity. Now you may think that this is wrong... but I feel this is right, since they both technically (intersect) at infinity. In theory, 4(infinity) = e^(infinity).... so is it really wrong? I dont believe so.... Anyway... my theory is that you can solve a solution for an equation when you set the equation to the form of x = f(x) Using an infinite recursive calculation. Now, I really don't know how to dive more into this theory, since I think this involves math thats super crazy (more than I know). Im a computer engineer, so I dont have the biggest arsenal of theoretical math at my disposal. I do know that this might have to do something with damping, since some of the recurrences will have an overdamped, underdamped, and critically damped response to converging to the solution... but... i dont know... So... any thoughts on this? Any suggestions on how to move forward with this? Or has someone done this before? And no, this does not have anything to do with homework, this is just fun theoretical stuff Im doing on my own and I would like input/other ideas to help me move forward with this. If this post is deemed by the admins to not belong here, then Ill delete this post (no questions asked) and move it to the Homework Section (or wherever they tell me to put it). All I want is some outside input for this.
I've learnt this before what you found is actually called iteration...
the idea is that as you keep on feeding in the values generated by f(x) in x=f(x), you will converge to the solution.

However the sequence won't always converge (i.e it will diverge from the solution). As you might have noticed, there are different ways of isolating x from the equation. But the values will only converge to the solution if |f'(x)|<1 near the solution.

Therefore, you must try different equations of the form x=f(x). Then you find the gradient/slope of f'(x) near the root. If it lies between -1 and 1, it will converge. Also you can modify the equation such that the slope of f(x) near the solution is close to zero; then the iteration will converge more rapidly!

Recognitions:
 Quote by Gackhammer So, Im trying to make up this theory of trying to solve any equation ever using recurrences...
You should get together with Matt Benesi: http://www.physicsforums.com/showthread.php?t=539396

 Quote by Stephen Tashi You should get together with Matt Benesi: http://www.physicsforums.com/showthread.php?t=539396

Thanks! I will!
 Blog Entries: 1 Hey Gackhammer, What you're doing is similar to Newton's method, as Number Nine said. You might also want to check out the root finding algorithm wikipedia page, as this is more or less exactly what you are doing. B
 What does this have to do with Newton's method? It looks to me like the Banach fixed point theorem at work here.
 So, just wanted to make some other interesting points (on the damping point i mentioned in the first post) So heres the equation $x^2 -4x-9$, with an input of 1. For the recurrence equation $x = \frac{9+4x}{x}$, here is the plot of number of recurrences (n = 1,2,3) and the value it produces (its in Excel, im lazy) As you can see, there is a damped curve, which is underdamped (I hope I got that right, been a while since I took Circuits for RLC stuff) Anyway, as n approaches infinity, it reaches one of the zeros for the equation of 5.6ish Now, if we manipulate the recurrence function to be $x = \frac{9}{x-4}$, here is the plot Also underdamped.... and gets to the -1.6 zero of the quadratic function So... how about x = ln(x+3), can we find some overdamped or critically damped? Well, here is that recurrence function with two different inputs Im not sure if this is critically damped or overdamped.... I thought this was interesting to note, as (and im not an expert on damping systems), we could probably describe this in a Second ODE....

Blog Entries: 1
 Quote by Vargo What does this have to do with Newton's method? It looks to me like the Banach fixed point theorem at work here.
It looked similar to Newton's method- converges on a 'fixed point' in some cases, divergent in others. It also looked similar to several root finding algorithms for the same reason.

Never heard of a fixed point theorem before now, although it sounds like an excellent term for describing several things I've seen through the years.

 Tags calculation, equation, infinite, recursion

 Similar discussions for: Trying to solve any equation ever with Recurrences Thread Forum Replies Calculus & Beyond Homework 33 Differential Equations 3 Precalculus Mathematics Homework 2 Differential Equations 4 Differential Equations 1