So I've always done simple ODEs by the method of separation of variables. You know,(adsbygoogle = window.adsbygoogle || []).push({});

dy/dx = A*y

dy/y = A*dx

IndefiniteIntegral[1/y dy] = IndefiniteIntegral[A dx]

ln(y) = A*x + Constant

y = Constant*e^(A*x)

It's easy to remember and it usually works. A lot of the PDEs I know how to do involve this process at some point.

The problem is, all my professors were quick to point out that this is an abuse of notation. That is, dy/dx isn't really a fraction, and can't necessarily be treated as such--it just happens to work out well if you pretend in a simple ODE.

So what's the rigorous way to do this ODE? What's really going on when you use the 'pretend derivatives are fractions' trick?

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Separation of Variable - What's REALLY going on?

**Physics Forums | Science Articles, Homework Help, Discussion**