# Integrating Factor and absolute value

1. Jun 19, 2011

### awelex

Hi,

I have a general question regarding the integrating factor of first-oder linear DEs. All textbooks that I've seen (which aren't too many) simply drop the absolute symbol when the factor has the form exp(ln(abs(x))). This would evaluate to abs(x), yet the books use simply x. Why is that valid?

Thanks,

Alex

2. Jun 19, 2011

### shelovesmath

I've been wondering too.

Last edited: Jun 19, 2011
3. Jun 19, 2011

### HallsofIvy

Staff Emeritus
An "integrating factor" for a linear first order equation, dy/dx+ f(x)y= g(x), is a function, $\mu(x)$ such that if we multiply the entire equation by it, $\mu(x)(y'+ f(x)y)= \mu(x)y'+ f(x)\mu(x)y= \mu(x)g(x)$, the left side becomes an "exact derivative":
$$\frac{d\mu(x)y}{dx}$$

If that is true then, by the product rule,
$$\frac{d\mu(x)y}{dx}= \mu(x)y'+ \mu' y= \mu(x)y'+ f(x)\mu(x)y$$
which leads immediately to $\mu'(x)= x\mu(x)$, a simple separable equation.
$$\frac{d\mu}{\mu}= f(x)dx$$
so that $ln(|\mu|)= \int f(x)dx$ and so
$$|\mu|= e^{\int f(x)dx}$$
Of course, that means that either $\mu= e^{\int f(x)dx}$ or $\mu(x)= -e^{\int f(x)dx}$

But since we are multiplying both sides of the equation by $\mu(x)$ it really doesn't matter whether it is positive or negative!