I assume you are asking if it is possible to find the zeroes of y(x) = Ax^a + Bx^b, where a,b are real numbers?
omg, my question is simple! I would like to know if is possible to isolate the x variable in equation [tex]f(x)=ax^{\alpha}+bx^{\beta}[/tex]
In general, no. If [itex]\alpha[/itex] and [itex]\beta[/itex] are both less than 5, then yes. If they're both less than three, then it's a quadratic or simpler and hence easily done by the quadratic formula. In most other cases it's either difficult or impossible.
Yes, it's possible, and quite easy. Let us suppose that [itex]\alpha > \beta[/itex]. Let [itex]\zeta = e^{2\pi i/(\alpha - \beta)}[/itex] be a primitive root of unity. Then the polynomial factors as: [tex]ax^\beta \prod_{k=1}^{\alpha - \beta} (x - \left( \frac{b}{a} \right)^{1/(\alpha - \beta)} \zeta^k)[/tex]
In mathematics, as in other areas of science, just because a question is 'simple', it does not necessarily follow that the solution will be 'simple'. For example, consider Fermat's Last Theorem: http://en.wikipedia.org/wiki/Fermat's_Last_Theorem
He's not looking to factor the polynomial but rather to find the inverse [itex]f^{-1}(x)[/itex], I think...
Yes if the exponents are positive integers. Consider the general algebraic function, ##y(x)## written implicitly as: $$f(x,y)=a_1(x)+a_2(x)y+a_3(x)y^2+\cdots+a_n(x)y^n=0$$ with ##a_i(x)## polynomials. In your case we would simply have: $$f(x,y)=x-ay^{\alpha}-by^{\beta}=0$$ Then by Newton-Puiseux's Theorem, we can compute power series representations of the various branches (solutions) of ##y(x)## having the form: $$y_d(x)=\sum_{n=-p}^{\infty} c_n\left(x^{1/d}\right)^n$$ with radii of convergences extending at least to the nearest singular point of ##f(x,y)## and often further than that. Do a search for "Newton Polygon" if you're interested in knowing how to compute these "Puiseux" series.