For which PDEs is the solution in the form of F(x)*G(t)?

  • Thread starter Nikitin
  • Start date
  • Tags
    Form Pdes
In summary: However, even in these cases it is still possible for the expansion series to converge VERY SLOWLY to the exact solution.In summary, this expert summarizer says that:-Separation of variables can be used to solve homogenous and linear two-variabled PDEs-What series expansion you use depends entirely on the problem at hand. For example, vacuum electrostatics is essentially all about solving Laplace's equation, and so a series expansion in terms of sinusoidal functions and exponential functions would be the correct one.-However, even in these cases, the expansion series can converge VERY SLOWLY to the exact solution.
  • #1
Nikitin
735
27
So we just started finding general solutions for homogenous&linear two-variabled PDEs by using separation of variables in my engineering-math class. There the professor tells us to assume the solution of a PDE is in the form of F(x)*G(t).

But when is the solution in the form of F(x)*G(t)? When does separation of variables work and not (does it always work, in theory atleast, on linear PDEs?)? For multivariabled linear PDEs will the solution be on the form of F(x)*G(y)*H(z)*J(t)... etc?

I'm a bit confused currently... all help is appreciated.
 
Physics news on Phys.org
  • #2
A good question!

Basically, you should think of product solutions you find as a single term in a series solution like the following:
[tex]U(x,t)=\sum_{i=0}^{\infty}F_{i}(x)G_{i}(t) (*)[/tex]
where linearity and homogeneity of the differential equation shows that we can solve the "same" problem at each "i" level.
If we take a wave equation like u_tt=A*u_xx, we can represent the general solution as solving:
[tex]\frac{\frac{d^{2}F_{i}}{dx^{2}}}{F_{i}}=\frac{\frac{d^{2}G_{i}}{dt^{2}}}{G_{i}}=k_{i}[/tex], where the k_i's are constants related to the wavenumber/frequencies of component waves.
-----------------------------------------------------------
Now, it is STILL a problem:
How can we be sure that an infinite series in terms of product functions are ADEQUATE in specifying a particular solution?
This requires that the expansion functions F_i and G_i represents what we call a COMPLETE set of functions for "x" and "t", respectively (essentially, that ALL functions in x must be representable by a sum of weighted F_i's, and similarly for all t-functions)

Is this trivial to show is true in every single case?
Not at all!
-----------------------
However, remember that you DO know of a similar series expansion of an ARBITRARY, nice function u(x,t) already, namely the two-variable Taylor series of it, in product powers of x and t.
So:
The product powers of "x" and "t" DO represent what we call a complete set of expansion functions; lots of other such sets exist, and for different types of differential equations, one set of expansion functions will be more natural to use, as dictated by the shape of the diff.eq. The diff.eq, effectively, tells you how such expansion functions must look like (i.e, as solutions to the simplified, separated problems)
 
  • Like
Likes 1 person
  • #3
Thanks for the clarifications :)! I understand more now, but I am still a bit unsure on why the form of F(x)*G(t) is the (only?) correct one for each partial solution.

You were comparing the Fourier series one gets out to its taylor series - so this means there are many ways to solve, say, the wave equation? Is it always an infinite-series expansion? Hell, assuming the initial conditions are satisfied and if v=1, then simply the polynomial [tex]Qx^2 + Qt^2 + ax + bt +cxt[/tex] should be a solution to the wave equation too... but we use the Fourier series because it's the most practical?
 
Last edited:
  • #4
What series expansion you use depends entirely on the problem at hand. For example, vacuum electrostatics is essentially all about solving Laplace's equation ##\nabla^{2}\varphi = 0## for the electrostatic potential ##\varphi##. Now if I write this in Cartesian coordinates (which I would use if I have for example a system consisting of infinite charged rectangular plates) then I just have ##\{\frac{\partial^2 }{\partial x^2} + \frac{\partial^2 }{\partial y^2} + \frac{\partial^2 }{\partial z^2} \}\varphi = 0## and I can represent the solution, using separation of variables, as a series expansion in terms of sinusoidal functions and exponential functions. If I then specify the boundary conditions (for example the values of the potential on the plates-say two are grounded and another two are kept at some non-zero potential), I know this must be the only possible solution thanks to the uniqueness theorem for solutions to Laplace's equation.

If, on the other hand, I write ##\nabla^2 \varphi = 0## in spherical coordinates (which I would use for say a spherical grounded sphere of charge), Fourier series would not be of help to me. Rather, I would want to write my solution as a series expansion of Legendre polynomials. In cylindrical coordinates (say an infinitely long cylinder of charge) I would want to use Bessel function expansions and so on. The uniqueness theorem still holds as long as I specify boundary conditions.

The point is that once you have more or less "guessed" a solution e.g. one of the series expansions mentioned above using separation of variables, and you have provided sufficient boundary conditions, the uniqueness theorem will guarantee that this can be the only solution so you're done.
 
  • #5
Nikitin:
Note that even in the cases where the nature of the PDE sort of specifies the type of expansion functions we ought to use, this does NOT mean that the solution isn't representable in, say, power products or Fourier series.
After all, these classes DO represent a set of complete functions!

The critical issues here would be:
1. How the coefficient structure within the new class of expansian functions ought to be
2. Extremely slow convergence of finite appproximations to the exact solutions.

Thus, SOME expansion set tends to less stupid to use than others! :smile:
 
  • #6
Also,just to clarify:
The uniqueness theorem does not say that a particular REPRESENTATION of the solution is the only one possible
It "merely" says that if you have found a solution to a problem, then there are no others to be found, not that your solution cannot be warped into an equivalent formulation in another expansion set of functions.
 
  • #7
Nikitin said:
When does separation of variables work and not (does it always work, in theory atleast, on linear PDEs?)?

There are a few conditions required for separation of variables to work on a linear PDE.

1. the equation must be separable. That is, you can separate the variables and get ODEs for each variable
2. Boundaries must be constant coordinate surfaces (at infinity is also okay)
3. Boundary conditions conditions cannot be completely arbitrary. If I remember right, if the boundary is at a constant [itex]\eta[/itex] (to pick an arbitrary coordinate system that may or may not be Cartesian) then the boundary condition cannot depend upon partial derivatives with respect to the other coordinates. There may be other restrictions as well.

Basic problems in heat transfer, electrodynamics, quantum mechanics, etc., can often be solved with this approach. However, it may be required to solve the problem in a non-Cartesian coordinate system in order for the approach to work (eg. solving laplaces equation inside a circle requires polar coordinates). Many real-world cases no longer fall into these categories, so other techniques are required. There are other exact approaches (eg. conformal mapping), approximate approaches (eg. perturbation theory), and of course for anything with really complicated geometry numerical methods eventually win out as the most logical approach.

jason
 

What is a PDE?

A PDE, or partial differential equation, is an equation that involves multiple variables and their partial derivatives. It describes how a physical quantity changes with respect to these variables.

What is the form of F(x)*G(t)?

F(x)*G(t) is a product of two functions, F(x) and G(t), where x and t are independent variables. It is a general form that can be used to represent the solution of many different PDEs.

What are some examples of PDEs that have solutions in the form of F(x)*G(t)?

Examples include the heat equation, wave equation, and diffusion equation. These are commonly found in physics, engineering, and other fields.

Can any PDE have a solution in the form of F(x)*G(t)?

No, not all PDEs have solutions in this form. It depends on the specific PDE and the boundary conditions that are imposed.

What are the advantages of using the form of F(x)*G(t) for solving PDEs?

Using this form can simplify the mathematical calculations and make it easier to find solutions. It also allows for the use of separation of variables, which is a common technique for solving PDEs.

Similar threads

Replies
4
Views
1K
  • Differential Equations
Replies
2
Views
2K
Replies
5
Views
612
Replies
28
Views
2K
  • Differential Equations
Replies
3
Views
2K
Replies
1
Views
1K
  • Differential Equations
Replies
4
Views
4K
Replies
13
Views
1K
Replies
4
Views
1K
  • Differential Equations
Replies
1
Views
770
Back
Top