How should I find the nontrivial stationary paths?

  • Thread starter Thread starter Math100
  • Start date Start date
Click For Summary

Homework Help Overview

The discussion revolves around finding nontrivial stationary paths for a variational problem involving a functional defined with boundary conditions and constraints. The subject area includes calculus of variations and eigenvalue problems, specifically focusing on the Euler-Lagrange equation and the implications of Lagrange multipliers.

Discussion Character

  • Exploratory, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the derivation of the Euler-Lagrange equation from an auxiliary functional and discuss the implications of boundary conditions on the solutions. There are inquiries about the nature of the solutions when the Lagrange multiplier is zero and how to handle the eigenvalue conditions arising from the problem setup.

Discussion Status

The discussion is ongoing with participants examining various aspects of the problem, including the implications of boundary conditions and the normalization of eigenfunctions. Some participants suggest integrating by parts and question assumptions regarding the trivial and nontrivial solutions. There is a recognition of multiple interpretations regarding the eigenvalue conditions.

Contextual Notes

Participants note that the problem involves constraints on the functional and that the boundary conditions lead to an eigenvalue problem, which complicates the search for nontrivial solutions. The discussion highlights the need for careful consideration of the conditions under which the constants in the solutions are determined.

Math100
Messages
823
Reaction score
234
Homework Statement
Consider the functional ## S[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx, y(0)=0 ##, with a natural boundary condition at ## x=1 ## and subject to the constraint ## C[y]=\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx=1 ##, where ## \alpha, \beta ## and ## \gamma ## are nonzero constants.
a) Show that the stationary paths of this system satisfy the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##. Find the nontrivial stationary paths, stating clearly the eigenfunctions ## y ## (normalized so that ## C[y]=1 ##) and the values of the associated Lagrange multiplier.
Relevant Equations
None.
a) Proof:
Let ## \lambda ## be the Lagrange multiplier.
Then the auxiliary functional is ## \overline{S}[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx-\lambda (\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx-1) ##.
This gives ## \overline{S}[y+\epsilon h]=\alpha (y(1)+\epsilon h(1))^2+\int_{0}^{1}\beta (y'+\epsilon h')^2dx-\lambda (\gamma(y(1)+\epsilon h(1))^2+\int_{0}^{1}w(x)(y+\epsilon h)^2dx-1) ##, where ## y+\epsilon h ## is an admissible perturbation, so that ## h(0)=0 ##.
Note that the Gateaux differential ## \triangle\overline{S}[y, h] ## is given by ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0} ##.
Thus ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0}=2\alpha y(1)h(1)+2\int_{0}^{1}\beta y'h'dx-2\lambda (\gamma y(1)h(1)+\int_{0}^{1}wyhdx) ##.

From here, how should I show that the stationary paths of this system satisfy the given Euler-Lagrange equation?

b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##.
Consider the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
Then we have ## \frac{d^2y}{dx^2}+\lambda y=0, y(0)=0, (1-\lambda)y(1)+y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
This gives ## y=c_{1}sin(\sqrt{\lambda}x)+c_{2}cos(\sqrt{\lambda}x) ##.

From here, how should I find the nontrivial stationary paths?
 
Physics news on Phys.org
Math100 said:
Homework Statement: Consider the functional ## S[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx, y(0)=0 ##, with a natural boundary condition at ## x=1 ## and subject to the constraint ## C[y]=\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx=1 ##, where ## \alpha, \beta ## and ## \gamma ## are nonzero constants.
a) Show that the stationary paths of this system satisfy the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##. Find the nontrivial stationary paths, stating clearly the eigenfunctions ## y ## (normalized so that ## C[y]=1 ##) and the values of the associated Lagrange multiplier.
Relevant Equations: None.

a) Proof:
Let ## \lambda ## be the Lagrange multiplier.
Then the auxiliary functional is ## \overline{S}[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx-\lambda (\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx-1) ##.
This gives ## \overline{S}[y+\epsilon h]=\alpha (y(1)+\epsilon h(1))^2+\int_{0}^{1}\beta (y'+\epsilon h')^2dx-\lambda (\gamma(y(1)+\epsilon h(1))^2+\int_{0}^{1}w(x)(y+\epsilon h)^2dx-1) ##, where ## y+\epsilon h ## is an admissible perturbation, so that ## h(0)=0 ##.
Note that the Gateaux differential ## \triangle\overline{S}[y, h] ## is given by ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0} ##.
Thus ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0}=2\alpha y(1)h(1)+2\int_{0}^{1}\beta y'h'dx-2\lambda (\gamma y(1)h(1)+\int_{0}^{1}wyhdx) ##.

From here, how should I show that the stationary paths of this system satisfy the given Euler-Lagrange equation?

Assuming your work is correct, you have <br /> (\alpha - \gamma\lambda)y(1)h(1) + \int_0^1 \beta y&#039;h&#039; - \lambda w y h \,dx = 0. What is the next step in all of these problems? Integrate y&#039;h&#039; by parts.

b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##.
Consider the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
Then we have ## \frac{d^2y}{dx^2}+\lambda y=0, y(0)=0, (1-\lambda)y(1)+y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
This gives ## y=c_{1}sin(\sqrt{\lambda}x)+c_{2}cos(\sqrt{\lambda}x) ##.

This assumes \lambda \neq 0. What happens if \lambda = 0? Do you get a non-zero solution for y?

For \lambda = k^2 &gt; 0, you know from the condition y(0) = 0 that c_2 = 0. That leaves you with the condition at y(1), which takes the form <br /> c_1f(k) = 0. We know that c_1 \neq 0, so that requires f(k) = 0. The condition C[c_1\sin kx] = 1 then gives you c_1 in terms of k.
 
  • Like
Likes   Reactions: Math100
pasmith said:
Assuming your work is correct, you have <br /> (\alpha - \gamma\lambda)y(1)h(1) + \int_0^1 \beta y&#039;h&#039; - \lambda w y h \,dx = 0. What is the next step in all of these problems? Integrate y&#039;h&#039; by parts.



This assumes \lambda \neq 0. What happens if \lambda = 0? Do you get a non-zero solution for y?

For \lambda = k^2 &gt; 0, you know from the condition y(0) = 0 that c_2 = 0. That leaves you with the condition at y(1), which takes the form <br /> c_1f(k) = 0. We know that c_1 \neq 0, so that requires f(k) = 0. The condition C[c_1\sin kx] = 1 then gives you c_1 in terms of k.
So for part b), I've got ## y=Asin(\sqrt{\lambda}x)+Bcos(\sqrt{\lambda}x) ##, where ## A, B ## are constants. The condition ## y(0)=0 ## gives ## B=0 ## and the boundary condition at ## x=1 ## gives ## y(1)=0\implies 0=Asin(\sqrt{\lambda})\implies sin(\sqrt{\lambda})=0 ## since ## A\neq 0 ##. This means ## \sqrt{\lambda}=n\pi\implies y=Asin(n\pi x) ##.
Thus, the constraint gives ## 1=\int_{0}^{1}[Asin(n\pi x)]^2dx\implies 1=A^2\int_{0}^{1}sin^2(n\pi x)dx\implies A=\sqrt{2} ##.
Hence, ## y=\sqrt{2}sin(\sqrt{\lambda x}) ##.
Is this the correct stationary path?
 
Try again. The condition at x = 1 is <br /> (1 - \lambda)y(1) + y&#039;(1) = 0.
 
pasmith said:
Try again. The condition at x = 1 is <br /> (1 - \lambda)y(1) + y&#039;(1) = 0.
I still don't get this one. How does ## (1-\lambda)y(1)+y'(1)=0 ## determine our another constant ## A ##?
 
You are dealing with an eigenvalue problem. The condition at 1 tells you that either A = 0, which is the trivial solution, or else \lambda must satisfy a certain condition. Then the constraint C[y] = 1 determines A.
 
Last edited:
pasmith said:
You are dealing with an eigenvalue problem. The condition at 1 tells you that either A = 0, which is the trivial solution, or else \lambda must satisfy a certain condition. Then the constraint C[y] = 1 determines A.
How can ## \lambda ## satisfy a certian condition? And how to find the constant ## A ##?
 
pasmith said:
You are dealing with an eigenvalue problem. The condition at 1 tells you that either A = 0, which is the trivial solution, or else \lambda must satisfy a certain condition. Then the constraint C[y] = 1 determines A.
Since the constant ## B=0 ##, we have ## y=Asin(\sqrt{\lambda}x) ## and using the boundary condition at ## x=1 ## gives ## (1-\lambda)y(1)+y'(1)=0\implies (1-\lambda)Asin(\sqrt{\lambda})+A\sqrt{\lambda}cos(\sqrt{\lambda})=0 ##. But then this means ## sin(\sqrt{\lambda})=0\implies \sqrt{\lambda}=n\pi ## for ## n\neq 0 ## and ## cos(\sqrt{\lambda})=0\implies \sqrt{\lambda}=(n+\frac{1}{2})\pi ## for some ## n\in\mathbb{Z} ##. What's wrong in here?
 
Math100 said:
Since the constant ## B=0 ##, we have ## y=Asin(\sqrt{\lambda}x) ## and using the boundary condition at ## x=1 ## gives ## (1-\lambda)y(1)+y'(1)=0\implies (1-\lambda)Asin(\sqrt{\lambda})+A\sqrt{\lambda}cos(\sqrt{\lambda})=0 ##. But then this means ## sin(\sqrt{\lambda})=0\implies \sqrt{\lambda}=n\pi ## for ## n\neq 0 ## and ## cos(\sqrt{\lambda})=0\implies \sqrt{\lambda}=(n+\frac{1}{2})\pi ## for some ## n\in\mathbb{Z} ##. What's wrong in here?
What's wrong is your assumption that the ##\sin## and ##\cos## terms must vanish individually. Following the suggestion of @pasmith, set ##\lambda=k^2## and write your boundary (eigenvalue) condition as ##\left(1-k^{2}\right)\sin k+k\cos k=0##. Beyond the the trivial solution ##k=0##, a plot of the function on the left-side suggests that the condition has an infinity of roots, the first few of which are (using Mathematica): ##k_1=1.20779,k_2=3.44824,k_3=6.44095,k_4=9.53048,k_5=12.6458##. The squares of these are the first five allowed values of the Lagrange multiplier ##\lambda## in your variational problem part b). All that remains to do now is to plug your eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## and calculate the normalization factors ##A_\lambda##.
 
  • #10
renormalize said:
What's wrong is your assumption that the ##\sin## and ##\cos## terms must vanish individually. Following the suggestion of @pasmith, set ##\lambda=k^2## and write your boundary (eigenvalue) condition as ##\left(1-k^{2}\right)\sin k+k\cos k=0##. Beyond the the trivial solution ##k=0##, a plot of the function on the left-side suggests that the condition has an infinity of roots, the first few of which are (using Mathematica): ##k_1=1.20779,k_2=3.44824,k_3=6.44095,k_4=9.53048,k_5=12.6458##. The squares of these are the first five allowed values of the Lagrange multiplier ##\lambda## in your variational problem part b). All that remains to do now is to plug your eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## and calculate the normalization factors ##A_\lambda##.
I don't understand. If the condition has an infinity of roots, then how are we supposed to plug those ## k ## values into the eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## in order to find the constant ## A ##?
 
  • #11
Math100 said:
I don't understand. If the condition has an infinity of roots, then how are we supposed to plug those ## k ## values into the eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## in order to find the constant ## A ##?
It's exactly analogous to what you would do for the simple boundary/eigenvalue condition ##\sin k_n=0##. Of course, for that case you know that the infinity of roots are explicitly given by ##k_n=n\pi##, where ##n## is any natural number, and you use those values to express the normalization ##A_n## as a function of ##n\pi##. But what if you didn't know that explicit solution? You'd simply replace ##n\pi## in ##A_n## by ##k_n##, along with the statement than the allowed eigenvalues of ##k_n## are the roots of ##\sin k_n=0##. (And perhaps display one or more of the eigenvalues that you find numerically.) Just do the same thing for your variational problem.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
Replies
2
Views
1K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K