- #1
- 94
- 0
Just a quick question
A delta Potential is given by -Vδ(x-a). If we wrote the equations of the wave function like this
For E>0
u(x) = Aexp(ikx) + A'exp(-ikx) for x<a
u(x) = Bexp(ikx) + B'exp(-ikx) for x>a
do i have to impose that A'=B=0 in order to u(x) not diverge in when x goes to +-infinite?
The problem is that where I read that, only impose the condition for E<0.
Thanks!
A delta Potential is given by -Vδ(x-a). If we wrote the equations of the wave function like this
For E>0
u(x) = Aexp(ikx) + A'exp(-ikx) for x<a
u(x) = Bexp(ikx) + B'exp(-ikx) for x>a
do i have to impose that A'=B=0 in order to u(x) not diverge in when x goes to +-infinite?
The problem is that where I read that, only impose the condition for E<0.
Thanks!