- #1
MrPacane
- 11
- 0
Hi everyone,
In my signals assignment, I'm asked to show that, for a continuous time, real-valued signal x(t):
Ex_even = Ex_odd = 0.5 * Ex
So here's what I've done:
Ex_even = ∫|(x(t) + x(-t))/2|²dt
Ex_even = 0.5 * ∫|(x(t)² + 2x(t)x(-t) + x(-t)²)/2|dt
Ex_even = 0.5 * [ 0.5 * ∫x(t)²dt + ∫x(t)x(-t)dt + 0.5 * ∫x(-t)²dt ]
Now, I assume that ∫x(t)x(-t)dt must go to zero (when integrated from -∞ to +∞), but I don't understand why. Could someone explain it to me?
Thanks!
In my signals assignment, I'm asked to show that, for a continuous time, real-valued signal x(t):
Ex_even = Ex_odd = 0.5 * Ex
So here's what I've done:
Ex_even = ∫|(x(t) + x(-t))/2|²dt
Ex_even = 0.5 * ∫|(x(t)² + 2x(t)x(-t) + x(-t)²)/2|dt
Ex_even = 0.5 * [ 0.5 * ∫x(t)²dt + ∫x(t)x(-t)dt + 0.5 * ∫x(-t)²dt ]
Now, I assume that ∫x(t)x(-t)dt must go to zero (when integrated from -∞ to +∞), but I don't understand why. Could someone explain it to me?
Thanks!