Energy calculation by using Parseval's theorm

In summary, the conversation discusses the relationship between energy and power in Parseval's theorem, which states that the energy of a signal can be calculated by taking the square of the signal and integrating over time. The conversation also explores the possibility of using the Fourier transform to calculate the energy of a signal, and differentiates between energy signals (which have finite total energy but zero average power) and power signals (which have infinite total energy but finite average power).
  • #1
Jiho
20
4

Homework Statement


In Parseval's theorm, energy equals integral x(t) squre when x(t) represents voltage.
I wonder if x(t) represernts power, then does integral x(t) dt equal energy?? Because 'power multiplied time' means energy.

Homework Equations


upload_2019-1-22_16-44-2.png


The Attempt at a Solution

 

Attachments

  • upload_2019-1-22_16-44-2.png
    upload_2019-1-22_16-44-2.png
    1.5 KB · Views: 737
Last edited:
Physics news on Phys.org
  • #2
Yes
 
  • #3
Then, in Parseval's theorm, E= integral x(t) square dt = integral X(f) square df. But if x(t) represent 'power integral x(t) dt'= energy as you said.

I wonder when x(t) represent power, if 'integral x(t) dt' = 'integral X(f) df' or not.
 
  • #4
You can clearly see the square in the expressions. Why ask about x itself in a convoluted 'IF ' question ?
Use ##\LaTeX## and clear formulas.
And mind the dimensions:
Jiho said:
In Parseval's theorm, energy equals integral x(t) squre when x(t) represents voltage
is incorrect. The dimension of power is not Volt2
 
  • #5
Sorry to my expression. I didn't know how to write LATEX letter.

When x(t) represent voltage,
E = ## \int x(t)^2 \, dx ## = ## \int X(f)^2 \, df ## by definition of Parsevals' theorm,

But when x(t) represent power, I wonder if
E = ## \int x(t) \, dx ## = ## \int X(f) \, df ## or not.

X(f) means continuous Fourier transform of x(t)
 
  • #6
Jiho said:
When x(t) represent voltage,
E = ## \int x(t)^2 \, dx ## = ## \int X(f)^2 \, df ## by definition of Parsevals' theorem,
Yes. In that case, E is not the energy because ##x^2## is not a power.
But when x(t) represent power, I wonder if
E = ## \int x(t) \, dt ## = ## \int X(f) \, df ## or not. (edit: integrate ##x## over ##t##)

You want to know if the Fourier transform of the power, integrated over its frequencies, is equal to the energy.
I think not: generally ## \int x(t) \, dt \ne \int X(f) \, df ## -- example: sine wave.
 
  • #7
Hi.

$$ \textbf{Definition of Energy and power of a signal} \tag*{} $$
This derives from the definitions given in pure physics and from electrical circuits.
$$ \text{Energy} = \text{Power} \cdot \text{Time} \tag*{} $$
$$ \text{Power} = \dfrac{\text{Work done (Energy) } }{\text{Time taken} } \tag*{} $$
The energy of a signal and power are related by the following above equations.
In circuit theory, the instantaneous power through a element given lumped matter constraints are obeyed is given by:
$$ p(t) = i(t) \cdot v(t) = \dfrac{v^{2}(t)}{R} = i^{2}(t) \cdot R = {v^{2}(t)}{G} \tag*{} $$
The instantaneous power of any function is thus defined against a reference resistance of 1 ohm for all signals as:
$$ \text{P}_{inst} = f^{2}(t) \tag*{} $$
The total energy of a function is defined as:
$$ \text{E} = \displaystyle \int_{-\infty}^{\infty} f^{2}(t) \,\,\,\, \, \text{d}t \tag*{} $$
$$ \text{E} = \displaystyle \sum_{n \to -\infty}^{n \to \infty} x^{2}[n] \tag*{} $$
The average power of a function is given by:
$$ \text{P}_{avg} = \displaystyle \lim_{T \to \infty} \dfrac{1}{2T} \cdot \text{E} = \displaystyle \lim_{T \to \infty} \dfrac{1}{2T} \displaystyle \int_{-\infty}^{\infty} f^{2}(t) \,\,\,\, \, \text{d}t \tag*{} $$
$$ \text{P}_{avg} = \displaystyle \lim_{n \to \infty} \dfrac{1}{2n + 1} \displaystyle \sum_{n }^{-n} x^{2}[n] \tag*{} $$
$$ \textbf{There are signals for whom their power has no physical significance} $$
For example, let there be an average total cost function of a firm:
$$ c(x) = { ( x - 120) }^{2} + 10 $$
The units of this function are in terms of money. Even then, money cannot be multiplied by some other unit to produce Watts. But we can still use the definition of power to find the power of this signal, because it is more about analysing signals, from what we were told.
If there is a power function:
$$ p(t) $$
Then we can still find the power and energy content of this signal, by taking the definitions of the integrals and solving, to me, and from the books I read, it is as simple as that. It is about signals and the way we can characterise and analyse them, and signals exist for whom there is no physical meaning of their power, but it is very important to be able to classify signals as power and energy signals. As you will see in this post.
$$ \textbf{Parsevals relation and the Fourier transform} \tag*{} $$
A possible alternative to find the total energy of such signals is to use parsevals relation which states that f(t) has a Fourier transform of X(j \omega) then:
$$ E = \displaystyle \int_{-\infty}^{\infty} f^{2}(t) \,\,\,\, \text{dt} = \dfrac{1}{2 \pi} \displaystyle \int_{-\infty}^{\infty} {| X(j \omega)| }^{2} \,\,\,\, \text{d}\omega \tag*{} $$
$$ \textbf{Energy and power signals} $$
A signal that has a finite total energy E has zero average power over all time. This is called an energy signal. Intuitively, energy signals start somewhere and they die out somewhere.

A signal that has a finite total average power P has infinite total energy over all time. This is called a power signal. Power signals have infinite total energy because for every moment of time from minus infinity to infinity, they are delivering some average power.

It is of critical importance to be able to classify signals as power or energy.
$$ \textbf{Importance of energy signals} \tag*{} $$
If a signal, is not an energy signal, then for sure its Fourier transform does not exist. If a signal is an energy signal, then this is a good start, its Fourier transform might exist, subject to other conditions. An LTI system is BIBO stable if its impulse response is an energy signal OR at least bounded.
If an LTI system is not BIBO stable, then any input signal to that system, would keep growing without bounds.
The laplace transform exists for power signals. The Fourier transform is mostly limited to energy signals. Apart from that, to be able to classify signals as power and energy allows you to understand them intuitively. Systems whose impulse responses are energy signals, are stable, or at least the impulse response must be bounded.

$$ \textbf{Addendum : Sinusoids whose Fourier transform exists}$$
Your sine and cosines are the only functions which I know whose Fourier transform exists despite them not being energy signals.
$$ \textbf{Intuition of putting power or energy signals into systems} $$
Well, let's say you have some physical system, an electric circuit, specifically, an RC circuit. The resistor and capacitor have voltage and power ratings. If you insert a power signal as your input, and don't turn it off, the capacitor and resistor, will eventually reach voltages exceeding their safe operating ones.

In another system, let's say, one modelling a towns financial response, and you insert a spender, someone, whose signal is an energy signal, this person or entity, he will blow up the towns financial response. Sorry for these crude analogies, but there is an intuition behind classing signals the way we do.
$$ \textbf{Summary} $$
The power of signals derives from a reference resistance of one Ohm, however for many signals, the power and energy has no physical significance, this is still fine, as the concepts for power and energy have been generalised for all signals that give us numbers. I do not know how to define power, for functions that return objects other than numbers.
It is important to be able to know whether a signal is an energy signal, or a power signal. If a power signal, then it has infinite energy, ie, it isn't dying out. If its an energy signal, it does die out. I've glossed over some issues, and taken some liberties, but most of all, the most important take is that despite deriving from the power and energy definitions in physics, power and energy content exists for all single variable functions that have the domain and codomain as the complex numbers or a subset.
 
Last edited:
  • Like
Likes Jiho and Delta2
  • #8
nice post by @AVBs2Systems . I just want to emphasize over one point of his/her post.
In mathematics we can define the energy of a complex valued function ##f(t)## as ##E=\int_{-\infty}^{+\infty}|f(t)|^2dt##. This E we can call it the "mathematical energy" of ##f(t)## and it MIGHT or NOT coincide with the notion of physical energy depending on what ##f(t)## represents. If for example ##f(t)## is the current through some resistor R then this mathematical energy E coincides with the physical energy dissipated on the resistor R due to the current ##I(t)=f(t)## (up to a constant of R cause the physical energy W would be ##W=\int I^2(t)Rdt##). If however ##f(t)## is something else for example ##f(t)=v(t)I(t)## where ##v(t)## and ##I(t)## the voltage and current through a resistor R, then the mathematical energy E does not coincide with the physical energy dissipated on the resistor R. Or if ##f(t)## is something like the money that something costs, there is no way to correlate it to a physical notion of energy.
 
  • Like
Likes AVBs2Systems
  • #9
Hi.

I made some mistakes in the previous post, and want to add another class of signals apart from energy or power, they are neither energy nor power signal. This also has a physical meaning.

But first, I would like to clarify one thing, about the signal inputted into the RC circuit example.
I am not sure if a signal can be a power signal without being unbounded, that is, my claim is that signals which are unbounded are not power signals, and all power signals are bounded.

So, to clarify:
$$ \textbf{Further inquiry required: Claim: All power signals are bounded} $$
An example of a power signal that is bounded: the unit step function: ## \big| \epsilon(t) \big| \le 1 ##
$$
\textbf{P} \Big[ \epsilon(t) \Big] = \displaystyle \lim_{T \to \infty} \dfrac{1}{2T} \displaystyle \int_{0}^{\infty} 1^{2} \,\,\,\, \text{dt} = \displaystyle \lim_{T \to \infty} \Bigg( \dfrac{1}{2T} \cdot \Big[ T - 0 \Big] \Bigg) = \dfrac{1}{2}
$$

But I am not sure if there is a power signal that is unbounded, as this would have an infinite energy as well as power. I think all power signals are bounded. The dirac pulse falls into the third class of signals from the internet searches I carried out.

Now the third class of signals:
$$ \textbf{There exists a third class of signals: Neither energy nor power signals} $$
The unit ramp function, is an example of this:
$$\text{R}(t) = t \epsilon(t) = \begin{cases} t & \text{t $ \ge 0 $ } \\ 0 & \text{t $ \lt 0 $ } \end{cases} $$
$$ \textbf{All periodic signals are power signals, and all energy signals are absolutely integrable} $$
A signal that repeats its pattern over some fixed amount of time, does so for all time, would have been delivering some average power and hence infinite energy (it isn't dying out). But just because a signal has a finite average power, does not make it periodic, it may be a random waveform with no repeating pattern.
$$ f( t + k\tau) = f(t) \,\,\,\,\, k \in \mathbb{Z} \,\,\,\, \tau \in \mathbb{R} \implies \textbf{P}_{avg} \in \mathbb{R}_{+} $$

A signal that has a finite energy, is absolutely integrable.
$$ \textbf{E} \,\,\,\, \in \mathbb{R}_{+} \,\,\,\, \iff \displaystyle \int_{-\infty}^{\infty} \Big |f(t) \Big| \,\,\,\, \text{dt} \in \mathbb{R}_{+} $$
$$\textbf{Summary} $$
I needed to correct the mistake first, a power signal may or may not be unbounded, I am not sure about this, but I believe all power signals are bounded. Here is the summary of the other properties:
$$ \begin{array}{|c|c|}
\hline \textbf{Energy signal }& \textbf{Properties} \\
\hline \text{Average power} & \text{ - Zero average power.} \\
\hline \text{Absolutely integrable} & \text{Is absolutely integrable.} \\
\hline
\end{array} $$

$$ \begin{array}{|c|c|}
\hline \textbf{Power signal }& \textbf{Properties} \\
\hline \text{Infinite energy} & \text{ Delivers some power for moments of time large enough the product diverges} \\
\hline \text{Periodicity} & \text{Periodicity implies a power classification} \\
\hline \text{R.M.S value relation} & \text{The avg power is the square of the R.M.S} \\
\hline
\end{array} $$

$$ \begin{array}{|c|c|}
\hline \textbf{Neither E nor P signals }& \textbf{Properties} \\
\hline \text{Infinite power} & \text{ Has infinite power} \\
\hline \text{Energy} & \text{Has infinite energy} \\
\hline \text{Divergent} & \text{Has some limit that diverges $ \pm $ to infinity} \\
\hline
\end{array} $$
The total average power can be used to classify the signal completely, as far as these properties are concerned:
$$ \begin{array}{|c|c|}
\hline \textbf{Power classification of all signals}& \textbf{Properties} \\
\hline \text{0 avg power} & \text{ Finite energy, is an energy signal} \\
\hline \text{Finite average power} & \text{Has infinite energy, is a power signal} \\
\hline \text{Infinite average power} & \text{Neither E nor P signal} \\
\hline
\end{array} $$
Most of the signals encountered in real life are energy signals, they switch off somehwere, and also it is usually that power and energy are analysed not for all time but for a smaller amount of time.
 
Last edited:

1. What is Parseval's theorem?

Parseval's theorem is a mathematical principle that states the total energy of a signal can be calculated by taking the sum of the squared magnitudes of its Fourier coefficients. It is commonly used in signal processing and engineering to analyze the frequency content of a signal.

2. How is Parseval's theorem used to calculate energy?

To calculate energy using Parseval's theorem, the signal must first be transformed into its Fourier representation. Then, the squared magnitudes of each Fourier coefficient are summed together to obtain the total energy of the signal.

3. What types of signals can be analyzed using Parseval's theorem?

Parseval's theorem can be applied to any signal that can be represented as a sum of sinusoidal components, such as audio signals, electrical signals, and even images.

4. Are there any limitations to using Parseval's theorem for energy calculation?

One limitation of Parseval's theorem is that it assumes the signal is periodic. This means that the signal must repeat itself infinitely in time, which may not always be the case in real-world applications.

5. How is Parseval's theorem related to the conservation of energy?

Parseval's theorem is based on the principle of conservation of energy, which states that energy cannot be created or destroyed, only transformed. This means that the total energy of a signal remains the same whether it is in the time domain or frequency domain.

Similar threads

  • Advanced Physics Homework Help
Replies
3
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
861
  • Advanced Physics Homework Help
Replies
5
Views
968
  • Advanced Physics Homework Help
Replies
1
Views
1K
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
347
  • Advanced Physics Homework Help
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
9
Views
875
  • Advanced Physics Homework Help
Replies
1
Views
1K
Back
Top