Hi.
$$ \textbf{Definition of Energy and power of a signal} \tag*{} $$
This derives from the definitions given in pure physics and from electrical circuits.
$$ \text{Energy} = \text{Power} \cdot \text{Time} \tag*{} $$
$$ \text{Power} = \dfrac{\text{Work done (Energy) } }{\text{Time taken} } \tag*{} $$
The energy of a signal and power are related by the following above equations.
In circuit theory, the instantaneous power through a element given lumped matter constraints are obeyed is given by:
$$ p(t) = i(t) \cdot v(t) = \dfrac{v^{2}(t)}{R} = i^{2}(t) \cdot R = {v^{2}(t)}{G} \tag*{} $$
The instantaneous power of any function is thus defined against a reference resistance of 1 ohm for all signals as:
$$ \text{P}_{inst} = f^{2}(t) \tag*{} $$
The total energy of a function is defined as:
$$ \text{E} = \displaystyle \int_{-\infty}^{\infty} f^{2}(t) \,\,\,\, \, \text{d}t \tag*{} $$
$$ \text{E} = \displaystyle \sum_{n \to -\infty}^{n \to \infty} x^{2}[n] \tag*{} $$
The average power of a function is given by:
$$ \text{P}_{avg} = \displaystyle \lim_{T \to \infty} \dfrac{1}{2T} \cdot \text{E} = \displaystyle \lim_{T \to \infty} \dfrac{1}{2T} \displaystyle \int_{-\infty}^{\infty} f^{2}(t) \,\,\,\, \, \text{d}t \tag*{} $$
$$ \text{P}_{avg} = \displaystyle \lim_{n \to \infty} \dfrac{1}{2n + 1} \displaystyle \sum_{n }^{-n} x^{2}[n] \tag*{} $$
$$ \textbf{There are signals for whom their power has no physical significance} $$
For example, let there be an average total cost function of a firm:
$$ c(x) = { ( x - 120) }^{2} + 10 $$
The units of this function are in terms of money. Even then, money cannot be multiplied by some other unit to produce Watts. But we can still use the definition of power to find the power of this signal, because it is more about analysing signals, from what we were told.
If there is a power function:
$$ p(t) $$
Then we can still find the power and energy content of this signal, by taking the definitions of the integrals and solving, to me, and from the books I read, it is as simple as that. It is about signals and the way we can characterise and analyse them, and signals exist for whom there is no physical meaning of their power, but it is very important to be able to classify signals as power and energy signals. As you will see in this post.
$$ \textbf{Parsevals relation and the Fourier transform} \tag*{} $$
A possible alternative to find the total energy of such signals is to use parsevals relation which states that f(t) has a Fourier transform of X(j \omega) then:
$$ E = \displaystyle \int_{-\infty}^{\infty} f^{2}(t) \,\,\,\, \text{dt} = \dfrac{1}{2 \pi} \displaystyle \int_{-\infty}^{\infty} {| X(j \omega)| }^{2} \,\,\,\, \text{d}\omega \tag*{} $$
$$ \textbf{Energy and power signals} $$
A signal that has a finite total energy E has zero average power over all time. This is called an energy signal. Intuitively, energy signals start somewhere and they die out somewhere.
A signal that has a finite total average power P has infinite total energy over all time. This is called a power signal. Power signals have infinite total energy because for every moment of time from minus infinity to infinity, they are delivering some average power.
It is of critical importance to be able to classify signals as power or energy.
$$ \textbf{Importance of energy signals} \tag*{} $$
If a signal, is not an energy signal, then for sure its Fourier transform does not exist. If a signal is an energy signal, then this is a good start, its Fourier transform might exist, subject to other conditions. An LTI system is BIBO stable if its impulse response is an energy signal OR at least bounded.
If an LTI system is not BIBO stable, then any input signal to that system, would keep growing without bounds.
The laplace transform exists for power signals. The Fourier transform is mostly limited to energy signals. Apart from that, to be able to classify signals as power and energy allows you to understand them intuitively. Systems whose impulse responses are energy signals, are stable, or at least the impulse response must be bounded.
$$ \textbf{Addendum : Sinusoids whose Fourier transform exists}$$
Your sine and cosines are the only functions which I know whose Fourier transform exists despite them not being energy signals.
$$ \textbf{Intuition of putting power or energy signals into systems} $$
Well, let's say you have some physical system, an electric circuit, specifically, an RC circuit. The resistor and capacitor have voltage and power ratings. If you insert a power signal as your input, and don't turn it off, the capacitor and resistor, will eventually reach voltages exceeding their safe operating ones.
In another system, let's say, one modelling a towns financial response, and you insert a spender, someone, whose signal is an energy signal, this person or entity, he will blow up the towns financial response. Sorry for these crude analogies, but there is an intuition behind classing signals the way we do.
$$ \textbf{Summary} $$
The power of signals derives from a reference resistance of one Ohm, however for many signals, the power and energy has no physical significance, this is still fine, as the concepts for power and energy have been generalised for all signals that give us numbers. I do not know how to define power, for functions that return objects other than numbers.
It is important to be able to know whether a signal is an energy signal, or a power signal. If a power signal, then it has infinite energy, ie, it isn't dying out. If its an energy signal, it does die out. I've glossed over some issues, and taken some liberties, but most of all, the most important take is that despite deriving from the power and energy definitions in physics, power and energy content exists for all single variable functions that have the domain and codomain as the complex numbers or a subset.