When a point source emits sound, the sound travels away from the source as a series of wavefronts - all being spherical shells - away from the source right? Now, we say the energy is conserved if we neglect damping forces in the medium, and so the power delivered by the source should be equal to the power delivered by a single wavefront at a certain distance from the source. Power equals the intensity at a wavefront times the area of that wavefront. And, the intensity from a particular sound wave is proportional to the square of the pressure amplitude of the wave. So, when a sound wave reaches a particular distance from the source, the pressure amplitude of the wave should decrease, as, at that distance, it is part of a wavefront, whose sound intesity is less than that of another wavefront closer to the source. But, we also consider energy to be conserved for a single wave. But this is not possible, if we proceed by the above logic that explains why the pressure amplitude of a wave should decrease as it moves away from the source.....??? This apparent paradox is gonna make me mad....someone help, please!!