Time-averages of superposition of waves.

Homework Statement

Consider the superposition of two waves;

$\zeta_1 + \zeta_2 = \zeta_{01} e^{i(kr_1 - wt)} + \zeta_{02} e^{i(kr_2 - wt + ∅)}$

where $∅$ is a phase difference that varies randomly with time. Show that the time-averages satisfy;

$<|\zeta_1 + \zeta_2|^2> = <|\zeta_1|^2> + <|\zeta_2|^2>$

Homework Equations

(1) If it wasn't clear, The two waves are;

$\zeta_1 = \zeta_{01} e^{i(kr_1 - wt)}$ and
$\zeta_2 = \zeta_{02} e^{i(kr_2 - wt + ∅)}$

The Attempt at a Solution

Unless I have my definition of time-average wrong. I can't seem to get this to work.

$|\zeta_1 + \zeta_2|^2 = (\zeta_1 + \zeta_2)(\zeta_1^* + \zeta_2^*) = |\zeta_1|^2 + |\zeta_2|^2 + \zeta_1\zeta_2^* + \zeta_2\zeta_1^* = |\zeta_1|^2 + |\zeta_2|^2 + 2\zeta_{01}\zeta_{02}cos(k(r_1 - r_2) - ∅)$

Then, I believe, the time average is given by;

$\frac{1}{T}\int^T_0 {|\zeta_1|^2 + |\zeta_2|^2 + 2\zeta_{01}\zeta_{02}cos(k(r_1 - r_2) - ∅)} dt$

However, I don't see how this turns in to the form I desire. It would require that the last term (containing the cosine) is time-averaged to zero. Can this be the case? Also, can $∅$ still even be considered a function of time when it varies RANDOMLY?