Problem applying divergence theorem to wave equation

1. Dec 22, 2011

diligence

I'm an undergrad doing research in PDE and my adviser gave me some material to read over the holiday. But I'm getting stuck at the beginning where the divergence theorem is applied to a calculation. Maybe somebody can help me?

Without getting too detailed about the context of the problem (so we don't get bogged down in material irrelevant to my question), let u(x,t) be a solution to the wave equation, and suppose that it's already been shown that the following integral is a divergence, and can therefore be written as a surface integral:

$\int\int (u_{tt} - Δ_x u)[(r^2 + t^2)u_t + 2t(ru)_r ]dxdt = \int_{\partial} (pn + qn_{t})dS = 0$

where ∂ is a 3-d surface with surface element dS, n is the space component of the outward normal, and n_t is the time component.

My question is how do I find p and q? I know I have to find an expression that represents the divergence of p and the time derivative of q by integrating by parts on the left hand side a gajillion times, but I'm confused as to whether I actually evaluate the integrals or what? Is the following correct as far as what I'm looking for is concerned?

$\int\int (u_{tt} - Δ_x u)[(r^2 + t^2)u_t + 2t(ru)_r ]dxdt = \int\int (\nabla \cdot p + q_t )dxdt$

By the way, this is from Appendix 3 in the book - Scattering Theory - by Lax, if anybody happens to know this stuff...