Gradient and Divergent Identities

bugatti79
Messages
786
Reaction score
4

Homework Statement



I need to show that ##\displaystyle\int_\Omega (\nabla G)w dxdy=-\int_\Omega (\nabla w) G dxdy+\int_\Gamma \hat{n} w G ds## given

##\displaystyle \int_\Omega \nabla F dxdy=\oint_\Gamma \hat{n} F ds## where ##\Omega## and ##\Gamma## are the domain and boundary respectively. F,G and w are scalar functions...any ideas?

I attempted to expand the LHS but I didnt feel it was leading me anywhere...

Homework Equations


The Attempt at a Solution



##\displaystyle \int_\Omega (\hat{e_x}\frac{\partial G}{\partial x}+\hat{e_y}\frac{\partial G}{\partial y})w dx dy##...

NOTE: I have posted this query on MHF 3 days ago and nobody has answered. Here is the link just in case somebody has replied. thanks http://mathhelpforum.com/calculus/200911-gradient-divergent-identities.html
 
Physics news on Phys.org
Integration by parts, taking u= w, dv= \nabla G dx dy.
 
bugatti79 said:

Homework Statement



I need to show that ##\displaystyle\int_\Omega (\nabla G)w dxdy=-\int_\Omega (\nabla w) G dxdy+\int_\Gamma \hat{n} w G ds## given

##\displaystyle \int_\Omega \nabla F dxdy=\oint_\Gamma \hat{n} F ds## where ##\Omega## and ##\Gamma## are the domain and boundary respectively. F,G and w are scalar functions...any ideas?

Homework Equations


The Attempt at a Solution


HallsofIvy said:
Integration by parts, taking u= w, dv= \nabla G dx dy.

If we let ##u=w## then ##du=dw=\nabla w##?

##\displaystyle dv=\nabla G dxdy## then

##\displaystyle v=\int_\Omega \nabla G dxdy=\int_\Gamma (\hat{n_x} \hat{e_x}+ \hat{n_y} \hat{e_y})Gds##

Thus

##\displaystyle ∫_Ω(∇G)wdxdy= \int_\Gamma (\hat{n_x} \hat{e_x}+ \hat{n_y} \hat{e_y})G w ds- \int \int_\Gamma (\hat{n_x} \hat{e_x}+ \hat{n_y} \hat{e_y})G \nabla w ds##

Clearly I have gone wrong somewhere...?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top