Consider a curve in R2 given in polar coordinates r=r(θ) for θ1<= θ <= θ2. Show that the line integral is equal to the integral from θ1 to θ2 of f(r*cosθ, r*sinθ) sqrt (r^2 + (dr/dθ)^2) dθ
x= cos θ, y= sin θ
The Attempt at a Solution
I understand that the curve does not necessarily have to be linear. So in Polar Coordinates we can let x=r*cosθ and y=r*sinθ as our parametrization for the curve. Line integral tells us that it is equal to the integral of the region (θ1 to θ2) of f(g(θ)) multiply by the magnitude of g(θ). g(θ) = (r*cos θ, r*sin θ), and the square root term is the magnitude of g(θ). My question is, where did the dr/dθ come from? What does it mean graphically?