1. The problem statement, all variables and given/known data A long, straight power line is made from a wire with radius ra = 1.0 cm and carries a line charge density λ = 2.6 μC/m. Assuming there are no other charges present, calculate the potential difference between the surface of the wire and the ground, a distance of rb = 22 m below. 2. Relevant equations ΔV= -∫E⋅ds E due to infinite line of charge: 2kλ/r 3. The attempt at a solution what I did was -2kλ∫1/r dr, with limits .01m to 22m. So -2kλ(ln(22/.01). I feel like this might be wrong because I am only taking into account the bottom of the wire aren't I?