Since the beginning, seeing the first definition of electric potential difference, ΔV = ΔU/q, I have had somewhat a harder time understanding it, or more so, utilizing it. While I have begun to understand it better now, I am still quite confused as to what it means in a circuit with an emf device. Take for example a simple single loop circuit with 1 battery with potential difference V and 1 resistor with resistance R. Assume that wires connecting the circuit are of negligible resistance. Why is it that the electric potential taken over any one side of the circuit is constant, or its difference is zero? That is, the potential difference has to be taken over the terminals or over the resistor to obtain any potential difference. The reason this doesn't make sense to me is that if ΔV = ΔU/q = -∫E ds and there is an electric field setup in the wire causing current flow, then as each electron flows, it surely does lose potential, even when there is no resistance because there is both electric field E and displacement ds. I find it easier to grasp this when thinking of a similar circuit but instead with a capacitor. Here it makes sense to me as once the capacitor is charged, there is no longer flow and the build up of charge on the plates of the capacitor causes a electric field over a distance of their separation which is indeed ΔV. When you have current flow, it just confuses me though.