1. The problem statement, all variables and given/known data Every second at Niagara Falls, some 5.0 10^3 m3 of water falls a distance of 50.0 m. What is the increase in entropy per second due to the falling water? Assume that the mass of the surroundings is so great that its temperature and that of the water stay nearly constant at 20.0°C. Also assume a negligible amount of water evaporates. 2. Relevant equations S = ∫ dQ/T 3. The attempt at a solution Well, if I divided across by Δt, then I would have an equation set up for the quantity I need. Temperature is constant, but since there is no real change in temperature or phase, is there any real change in heat? I'm not sure of where to go from here. EDIT: Nevermind, I figured it out. Potential Energy converts to Kinetic Energy then to Heat Energy. Entropy is Q/T.