Thank you in advance... 1. The problem statement, all variables and given/known data Suppose a power plant delivers energy at 918MW using steam turbines. The steam goes into the turbines superheated at 626K and deposits its unused heat in river water at 286K. Assume that the turbine operates as an ideal Carnot engine. If the river flow rate is 40.4m^3/s, calculate the average temperature increase (in Celsius) of the river water downstream from the power plant. What is the entropy increase per kilogram of the downstream river water? 2. Relevant equations (Q_H)/(T_H)=(Q_L)/(T_L) delta(Q)=(m)(c)(delta(T)) delta(S)=(m)(c)ln((T_F)/(T_I)) 3. The attempt at a solution (918 MW)/(626K)=(Q_L)/(286K) so Q_L= 419.4 MW (419400000 J/s)= (40.3 m^3/s)*(10^6 g/m^3)*(4.186 J/g/K)*delta(T) so delta(T)= 2.5 celcius degrees right?