Suppose a power plant delivers energy at 980 MW using steam turbines. The steam goes into the turbine superheated at 625 K and deposits its unused heat in river water at 285 K. Assume that the turbine operates as an ideal Carnot engine. If the river flow rate is 37 m^3 / s, estimate the average temperature increase of the river water immediately downstream from the power plant. What is the entropy increase per kilogram of the downstream river water in J / kg * K? What I've got so far, is this right?: e ideal = [ Th - Tl ] / Th = [625 - 285] / 625 = .544 (don't know if this has any relevance in the question being asked) density of water - 1000 kg / m^3 flow of 37 m^3 / s times 1000 kg / m^3 = 37,000 kg / s 980 MW = 980 MJ / s = 980000 kJ 980 MW = flow (kg/s) * heat capacity (kJ / kg / degrees Celsius) * T (degrees Celsius) T = [980000 / (37000 * 4.186)] = 6.327 degrees Celsius ?