Calculating Entropy Change in a Thermal Conduction System

  • Thread starter bpaterni
  • Start date
  • Tags
    Entropy
So yeah, thanks again.In summary, the problem involves calculating the total change in entropy from the conduction of 1096 J of heat through a wire with one end at 752 K and the other at 345 K. Using the equation dS = dQ / T, the answer is 1.72 J/K.
  • #1
bpaterni
8
0

Homework Statement


Each end of a metal wire is in thermal contact with a different heat reservoir.
Reservoir 1 is at a temperature of 752 K, and reservoir 2 is at a temperature of
345 K. Compute the total change in entropy that occurs from the conduction of
1096 J of heat through the wire.


Homework Equations


dS = dQ / T


The Attempt at a Solution


I'm really unsure how to solve this question. Would I do something like this:

-1096/752 + 1096/345 =
-1.457 + 3.177 = 1.72 J/K

Or am I totally off base on this one?
 
Physics news on Phys.org
  • #2
bpaterni said:

The Attempt at a Solution


I'm really unsure how to solve this question. Would I do something like this:

-1096/752 + 1096/345 =
-1.457 + 3.177 = 1.72 J/K
I am not sure why you are unsure. You have the right equation and you have the right answer.

AM
 
  • #3
Wow, okay... Thanks for the reassurance then! :)

The question has a point value of 6 points out of a 25 point homework assignment, so I thought there might be more to it than that, but apparently I thought wrong.
 

1. What is entropy?

Entropy is a scientific concept that refers to the measure of disorder or randomness in a system. It is often used in thermodynamics and information theory to describe the amount of energy or information that is unavailable for work or organization.

2. How is entropy related to the Second Law of Thermodynamics?

The Second Law of Thermodynamics states that the total entropy of a closed system always increases over time. This means that the disorder and randomness in a system will naturally tend to increase, and organized systems will eventually break down. Entropy is a measure of this change in disorder.

3. Can entropy be reversed?

In a closed system, the total entropy will always increase over time. However, it is possible to decrease the entropy of a specific part of the system by expending energy. This can be seen in living organisms, which are able to maintain a low entropy state by constantly taking in energy and expending it to maintain order.

4. How is entropy used in information theory?

In information theory, entropy is used to measure the uncertainty or randomness in a set of data. The higher the entropy, the less predictable the data is. This is often used in fields such as cryptography and data compression.

5. Can entropy be calculated?

Yes, entropy can be calculated using mathematical formulas and equations. In thermodynamics, it is calculated using the change in temperature and the heat transfer within a system. In information theory, it is calculated using the probability of different outcomes in a data set.

Similar threads

  • Introductory Physics Homework Help
Replies
4
Views
3K
  • Introductory Physics Homework Help
Replies
2
Views
863
  • Introductory Physics Homework Help
Replies
4
Views
773
  • Introductory Physics Homework Help
Replies
8
Views
1K
  • Introductory Physics Homework Help
Replies
12
Views
3K
  • Introductory Physics Homework Help
Replies
3
Views
3K
  • Introductory Physics Homework Help
Replies
6
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
3K
  • Introductory Physics Homework Help
Replies
16
Views
1K
  • Introductory Physics Homework Help
Replies
10
Views
1K
Back
Top