Page 1 of 1

### Definition of Entropy

Posted: Sun Jan 28, 2018 7:47 pm
Since we should not use the concept of disorder when defining/explaining entropy, what would be the correct definition? How would you explain entropy?

### Re: Definition of Entropy

Posted: Sun Jan 28, 2018 7:52 pm
We could define entropy as a property of a particular system that describes the likelihood/probability that the system will be in a particular state.

### Re: Definition of Entropy

Posted: Sun Jan 28, 2018 8:23 pm
The second law of thermodynamics states that an isolated system's entropy never decreases!

### Re: Definition of Entropy

Posted: Sun Jan 28, 2018 11:05 pm
Entropy of a system can be defined as being the thermodynamic quantity that demonstrates how a system's thermal energy cannot be converted into mechanical work. Usually it is interpreted as being the degree of disorder or randomness in a system.

### Re: Definition of Entropy

Posted: Sun Jan 28, 2018 11:18 pm
entropy can be thought of as the number of possible states that a system can occupy. this idea is pretty synonymous with disorder because the more disorder, the more places things can be.

### Re: Definition of Entropy

Posted: Mon Jan 29, 2018 2:49 pm
Kaileigh Yang 2I wrote:Entropy of a system can be defined as being the thermodynamic quantity that demonstrates how a system's thermal energy cannot be converted into mechanical work. Usually it is interpreted as being the degree of disorder or randomness in a system.

Could you elaborate more on this? I think I understand what you're saying, but I just need clarification on what is meant by "how a system's thermal energy cannot be converted into mechanical work." Thanks!

### Re: Definition of Entropy

Posted: Mon Jan 29, 2018 3:33 pm
I have a question about how to determine whether entropy is increasing or decreasing based on a chemical reaction? The solutions manual says an increase in moles on the product side shows an increase in entropy. Can someone explain this?

### Re: Definition of Entropy

Posted: Tue Jan 30, 2018 11:28 pm
Another way to think about entropy is to consider the number of possible positions and motions (microstates) a molecule can take on in a given system. As entropy increases, so does the number of positions and motions available to the molecule.

For example: in the expansion of a gas, the molecules of gas are given more space and are thus less constrained and able to move about more freely. They have more positions available to them and move about at a larger range of speeds, so they have more entropy.

### Re: Definition of Entropy

Posted: Wed Jan 31, 2018 10:32 am
To add some foundation, entropy is an extensive property (it depends on the number of particles we are calculating for), so when the number of particles increases, the degeneracy increases as well (W = 2^n). Thus, based on the entropy equation, S = kb (ln W), as W increases, so does S.

### Re: Definition of Entropy

Posted: Wed Jan 31, 2018 11:05 am
I think we should think about entropy as corresponding to the possible number of states that could exist, which comes from the equation S = k ln W. As volume increases, the number of possible states a gas could be in increases, so does entropy of the system.

### Re: Definition of Entropy

Posted: Wed Jan 31, 2018 11:33 am
The more moles of gas there are in the system, the greater the entropy because W=number of possible states^number of molecules, and when you increase the number of moles in a system you also increase the number of molecules.