[Home]Entropy

HomePage | Recent Changes | Preferences

Difference (from prior major revision) (minor diff, author diff)

Changed: 1c1
Entropy (S): A measure for the part of energy of a system which cannot be used to do work. In a wider sense, it can also be interpreted as a measure for the disorder of a system.
Entropy (S) is a measure for the part of energy of a system which cannot be used to do work. In a wider sense, it can also be interpreted as a measure for the disorder of a system.

Changed: 5c5
S = k ln &omega
S = k ln Ω

Changed: 7c7
where k is Boltzman's constant and &omega is the number of possible states that a system may be in. This definition of entropy is similar to the definition of entropy in information theory and therefore reveals deep connections between information theory and thermodynamics.
where k is Boltzman's constant and Ω is the number of possible states that a system may be in. This definition of entropy is similar to the definition of entropy in information theory and therefore reveals deep connections between information theory and thermodynamics.

Entropy (S) is a measure for the part of energy of a system which cannot be used to do work. In a wider sense, it can also be interpreted as a measure for the disorder of a system.

Entropy is defined as

S = k ln Ω

where k is Boltzman's constant and Ω is the number of possible states that a system may be in. This definition of entropy is similar to the definition of entropy in information theory and therefore reveals deep connections between information theory and thermodynamics.


HomePage | Recent Changes | Preferences
This page is read-only | View other revisions
Last edited December 18, 2001 4:53 pm by 129.132.139.xxx (diff)
Search: