[Home]History of Entropy

HomePage | Recent Changes | Preferences

Revision 5 . . (edit) December 18, 2001 4:53 pm by (logged).132.139.xxx
Revision 4 . . (edit) November 21, 2001 3:32 am by Chenyu
Revision 3 . . November 20, 2001 2:01 pm by Chenyu
Revision 2 . . November 20, 2001 2:01 pm by Chenyu
  

Difference (from prior major revision) (minor diff, author diff)

Changed: 1c1
Entropy (S): A measure for the part of energy of a system which cannot be used to do work. In a wider sense, it can also be interpreted as a measure for the disorder of a system.
Entropy (S) is a measure for the part of energy of a system which cannot be used to do work. In a wider sense, it can also be interpreted as a measure for the disorder of a system.

Changed: 5c5
S = k ln &omega
S = k ln Ω

Changed: 7c7
where k is Boltzman's constant and &omega is the number of possible states that a system may be in. This definition of entropy is similar to the definition of entropy in information theory and therefore reveals deep connections between information theory and thermodynamics.
where k is Boltzman's constant and Ω is the number of possible states that a system may be in. This definition of entropy is similar to the definition of entropy in information theory and therefore reveals deep connections between information theory and thermodynamics.

HomePage | Recent Changes | Preferences
Search: