[Home]Entropy

HomePage | Recent Changes | Preferences

Showing revision 2
Entropy (S): A measure for the part of energy of a system which cannot be used to do work. In a wider sense, it can also be interpreted as a measure for the disorder of a system.

Entropy is defined as

S = k ln &omega

where k is Boltzman's constant and &omega is the number of possible states that a system may be in. This definition of entropy is similar to the definition of entropy in information theory and therefore reveals deep connections between information theory and thermodynamics.


HomePage | Recent Changes | Preferences
This page is read-only | View other revisions | View current revision
Edited November 20, 2001 2:01 pm by Chenyu (diff)
Search: