Entropy is defined as
S = k ln &#omega
where k is Boltzman's constant and &#omega is the number of possible states that a system may be in. This definition of entropy is similar to the definition of entropy in information theory and therefore reveals deep connections between information theory and thermodynamics.