[web] [lect]

Information entropy

Boltzmann's entropy of a macrostate
$$S=k_B\ln\Omega$$ all microstates are equally probable
statistical weight of a macrostate, $\Omega$
$$S=-k_B\sum_ip_i\ln{p_i}$$ probability of a microstate, $p_i$
information
binary chunks: bits
information entropy
$$S=-\sum_i\log_2p_i\qquad\qquad\color{grey}{[\ln{x}=\ln{2}\cdot\log_2x]}$$

Fig.: .

thermodynamics:
Gibbs's algorithm
$${\rm d}\ln{\Omega}\overset{!}{=}0$$ find the most probable macrostate
Gaussian (random) distribution

Fig.: .

information theory:
find the information content of a message
arbitrary distribution functions
surprisal

Messages

Fig.: .

letter frequencies
surprisal of a particular letter letter combinations: th, qu...

Fig.: .
Fig.: .

data compression
chain of binary decisions
Huffman encoding
Huffman tree
number of binary decisions needed = information entropy of message

Analogy or equivalence?

Fig.: .

Maxwell's daemon
second law

Fig.: .

Szilard's engine
1 bit of information destroyed $$k_BT\ln{2}$$ work extracted

Case study:
S Toyabe, T Sagawa, M Ueda, E Muneyuki, M Sano
Information heat engine: converting information to energy by feedback control
Nature Phys 6 (2010) 988

Fig.: .
Fig.: .

information heat engine
Landauer's principle