Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL
E |
---|
Entropy | ||||||||
---|---|---|---|---|---|---|---|---|
The following introduces Shannon entropy before von Neumann entropy. Shannon entropyThe Shannon entropy of a random variable, say , measures the uncertainty of . Intuition:
The following definition of entropy, denoted by , measured in number of bits, can reflect the two extreme cases above [MC12, Definition 5.4]: where denotes the probability of taking on the th value. ⚠ Note: 1️⃣ ; 2️⃣ number of bits is discrete in practice but as a metric of comparison, we need entropy to be a continuous-valued metric. If has possible values, and has , the joint entropy of and is defined as [MC12, Definition 6.2]: The conditional entropy of given is defined as [MC12, (6.53)]: The chain rule for entropy states [MC12, p. 151; Gra21, (2.48)]: or more generally, Von Neumann entropyThe Shannon entropy measures the uncertainty associated with a classical probability distribution. Quantum states are described in a similar fashion, with density operators replacing probability distributions. The von Neumann entropy of a quantum state, , is defined as [NC10, Sec. 11.3]: where denotes the base-2 logarithm of and not the element-wise application of base-2 logarithm to . Watch an introduction to the matrix logarithm on YouTube: References
| ||||||||