| Units of information |
| Information-theoretic |
|---|
| Data storage |
| Quantum information |
Thenatural unit of information (symbol:nat),[1] sometimes alsonit ornepit, is a unit ofinformation orinformation entropy, based onnatural logarithms and powers ofe, rather than the powers of 2 andbase 2 logarithms, which define theshannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.
One nat is equal to1/ln 2 shannons ≈ 1.44 Sh or, equivalently,1/ln 10 hartleys ≈ 0.434 Hart.[1]
Boulton andWallace used the termnit in conjunction withminimum message length,[2] which was subsequently changed by theminimum description length community tonat to avoid confusion with thenit used as a unit ofluminance.[3]
Alan Turing used thenaturalban.[4]
Shannon entropy (information entropy), being theexpected value of the information of an event, is inherently a quantity of the same type and with a unit of information. TheInternational System of Units, by assigning the same unit (joule perkelvin) both toheat capacity and tothermodynamic entropy implicitly treats information entropy as aquantity of dimension one, with1 nat = 1.[a] Systems of natural units that normalize theBoltzmann constant to 1 are effectively measuring thermodynamic entropy with the nat as unit.
When the Shannon entropy is written using a natural logarithm,it is implicitly giving a number measured in nats.