Information theory MOC

Shannon entropy

The Shannon entropy or information-theoretic entropy of a discrete random variable 𝑋 :πœ‰ →𝑀 is the Expectation of its Shannon information 𝐼𝑋, info

𝐻[𝑋]=𝔼⁑[𝐼𝑋(𝑋)]=βˆ’βˆ‘π‘₯βˆˆπ‘€π‘π‘‹(π‘₯)log𝑏⁑𝑝𝑋(π‘₯)

where 𝑏 =2 corresponds to the unit Sh, 𝑏 =𝑒 corresponds to the unit nat, and 𝑏 =10 corresponds to the unit Hart. Shannon entropy is closely related to thermodynamic entropy.


develop | en | SemBr