[Information Theory] Entropy, KL Divergence, and Mutual Information

[Information Theory] Entropy, KL Divergence, and Mutual Information

2023, Feb 04    


Contents

  • Entropy, Cross Entropy, Conditional Entropy
  • KL Divergence
  • Mutual Information
  • KL Divergence of Two Different Normal Distribution & Bernoulli Distribution

Notes