/* ---- Google Analytics Code Below */

Sunday, October 13, 2019

Information Entropy and Data

With a background in physics this is a great topic.  It links the universe to information technologies in interesting ways.  Even includes a hint at the nature of 'surprise'.  At very minimum impress your friends.

 Gentle Introduction to Information Entropy   by Jason Brownlee  in Probability

Information theory is a subfield of mathematics concerned with transmitting data across a noisy channel.

A cornerstone of information theory is the idea of quantifying how much information there is in a message. More generally, this can be used to quantify the information in an event and a random variable, called entropy, and is calculated using probability.

Calculating information and entropy is a useful tool in machine learning and is used as the basis for techniques such as feature selection, building decision trees, and, more generally, fitting classification models. As such, a machine learning practitioner requires a strong understanding and intuition for information and entropy.

In this post, you will discover a gentle introduction to information entropy.

After reading this post, you will know:

Information theory is concerned with data compression and transmission and builds upon probability and supports machine learning.

Information provides a way to quantify the amount of surprise for an event measured in bits.
Entropy provides a measure of the average amount of information needed to represent an event drawn from a probability distribution for a random variable.

Discover bayes opimization, naive bayes, maximum likelihood, distributions, cross entropy, and much more in my new book, with 28 step-by-step tutorials and full Python source code.

Let’s get started.    .... " 

No comments: