Variable length coding exercises

First of all, read my Information Theory Notes.  These exercises are similar to, but not exactly the same as the 3 exercises at the end of the Notes.

You'll be writing your answers into the Comments-to-Teacher.

For each of the exercises below, I'll be asking you to provide:

a) The Shannon entopy which is equal to the minimum average number of bits required to send the results of a single flip of the type of coin in the question.

b) The coding itself that you've come up with (see example below)

c) The average number of bits required by your coding scheme.

Note:  You do not have to automatically think that grouping is necessary to reduce the number of bits.  Sometimes, just using variable-length encoding of a single result is pretty good.

Example from class: come up with a better-than-the-trivial coding scheme for a two-sided weighted coin whose probabilities are: H: 0.75, T: 0.25

Answer:

a) Shannon entropy is ~0.811

b) use a grouping of 2 coins:  HH: 0,  HT: 10,  TH: 100,  TT: 101

c) that coding average is ~0.844

Exercises:

Ex. 2: You have a 3-sided coin, with sides H (heads), T (tails), and M (midriffs).  M has prob. 0.5, and the other two have 0.25 each. 

Ex. 1: You have a 2-sided coin, with probabilities H: 0.75, T: 0.25, and you'll use a group of 3 results at a time, instead of the 2 we used in class.

Ex. 3: You have a 5-sided coin with equal probabilities.