In 1948 Claude E. Shannon in "The Mathematical Theory of Communication"' has introduced his famous formula for the entropy of a discrete set of probabilities p1, ... , p
n:
H=-∑pilog2pi
We will apply this formula to an arbitrary text string by letting p
i be the relative frequencies of occurrence of characters in the string. For example, the entropy of the string "Northeastern European Regional Contest" with the length of 38 characters (including 3 spaces) is 3.883 with 3 digits after decimal point. The following table shows relative frequencies and the corresponding summands for the entropy of this string.
Your task is to find a string with the given entropy.