A Note on the Shannon Entropy of Short Sequences

 

It is introduced the information fluctuation of a source, $F(U)$, which corresponds to the second central moment of the random variable that measures the information content of a source symbol. For source sequences of length $L$ symbols we proposed to use a more realistic value to the usual benchmark of number of code letters by source letters. Our idea is based on a quantifier of information fluctuation of a source, $F(U)$, which corresponds to the second central moment of the random variable that measures the information content of a source symbol. An alternative interpretation of typical sequences is also provided through this approach.