Non-asymptotic Equipartition Properties with Respect to Information Quantities

Prof. En-hui YANG
University of Waterloo
Date: 
Wednesday, 7 December, 2011
Time: 
11:00 am – 12:00 pm
Venue: 
Room 833, Ho Sin Hang Engineering Building, CUHK
Abstract: 

The asymptotic equipartition property (AEP) is fundamental to information theory. It is not only instrumental to source coding theorems, but also behind almost all asymptotic coding (including source, channel, and multi-user coding) theorems in information theory through the concepts of typical sets and typical sequences.

However, in the non-asymptotic regime where one wants to establish non-asymptotic coding results for finite block length n, the AEP in its current form cannot be applied in general. In this talk, we will first present our newly established non-asymptotic counterpart of the AEP, which is broadly referred to as the non-asymptotic equipartition property (NEP), and can be applied to any information quantity such as entropy, conditional entropy, mutual information, and relative entropy (or divergence). Using the NEP with respect to entropy, conditional entropy, or relative entropy, we will then discuss our newly established non-asymptotic source coding and channel coding theorems, which reveal, for any finite block length n, a complete picture about the trade-off between the rate gap (defined as the difference between the coding rate and entropy (in lossless source coding) or channel capacity (in channel coding)) and word error probability when the word error probability is a constant, or goes to zero with block length n at a sub-polynomial, polynomial, or sub-exponential speed. For example, this complete picture tells us that for any binary input memoryless channel with uniform capacity achieving distribution, if the word error probability goes to zero with block length n at a polynomial speed with degree d, the coding rate of linear block codes is only about σ H (X |Y)  (2d lnn/n) + (d lnn/n) away from the channel capacity; if the word error probability is kept slightly above 0.5, the coding rate of linear block codes can be even slightly above the capacity!  In the case of binary symmetrical channel with cross over probability p=0.12, at the block length 1000, the word error probability is around 0.65, and the coding rate is 0.21% above the capacity! A similar trade-off is also valid for fixed rate lossless source coding.


Biography: 

En-hui Yang has been with the Dept. of Electrical and Computer Engineering, University of Waterloo, Ontario, Canada since June 1997. He spent his sabbatical leave at the Chinese University of Hong Kong from 2003 to 2004. He is the founding director of the Leitch-University of Waterloo multimedia communications lab, and a co-founder of SlipStream Data Inc. (now a subsidiary of Research In Motion). He currently also serves as an Associate Editor for IEEE Transactions on Information Theory, and is sitting on the Overseas Expert Advisory Committee for the Overseas Chinese Affairs Office of the State Council of China.

A Fellow of IEEE, the Canadian Academy of Engineering, and the Royal Society of Canada (The Academies of Arts, Humanities and Sciences of Canada), Dr. Yang is also a recipient of several research awards including the prestigious Inaugural (2007) Ontario Premier’s Catalyst Award for the Innovator of the Year, and the 2007 Ernest C. Manning Award of Distinction, one of the Canada’s most prestigious innovation prizes. Products based on his inventions and commercialized by SlipStream received the 2006 Ontario Global Traders Provincial Award. With over 180 papers and around 200 patents/patent applications worldwide, his research work has had a great impact on the daily life of hundreds of millions people over 170 countries either through commercialized products, or video coding open sources, or video coding standards. In 2011, he was selected for inclusion in Canadian Who’s Who. 

«
»