Entropies, Codes and Groups

Dr. Terence CHAN
University of South Australia
Monday, 17 December, 2012
2:30 – 3:30 pm
Room 833, Ho Sin Hang Engineering Building

Entropy functions and information inequalities are some of the most important tools in communications. They are particularly useful in deriving converse coding theorems in communications and compression problems. In this talk, we will examine some interesting properties about entropies and information inequalities. Our focus will be on the interplay among codes, entropies and groups.  

By treating a code (either a classical error control code, a secret sharing code, or a network code) as a set of random variables, many properties of the code can be re-interpreted “information-theoretically”. One example is the Generalised Greene Theorem, which says that the weight enumerating polynomial of a code can be determined by the entropies of the codeword symbol random variables. By exploiting the link between codes and entropies, we have extended Delsarte’s linear programming bound for a boarder class of codes including the secret sharing codes. Furthermore, by constructing the random variables (and hence the induced code) from a set of subgroups, we obtained a new group inequality. 


Terence Chan received his Ph.D. degree in Information Engineering in 2001 from The Chinese University of Hong Kong. Upon receiving his degree, he was a visiting assistant professor in the Department of Information Engineering at the same university. From February 2002 to June 2004, he was a Post-doctoral Fellow at the Department of Electrical and Computer Engineering at the University of Toronto. He was an assistant professor at University of Regina between 2004 and 2006. He is now a Senior Research Fellow in Institute for Telecommunications Research at University of South Australia.  

His research interests are on information theory and network communications. He received the Croucher Foundation Fellowship and Sir Edward Youde Fellowship in 2002 and 2000 respectively.