Multivariate mutual information: From secret key agreement to clustering of random variables
A new notion of multivariate mutual information (MMI) is introduced based on the concept of residual independence. Through the theory of Dilworth truncation for submodular functions, it can be shown that the proposed MMI provides a precise characterization of the fundamental limits of multi-terminal secret key agreement. Furthermore, we show that the proposed MMI measure satisfies many fundamental information-theoretic properties that previously proposed notions of multivariate correlation do not satisfy. We conclude this talk by demonstrating that the proposed MMI measure leads to efficient clustering of random variables in machine learning.
This is a joint work with Prof. Chung Chan from the Institute of Network Coding at the Chinese University of Hong Kong.
A brief bio of Prof. Tie Liu:
Tie Liu was born in Jilin, China in 1976. He received his B.S. (1998) and M.S. (2000) degrees, both in Electrical Engineering, from Tsinghua University, Beijing, China and a second M.S. degree in Mathematics (2004) and Ph.D. degree in Electrical and Computer Engineering (2006) from the University of Illinois at Urbana-Champaign. Since August 2006 he has been with Texas A&M University, where he is currently an Associate Professor with the Department of Electrical and Computer Engineering. His primary research interest is in the area of information theory and statistical information processing.
Dr. Liu received an M. E. Van Valkenburg Graduate Research Award (2006) from the University of Illinois at Urbana-Champaign and a Faculty Early Career Development (CAREER) Award (2009) from the National Science Foundation. He was a Technical Program Committee Co-Chair for the 2008 IEEE Global Communications Conference (GLOBECOM) and a General Co-Chair for the 2011 IEEE North American School of Information Theory. He currently serves as an Associate Editor for Shannon Theory for the IEEE Transactions on Information Theory.