Yury Polyanskiy
        Assistant Professor

              Curriculum Vitae
              Publications & Preprints

Research:   Information theory, coding theory & related fields

Education:     2010   Princeton University, Ph.D. (Advisers: H. V. Poor and S. Verdú)
2005   MIPT, M. S. (honors), Dept. General and Applied Physics

Address:     32-D668, MIT
77 Massachusetts Avenue
Cambridge, MA 02139
Phone:     (617) 324-0047

About me

I am Robert J. Shillman Assistant Professor of Electrical Engineering and Computer Science in the Dept. of EECS at MIT and a member of LIDS. Previously I was a postdoc at Princeton University hosted by Sergio Verdú, with whom we worked on various topics in information theory. In my spare time, I enjoy playing with mathematics (especially, algebraic geometry and algebraic topology) and hacking Linux kernel. You can find more information about me in my CV and papers.

Channel dispersion

My current research interests spin around finding exact and approximate answers to non-asymptotic questions in communication theory. This direction has been explored in my thesis from an information-theoretic (fundamental limits) angle. For example, the capacity of an additive white Gaussian noise (AWGN) channel with SNR=0 dB is given by Shannon's formula

This classical result means that one can communicate at rates arbitrarily close to 0.5 bits per channel use with an arbitrary small probability of error ε in the limit of infinite number of channel uses (and no such communication is possible for any higher rate). This fundamental observation, however, means little for the practical engineer who is always limited by delay requirements. In such circumstances he would rather ask the question:

Assume I agree to step back from the capacity by a factor η=0.9 and I also agree to tolerate a probability of error ε=10-3. What is the minimum number n of channel uses do I need?

Obtaining exact solution is beyond the reach of current computers. Computationally feasible bounds, however, show that [PPV10]

2550   ≤   n   ≤  2850

These bounds are hard to compute and even harder to analyze. One, however, can get a good approximation and insight into the behavior of n via the following expression:

Here C is Shannon's capacity, Q-1 is a functional inverse of the standard Gaussian complimentary CDF, while V is a new fundamental characteristic of the channel, channel dispersion, which for the AWGN is given by [PPV10]

In our running example of the AWGN with SNR=0dB, η=0.9 and ε=10-3, one gets

n≈ 2750

an excellent match with the firm bounds. The closest classical approach to the same question (that of error-exponents) gives an estimate

n≈ 4120

which is evidently way off.

For a much more detailed introduction to this line of research please check Chapter 1 of my thesis (requires no information theory background!).

    Bounds on maximal achievable rate for the AWGN(0 dB)