|Research:  ||Information theory, coding theory & related fields|
|Education:    ||2010  ||Princeton University, Ph.D. (Advisers: H. V. Poor and S. Verdú)|
|2005  ||MIPT, M. S. (honors), Dept. General and Applied Physics|
|Address:    ||32-D668, MIT|
|77 Massachusetts Avenue|
|Cambridge, MA 02139|
|Phone:    ||(617) 324-0047|
I am Robert J. Shillman Assistant Professor of Electrical Engineering and Computer Science in the Dept. of EECS at MIT and a member of LIDS. Previously I was a postdoc at Princeton University hosted by Sergio Verdú, with whom we worked on various topics in information theory. In my spare time, I enjoy playing with mathematics (especially, algebraic geometry and algebraic topology) and hacking Linux kernel. You can find more information about me in my CV and papers.
My current research interests spin around finding exact and approximate answers to non-asymptotic questions in communication theory. This direction has been explored in my thesis from an information-theoretic (fundamental limits) angle. For example, the capacity of an additive white Gaussian noise (AWGN) channel with SNR=0 dB is given by Shannon's formula
This classical result means that one can communicate at rates arbitrarily close to 0.5 bits per channel use with an arbitrary small probability of error ε in the limit of infinite number of channel uses (and no such communication is possible for any higher rate). This fundamental observation, however, means little for the practical engineer who is always limited by delay requirements. In such circumstances he would rather ask the question:
2550 ≤ n ≤ 2850
For a much more detailed introduction to this line of research please check Chapter 1 of my thesis (requires no information theory background!).
Bounds on maximal achievable rate for the AWGN(0 dB)