We characterize fundamental limits for the Slepian-Wolf problem, the multiple-access channel and the asymmetric broadcast channel in the finite blocklength setting. For the Slepian-Wolf problem (distributed lossless source coding), we introduce a fundamental quantity known as the entropy dispersion matrix. We show that if this matrix is positive-definite, the optimal rate region under the constraint of a fixed blocklength and non-zero error probability has a curved boundary compared to being polyhedral for the asymptotic Slepian-Wolf scenario. In addition, the entropy dispersion matrix governs the rate of convergence of the non-asymptotic region to the asymptotic one. We develop a general universal achievability procedure for finite blocklength analyses of other network information theory problems such as the multiple-access channel and broadcast channel. We provide inner bounds to these problems using a key result known as the vector rate redundancy theorem which is proved using a multidimensional version of the Berry-Essèen theorem. We show that a so-called information dispersion matrix characterizes these inner bounds.