We characterize second-order coding rates (or dispersions) for distributed lossless source coding (the Slepian-Wolf problem). We introduce a fundamental quantity known as the entropy dispersion matrix, which is analogous to scalar dispersion quantities. We show that if this matrix is positive-definite, the optimal rate region under the constraint of a fixed blocklength and non-zero error probability has a curved boundary compared to being polyhedral for the Slepian-Wolf case. In addition, the entropy dispersion matrix governs the rate of convergence of the non-asymptotic region to the asymptotic one. As a by-product of our analyses, we develop a general universal achievability procedure for dispersion analysis of some other network information theory problems such as the multiple-access channel. Numerical examples show how the region given by Gaussian approximations compares to the Slepian-Wolf region.