Traditional analysis in information theory has focused on asymptotic analysis wherein we assume that an information-processing task is repeated an unlimited number of times. This approach has given operational meaning to quantities such as entropy, mutual information etc. Over the last few years, there has been a renewed interest in non-asymptotic information-theoretic analysis where an information-processing task could be done just once. We carry over this analysis to some multi-party scenarios (multiple senders and/or receivers). Our results are asymptotically optimal, i.e., they yield the same answer as the asymptotic analysis in the limit of unlimited repetitions of the task. Our results include the case of the distributed encoding of correlated and memoryless sources whose rate region was first given by Slepian and Wolf.