dc.description.abstract | Normalizing flows is a promising avenue in both density estimation and variational inference, which promises models that can both generate new samples and evaluate the exact density, both with reasonable computational complexity. In addition, normalizing flows incorporates deep learning, which gives the existence of arbitrarily good approximations of any distribution. This thesis will have two purposes in mind. We first find that normalizing flows contain several components, where each is not as well defined, and which we provide a formalisation of the lay the groundwork for future theoretical work. By formalising, we find both new theoretical results and give an overview of the current literature. Second purpose is to fill the gap between normalizing flows that are fast computationally and have many attractive properties, but less complex than other flows in the literature. We introduce new normalizing flows that balance the attractive qualities of the less complex flows, but increases the flexibility. We show this through both proving asymptotically behaviour exactly the same as the more complex flows, and then confirm empirically that our proposed flows improve upon the simpler ones, and indeed fills the gap. In addition to this, we find interesting results in terms of variational inference, that shows more complex flows can perform much better in a variational inference setting with increasing dimensions, than what simpler ones shown in the literature can. | eng |