File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Overcoming Posterior Collapse in Variational Autoencoders Via EM-Type Training
Title | Overcoming Posterior Collapse in Variational Autoencoders Via EM-Type Training |
---|---|
Authors | |
Issue Date | 4-Jun-2023 |
Abstract | Variational autoencoders (VAE) are one of the most prominent deep generative models for learning the underlying statistical distribution of high-dimensional data. However, training VAEs suffers from a severe issue called posterior collapse; that is, the learned posterior distribution collapses to the assumed/pre-selected prior distribution. This issue limits the capacity of the learned posterior distribution to convey data information. Previous work has proposed a heuristic training scheme to mitigate this issue, in which the core idea is to train the encoder and the decoder in an alternating fashion. However, there is still no theoretical interpretation of this scheme, and this paper, for the first time, fills in this gap by inspecting the previous scheme under the lens of the expectation maximization (EM) framework. Under this framework, we propose a novel EM-type training algorithm that gives a controllable optimization process and it allows for further extensions, e.g., employing implicit distribution models. Experimental results have corroborated the superior performance of the proposed EM-type VAE training algorithm in terms of various metrics. |
Persistent Identifier | http://hdl.handle.net/10722/333894 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, Ying | - |
dc.contributor.author | Cheng, Lei | - |
dc.contributor.author | Yin, Feng | - |
dc.contributor.author | Zhang, Michael Minyi | - |
dc.contributor.author | Theodoridis, Sergios | - |
dc.date.accessioned | 2023-10-06T08:39:58Z | - |
dc.date.available | 2023-10-06T08:39:58Z | - |
dc.date.issued | 2023-06-04 | - |
dc.identifier.uri | http://hdl.handle.net/10722/333894 | - |
dc.description.abstract | <p>Variational autoencoders (VAE) are one of the most prominent deep generative models for learning the underlying statistical distribution of high-dimensional data. However, training VAEs suffers from a severe issue called posterior collapse; that is, the learned posterior distribution collapses to the assumed/pre-selected prior distribution. This issue limits the capacity of the learned posterior distribution to convey data information. Previous work has proposed a heuristic training scheme to mitigate this issue, in which the core idea is to train the encoder and the decoder in an alternating fashion. However, there is still no theoretical interpretation of this scheme, and this paper, for the first time, fills in this gap by inspecting the previous scheme under the lens of the expectation maximization (EM) framework. Under this framework, we propose a novel EM-type training algorithm that gives a controllable optimization process and it allows for further extensions, e.g., employing implicit distribution models. Experimental results have corroborated the superior performance of the proposed EM-type VAE training algorithm in terms of various metrics.<br></p> | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP 2023 (04/06/2023-10/06/2023, Rhodes Island) | - |
dc.title | Overcoming Posterior Collapse in Variational Autoencoders Via EM-Type Training | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.1109/ICASSP49357.2023.10096746 | - |