A Discrete Variational Recurrent Topic Model without the Reparametrization Trick
Loading...
Permanent Link
Author/Creator
Author/Creator ORCID
Date
2020-10-22
Type of Work
Department
Program
Citation of Original Publication
Mehdi Rezaee and Francis Ferraro, A Discrete Variational Recurrent Topic Model without the Reparametrization Trick, 34th Conference on Neural Information Processing Systems (NeurIPS 2020), https://papers.nips.cc/paper/2020/file/9f1d5659d5880fb427f6e04ae500fc25-Paper.pdf
Rights
This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
Abstract
We show how to learn a neural topic model with discrete random variables---one that explicitly models each word's assigned topic---using neural variational inference that does not rely on stochastic backpropagation to handle the discrete variables. The model we utilize combines the expressive power of neural methods for representing sequences of text with the topic model's ability to capture global, thematic coherence. Using neural variational inference, we show improved perplexity and document understanding across multiple corpora. We examine the effect of prior parameters both on the model and variational parameters and demonstrate how our approach can compete and surpass a popular topic model implementation on an automatic measure of topic quality.