A Discrete Variational Recurrent Topic Model without the Reparametrization Trick
dc.contributor.author | Rezaee, Mehdi | |
dc.contributor.author | Ferraro, Francis | |
dc.date.accessioned | 2020-12-08T20:13:39Z | |
dc.date.available | 2020-12-08T20:13:39Z | |
dc.date.issued | 2020-10-22 | |
dc.description | 34th Conference on Neural Information Processing Systems (NeurIPS 2020), Vancouver, Canada. | en_US |
dc.description.abstract | We show how to learn a neural topic model with discrete random variables---one that explicitly models each word's assigned topic---using neural variational inference that does not rely on stochastic backpropagation to handle the discrete variables. The model we utilize combines the expressive power of neural methods for representing sequences of text with the topic model's ability to capture global, thematic coherence. Using neural variational inference, we show improved perplexity and document understanding across multiple corpora. We examine the effect of prior parameters both on the model and variational parameters and demonstrate how our approach can compete and surpass a popular topic model implementation on an automatic measure of topic quality. | en_US |
dc.description.sponsorship | We would like to thank members and affiliates of the UMBC CSEE Department, including Edward Raff, Cynthia Matuszek, Erfan Noury and Ahmad Mousavi. We would also like to thank the anonymous reviewers for their comments, questions, and suggestions. Some experiments were conducted on the UMBC HPCF. We’d also like to thank the reviewers for their comments and suggestions. This material is based in part upon work supported by the National Science Foundation under Grant No. IIS-1940931. This material is also based on research that is in part supported by the Air Force Research Laboratory (AFRL), DARPA, for the KAIROS program under agreement number FA8750-19-2-1003. The U.S.Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either express or implied, of the Air Force Research Laboratory (AFRL), DARPA, or the U.S. Government. | en_US |
dc.description.sponsorship | We would like to thank members and affiliates of the UMBC CSEE Department, including Edward Raff, Cynthia Matuszek, Erfan Noury and Ahmad Mousavi. We would also like to thank the anonymous reviewers for their comments, questions, and suggestions. Some experiments were conducted on the UMBC HPCF. We’d also like to thank the reviewers for their comments and suggestions. This material is based in part upon work supported by the National Science Foundation under Grant No. IIS-1940931. This material is also based on research that is in part supported by the Air Force Research Laboratory (AFRL), DARPA, for the KAIROS program under agreement number FA8750-19-2-1003. The U.S.Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either express or implied, of the Air Force Research Laboratory (AFRL), DARPA, or the U.S. Government. | |
dc.description.uri | https://papers.nips.cc/paper/2020/file/9f1d5659d5880fb427f6e04ae500fc25-Paper.pdf | en_US |
dc.format.extent | 16 pages | en_US |
dc.genre | conference papers and proceedings preprints | en_US |
dc.identifier | doi:10.13016/m2pzsd-zgcf | |
dc.identifier.citation | Mehdi Rezaee and Francis Ferraro, A Discrete Variational Recurrent Topic Model without the Reparametrization Trick, 34th Conference on Neural Information Processing Systems (NeurIPS 2020), https://papers.nips.cc/paper/2020/file/9f1d5659d5880fb427f6e04ae500fc25-Paper.pdf | en_US |
dc.identifier.uri | http://hdl.handle.net/11603/20207 | |
dc.language.iso | en_US | en_US |
dc.publisher | Conference on Neural Information Processing Systems | |
dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department Collection | |
dc.relation.ispartof | UMBC Faculty Collection | |
dc.rights | This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author. | |
dc.subject | UMBC Ebiquity Research Group | |
dc.subject | UMBC High Performance Computing Facility (HPCF) | |
dc.title | A Discrete Variational Recurrent Topic Model without the Reparametrization Trick | en_US |
dc.type | Text | en_US |