Large Scale Taxonomy Classification using BiLSTM with Self-Attention
dc.contributor.advisor | UMBC Faculty Collection | |
dc.contributor.advisor | UMBC Student Collection | |
dc.contributor.author | Gao, Hang | |
dc.contributor.author | Oates, Tim | |
dc.date.accessioned | 2018-09-05T19:50:02Z | |
dc.date.available | 2018-09-05T19:50:02Z | |
dc.date.issued | 2018-07 | |
dc.description.abstract | In this paper we present a deep learning model for the task of large scale taxonomy classification, where the model is expected to predict the corresponding category ID path given a product title. The proposed approach relies on a Bidirectional Long Short Term Memory Network (BiLSTM) to capture the context information for each word, followed by a multi-head attention model to aggregate useful information from these words as the final representation of the product title. Our model adopts an end-to-end architecture that does not rely on any hand-craft features, and is regulated by various techniques. | en_US |
dc.description.uri | https://doi.org/10.475/123_4 | en_US |
dc.format.extent | 5 PAGES | en_US |
dc.genre | journal articles preprints | en_US |
dc.identifier | doi:10.13016/M2154DS3X | |
dc.identifier.citation | Hang Gao and Tim Oates. 2018. Large Scale Taxonomy Classification using BiLSTM with Self-Attention. In Proceedings of ACM SIGIR Workshop on eCommerce (SIGIR 2018 eCom Data Challenge). ACM, New York, NY, USA, Article 4, 5 pages. https://doi.org/10.475/123_4 | en_US |
dc.identifier.isbn | 123-4567-24-567/08/06. | |
dc.identifier.uri | http://hdl.handle.net/11603/11240 | |
dc.language.iso | en_US | en_US |
dc.publisher | ACM | en_US |
dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department Collection | |
dc.rights | This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please contact the author. | |
dc.subject | taxonomy classification | en_US |
dc.subject | BiLSTM | en_US |
dc.subject | attention | en_US |
dc.title | Large Scale Taxonomy Classification using BiLSTM with Self-Attention | en_US |
dc.type | Text | en_US |