Recasting Self-Attention with Holographic Reduced Representations
dc.contributor.author | Alam, Mohammad Mahmudul | |
dc.contributor.author | Raff, Edward | |
dc.contributor.author | Oates, Tim | |
dc.contributor.author | Holt, James | |
dc.date.accessioned | 2022-10-10T14:39:46Z | |
dc.date.available | 2022-10-10T14:39:46Z | |
dc.date.issued | 2022-08-15 | |
dc.description.abstract | Self-Attention has become fundamentally a new approach to set and sequence modeling, particularly within transformerstyle architectures. Given a sequence of 𝑇 items the standard self-attention has O (𝑇 2 ) memory and compute needs, leading to many recent works building approximations to self-attention with reduced computational or memory complexity. In this work, we instead re-cast self-attention using the neuro-symbolic approach of Holographic Reduced Representations (HRR). In doing so we perform the same logical strategy of the standard self-attention. Implemented as a “Hrrformer” we obtain several benefits including faster compute (O (𝑇 log𝑇 ) time complexity), less memory-use per layer (O (𝑇 ) space complexity), convergence in 10× fewer epochs, near state-of-the-art accuracy, and we are able to learn with just a single layer. Combined, these benefits make our Hrrformer up to 370× faster to train on the Long Range Arena benchmark. | en_US |
dc.description.uri | https://kdd-milets.github.io/milets2022/papers/MILETS_2022_paper_5942.pdf | en_US |
dc.format.extent | 9 pages | en_US |
dc.genre | journal articles | en_US |
dc.genre | preprints | en_US |
dc.identifier | doi:10.13016/m2lqpd-f2gn | |
dc.identifier.uri | http://hdl.handle.net/11603/26128 | |
dc.language.iso | en_US | en_US |
dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department Collection | |
dc.relation.ispartof | UMBC Faculty Collection | |
dc.relation.ispartof | UMBC Student Collection | |
dc.rights | This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author. | en_US |
dc.title | Recasting Self-Attention with Holographic Reduced Representations | en_US |
dc.type | Text | en_US |