Neuro-symbolic Representations and their Applications to Deep Learning
| dc.contributor.advisor | Oates, Tim | |
| dc.contributor.advisor | Raff, Edward | |
| dc.contributor.author | Alam, Mohammad Mahmudul | |
| dc.contributor.department | Computer Science and Electrical Engineering | |
| dc.contributor.program | Computer Science | |
| dc.date.accessioned | 2025-02-13T15:34:53Z | |
| dc.date.available | 2025-02-13T15:34:53Z | |
| dc.date.issued | 2024-01-01 | |
| dc.description.abstract | Neuro-symbolic AI (NSAI) is an emerging field of research becoming eminent in deep learning applications. It combines the benefits of connectionist systems such as neural networks with symbolic AI of reasoning and inference. In this dissertation, the benefits and prospects of neuro-symbolic representations are explored across four diverse applications which concludes with the introduction of a novel proposed Vector Symbolic Architecture (VSA). At first, a pseudo-encryption method is developed to deploy convolutional networks on an untrusted platform for secure inferential steps and prevent data and model theft using 2D Holographic Reduced Representations (HRR) by symbolically representing a one-time pad strategy within a neural network referred to as Connectionist Symbolic Pseudo Secrets (CSPS). The proposed CSPS is ?5000× faster and sends ?18,000× fewer data per query compared to the SOTA. Next, a linked key-value pair-based prediction and loss strategy is devised using HRR for Subitizing. The impact of the proposed loss function on the learning capabilities of both CNN and ViT is analyzed through saliency maps and out-of-distribution performance. Then, a neuro-symbolic self-attention mechanism is developed using HRR by re-casting the same logical strategy of self-attention mechanism which is termed as “Hrrformer”. It has several benefits including linear time & space complexity with respect to sequence length. Afterward, leveraging the properties of HRR, we introduce a global convolutional network called Holographic Global Convolutional Networks (HGConv), designed to encode and decode features from sequence elements. With log-linear complexity, the HGConv has achieved new SOTA results on Microsoft Malware Classification Challenge, Drebin, and EMBER malware benchmarks and it efficiently scales even for sequence lengths ?100,000. Finally, a novel VSA is proposed derived from Walsh Hadamard transformation, termed Hadamard-derived Linear Binding (HLB). The proposed symbolic method offers several desirable properties for both classical VSA tasks and differentiable systems. In general, the prospect of neuro-symbolic representation across diverse fields of AI is explored in this dissertation. In the case of CSPS, external information in the form of secret is incorporated into the network using HRR while in Subitizing, symbolic computation is performed in terms of a loss function. Additionally, Hrrformer and HGConv are the sequence models both of which are built by utilizing symbolic operations, highlighting the functionality in sequential applications. Furthermore, the newly proposed HLB exhibits superior properties for advancing the development of NSAI. | |
| dc.format | application:pdf | |
| dc.genre | dissertation | |
| dc.identifier | doi:10.13016/m2yr01-5qrq | |
| dc.identifier.other | 12954 | |
| dc.identifier.uri | http://hdl.handle.net/11603/37622 | |
| dc.language | en | |
| dc.relation.isAvailableAt | The University of Maryland, Baltimore County (UMBC) | |
| dc.relation.ispartof | UMBC Computer Science and Electrical Engineering Department Collection | |
| dc.relation.ispartof | UMBC Theses and Dissertations Collection | |
| dc.relation.ispartof | UMBC Graduate School Collection | |
| dc.relation.ispartof | UMBC Student Collection | |
| dc.rights | This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu or contact Special Collections at speccoll(at)umbc.edu | |
| dc.source | Original File Name: Alam_umbc_0434D_12954.pdf | |
| dc.subject | Global Convolution | |
| dc.subject | Hadamard-derived Linear Binding | |
| dc.subject | Holographic Reduced Representations | |
| dc.subject | Neuro-symbolic AI | |
| dc.subject | Sequence Modeling | |
| dc.subject | Vector Symbolic Architecture | |
| dc.title | Neuro-symbolic Representations and their Applications to Deep Learning | |
| dc.type | Text | |
| dcterms.accessRights | Distribution Rights granted to UMBC by the author. |
