Neuro-symbolic Representations and their Applications to Deep Learning

dc.contributor.advisorOates, Tim
dc.contributor.advisorRaff, Edward
dc.contributor.authorAlam, Mohammad Mahmudul
dc.contributor.departmentComputer Science and Electrical Engineering
dc.contributor.programComputer Science
dc.date.accessioned2025-02-13T15:34:53Z
dc.date.available2025-02-13T15:34:53Z
dc.date.issued2024-01-01
dc.description.abstractNeuro-symbolic AI (NSAI) is an emerging field of research becoming eminent in deep learning applications. It combines the benefits of connectionist systems such as neural networks with symbolic AI of reasoning and inference. In this dissertation, the benefits and prospects of neuro-symbolic representations are explored across four diverse applications which concludes with the introduction of a novel proposed Vector Symbolic Architecture (VSA). At first, a pseudo-encryption method is developed to deploy convolutional networks on an untrusted platform for secure inferential steps and prevent data and model theft using 2D Holographic Reduced Representations (HRR) by symbolically representing a one-time pad strategy within a neural network referred to as Connectionist Symbolic Pseudo Secrets (CSPS). The proposed CSPS is ?5000× faster and sends ?18,000× fewer data per query compared to the SOTA. Next, a linked key-value pair-based prediction and loss strategy is devised using HRR for Subitizing. The impact of the proposed loss function on the learning capabilities of both CNN and ViT is analyzed through saliency maps and out-of-distribution performance. Then, a neuro-symbolic self-attention mechanism is developed using HRR by re-casting the same logical strategy of self-attention mechanism which is termed as “Hrrformer”. It has several benefits including linear time & space complexity with respect to sequence length. Afterward, leveraging the properties of HRR, we introduce a global convolutional network called Holographic Global Convolutional Networks (HGConv), designed to encode and decode features from sequence elements. With log-linear complexity, the HGConv has achieved new SOTA results on Microsoft Malware Classification Challenge, Drebin, and EMBER malware benchmarks and it efficiently scales even for sequence lengths ?100,000. Finally, a novel VSA is proposed derived from Walsh Hadamard transformation, termed Hadamard-derived Linear Binding (HLB). The proposed symbolic method offers several desirable properties for both classical VSA tasks and differentiable systems. In general, the prospect of neuro-symbolic representation across diverse fields of AI is explored in this dissertation. In the case of CSPS, external information in the form of secret is incorporated into the network using HRR while in Subitizing, symbolic computation is performed in terms of a loss function. Additionally, Hrrformer and HGConv are the sequence models both of which are built by utilizing symbolic operations, highlighting the functionality in sequential applications. Furthermore, the newly proposed HLB exhibits superior properties for advancing the development of NSAI.
dc.formatapplication:pdf
dc.genredissertation
dc.identifierdoi:10.13016/m2yr01-5qrq
dc.identifier.other12954
dc.identifier.urihttp://hdl.handle.net/11603/37622
dc.languageen
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department Collection
dc.relation.ispartofUMBC Theses and Dissertations Collection
dc.relation.ispartofUMBC Graduate School Collection
dc.relation.ispartofUMBC Student Collection
dc.rightsThis item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu or contact Special Collections at speccoll(at)umbc.edu
dc.sourceOriginal File Name: Alam_umbc_0434D_12954.pdf
dc.subjectGlobal Convolution
dc.subjectHadamard-derived Linear Binding
dc.subjectHolographic Reduced Representations
dc.subjectNeuro-symbolic AI
dc.subjectSequence Modeling
dc.subjectVector Symbolic Architecture
dc.titleNeuro-symbolic Representations and their Applications to Deep Learning
dc.typeText
dcterms.accessRightsDistribution Rights granted to UMBC by the author.

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Alam_umbc_0434D_12954.pdf
Size:
22.5 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Alam-Mohammad_Mahmudul_Openm.pdf
Size:
205.24 KB
Format:
Adobe Portable Document Format
Description: