Deep Fusion of Neurophysiological and Facial Features for Enhanced Emotion Detection

dc.contributor.authorSafavi, Farshad
dc.contributor.authorVenkannagari, Vikas Reddy
dc.contributor.authorParikh, Dev
dc.contributor.authorVinjamuri, Ramana
dc.date.accessioned2025-06-05T14:03:49Z
dc.date.available2025-06-05T14:03:49Z
dc.date.issued2025
dc.description.abstractThe fusion of facial and neurophysiological features for multimodal emotion detection is vital for applications in healthcare, wearable devices, and human-computer interaction, as it enables a more comprehensive understanding of human emotions. Traditionally, the integration of facial expressions and neurophysiological signals has required specialized knowledge and complex preprocessing. With the rise of deep learning and artificial intelligence (AI), new methodologies in affective computing allow for the seamless fusion of multimodal signals, advancing emotion recognition systems. In this paper, we present a novel multimodal deep network that leverages transformers to extract comprehensive features from neurophysiological data, which are then fused with facial expression features for emotion classification. Our transformer-based model analyzes neurophysiological time-series data, while transformer-inspired methods extract facial expression features, enabling the classification of complex emotional states. We compare single modality with multimodal systems, testing our model on Electroencephalography (EEG) signals using the DEAP and Lie Detection datasets. Our hybrid approach effectively captures intricate temporal and spatial patterns in the data, significantly enhancing the system's emotion recognition accuracy. Validated on the DEAP dataset, our method achieves near state-of-the-art performance, with accuracy rates of 97.78%, 97.64%, 97.91%, and 97.62% for arousal, valence, liking, and dominance, respectively. Furthermore, we achieved a precision of 97.9%, a ROC AUC score of 97.6%, an F1-score of 98.1%, and a recall of 98.2%, demonstrating the model's robust performance. We demonstrated the effectiveness of this method, specifically for EEG caps with a limited number of electrodes, in emotion detection for wearable devices.
dc.description.sponsorshipThis work was supported by the National Science Foundation Faculty Early Career CAREER Development Award HCC 2053498
dc.description.urihttps://ieeexplore.ieee.org/document/10945364
dc.format.extent12 pages
dc.genrejournal articles
dc.identifierdoi:10.13016/m2vskf-ali1
dc.identifier.citationSafavi, Farshad, Vikas Reddy. Venkannagari, Dev Parikh, and Ramana Kumar Vinjamuri. “Deep Fusion of Neurophysiological and Facial Features for Enhanced Emotion Detection.” IEEE Access, 2025, 1–1. https://doi.org/10.1109/ACCESS.2025.3555934.
dc.identifier.urihttps://doi.org/10.1109/ACCESS.2025.3555934
dc.identifier.urihttp://hdl.handle.net/11603/38760
dc.language.isoen_US
dc.publisherIEEE
dc.relation.isAvailableAtThe University of Maryland, Baltimore County (UMBC)
dc.relation.ispartofUMBC Computer Science and Electrical Engineering Department
dc.relation.ispartofUMBC Faculty Collection
dc.relation.ispartofUMBC Center for Accelerated Real Time Analysis
dc.relation.ispartofUMBC Student Collection
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/deed.en
dc.subjectComputer architecture
dc.subjectFeature extraction
dc.subjectTransformer
dc.subjectElectroencephalography
dc.subjectVisualization
dc.subjectVideos
dc.subjectBrain modeling
dc.subjectEmotion recognition
dc.subjectFacial features
dc.subjectTransformers
dc.subjectMultimodal Emotion Recognition
dc.subjectAffective Computing
dc.subjectDeep learning
dc.subjectEmotion Detection
dc.subjectAccuracy
dc.titleDeep Fusion of Neurophysiological and Facial Features for Enhanced Emotion Detection
dc.typeText
dcterms.creatorhttps://orcid.org/0000-0003-1650-5524
dcterms.creatorhttps://orcid.org/0000-0002-6905-0781
dcterms.creatorhttps://orcid.org/0009-0000-4615-5823

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
DeepFusionofNeurophysiological.pdf
Size:
1.34 MB
Format:
Adobe Portable Document Format