USING FACIAL EMOTION READING TO ASSIST IN THE EXTRACTION OF EEG EMOTIONAL BIOMARKERS

Author/Creator ORCID

Date

2022-01-01

Department

Computer Science and Electrical Engineering

Program

Engineering, Computer

Citation of Original Publication

Rights

Access limited to the UMBC community. Item may possibly be obtained via Interlibrary Loan through a local library, pending author/copyright holder's permission.
This item may be protected under Title 17 of the U.S. Copyright Law. It is made available by UMBC for non-commercial research and education. For permission to publish or reproduce, please see http://aok.lib.umbc.edu/specoll/repro.php or contact Special Collections at speccoll(at)umbc.edu

Abstract

While electroencephalograms (EEG) have been used for decades in various areas of research, very little has been accomplished when it comes to how emotional states are reflected in EEG readings of test subjects. In this theses, we work to identify the occurrence of four different emotional states (happy, sad, surprise, and disgust), based on activity in one's brain networks (specifically DMN, FPN, Salience, and Attention) through the use of Source Localization. In order to validate EEG emotional biomarker extraction, an open-source facial recognition software, self-modified to perform emotion-sensing logic based on the Facial Action Coding System (FACS), was used. These modifications use the Action Unit (AU) data that the software extracts through its facial recognition algorithms and performs logic determinations based on established facial muscle movement combinations. FACS has been a gold standard for emotion reading, and has been used in many research studies as a highly accurate method of facial analysis. While different publications have come up with slightly varied sets of AUs for emotions, they have largely agreed on a specific set that reliably represent different emotional expressions. To make the conclusions reached in this theses, we used customized emotion-sensing software, built onto an open-source facial-recognition software. We then recorded the EEG waveforms of several subjects while monitoring their emotional states using the emotion-sensing software. After the recordings ended, we compared the activations of the brain networks with the emotional states observed to determine if there were any identifiable patterns. Upon completing the analysis of the data, the resulting conclusion was that the activations of the brain networks were independent of the subjects' emotional states, and there were no discernable patterns.