Table of Contents
Abstract
Children with autism spectrum disorders (ASD) often have trouble understanding emotions (Baron-Cohen, Golan, & Ashwin, 2009). Emotions play an important role in everyday life. For children, being able to understand the emotions of others allows them to gain an enormous amount of information in social settings. Furthermore, the ability to interpret and process emotions allows one to build healthy relationships and improve communication. Research has shown that autistic children endure challenges associated with communication, social interaction, and cognitive skills (Kouijzer et al., 2008). Findings have indicated how neurofeedback can help improve such executive functions in children with autism.
Understanding the impact that neurofeedback has on autistic children’s executive function skills can allow us to understand the mechanism behind the deficit in their emotion recognition skills. We aim to propose an EEG- based BCI that will detect emotions. Participants will be shown a video clip in order to evoke a certain emotion, and an auditory stimulus will be played based on their ability to control their brainwaves. For the participants that will receive neurofeedback training, we expect to see improvements in their ability to understand emotions. Thus, this will improve their quality of life socially, physically, and mentally.
Keywords: Autism spectrum disorder, neurofeedback, emotional understanding, EEG-BCI
The ability to detect the emotions of others is essential in everyday life. For some children with autism spectrum disorders (ASD), the ability to recognize and process the emotions of people can be a difficult task (Baron-Cohen, Golan, & Ashwin, 2009). Emotions can be expressed through different mediums such as body language, facial expressions, and physical gestures. It is important individuals know how to process other people’s emotions because it improves social interactions by allowing one to react properly. Autistic children also struggle with executive functions such as communication, social interaction, working memory and cognitive flexibility skills (Kouijzer et al., 2009).
Currently, with no cure or medication for individuals with this disorder, previous research has demonstrated the importance of neurofeedback treatment to improve such functions (Kouijzer et al., 2009). Understanding the impact of proper executive functioning in children with autism can allow one to better understand their emotional recognition skills. Rueda et al. suggest that certain parts of executive functioning “are strongly related to increased emotional understanding” (2013). If previous findings indicate that autistic children are able to improve on mental, cognitive, and self-regulated skills through neurofeedback treatment, then this can allow one to test the effectiveness of this treatment on emotional recognition skills in autistic children.
Research conducted on the eye movement of individuals with autism have shown that with social interactions with others, they do not attend to the important features (eyes and mouth) on people’s faces to assess other’s emotions (Frith, 2003). Lorey et al, (2012) suggest that for an individual to understand the emotions of others he or she needs to be aware of their own emotions. For social interactions, being able to recognize and process other people’s emotions is important in forming an emotional connection with another individual. Not only do autistic children who are not able to recognize and process the emotions of others face challenges, their caregivers and teachers do as well. Children who cannot understand the emotions of an authority figure can end up misinterpreting a situation and exhibiting an irrational emotional response.
In comparison to their age-group, typically autistic children have difficulty with emotional recognition (Cibralic et al., 2019). Previous research has attributed their poor ability to recognize emotions to their lack of attending to people’s facial expressions. However, such claims still remain unclear as other studies have made different claims. Gepner, Deruelle, and Grynfelt suggest that autistic children’s ability to attend to and understand other people’s facial expression is not impaired (2001). Rump et al. suggests that “researchers appear to be more focused on how individuals with autism perform relative to controls, and are failing to consider the developmental course of their emotional ability and how the age of the participants and the methodology employed might affect their results” (2009). To date, there is still no single answer as to if children with autism indeed do lack the ability to understand people’s emotions.
Researchers have tested the idea of using Brain Computer Interface (BCI) in helping autistic children regulate their emotions. Previous findings have found that such neurofeedback training helps children regulate their brain activity and improve symptoms associated with ASD (Pineda et al., 2008). However, there is still a lack of studies that investigate how neurofeedback can help autistic children better understand and recognize emotions. As previously discussed, emotion recognition is an important skill to have because it allows us to effectively communicate, interact, and socialize with others. When this skill is lacking in children, oftentimes, they will not be able to form positive social relationships.
This study aims to propose a BCI that is able to help autistic children better understand the emotions of other people. Participants will watch a series of videos that will evoke basic emotions such as happiness, anger, and sadness. When looking at the electrical activity of the brain, when participants are able to correctly understand the emotion of the subject in the video, they will receive an auditory stimulus as feedback. This study focuses on getting autistic children to recognize basic emotions and train their brain waves to associate certain facial expressions and body language with the correct emotions. We propose this BCI to investigate whether neurofeedback can indeed be effective in training autistic children to understand emotions of other people and to improve their quality of life and ability to form meaningful relationships.
Method
Participants
Forty-four children with autism spectrum disorder (15 females, 29 males) between the ages 7-13 (mean age= 10.1) will be recruited via flyer advertisements placed in local centers for children with ASD across Dallas, Texas. All participants will have an IQ-score of above 80 and will be medically diagnosed with Autism Spectrum Disorder (ASD) using the Diagnostic and Statistical Manual (DSM-5) by a medical professional. Children who have co-existing disorders or who are on medication will be excluded. The first group recruited, comprised of 44 children, will be assigned to the treatment group. The control group will include 44 children with ASD who will match the treatment group in gender and age. Parents of the participants will be given an informed consent form to sign. Participants will be compensated $300 for their participation in our study.
Design
Forty-four participants will be tested on their ability to recognize three basic emotions: happiness, sadness, and anger. The features that will be extracted from the brain signal are theta, delta, and alpha. Previously, research has shown that changes in alpha, beta, and gamma power are associated with changes in the right hemisphere of the brain, specifically the parietal lobe (Yuvaraj et al., 2014). Furthermore, changes in delta power are associated with changes in the right hemisphere of the brain near the occipital lobes (Woaswi et al., 2016). Past studies on emotion detection using EEG have found that specific changes in brain waves are associated with different emotional reactions (Woaswi et al., 2016). When participants experienced angry emotional reactions, theta waves were activated in the right hemisphere (Woaswi et al., 2016). Happy emotional reactions were associated with the activation of alpha waves on the EEG (Woaswi et al., 2016). In addition, the presence of delta and theta waves in the right hemisphere corresponded to sad emotional reactions (Woaswi et al., 2016).
Participants will not be given strategies to control their brainwaves. They will be asked to use the feedback as a guide to help them regulate their brainwaves. Feedback presented to participants will be an auditory stimulus, specifically music. The auditory stimulus will be discrete (i.e. the music will only play when participants are able to get their brain signal to a certain emotional state). The feedback will be proportional to each feature (theta, delta, and alpha) individually. A waitlist control group will be used.
Pre- and-post neurofeedback training, participants in the treatment and control group will be tested on their ability to recognize certain emotions (i.e. participants will be shown three video clips and asked to verbally identify the emotion evoked in the clip). EEG activity will be recorded throughout. Pre and post measurements will last up to ten minutes.
Neurofeedback training will consist of 25 sessions three times a week. Each session of neurofeedback will consist of a baseline ten minutes in length. The neurofeedback trial will consist of seven, five-minute blocks with a rest period, one minute in length. The total length of the experiment will be approximately 45 minutes.
Materials
- Computer with monitor
- Electrode cap
- CONTEC KT88-3200 EEG
- EEG gel- used to help attach the electrode to the scalp
Software Install
- A CD software will be installed on the computer for using EEG applications
- “Safnet Microdog” software will be needed for the proper functioning of EEG
Hardware Install
- CONTEC KT88-3200 EEG: used to assess EEG data (measurements of the signals on the EEG) and to determine where the electrodes will be placed.
- FP1-A1, FP2-A2, F3-A1, F4-A2, C3-A1, C4-A2, P3-A1, P4-A2, O1-A1, O2-A2, F7-A1, F8-A2, T7-A1, T8-A2, P7A1, P8-A2, Fz-AV, Pz-AV, and Cz-AV will be the 20 electrodes used in this study
Participants will be shown video clips taken from a database called “LIRIS-ACCEDE.” This database contains over 1000 videos from various movie scenes meant to induce different emotions.
Auditory stimuli will be used as feedback to the participants. The audio stimuli will be taken from the International Affective Digitized Sound System (IADS). This is a database that has over 100 auditory stimuli that are meant to correspond to positive and negative emotional states.
Procedure
Pre-test measurements
All participants will be told about the protocol of the study before beginning. At baseline, before neurofeedback training, participants in the treatment and control group will be pre-tested on their ability to recognize certain emotions. Participants will be asked to sit in front of a computer screen and an EEG cap with 20 electrodes will be placed on their head. The participants will be asked to limit the movement of their body to lessen the artifacts present in the EEG data. Three video clips that evoke happy, sad, and angry emotions will be selected from the LIRIS-ACCEDE database. Participants will be shown the sad, happy, then angry video in order, and will be given a twenty second rest period in between each clip. Each video, three minutes in length, will be shown to participants, and simultaneously EEG data will be recorded.
Neurofeedback training
CONTEC KT88-3200 EEG and software/hardware install will be used for neurofeedback training. Additionally, training will be conducted by a licensed clinician. In each neurofeedback session, the trial will consist of seven, five-minute blocks with a rest period (one minute in length). Each rest period will consist of participants relaxing in silence while a white screen is presented. In each of the seven blocks (five minutes in length), participants will be shown different video clips that evoke either a happy, sad, or angry emotion. When participants’ brain waves show that they are evoking the correct emotion, an auditory feedback will be presented to them. Two auditory stimuli, one evoking a happy sound and the other evoking a “wrong” sound (i.e. sound of storm) will be taken from the IADS database. Participants will not be told how to control their brainwaves but will be told to use the auditory feedback to guide them. After neurofeedback sessions are complete, post-test measurements will be taken.
Post-test measurements
All data collected in pre-test measurements will be re-collected for the treatment and control group.
After the completion of 25 neurofeedback sessions, three-month follow up measurements will be taken for both groups. In addition, participants in the control group will be invited to participate in neurofeedback training months later.
Data Collection
A 2 x 2 mixed MANOVA was conducted to analyze the data.
In this study, we aim to test if participants, just by exposure to emotion evoking clips via neurofeedback, can learn to correctly identify and understand what emotions are exhibited. Pre-test measurements will be done to see where participants (intervention and control group) stand in their ability to recognize basic emotions (happiness, anger, and sadness). A MANOVA was conducted to analyze where the treatment group stood in comparison to the control group on their ability to understand the emotion expressed in the video. Post neurofeedback training, we expect to see that participants in the treatment group are better able to correctly classify emotions in comparison to the control group (Fig. 1). For example, when a participant in the treatment group is shown a video clip meant to elicit angry emotions, we expect to see theta waves present in the right hemisphere of the brain (Fig.2).
Previously, research has discovered that when participants are evoking happy emotions, we should see the activations of alpha waves predominantly in the parietal cortex area (Woaswi et al., 2016). In addition, when participants evoke sad emotions the activation of theta and delta waves should be seen in the right hemisphere (Woaswi et al., 2016). Research is still limited on the use of BCI’s for emotion recognition using EEG. However, our study provides insight that participants can understand emotions by training their brain waves to associate certain emotional clues to the right emotional state. Furthermore, previous studies have used simple techniques such as looking at facial expressions or physical gestures to understand emotions (Baron-Cohen, Golan, & Ashwin, 2009). However, trying to decipher different emotions based on physical attributes has not always served as a good indicator of really understanding emotions due to the possibility of deception being present (Baron-Cohen, Golan, & Ashwin, 2009). Therefore, in this study we intend to look at the electrical activities of the brain that are associated with different emotions and train autistic children to understand the basic emotions. If participants in the treatment group are better able to understand emotions, it can help improve quality of their life, socially, mentally, and physically.
If participants from the treatment group and the control group perform the same on their ability to understand emotions, then we have means to consider the applications used for the BCI. This could mean that our EEG based BCI for detecting emotions using auditory feedback may not be useful in treating autistic children in this aspect. In this case, other modalities can be considered in training participants to control their brain waves. Kouijzer et al. (2008) displayed feedback to participants in a visual format, and this proved to be successful because participants were able to improve on their ability to decrease/increase their theta/beta ratio by the visual feedback given to them. Future research can test to see if our proposed BCI using visual feedback may result in participants in the treatment group outperforming the control group on an emotional understanding task.
Future Research/Limitations
In order to confirm the original hypothesis, participants will need to be re-tested on their ability to understand emotions three months down the line. This will allow us to discover if our neurofeedback sessions were effective in helping them understand emotions in the long run.
Pan, Li, and Wang (2016) suggest that for emotion processing it is best to use “subject specific frequency bands for emotion recognition”. However, in this study we focused on EEG frequency bands ranging from 2 Hz to 40 Hz to assess emotional understanding skills. It has been suggested in order to accurately differentiate which brain wave corresponds to a specific emotion it is more useful to select specific frequency bands (Pan, Li & Wang, 2016). Future research can implement our BCI using such applications to see if there is a difference in emotional intelligence outcomes.
A limitation to our study could be the lack of a larger sample size and a sham control. Also, improvement could be done in the task used to assess participants’ ability to understand emotions. To improve this, we could include a questionnaire that will assess where participants stand in their ability to understand emotions. Although, our study aimed at getting participants to understand a few basic emotions; for future research it may be more effective to include additional emotional types such as fear, shame, guilt, and disgust.
In this study, we proposed a BCI to detect autistic children’s emotional state and provide feedback based on their ability to correctly identify what emotion is being evoked in the video clips. If successful, children in the treatment group will outperform the control group in a task regarding understanding emotions. Being able to understand the emotions of others allows us to gain an enormous amount of information in social settings. In addition, the ability to interpret and process emotions helps us build healthy relationships and improve communication. To improve emotional understanding in autistic children, our EEG-based BCI proves to be effective in treating emotional recognition deficits in developmental disorders.