Detecting Attention of Children with Autism Spectrum through Face-Tracking Technology

Detecting Attention of Children with Autism Spectrum through Face-Tracking Technology

Bilikis Banire

Research article Online Open access | Available online on: 05 October, 2020 | Last update: 28 October, 2021

View PDF Nafath

Volume 6

Issue 14

Autism spectrum disorders (ASD) is a neurodevelopmental disorder with deficits in social communication and repetitive patterns of behavior. This developmental disorder affects one in 160 children worldwide. In particular, a cross-sectional survey on the prevalence of ASD in Qatar shows that 1,575 children under age 5 and 5,025 individuals between the age of 5–19 are affected [1]. These figures point to the need for evidence-based research that can relatively support individuals with ASD. Children with (ASD) have difficulties with attention as they are easily distracted away from learning tasks. Consequently, teachers find it challenging to monitor their attention and learning material at the same time. Assistive Technology can provide a solution; by using Face-Tracking Technology. This technology relies on webcam and artificial intelligence.

Detecting Attention of Children with Autism Attention requires behavioral and cognitive processing of discrete information while ignoring other distracting information [2]. It serves as a fundamental component of any productive learning which supports the assimilation of required skills for daily activities [3]. A review study on attention assessment of children with ASD shows that one of the most common strategies used for assessing attention is a direct observation or video data analysis [4]. In video data analysis, attentional behaviors of the participants are coded or rated from a recorded learning session by experts, parents or caregivers. This approach requires experience on how children with ASD pay attention. Also, the process of coding the attentional behaviors is time-consuming and tedious.

The dynamics of attention assessment has shifted from subjective evaluation to objective techniques. Some of the critical benefits of objective techniques such as face-tracking include easy attention assessment, personalized pedagogical support, and adaptive learning [5], Face-tracking simply refers to face detection and description of facial actions within a video stream in real-time. The tracking technique relies on a camera and artificial intelligence (AI) to capture video images and analyze facial actions respectively. Facial actions provide a specific definition of facial expressions describing the feelings and emotions of individuals. Overall, face-tracking is a promising approach because it is ubiquitous, non-obtrusive and cost-effective.

Research has shown that children with ASD exhibit attentional behavior via emotions such as happy and sad [6]. To further understand how emotions, facial expressions, and facial landmarks describe attentional behaviors in children with ASD and typically developing (TD) children, we conducted an experimental study on face-tracking during attention tasks. The study simulated a continuous performance task (CPT) in a virtual classroom. CPT displays random alphabets on a blackboard where participants click the keyboard when a specified letter appears. This test is conventionally used to assess selective and sustained attention of children with attention-deficit [7]. Auditory and visual distractors are introduced in the test to simulate possible classroom distractions (Fig. 1). During the attention task experiment, iMotions software detects and generates facial features using a webcam device [8]. iMotions is commercial biometric software that uses computer vision and artificial intelligence for emotions, facial expressions detection, and facial landmarks.

The investigation of emotions on attentional behaviors in children with ASD and TD revealed that positive emotions were prominent when they pay attention. For example, children with ASD expressed joy emotion, while the TD group expressed more joy and surprise emotions. This finding is similar to a study by [28], which states that children with ASD show positive emotions in increased learning engagement. However, there was no significant correlation between positive or negative emotions with the performance scores. This finding indicates that while emotions explain the interest of students towards learning it is not sufficient for attention assessment. This implies that emotion will not define attention at all times. In the next phase of the investigation, we explore the low level of emotions i.e. facial expression describes attention.  Facial expressions are facial action units that describe emotions. For example, surprise emotion comprises of three facial action units: mouth open, eye-widen, and brow raise. We explored 10 basic facial action units that are related to attentional behaviors in children with ASD and TD. These facial expressions include brow furrow, brow raise, lip corner depressor, smile, nose wrinkle, lip suck, mouth open, chin raise and lip pucker [9]. Four facial action units were common in children with ASD during the attention task. These facial action units include mouth open, brow raise, lip suck, and lip press. Similar facial expressions were identified in the TD children, except for lip press [10]. Further analysis of how these facial action units differentiate attention from inattention led to an attention detection model using facial landmarks.

Facial landmarks are facial points that describe the facial action units. The example of facial landmarks is depicted as red dots in Fig 2. We annotated the facial landmarks generated during the attention task as attention and inattention based on the response of the participants. Then we fed the annotated facial landmarks into a machine learning algorithm via specific and generalized models. A specific model is developed with facial landmarks from a specific participant while a generalized model is based on facial landmarks across different participants. The results from the experimental study show that the performance of the specific model was higher than the performance of the generalized model. This finding indicates that each child has a unique facial movement for attentional behavior. Conversely, the performance of the generalized model for TD children was higher than that of the ASD participants. This shows that the face-based attentional behaviors are common among TD children.

For effective attention assessment in children with ASD using the face-tracking tool, the development of the attention detection tool should be personalized. This indicates that the face-based attentional behaviors should be built on the facial actions of each child. The findings can also be linked to attributes of ASD which states that each child with ASD is entirely different from another. It is worth noting that facial features have the potential for attention assessment but there is no universal facial feature that describes attentional behavior in children with ASD.

References

  1. Alshaban, F., et al., Prevalence and correlates of autism spectrum disorder in Qatar: a national study. Journal of Child Psychology and Psychiatry, 2019. 60(12): p. 1254-1268.
  2. James, W., The principles of psychology New York. Holt and company, 1890.
  3. Moore, M. and S. Calvert, Brief report: Vocabulary acquisition for children with autism: Teacher or computer instruction. Journal of autism and developmental disorders, 2000. 30(4): p. 359-362.
  4. Banire, B., et al. A systematic review: Attention assessment of virtual reality based intervention for learning in children with autism spectrum disorder. in 2017 7th IEEE International Conference on Control System, Computing and Engineering (ICCSCE). 2017.
  5. Dewan, M.A.A., M. Murshed, and F. Lin, Engagement detection in online learning: a review. Smart Learning Environments, 2019. 6(1): p. 1.
  6. Escobedo, L., et al., Using Augmented Reality to Help Children with Autism Stay Focused. IEEE Pervasive Computing, 2014. 13(1): p. 38-46.
  7. Rosvold, H.E., et al., A continuous performance test of brain damage. Journal of consulting psychology, 1956. 20(5): p. 343.
  8. iMotions, iMotion Biometric Tool, 2017.
  9. Magdin, M. and F. Prikler, Real time facial expression recognition using webcam and SDK affectiva. IJIMAI, 2018. 5(1): p. 7-15.
  10. Banire, B., et al. Attention Assessment: Evaluation of Facial Expressions of Children with Autism Spectrum Disorder. in International Conference on Human-Computer Interaction. 2019. Springer.

 

Share this