Objectives
Facial palsy causes variable facial disfigurement ranging from subtle asymmetry to crippling deformity. There is no existing standard database to serve as a resource for facial palsy education and research. We present a standardized set of facial photographs and videos representing the entire spectrum of flaccid and nonflaccid (aberrantly regenerated or synkinetic) facial palsy. To demonstrate the utility of the dataset, we describe the relationship between level of facial function and perceived emotion expression as determined by an automated emotion detection, machine learning‐based algorithm.
Methods
Photographs and videos of patients with both flaccid and nonflaccid facial palsy were prospectively gathered. The degree of facial palsy was quantified using eFACE, House‐Brackmann, and Sunnybrook scales. Perceived emotion during a standard video of facial movements was determined using an automated, machine learning algorithm.
Results
Sixty participants were enrolled and categorized by eFACE score across the range of facial function. Patients with complete flaccid facial palsy (eFACE <60) had a significant loss of perceived joy compared to the nonflaccid and normal groups. Additionally, patients with only moderate flaccid and nonflaccid facial palsy had a significant increase in perceived negative emotion (contempt) when compared to the normal group.
Conclusion
We provide this open‐source database to assist in comparing current and future scales of facial function as well as facilitate comprehensive investigation of the entire spectrum of facial palsy. The automated machine learning‐based algorithm detected negative emotions at moderate levels of facial palsy and suggested a threshold severity of flaccid facial palsy beyond which joy was not perceived.