Analysis of Various Facial Expressions of Horses as a Welfare Indicator Using Deep Learning

Su Min Kim, Gil Jae Cho

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

This study aimed to prove that deep learning can be effectively used for identifying various equine facial expressions as welfare indicators. In this study, a total of 749 horses (healthy: 586 and experiencing pain: 163) were investigated. Moreover, a model for recognizing facial expressions based on images and their classification into four categories, i.e., resting horses (RH), horses with pain (HP), horses immediately after exercise (HE), and horseshoeing horses (HH), was developed. The normalization of equine facial posture revealed that the profile (99.45%) had higher accuracy than the front (97.59%). The eyes–nose–ears detection model achieved an accuracy of 98.75% in training, 81.44% in validation, and 88.1% in testing, with an average accuracy of 89.43%. Overall, the average classification accuracy was high; however, the accuracy of pain classification was low. These results imply that various facial expressions in addition to pain may exist in horses depending on the situation, degree of pain, and type of pain experienced by horses. Furthermore, automatic pain and stress recognition would greatly enhance the identification of pain and other emotional states, thereby improving the quality of equine welfare.

Original languageEnglish
Article number283
JournalVeterinary Sciences
Volume10
Issue number4
DOIs
StatePublished - Apr 2023

Keywords

  • automatic recognition
  • deep learning
  • equine welfare
  • facial expression
  • horse
  • pain
  • profile

Cite this