Automatic pain assessment from face video (continuous pain intensity estimation in adults and newborns)

Egede, Joy Onyekachukwu (2019) Automatic pain assessment from face video (continuous pain intensity estimation in adults and newborns). PhD thesis, University of Nottingham.

[img] PDF (Thesis - as examined) - Repository staff only - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (6MB)


Pain assessment is a very crucial aspect of medical diagnosis as it is symptomatic of many medical conditions. In cases where a patient is unable to self-report pain, pain assessment is done by a clinician via observation of behavioural changes and vital signs. However, this method is highly subjective and discontinuous in time. In order to introduce an objective measure to clinical pain assessment and support real-time pain monitoring, automatic pain recognition models have been proposed but the performance of these models is still limited by the imbalanced and small sample pain data available for training. In addition, there is currently a dearth of information on the usability and impact of such tools in clinical settings. This thesis aims to develop novel computer vision and machine learning techniques that can achieve good pain estimation on small and sparse pain datasets and also explore the applicability of automated pain assessment tools to clinical settings.

Regarding the problem of insufficient data for automatic pain recognition, this thesis presents and describes the collection of a geographically diverse multimodal newborn and infant pain dataset containing over 200 participant videos with framewise pain intensity annotations. Furthermore, to address the problem of learning from small pain data with sparse representation for higher pain levels, two novel methods of learning discriminative pain features for pain intensity estimation are proposed - a Hybrid Deep-learned and Hand-crafted (HDH) feature framework and a Cumulative Attribute (CA) learning framework. Evaluation of the HDH feature model on the UNBC McMaster pain dataset yielded state-of-the-art performance with an RMSE of 0.99 and a Pearson Correlation of 0.67. Similarly, an analysis of the CA learning framework revealed that models trained on the CA features consistently outperformed those trained on the corresponding low-level features and the performance improvement was due to the CA feature's ability to make better predictions for sparse higher pain classes. Additional evaluation on the newborn pain data resulted in a performance comparable to human error.

As a step towards investigating the suitability of computer-assisted pain assessment tools in clinical contexts, a user study was conducted with clinicians from a Neonatal Intensive Care Unit (NICU). Using the aforementioned models as early prototypes, qualitative and quantitative methods were employed to gauge acceptance and identify usability design issues. Findings from the study generated useful NICU environment-specific and context design issues which researchers should consider when developing video-based pain assessment tools.

Preliminary experimental results from the proposed models, as well as the insights garnered from the stakeholder survey, could potentially lead to improved clinical pain management. Likewise, the dataset presented would promote research in automatic newborn pain assessment and serve as a benchmarking platform for future methods.

Item Type: Thesis (University of Nottingham only) (PhD)
Supervisors: Valstar, Michel
Qiu, Guoping
Keywords: Automatic pain recognition; Pain estimation; Affective Computing; Automatic understanding of human behaviour; Facial expression analysis; Machine learning; Computer vision; Image processing.
Subjects: T Technology > TA Engineering (General). Civil engineering (General)
Faculties/Schools: UNNC Ningbo, China Campus > Faculty of Science and Engineering > School of Computer Science
Item ID: 55792
Depositing User: EGEDE, Joy
Date Deposited: 04 Apr 2019 07:09
Last Modified: 04 Jan 2021 08:11

Actions (Archive Staff Only)

Edit View Edit View