Facial recognition is a non-invasive method for analysing facial characteristics and changes in facial expressions. There is evidence to suggest there may be a link between physiological health and facial expression patterns. Many medical conditions are associated with specific facial characteristics, with some related to inherited genetic conditions and others that have been acquired as a result of medical conditions such as stroke, nerve injuries and dementia. If successful, this technology could prove useful for studies evaluating the monitoring of acute or chronic illness. The proposed study will ask participants to do a maximal exercise test that will challenge the body, as a proof-of-principle, to see how this correlates with facial expressions. Depending on the study outcomes and future work, it could prove to be a useful tool and clinical application in healthcare for evaluating and tracking patient health and well-being. The investigators want to conduct a proof-of-principle study where physically stressed young and healthy participants (in the form of a maximal exercise test), whilst recording facial expressions, will be used to determine the relationship between physiological stress and changes in facial parameters.
For this pilot study, no specific sample size calculation is required for this pilot study as this will recruit 15 participants for proof of concept. The outcomes from this research will lead to the identification of measurement parameters to be evaluated in future studies concerning facial characteristics and their parameters. This will inform the next steps in the application of the process to further healthy participants with the ultimate aim to translate this to research on a range of patient populations in elective and acute medical settings. Video and still image capture will be undertaken using a camera mounted on a tripod in front of the participant whilst they are exercising. The data analysis mainly consists of four processing stages, including facial landmark detection, facial region identification, region-based facial analysis and data correlations, as illustrated in the diagram below. The facial landmark detection can label a total of 68 points in each face, which are in the predefined locations. Based on the facial landmark, the region identification can segment the face into different areas, such as brows, eyes, chin, etc. Then, the region-based analysis can be applied to facial data and extracts the geometry, colour, emotion and temperature information of the individual facial region and calculates their variations against time. Finally, vital sign reading and region-based facial data are analysed together by using statistical tools in order to identify their correlations and demonstrate how facial appearances change in different scenarios. Lancaster University is the data owner/controller, and UCLAN will follow Lancaster's requirements of data management when the data is processed. The data will be transferred on an encrypted hard drive and stored on secure online storage by UCLAN data protection rules. The data will only be accessible to the FACIAL study collaborators and statisticians. Whilst sociodemographic and physiological data can be anonymised, the video recordings of facial expressions cannot. The two sets of data will be provided with separate research codes so that protected data cannot be linked to identities by anyone other than the study team. Processing of protected data such as ethnicity and biometric data is central to the research question and ultimately, might have efficacy in the treatment of high dependency patients in hospital care. The FACIAL Study protocol has been formulated and reviewed by the FACIAL Steering Committee, and research will be carried out in line with the University of Central Lancashire and Lancaster University code of ethics. The FACIAL study will be carried out by the FACIAL steering committee as a collaboration between East Lancashire Hospitals NHS Trust, Lancaster University and the University of Central Lancashire. The findings from this study will be presented at national or international conferences. Any facial images used for publication will be obtained with additional consent from the individuals. Manuscript(s) will be prepared for submission to a peer-reviewed journal.
Study Type
OBSERVATIONAL
Enrollment
15
Healthy participants will be situated on a cycle ergometer fitted with a 12-lead ECG or chest-strap heart rate monitor. Data from the questionnaire will be stored in a study site file in a locked cabinet in an office. Baseline readings will be obtained whilst on the bike for 3 minutes before the subject embarks on a 3-minute warm-up at 50 Watts (W). The workload (watts) on the bike is then increased to a workload that corresponds to a heart rate of 120 bpm (gentle exercise). The workload will then be increased by 25-50W every three minutes until volitional exhaustion. Between each stage, there will be a 30s rest period where the participant will be cycling with minimal resistance ("freewheeling") but will be otherwise stationary on the cycle ergometer. Still images of the face and facial video recordings will be taken during a 30s rest period between increases in exercise intensity. A finger-prick blood sample (glucose \& lactate) will be taken at the beginning and every 3 minutes.
Faculty of Health and Medicine, Lancaster University
Lancaster, Lancashire, United Kingdom
Change in distance between facial features
Change in inter-feature distance (e.g., between eyes, mouth corners) during different phases of exercise, captured using video and still images to detect facial regions and landmarks. Unit: millimetres (mm).
Time frame: 60 minutes
Change in size and shape of facial features
Change in facial feature dimensions (e.g., width and height of eyes, mouth) and geometric shape indices during different phases of exercise. Additionally, captured using video and still images to detect facial regions and landmarks. Units: millimetres² or shape index (unitless).
Time frame: 60 minutes
Change in facial blood flow patterns
Regional blood flow variation across facial regions during exercise, captured through thermal imaging. Unit: perfusion units or arbitrary units (a.u.)
Time frame: 60 minutes
Change in facial temperature
Alterations in surface temperature across facial regions during exercise, measured with thermal imaging. Units: degrees Celsius (°C)
Time frame: 60 minutes
Change in facial skin colour
Variation in facial colour intensity and hue across regions during exercise, analysed from captured video, thermal imaging and still images. Unit of measure: RGB values
Time frame: 60 minutes
Change in heart rate
Heart rate is recorded at rest, during exercise, and during recovery. Units: Beats per minute (bpm)
Time frame: 60 minutes
Change in heart rate variability (HRV)
Analysis of time and frequency domain HRV indices during different exercise stages. Units: Milliseconds (ms)
Time frame: 60 minutes
Change in blood glucose
Blood glucose levels are recorded at rest, during exercise, and recovery. Unit: millimoles per litre (mmol/L)
Time frame: 60 minutes
Change in Blood Lactate
Blood lactate recorded at rest, during exercise, and recovery. Units: millimoles per litre (mmol/L)
Time frame: 60 minutes
This platform is for informational purposes only and does not constitute medical advice. Always consult a qualified healthcare professional.