- Estimating Motor Symptom Presence and Severity in Parkinson's Disease from Wrist Accelerometer Time Series Using ROCKET and InceptionTime. Nature Scientific Reports, 2025 more… BibTeX
- Predictive Model Development to Identify Failed Healing in Patients after Non–Union Fracture Surgery. Proceedings of the 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2024), IEEE, 2024, 1-4 more… BibTeX
- Assessing Human-Human Kinematics for the Implementation of Robot-Assisted Physical Therapy in Humanoids: A pilot study. The International Conference on Rehabilitation Robotics (ICORR) 2023, 2023 more… BibTeX
- A neural network for the detection of soccer headers from wearable sensor data. Scientific Reports 12 (1), 2022 more… BibTeX
- High-Resolution Motor State Detection in Parkinson's Disease Using Convolutional Neural Networks. Scientific Reports 10 (1), 2020, 5860 more… BibTeX
- A Multi-layer Gaussian Process for Motor Symptom Estimation in People with Parkinson’s Disease. IEEE Transactions on Biomedical Engineering 66 (11), 2019, 3038-3049 more… BibTeX
- Dynamics-based estimation of Parkinson's disease severity using Gaussian Processes. Second IFAC Conference on Cyber-Physical & Human Systems, 2018 more… BibTeX
- Acquisition, Validation and Preprocessing of Wrist-Worn Sensor Data in Patients with Parkinson’s Disease and Healthy Controls. International Parkinson and Movement Disorder Society 201721st International Congress, Vancouver, BC more… BibTeX
- Deep Learning in Objective Classification of Spontaneous Movement of Patients with Parkinson’s Disease Using Large-Scale Free-Living Sensor Data. International Parkinson and Movement Disorder Society 201721st International Congress, Vancouver, BC more… BibTeX
- Data augmentation of wearable sensor data for parkinson's disease monitoring using convolutional neural networks. Proceedings of the 19th ACM International Conference on Multimodal Interaction - ICMI 2017, 2017 more… BibTeX
Clinical Decision‑Support through Interpretable and Uncertainty‑Aware Machine Learning
Researcher: Cedric Donié, Neha Das, Satoshi Endo
Clinical decision‑making increasingly depends on moving beyond static, episodic assessments toward continuous and quantitative estimation of patient state. Detecting subtle changes, predicting future progression, and identifying early warning signs can transform how interventions are planned and adjusted. This is essential in situations where critical events are rare and embedded within large volumes of routine or noisy data.
To be useful in real‑world practice, models must perform reliably on heterogeneous, incomplete, and often small datasets, while producing outputs that are interpretable and actionable for clinicians. The challenge is to bridge the gap between powerful data‑driven learning methods and the requirements of safe, trusted, and clinically integrated systems.
Research questions
- How can machine learning be used to estimate patient state and predict outcomes in a way that is robust to imperfect, variable, and sparse clinical data?
- How can early‑warning outputs be made clinically interpretable, highlighting why an adverse trend is likely and how to respond?
- How can these methods adapt across multiple domains while remaining tailored to specific patient populations and goals?
Approach
We design machine‑learning frameworks that provide interpretable state estimation and outcome prediction for decision‑support in clinical contexts.
Our approach emphasises:
- Feature‑based and sequence‑learning methods to capture temporal and biomechanical patterns in patient data.
- Domain‑aware calibration to align model outputs with meaningful clinical measures.
- Robustness to small, imbalanced datasets through model regularisation and data‑efficient training.
Exemplar case studies demonstrate the flexibility of this approach:
- Fracture‑healing prognosis with gradient‑boosted models to flag patients at risk of delayed union early after surgery.
- Head‑impact exposure monitoring with recurrent networks applied to wearable sensor data for quantifying long‑term exposure in athletes.
- Abnormal movement detection with interpretable classifiers and biomechanical features to characterise compensatory motion patterns in rehabilitation.
We are extending this work toward grey‑box modelling, embedding established clinical knowledge, biomechanical principles, and pathophysiological constraints directly into machine‑learning architectures. This hybrid approach combines the adaptability of data‑driven methods with the reliability and interpretability of mechanistic models, producing systems that are both evidence‑informed and patient‑specific.
Another active area is model‑uncertainty quantification. By explicitly estimating prediction confidence, we can identify when the model is extrapolating beyond its reliable operating range and where additional clinical data or expert input is needed. This capability not only improves safety and trust but also guides targeted data collection for model refinement.
Key results and achievements
- Demonstrated reliable state‑estimation and event‑detection methods in challenging real‑world datasets.
- Shown that clinically interpretable features improve acceptance and adoption of model outputs.
- Established a pathway toward hybrid grey‑box models and uncertainty‑aware predictions for future decision‑support systems.
By focusing on interpretable, uncertainty‑aware, and clinically grounded machine‑learning frameworks, this research aims to advance decision‑support tools that can evolve into trusted partners for clinicians, informing both immediate care decisions and the design of future clinical studies.