AI for health: Applying AI to model and understand multimorbidity

Go back to programme

AI for health: Applying AI to model and understand multimorbidity

Multimorbidity, i.e. the co-occurrence of multiple chronic or acute conditions within a single host, represents one of the largest global health challenges of the 21st century and could become the next global pandemic. As people live longer and the incidence of chronic diseases continues to rise, the prevalence of multimorbidity is reaching unprecedented levels. New evidence suggests that multimorbidity may emerge not through incidental accumulation of independent diseases, but instead begins as a single-organ injury that proceeds to incite damage in secondary organs through dysregulated inter-organ communication.

With the recent revolutions in measurement technologies (e.g., spatial transcriptomics, proteomics, metabolomics), multi-organ chip technologies and AI, we are now at a stage where we start gaining an advanced understanding of homeostatic and disturbed inter-organ communication to not only manage, but to therapeutically target, predict and even prevent multimorbidity. Longitudinal biosamples from well-characterized patient cohorts and data obtained from pilot, hypothesis-generating studies, as well as technological advances in multi-omics-based profiling, microphysiological systems, high-resolution imaging, and explainable AI now make it possible to comprehensively capture and model aberrant signals leading to multimorbidity in an explorative and hypothesis-driven manner.

This workshop will discuss the potentials of these technological breakthroughs for gaining a deeper understanding of the mechanisms behind diseases of the 21th century, such as multimorbidity.

Advances in multi-omics and AI: Confounder testing through explainable AI

Sofia Kirke Forslund-Startceva

Until recently, biomedical sciences advanced incrementally through consecutive experimental and statistical testing of one hypothesis at a time. Parallel -omics technologies allow orders of magnitude more concurrent measurements per subject. Univariate statistics for biomarker discover can be scaled up using computers though key assumptions do not always hold. These risks are aggravated in patient cohort data because it exhibits structure from co-, pre- and multimorbidity, treatment effects, and survival biases, presenting challenges when mining patient data for mechanisms or targets, and in generalizing diagnostic tests. In response, we and others deployed covariate-aware statistical frameworks. These allow confounder correction and mediation analysis, but so far mostly on a variable-by-variable basis. I will introduce these approaches, then outline some challenges and opportunities in extending confounder testing paradigms into the machine learning space, with examples from studies on host-microbiome factors in multimorbid cardiovascular, metabolic and renal disease.

Share this session

Are you sure you want to remove this speaker?