Interpretable predictions with Convolutional Neural Networks for complex data

Giacomo Lancia

Research output: ThesisDoctoral thesis 1 (Research UU / Graduation UU)


Deep Learning (DL) and Artificial Intelligence (AI) are nowadays one of the most used tools to analyze massive and complex data sets. Despite being very flexible and powerful, Artificial Neural Networks (ANN) are often denoted as "black-box" methods; the causal association between predictions and data is not straightforward and easy to explain. This thesis is focused on three applications with complex data of 1-D Convolutional Neural Networks (1-D CNN), a specific type of ANN. Through Explainable Artificial Intelligence (XAI) algorithms, 1-D CNN-based predictions can be made interpretable. Firstly, we considered the possibility of improving the diagnosis of malignant tumors through the classification of Raman spectra of genomic DNA. Much of the focus is dedicated to discerning different sub-cell lines of the same tumor. Next, 1-D CNN has been implemented to predict El Niño Southen Oscillation (ENSO) from Zebiack-Cane (ZC) simulated data. We tried to understand what 1-D CNN can learn about these events' physical dynamics when trying to distort the parameters ruling the ocean-atmosphere coupling. Last, a joint work with the ICU-Department at UMCU for improving the treatment of ICU patients is presented: 1-D CNN was utilized to predict nosocomial ICU-Acquired Infections (ICU-AI) dynamically. Specifically, 1-D CNN has been trained to score the risk of an ICU-AI onset only by analyzing the massive amount of information available from the ICU monitors. The actual ICU-AI prediction has been provided through Survival Analysis techniques after embedding the 1-D CNN analysis into a wider set of traditional explainable variables.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • Utrecht University
  • Frank, Jason, Primary supervisor
  • Spitoni, Cristian, Co-supervisor
Award date14 Jun 2023
Place of PublicationUtrecht
Publication statusPublished - 14 Jun 2023


  • Convolutional Neural Networks
  • Artificial Neural Networks
  • Explainable Artificial Intelligence
  • Competing Risks models
  • Landmarking
  • intensive care medicine
  • Zebiac-Cane equations
  • El Niño
  • Raman mapping
  • genomic DNA


Dive into the research topics of 'Interpretable predictions with Convolutional Neural Networks for complex data'. Together they form a unique fingerprint.

Cite this