Fundamental Properties of Causal Entropy and Information Gain

Research output: Contribution to journalConference articleAcademicpeer-review

Abstract

Recent developments enable the quantification of causal control given a structural causal model (SCM). This has been accomplished by introducing quantities which encode changes in the entropy of one variable when intervening on another. These measures, named causal entropy and causal information gain, aim to address limitations in existing information theoretical approaches for machine learning tasks where causality plays a crucial role. They have not yet been properly mathematically studied. Our research contributes to the formal understanding of the notions of causal entropy and causal information gain by establishing and analyzing fundamental properties of these concepts, including bounds and chain rules. Furthermore, we elucidate the relationship between causal entropy and stochastic interventions. We also propose definitions for causal conditional entropy and causal conditional information gain. Overall, this exploration paves the way for enhancing causal machine learning tasks through the study of recently-proposed information theoretic quantities grounded in considerations about causality.

Original languageEnglish
Pages (from-to)188-208
Number of pages21
JournalProceedings of Machine Learning Research
Volume236
Publication statusPublished - Apr 2024
Event3rd Conference on Causal Learning and Reasoning, CLeaR 2024 - Los Angeles, United States
Duration: 1 Apr 20243 Apr 2024

Bibliographical note

Publisher Copyright:
© 2024 F.N.F.Q. Simoes, M. Dastani & T. van Ommen.

Keywords

  • Causal Inference
  • Information Theory
  • Structural Causal Models

Fingerprint

Dive into the research topics of 'Fundamental Properties of Causal Entropy and Information Gain'. Together they form a unique fingerprint.

Cite this