Efficient CHAD

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

We show how the basic Combinatory Homomorphic Automatic Differentiation (CHAD) algorithm can be optimised, using well-known methods, to yield a simple, composable, and generally applicable reverse-mode automatic differentiation (AD) technique that has the correct computational complexity that we would expect of reverse-mode AD. Specifically, we show that the standard optimisations of sparse vectors and state-passing style code (as well as defunctionalisation/closure conversion, for higher-order languages) give us a purely functional algorithm that is most of the way to the correct complexity, with (functional) mutable updates taking care of the final log-factors. We provide an Agda formalisation of our complexity proof. Finally, we discuss how the techniques apply to differentiating parallel functional array programs: the key observations are 1) that all required mutability is (commutative, associative) accumulation, which lets us preserve task-parallelism and 2) that we can write down data-parallel derivatives for most data-parallel array primitives.
Original languageEnglish
Article number36
Number of pages29
JournalProceedings of the ACM on Programming Languages
Volume8
DOIs
Publication statusPublished - 5 Jan 2024

Bibliographical note

Publisher Copyright:
© 2024 Owner/Author.

Funding

We would like to thank Gershom Bazerman for suggesting Okasaki's banker's method, which inspired the amortisation argument in this paper. This project has received funding via NWO Veni grant number VI.Veni.202.124.

FundersFunder number
NWO Veni grantVI.Veni.202.124

    Keywords

    • automatic differentiation
    • functional programming
    • source transformation

    Fingerprint

    Dive into the research topics of 'Efficient CHAD'. Together they form a unique fingerprint.

    Cite this