How small frictions can stall Machine Learning - Recognition for a piece of infrastructure that few know, but many rely on
On December 1, 2025, BIFOLD researcher Adrian Hill (TU Berlin) was part of a French-German team that received France’s Open Science Award for the open-source software DifferentiationInterface.jl. The prize was awarded by the French Ministry of Higher Education and Research (MESRE) at the 4th Open Science Awards ceremony for free research software. Together with Guillaume Dalle (École des Ponts, France), Hill was honored in the “Communauté” category. Their software targets developers using the Julia programming language and automatic differentiation (AD), a core technique for computing derivatives that underpins modern machine learning, optimization, and scientific simulation.
Choosing the right AD system is critical for scientific machine learning. DifferentiationInterface.jl addresses this by offering a unified, composable interface to multiple AD tools in Julia. Combined with the package SparseConnectivityTracer.jl, it enables automatic sparse differentiation across more than a dozen Julia backends.
The hidden cost: Switching AD tools meant rewriting entire codebases
Until now, switching between different AD backends often required deep changes to the code. Each tool came with its own interface, data structures, and quirks. As a result, researchers would often stick with suboptimal tools or avoid experimenting with alternatives altogether.
DifferentiationInterface.jl tackles exactly this issue. The software provides a unified interface for multiple AD libraries in Julia. The benefit: write once, test all of them. Researchers can now easily compare different methods and pick the one that fits their problem best, boosting efficiency, reproducibility, and scientific progress.
Technical finesse that makes a real difference
Whether gradients, Jacobians, or Hessians, DifferentiationInterface.jl abstracts away complexity, supports different memory strategies, and handles sparse matrices efficiently to save compute time. A built-in testing and benchmarking toolkit further ensures quality and performance.
A tool for the open science community
Launched only in 2024, DifferentiationInterface.jl has already become a cornerstone of the Julia ecosystem. As of January 2026, around 1,200 Julia software packages rely on it, and the tool is downloaded roughly 50,000 times per month — clear indicators of its growing importance for the machine learning community working with Julia.
The award from France’s Ministry of Higher Education, Research and Space underscores that scientific progress depends not only on groundbreaking ideas, but also on reliable, shared infrastructure — and on a strong commitment to openness.
France’s Open Science Award for free research software honors projects and research teams that develop and disseminate open-source tools and help build a digital commons of critical importance for science.
How it started
DifferentiationInterface.jl began with a chance encounter. Hill and Dalle met in a Slack channel while discussing their shared frustration with incompatible Julia tools. That exchange sparked the idea for a common solution.
Their work grew into a three-part framework: DifferentiationInterface.jl, SparseConnectivityTracer.jl for sparsity pattern detection, and matrix coloring methods in SparseMatrixColorings.jl. Earlier efforts to unify Julia’s differentiation tools had struggled to gain broad adoption. DifferentiationInterface.jl drew on their strengths while addressing their shortcomings.
The decisive factor was close collaboration with the Julia community, which helped turn the project into a widely used standard. In 2025, the authors submitted their software to the Open Science Award and were selected among the winners.
Software Details
More award information: Prix Science Ouverte 2025
GitHub: https://github.com/JuliaDiff/DifferentiationInterface.jl
Julia Packages: https://juliapackages.com/p/differentiationinterface
Machine Learning Group Software: https://web.ml.tu-berlin.de/software/
PrePrint accompanying Software: https://arxiv.org/abs/2505.05542
Further paper on the topic of sparsity detection: https://openreview.net/forum?id=GtXSN52nIW