TorchOpt: An Efficient Library for Differentiable Optimization

Abstract

Differentiable optimization algorithms often involve expensive computations of various meta-gradients. To address this, we design and implement TorchOpt, a new PyTorch-based differentiable optimization library. TorchOpt provides an expressive and unified program- ming interface that simplifies the implementation of explicit, implicit, and zero-order gradients. Moreover, TorchOpt has a distributed execution runtime capable of parallelizing diverse operations linked to differentiable optimization tasks across CPU and GPU devices. Experimental results demonstrate that TorchOpt achieves a 5.2× training time speedup in a cluster. TorchOpt is open-sourced at https://github.com/metaopt/torchopt and has become a PyTorch Ecosystem project.

Publication
In Journal of Machine Learning Research
Jie Ren
Collaborator
Yao Fu
PhD Student (Winner of 2024 Rising Star in ML & Systems)
Luo Mai
Luo Mai
Assistant Professor

My research interests include computer systems, machine learning systems and data management.

Related