Welcome to CDOpt#
A Python toolbox for optimization on Riemannian manifolds with supports for deep learning
Riemannian optimization is a powerful framework to tackle nonlinear optimization problems with structural equality constraints. By transforming these Riemannian optimization problems into the minimization of the constraint dissolving functions, CDOpt allows for elegant and direct implementation various unconstrained optimization approaches for Riemannian optimization problems. CDOpt also provides user-friendly frameworks for training manifold constrained neural networks by PyTorch and Flax.
CDOpt have the following key features:
Dissolved constraints: CDOpt transforms Riemannian optimization problems into equivalent unconstrained optimization problems. Therefore, we can utilize various highly efficient solvers for unconstrained optimization, and directly apply them to solve Riemannian optimization problems. Benefiting from the rich expertise gained over decades for unconstrained optimization, CDOpt is very efficient and naturally avoids the difficulties in extending the unconstrained optimization solvers to their Riemannian versions.
High compatibility: CDOpt has high compatibility with various numerical backends, including NumPy, SciPy, PyTorch, JAX, Flax, etc . Users can directly apply the advanced features of these packages to accelerate optimization, including the automatic differentiation, GPU/TPU supports, distributed optimization frameworks, just-in-time (JIT) compilation, etc.
Customized constraints: CDOpt dissolves manifold constraints without involving any geometrical material of the manifold in question. Therefore, users can directly define various Riemannian manifolds in CDOpt through their constraint expressions \(c(x)\).
Plug-in neural layers: CDOpt provides various plug-in neural layers for PyTorch and Flax packages. With minor changes in the standard PyTorch/Flax codes, users can easily build and train neural networks with various manifold constraints.
- Overview
- Installation
- Tutorials
- Examples
- Optimization via SciPy
- Training Neural Networks with Manifold Constraints via PyTorch
- Training LeNet with Constrained Convolution Kernels
- Training Single-Layer RNN with Constrained Weights
- Training Multi-Layer RNN with Constrained Weights
- Training LSTM with Constrained Weights
- Time Sequence Prediction with Orthogonality Constrained LSTM
- Distributed Training for RNN with Constrained Weights
- Distributed Training for A Simple Network by Distributed RPC Framework
- Training Neural Networks with Manifold Constraints via JAX and FLAX
- API Reference
- About CDOpt
- Update log