Short examples that illustrate basic features of CVXOPT.
Examples from the book Convex Optimization by Boyd and Vandenberghe.
- Optimal trade-off curve for a regularized least-squares problem (fig. 4.11)
- Risk-return trade-off (fig. 4.12)
- Penalty function approximation (fig. 6.2)
- Robust regression (fig. 6.5)
- Input design (fig. 6.6)
- Sparse regressor selection (fig. 6.7)
- Quadratic smoothing (fig. 6.8-6.10)
- Total variation reconstruction (fig. 6.11-6.14)
- Stochastic and worst-case robust approximation (fig. 6.15-6.16)
- Polynomial and spline fitting (fig. 6.19-6.20)
- Basis pursuit (fig 6.21-6.23)
- Least-squares fit of a convex function (fig. 6.24)
- Consumer preference analysis (fig. 6.25-6.26)
- Logistic regression (fig. 7.1)
- Maximum entropy distribution (fig. 7.2-7.3)
- Chebyshev bounds (fig. 7.6-7.7)
- Chernoff lower bound (fig. 7.8)
- Experiment design (fig. 7.9-7.11)
- Ellipsoidal approximations (fig. 8.3-8.4)
- Centers of polyhedra (fig. 8.5-8.7)
- Approximate linear discrimination (fig. 8.10-8.12)
- Linear, quadratic, and fourth-order placement (fig. 8.15-8.17)
- Floor planning example (fig. 8.20)
Custom interior-point solvers¶
Examples from the book chapter Interior-point methods for large-scale cone programming (pdf) by M. S. Andersen, J. Dahl, Z. Liu, L. Vandenberghe; in: S. Sra, S. Nowozin, S. J. Wright (Editors) Optimization for Machine Learning, MIT Press, 2011.
The code for nuclear norm approximation can be found here.
Useful Python scripts that are not included in the distribution.