GSoC’20 - A summer full of optimization and Julia

2 minute read

Latest updates on the work done. Post-GSoC tasks

Hi all! This is the final blog in the series marking my progress in Differentiable Optimization Problems. You may enjoy:

  1. Reading my first blog.
  2. Checking the code repository here.
  3. Reading the docs @ https://aks1996.github.io/DiffOpt.jl/dev/.

The project - progress

Milestones completed:

  1. Using MatrixOptInterface.jl as a dependency in DiffOpt.jl - PR#37
  2. Fix MathOptSetDistances.jl dependency - PR#35
  3. Support sparse structures in DiffOpt - PR#47. The biggest bottleneck in inducing sparsity was matrix A; thanks to @blegat, I drew inspiration from an SCS.jl PR
  4. Minor updates in converting MOI model to MatrixOptInterface.jl model - PR#7
  5. Updated docs with manual and examples

What do we mean by differentiating a program?

For a primer on differentiable optimization, refer to the introduction page in the documentation or this TF Dev Summit’20 video.

What have we achieved? How can you use it?

As of now, one can differentiate:

  • convex conic programs (with linear objectives), and
  • convex quadratic programs (with affine constraints)

written in MOI, and the theoretical models rely, respectively, on:

I’ve included some examples (for both the methods) in the documentation, plus plenty of examples (with reference CVXPY code) in the tests folder. For a primer on matrix inversion, I would suggest this example. If you face any problem, feel free to create an issue.

Post-GSoC improvements

Although we’ve almost approached the last week for GSoC, I’ll make sure to improve on the following things post-GSoC too.

  1. Making the code independent of SCS.jl (Issue#38) - although we did resolve SCS specific code in MatrixOptInterface.jl, but some part of differentiation still relies on SCS specific code. Removing this dependency should generalize differentiation to any available conic solver
  2. Derivative in terms of ChainRules.jl - this is one of the foremost suggestions by @matbesancon; although we couldn’t include AD in GSOC timeline well, we’ve begun discussion with ChainRules.jl community
  3. Time benchmarking - it’ll be really cool to make DiffOpt.jl fast. Started a beginner PR to profile computation time - PR#40
  4. From MOI to JuMP - since the code is already working with MOI backend, it shouldn’t be hard to differentiate JuMP models too. I’ll be interested in improving the interface and API usage using JuMP
  5. Many specific improvements in MODistances.jl and MatrixOptInterface.jl

Final thoughts

I had a really good experience with the GSoC project. As I’ve already mentioned in my previous blog, the JuMP developer community is indeed welcoming and helpful.

A shoutout to Julia project maintainers, JuMP developer community, and my mentors!!

It wouldn’t be wrong to say that at times I was confused with how to make an effective segue from optimization theory to working models. We’ve been able to develop the codebase because of the biweekly GSoC standups, the JuMP developer call, numerous issues/PR discussions, and slack threads.

100+ commits, 20+ PRs and 3000+ lines of code and counting

Finally, I would like to thank Google Summer of Code team for providing us this great learning opportunity amidst the COVID pandemic.

Updated: