├── README.md ├── LICENSE └── ideas-list.md /README.md: -------------------------------------------------------------------------------- 1 | # JuMP-NumFOCUS 2 | 3 | [JuMP](http://www.juliaopt.org) will be participating as a sub-organization in [NumFOCUS](http://numfocus.org/)'s application for [Google Summer of Code 2019](https://summerofcode.withgoogle.com/). For more information about this application see: 4 | - [NumFOCUS's main GSoC page](https://github.com/numfocus/gsoc) 5 | - [JuMP's project ideas list](https://github.com/JuliaOpt/GSOC2019/blob/master/ideas-list.md) 6 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 JuliaOpt 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /ideas-list.md: -------------------------------------------------------------------------------- 1 | # JuMP 2 | 3 | JuMP is a modeling interface and a collection of supporting packages for mathematical optimization that is embedded in Julia. With JuMP, users formulate various classes of optimization problems with easy-to-read code, and then solve these problems using state-of-the-art open-source and commercial solvers. JuMP also makes advanced optimization techniques easily accessible from a high-level language. 4 | 5 | ## Mentors 6 | 7 | - [Chris Coey](https://github.com/chriscoey) 8 | - [Carleton Coffrin](https://github.com/ccoffrin) 9 | - [Joaquim Dias Garcia](https://github.com/joaquimg) 10 | - [Oscar Dowson](https://github.com/odow) 11 | - [Lea Kapelevich](https://github.com/lkapelevich) 12 | - [Benoît Legat](https://github.com/blegat) 13 | - [Miles Lubin](https://github.com/mlubin) 14 | - [Sascha Timme](https://github.com/saschatimme) 15 | - [Juan Pablo Vielma](https://github.com/juan-pablo-vielma) 16 | 17 | 18 | ## Project Ideas 19 | 20 | ### Solver Tests and Benchmarks 21 | 22 | #### Abstract 23 | 24 | MathOptInterface is a very general way of representing a mathematical optimization problem. This project will develop infrastructure to test the correctness of, and benchmark, MathOptInterface-compatible solvers in Julia. 25 | 26 | 27 | | **Intensity** | **Priority** | **Involves** | **Mentors** | 28 | | ------------- | ------------ | ------------- | ----------- | 29 | | Moderate | Medium | Converting existing benchmark libraries into MathOptFormat. Developing infrastructure to make testing and benchmarking MathOptInterface-compatible solvers easy. | [Chris Coey](https://github.com/chriscoey), [Carleton Coffrin](https://github.com/ccoffrin) and [Oscar Dowson](https://github.com/odow) | 30 | 31 | #### Technical Details 32 | 33 | The generality of [MathOptInterface’s standard form](http://www.juliaopt.org/MathOptInterface.jl/dev/apimanual/) enables it to model problems that cannot be written down in existing file-formats for optimization models (e.g., a mixed-integer problem with both NLP and conic constraints). To overcome this problem, a new file format is under development, [MathOptFormat](https://github.com/odow/MathOptFormat.jl). By converting existing benchmark libraries of optimization models into MathOptFormat, we can have a centralized and diverse library of models for testing and benchmarking Julia solvers that are hooked into MathOptInterface. 34 | The main goals of this project are to: 35 | - Automate the conversion of existing benchmark libraries such as [CBLib](http://cblib.zib.de/) and [MINLPLib](http://www.minlplib.org/) into MathOptFormat 36 | - Make this library accessible online 37 | - Develop Julia infrastructure to make downloading, reading, and running these MathOptFormat instances easy for the purpose of testing and benchmarking optimization solvers in Julia 38 | - Extend/improve [MathOptFormat](https://github.com/odow/MathOptFormat.jl) as necessary 39 | 40 | 41 | #### Helpful Experience 42 | 43 | - Basic knowledge of JuMP v0.19 44 | - Prior work with optimization solvers 45 | - Experience with testing or benchmarking numerical code 46 | 47 | 48 | #### First steps 49 | 50 | - Become familiar with [MathOptFormat](https://github.com/odow/MathOptFormat.jl) and various standard file formats for optimization models such as .MPS, .CBF, .NL, and .LP 51 | - Become familiar with testing and benchmarking tools in Julia 52 | - Become familiar with current testing and benchmarking infrastructure in [JuMP](https://github.com/JuliaOpt/JuMP.jl) and [MathOptInterface](https://github.com/JuliaOpt/MathOptInterface.jl), solvers such as [Pajarito](https://github.com/JuliaOpt/Pajarito.jl), and [MINLPLibJuMP](https://github.com/lanl-ansi/MINLPLibJuMP.jl) and [PowerModels](https://github.com/lanl-ansi/PowerModels.jl) 53 | 54 | ### New and Updated Example Notebooks 55 | 56 | #### Abstract 57 | 58 | With the upcoming v0.19 update to JuMP many example notebooks need to be updated. This is also a great opportunity to add new examples and improve their accessibility. 59 | 60 | | **Intensity** | **Priority** | **Involves** | **Mentors** | 61 | | ------------- | ------------ | ------------- | ----------- | 62 | | Easy | Medium | Porting notebook examples to JuMP v0.19, improving and standardizing notation, naming, and style, developing new notebooks, linking to JuMP documentation, etc. | [Chris Coey](https://github.com/chriscoey), [Joaquim Dias Garcia](https://github.com/joaquimg) and [Lea Kapelevich](https://github.com/lkapelevich)| 63 | 64 | #### Technical Details 65 | 66 | The soon to be released version 0.19 of [JuMP](https://github.com/JuliaOpt/JuMP.jl) introduces many syntax changes that will require updates to many [example notebooks](https://github.com/JuliaOpt/juliaopt-notebooks). Many JuMP extensions such as [PolyJuMP](https://github.com/JuliaOpt/PolyJuMP.jl) could also benefit from new example notebooks. This is also a good opportunity to improve these examples in various ways such as: 67 | - Standardizing naming and structuring of examples to enforce [JuMP style guidelines](http://www.juliaopt.org/JuMP.jl/dev/style/) 68 | - Linking to parts of specific examples from the [JuMP documentation](http://www.juliaopt.org/JuMP.jl/dev/index.html), and try to cover all important JuMP functionality with examples 69 | - Adding examples for all [major solvers](http://www.juliaopt.org/#solvers) and linking to the examples from solver readmes 70 | - Making it easy for people to try out JuMP examples with open-source solvers in the browser without downloading Julia 71 | - Using tools like [Weave](https://github.com/mpastell/Weave.jl) to facilitate the development and deployment of the examples 72 | 73 | 74 | #### Helpful Experience 75 | 76 | - Basic knowledge of JuMP v0.19 77 | - Basic knowledge of mathematical optimization models and methods 78 | - Basic knowledge of notebook-related tools like [Weave](https://github.com/mpastell/Weave.jl) 79 | 80 | #### First steps 81 | 82 | - Become familiar with the updated syntax of JuMP 0.19, its [documentation](http://www.juliaopt.org/JuMP.jl/dev/index.html) and [style guidelines](http://www.juliaopt.org/JuMP.jl/dev/style/) 83 | - Become familiar with existing JuMP [example notebooks](https://github.com/JuliaOpt/juliaopt-notebooks) 84 | - Become familiar with tools like [Weave](https://github.com/mpastell/Weave.jl) 85 | - Make a small update to an existing notebook or develop a simple new notebook, and submit a pull request to [PolyJuMP](https://github.com/JuliaOpt/PolyJuMP.jl) on Github 86 | - Explore JuMP extensions such as [PolyJuMP](https://github.com/JuliaOpt/PolyJuMP.jl) 87 | 88 | 89 | ### Experiments with Automatic Differentiation (AD) Backends for JuMP 90 | 91 | #### Abstract 92 | 93 | JuMP's Automatic Differentiation (AD) library for computing derivatives of 94 | nonlinear models was written before the recent emergence of advanced AD 95 | libraries in Julia. JuMP's AD imposes a number of 96 | [syntax limitations](http://www.juliaopt.org/JuMP.jl/0.18/nlp.html#syntax-notes) 97 | that are difficult to address within the current framework. Performance could 98 | also be improved. Hence, we would like to explore whether other libraries have 99 | the potential to integrate with JuMP as replacement AD backends. 100 | 101 | Mentors: [Miles Lubin](https://github.com/mlubin) and 102 | [Juan Pablo Vielma](https://github.com/juan-pablo-vielma) 103 | 104 | #### Technical Details 105 | 106 | The goal of this project is to evaluate if existing AD libraries in Julia can 107 | serve as replacements for JuMP's AD. This involves implementing benchmarks for 108 | gradient and Hessian computations in multiple backends and comparing performance 109 | and functionality. 110 | 111 | Stretch goals: 112 | - A prototype translation layer between JuMP and a new AD backend, or 113 | - A frontend to [MOI](https://github.com/JuliaOpt/MathOptInterface.jl) nonlinear 114 | solvers using a different AD backend. 115 | 116 | 117 | #### Helpful Experience 118 | 119 | - Familiarity with JuMP’s non-linear optimization interface 120 | - Familiarity with some of Julia’s Automatic Differentiation libraries 121 | including: 122 | - [Cassette](https://github.com/jrevels/Cassette.jl) 123 | - [ChainRules](https://github.com/JuliaDiff/ChainRules.jl) 124 | - [Flux](https://github.com/FluxML) 125 | - [Zygote](https://github.com/FluxML/Zygote.jl) 126 | - [ForwardDiff](https://github.com/JuliaDiff/ForwardDiff.jl) 127 | - [ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl) 128 | - [Nabla](https://github.com/invenia/Nabla.jl) 129 | 130 | #### First steps 131 | 132 | - Watch Miles's [JuliaCon talk](https://youtu.be/xtfNug-htcs) on JuMP's AD 133 | - Read Section 5 of the [JuMP paper](https://mlubin.github.io/pdf/jump-sirev.pdf) 134 | - Update `acpower` and `clnlbeam` 135 | [benchmarks](https://github.com/mlubin/JuMPSupplement) for JuMP 0.19. 136 | - Implement these two benchmarks in another Julia AD library for comparison of 137 | performance. 138 | 139 | ### MutableArithmetics 140 | 141 | #### Abstract 142 | 143 | Create a MutableArithmetics (MA) package in which mutable versions of the arithmetic operations are defined. This package allows 1) for mutable types to implement a mutable arithmetics 2) for algorithms that could exploit mutable arithmetics to exploit it while still being completely generic. 144 | 145 | | **Intensity** | **Priority** | **Involves** | **Mentors** | 146 | | ------------- | ------------ | ------------- | ----------- | 147 | | Hard | Medium | Creating/documenting/testing a new API. Benchmarking code. | [Benoît Legat](https://github.com/blegat) and [Sascha Timme](https://github.com/saschatimme) | 148 | 149 | #### Technical Details 150 | 151 | Julia define standard arithmetic functions such as `+`, `-`, `*`, `/`, … 152 | which allows generic code to be usable by any type implementing the appropriate methods. 153 | For instance, `sum(::Vector{T})` works as long as `+(::T, ::T)` is defined. 154 | However, this generic implementation relying on `+` may not be optimal if `T` is a mutable type. 155 | For instance, if `T` is `JuMP.VariableRef`, and the vector has length `n`, 156 | the generic `sum` will allocate intermediate affine expressions of length 2, 3, 4, …, `n` 157 | resulting in `O(n^2)` complexity. 158 | For this reason, a [specialized method is defined in JuMP for summing affine expressions](https://github.com/JuliaOpt/JuMP.jl/blob/f46a461c126fd1a7c309fb773a00b5dc529632b9/src/operators.jl#L304-L310) 159 | that creates an empty affine expression and mutates its by adding each expression in the sum using `JuMP.add_to_expression!`. 160 | This method has `O(n)` complexity for summing `n` variables. 161 | The need for such specified methods is not specific for JuMP and is in general required for any mutable type, 162 | see for instance the [`sum` method for `BigInt`](https://github.com/JuliaLang/julia/blob/d4018df968019d38943cd5009fe469f1e97ba0b6/base/gmp.jl#L549) which uses `MPZ.add!` to accumulate the sum in the same `BigInt`. 163 | Because there is no standardized function for summing two object and allowing to mutate the first one, 164 | mutable types have to create specific methods (such as `JuMP.add_to_expression!` or `MPZ.add!`) and reimplement a specific method for each function that can take advantage of its mutability (such as `sum`). 165 | In JuMP, the mutating ability of its affine and quadratic expressions are exploited in two ways: 166 | 167 | - First, specialized methods for standard Base functions are implemented in `src/operators.jl` such as `sum`, matrix products, … 168 | - Second, when written inside a JuMP macro such as `@constraint`, an expression such as `3x + 2y + z - u + w` is rewritten to use a mutating function `JuMP.destructive_add!` (see `src/parse_expr.jl`). 169 | 170 | While this covers most use cases, 171 | it does not allow generic algorithms to take advantage of the mutability of JuMP expressions as it needs to use either `JuMP.add_to_expression!` or `JuMP.destructive_add!`. 172 | For instance, when the coefficients of polynomials are JuMP expressions, 173 | the generic operations defined in the https://github.com/JuliaAlgebra/ packages are suboptimal. 174 | Moreover, the rewritting rules used in the JuMP macro does not exploit the mutability of mutable types such as polynomials. 175 | Therefore in `3x + 2y + z - u + w`, if `x`, `y`, `z`, `u` and `w` are polynomials, 176 | as they do not implement `JuMP.add_to_expression!` or `JuMP.destructive_add!`, 177 | the rewritting with fallback to using the non-mutating `+`. 178 | In this project, a MutableArithmetics (MA) package is created in which mutable versions of the arithmetic operations are defined. 179 | This package allows 180 | 181 | 1. for mutable types to implement a mutable arithmetics 182 | 2. for algorithms that could exploit mutable arithmetics to exploit it while still being completely generic. 183 | 184 | The student will be asked to complete some of the following tasks after creating the MA package 185 | 186 | - Implement the MA API in MOIU and JuMP 187 | - Use the MA API in JuMP.parseExpr instead of JuMP.destructive_add 188 | - Implement the MA API for mutable types in Base such as BigInt in MA (so that MA could tests itself). 189 | - Use MA in MultivariatePolynomials/DynamicPolynomials/TypedPolynomials: This would allow polynomial operation to be more efficient when the element type is BigInt, MOI functions or JuMP functions. 190 | - Implement MA for MultivariatePolynomials/DynamicPolynomials/TypedPolynomials: This would allow polynomial arithmetics to be done efficiently in JuMP macros. 191 | - Move JuMP.parseExpr and related code to MA, it should be able to define an “@expression(expr)” macro that replace “expr” with an equivalent expression which use MA instead of allocating many intermediate results. Example: @expression(a + b + c) would be rewritten as “res = copy(a); operate!(+, res, b); operate!(+, res, c)”. 192 | This would allow types that implement the MA API to benefit from this feature. 193 | 194 | #### Helpful Experience 195 | 196 | - Notions of computational complexity. Basic knowledge on Benchmarking 197 | - Familiarity with JuMP representation of affine and quadratic expressions 198 | - Familiarity with https://github.com/JuliaAlgebra/MultivariatePolynomials.jl 199 | 200 | #### First steps 201 | 202 | - Write simple benchmarks with https://github.com/JuliaAlgebra/MultivariatePolynomials.jl and JuMP/MOI which would benefit from MA 203 | - Read performance tips https://docs.julialang.org/en/v1/manual/performance-tips/index.html 204 | 205 | ### Automatic Dualization 206 | 207 | #### Abstract 208 | 209 | Sometimes solving the dual of an optimization problem is faster than solving the original problem. The goal of this project is to create a tool similar to the [automatic dualization features in YALMIP](https://yalmip.github.io/tutorial/automaticdualization/). 210 | 211 | | **Intensity** | **Priority** | **Involves** | **Mentors** | 212 | | ------------- | ------------ | ------------- | ----------- | 213 | | Moderate | Medium | Writing [MOI](https://github.com/JuliaOpt/MathOptInterface.jl) code. Using duality theory in optimization. | [Chris Coey](https://github.com/chriscoey) [Joaquim Dias Garcia](https://github.com/joaquimg) [Benoît Legat](https://github.com/blegat) | 214 | 215 | #### Technical Details 216 | 217 | Duality [plays a central role in optimization in optimization](http://web.stanford.edu/~boyd/cvxbook/). 218 | For many solvers, the primal and dual form are different and it sometimes 219 | renders the problem easier to solve if it is converted into the dual form 220 | see the [automatic dualization features in YALMIP](https://yalmip.github.io/tutorial/automaticdualization/). 221 | 222 | Moreover, there are applications that require solving the KKT conditions for an optimization problem. 223 | This is the case of multilevel optimization, in which some constraints of an optimization problem include another optimization problem. 224 | Many classes of problems in [Extended Mathematical Programming](https://www.gams.com/latest/docs/UG_EMP.html) (EMP) such as MPECs, 225 | EPECs and so on, can be easily modeled if the KKT conditions can be automatically obtained from the primal problem. 226 | (https://www.gams.com/latest/docs/UG_EMP.html, https://github.com/xhub/EMP.jl) 227 | 228 | #### Helpful Experience 229 | 230 | - Basic knowledge of JuMP v0.19 and MOI 231 | - Understanding of basic duality theory for linear and conic optimization 232 | - Basic knowledge of mathematical optimization models and methods 233 | 234 | #### First steps 235 | 236 | - Become familiar with [MathOptInterface’s standard form](http://www.juliaopt.org/MathOptInterface.jl/dev/apimanual/) 237 | --------------------------------------------------------------------------------