The MOSEK Python optimizer API manual Version 7.0 (Revision 141)
Optimizer API for Python - Documentation - Mosek Optimizer API for Python - Documentation - Mosek
188 CHAPTER 15. SENSITIVITY ANALYSIS In summary, the basis type sensitivity analysis is computationally cheap but does not provide complete information. Hence, the results of the basis type sensitivity analysis should be used with care. 15.4.3 The optimal partition type sensitivity analysis Another method for computing the complete linearity interval is called the optimal partition type sensitivity analysis. The main drawback of the optimal partition type sensitivity analysis is that it is computationally expensive compared to the basis type analysts. This type of sensitivity analysis is currently provided as an experimental feature in MOSEK. Given the optimal primal and dual solutions to (15.1), i.e. x ∗ and ((s c l )∗ , (s c u) ∗ , (s x l )∗ , (s x u) ∗ ) the optimal objective value is given by The left and right shadow prices σ 1 and σ 2 for l c i z ∗ := c T x ∗ . are given by this pair of optimization problems: and σ 1 = minimize e T i s c l subject to A T (s c l − s c u) + s x l − s x u = c, (l c ) T (s c l ) − (u c ) T (s c u) + (l x ) T (s x l ) − (u x ) T (s x u) = z ∗ , s c l , s c u, s c l , s x u ≥ 0 σ 2 = maximize e T i s c l subject to A T (s c l − s c u) + s x l − s x u = c, (l c ) T (s c l ) − (u c ) T (s c u) + (l x ) T (s x l ) − (u x ) T (s x u) = z ∗ , s c l , s c u, s c l , s x u ≥ 0. These two optimization problems make it easy to interpret the shadow price. Indeed, if ((s c l )∗ , (s c u) ∗ , (s x l )∗ , (s x u) ∗ ) is an arbitrary optimal solution then Next, the linearity interval [β 1 , β 2 ] for l c i (s c l ) ∗ i ∈ [σ 1 , σ 2 ]. is computed by solving the two optimization problems and β 1 = minimize β subject to l c + βe i ≤ Ax ≤ u c , c T x − σ 1 β = z ∗ , l x ≤ x ≤ u x , β 2 = maximize β subject to l c + βe i ≤ Ax ≤ u c , c T x − σ 2 β = z ∗ , l x ≤ x ≤ u x .
15.4. SENSITIVITY ANALYSIS FOR LINEAR PROBLEMS 189 The linearity intervals and shadow prices for u c i , lx j , and ux j are computed similarly to lc i . The left and right shadow prices for c j denoted σ 1 and σ 2 respectively are computed as follows: and σ 1 = minimize e T j x subject to l c + βe i ≤ Ax ≤ u c , c T x = z ∗ , l x ≤ x ≤ u x σ 2 = maximize e T j x subject to l c + βe i ≤ Ax ≤ u c , c T x = z ∗ , l x ≤ x ≤ u x . Once again the above two optimization problems make it easy to interpret the shadow prices. Indeed, if x ∗ is an arbitrary primal optimal solution, then x ∗ j ∈ [σ 1 , σ 2 ]. The linearity interval [β 1 , β 2 ] for a c j is computed as follows: and β 1 = minimize β subject to A T (s c l − s c u) + s x l − s x u = c + βe j , (l c ) T (s c l ) − (u c ) T (s c u) + (l x ) T (s x l ) − (u x ) T (s x u) − σ 1 β ≤ z ∗ , s c l , s c u, s c l , s x u ≥ 0 β 2 = maximize β subject to A T (s c l − s c u) + s x l − s x u = c + βe j , (l c ) T (s c l ) − (u c ) T (s c u) + (l x ) T (s x l ) − (u x ) T (s x u) − σ 2 β ≤ z ∗ , s c l , s c u, s c l , s x u ≥ 0. 15.4.4 Example: Sensitivity analysis As an example we will use the following transportation problem. Consider the problem of minimizing the transportation cost between a number of production plants and stores. Each plant supplies a number of goods and each store has a given demand that must be met. Supply, demand and cost of transportation per unit are shown in Figure 15.2. If we denote the number of transported goods from location i to location j by x ij , problem can be formulated as the linear optimization problem minimize subject to 1x 11 + 2x 12 + 5x 23 + 2x 24 + 1x 31 + 2x 33 + 1x 34
- Page 159 and 160: 11.1. HOW AN OPTIMIZER WORKS 137 11
- Page 161 and 162: 11.2. LINEAR OPTIMIZATION 139 11.2.
- Page 163 and 164: 11.2. LINEAR OPTIMIZATION 141 Whene
- Page 165 and 166: 11.2. LINEAR OPTIMIZATION 143 11.2.
- Page 167 and 168: 11.2. LINEAR OPTIMIZATION 145 • R
- Page 169 and 170: 11.5. NONLINEAR CONVEX OPTIMIZATION
- Page 171 and 172: 11.6. SOLVING PROBLEMS IN PARALLEL
- Page 173 and 174: 11.6. SOLVING PROBLEMS IN PARALLEL
- Page 175 and 176: 11.6. SOLVING PROBLEMS IN PARALLEL
- Page 177 and 178: Chapter 12 The optimizers for mixed
- Page 179 and 180: 12.3. THE MIXED-INTEGER CONIC OPTIM
- Page 181 and 182: 12.5. TERMINATION CRITERION 159 •
- Page 183 and 184: 12.7. UNDERSTANDING SOLUTION QUALIT
- Page 185 and 186: Chapter 13 The analyzers 13.1 The p
- Page 187 and 188: 13.1. THE PROBLEM ANALYZER 165 Cons
- Page 189 and 190: 13.2. ANALYZING INFEASIBLE PROBLEMS
- Page 191 and 192: 13.2. ANALYZING INFEASIBLE PROBLEMS
- Page 193 and 194: 13.2. ANALYZING INFEASIBLE PROBLEMS
- Page 195 and 196: 13.2. ANALYZING INFEASIBLE PROBLEMS
- Page 197 and 198: 13.2. ANALYZING INFEASIBLE PROBLEMS
- Page 199 and 200: Chapter 14 Primal feasibility repai
- Page 201 and 202: 14.2. AUTOMATIC REPAIR 179 One way
- Page 203 and 204: 14.3. FEASIBILITY REPAIR IN MOSEK 1
- Page 205 and 206: 14.3. FEASIBILITY REPAIR IN MOSEK 1
- Page 207 and 208: Chapter 15 Sensitivity analysis 15.
- Page 209: 15.4. SENSITIVITY ANALYSIS FOR LINE
- Page 213 and 214: 15.4. SENSITIVITY ANALYSIS FOR LINE
- Page 215 and 216: 15.5. SENSITIVITY ANALYSIS FROM THE
- Page 217 and 218: 15.6. SENSITIVITY ANALYSIS WITH THE
- Page 219 and 220: 15.6. SENSITIVITY ANALYSIS WITH THE
- Page 221 and 222: Appendix A API reference This chapt
- Page 223 and 224: 201 • Task.relaxprimal Obtain inf
- Page 225 and 226: A.1. EXCEPTIONS 203 • Task.putvar
- Page 227 and 228: A.2. CLASS TASK 205 Arguments which
- Page 229 and 230: A.2. CLASS TASK 207 See also • Ro
- Page 231 and 232: A.2. CLASS TASK 209 Description: Ap
- Page 233 and 234: A.2. CLASS TASK 211 Description: If
- Page 235 and 236: A.2. CLASS TASK 213 A.2.16 Task.com
- Page 237 and 238: A.2. CLASS TASK 215 subj : int[] In
- Page 239 and 240: A.2. CLASS TASK 217 firsti : int In
- Page 241 and 242: A.2. CLASS TASK 219 A.2.27 Task.get
- Page 243 and 244: A.2. CLASS TASK 221 valijkl : Descr
- Page 245 and 246: A.2. CLASS TASK 223 A.2.33 Task.get
- Page 247 and 248: A.2. CLASS TASK 225 idx : long Inde
- Page 249 and 250: A.2. CLASS TASK 227 A.2.41 Task.get
- Page 251 and 252: A.2. CLASS TASK 229 i : int Index o
- Page 253 and 254: A.2. CLASS TASK 231 A.2.49 Task.get
- Page 255 and 256: A.2. CLASS TASK 233 conetype : cone
- Page 257 and 258: A.2. CLASS TASK 235 Description: Ob
- Page 259 and 260: A.2. CLASS TASK 237 sub : int[] Ind
15.4. SENSITIVITY ANALYSIS FOR LINEAR PROBLEMS 189<br />
<strong>The</strong> linearity intervals and shadow prices for u c i , lx j , and ux j are computed similarly to lc i .<br />
<strong>The</strong> left and right shadow prices for c j denoted σ 1 and σ 2 respectively are computed as follows:<br />
and<br />
σ 1 = minimize e T j x<br />
subject to l c + βe i ≤ Ax ≤ u c ,<br />
c T x = z ∗ ,<br />
l x ≤ x ≤ u x<br />
σ 2 = maximize e T j x<br />
subject to l c + βe i ≤ Ax ≤ u c ,<br />
c T x = z ∗ ,<br />
l x ≤ x ≤ u x .<br />
Once again the above two optimization problems make it easy to interpret the shadow prices. Indeed,<br />
if x ∗ is an arbitrary primal optimal solution, then<br />
x ∗ j ∈ [σ 1 , σ 2 ].<br />
<strong>The</strong> linearity interval [β 1 , β 2 ] for a c j is computed as follows:<br />
and<br />
β 1 = minimize β<br />
subject to A T (s c l − s c u) + s x l − s x u = c + βe j ,<br />
(l c ) T (s c l ) − (u c ) T (s c u) + (l x ) T (s x l ) − (u x ) T (s x u) − σ 1 β ≤ z ∗ ,<br />
s c l , s c u, s c l , s x u ≥ 0<br />
β 2 = maximize β<br />
subject to A T (s c l − s c u) + s x l − s x u = c + βe j ,<br />
(l c ) T (s c l ) − (u c ) T (s c u) + (l x ) T (s x l ) − (u x ) T (s x u) − σ 2 β ≤ z ∗ ,<br />
s c l , s c u, s c l , s x u ≥ 0.<br />
15.4.4 Example: Sensitivity analysis<br />
As an example we will use the following transportation problem. Consider the problem of minimizing<br />
the transportation cost between a number of production plants and stores. Each plant supplies a<br />
number of goods and each store has a given demand that must be met. Supply, demand and cost of<br />
transportation per unit are shown in Figure 15.2. If we denote the number of transported goods from<br />
location i to location j by x ij , problem can be formulated as the linear optimization problem minimize<br />
subject to<br />
1x 11 + 2x 12 + 5x 23 + 2x 24 + 1x 31 + 2x 33 + 1x 34