v2010.10.26 - Convex Optimization
v2010.10.26 - Convex Optimization v2010.10.26 - Convex Optimization
26 CHAPTER 1. OVERVIEWFigure 5: Swiss roll from Weinberger & Saul [372]. The problem of manifoldlearning, illustrated for N =800 data points sampled from a “Swiss roll” 1Ç.A discretized manifold is revealed by connecting each data point and its k=6nearest neighbors 2Ç. An unsupervised learning algorithm unfolds the Swissroll while preserving the local geometry of nearby data points 3Ç. Finally, thedata points are projected onto the two dimensional subspace that maximizestheir variance, yielding a faithful embedding of the original manifold 4Ç.
(biorthogonal expansion) is examined. It is shown that coordinates areunique in any conic system whose basis cardinality equals or exceedsspatial dimension. The conic analogue to linear independence, called conicindependence, is introduced as a tool for study, analysis, and manipulation ofcones; a natural extension and next logical step in progression: linear, affine,conic. We explain conversion between halfspace- and vertex-description of aconvex cone, we motivate the dual cone and provide formulae for finding it,and we show how first-order optimality conditions or alternative systemsof linear inequality or linear matrix inequality can be explained by dualgeneralized inequalities with respect to convex cones. Arcane theorems ofalternative generalized inequality are, in fact, simply derived from conemembership relations; generalizations of algebraic Farkas’ lemma translatedto geometry of convex cones.Any convex optimization problem can be visualized geometrically. Desireto visualize in high dimension [Sagan, Cosmos −The Edge of Forever, 22:55 ′ ]is deeply embedded in the mathematical psyche. [1] Chapter 2 providestools to make visualization easier, and we teach how to visualize in highdimension. The concepts of face, extreme point, and extreme direction of aconvex Euclidean body are explained here; crucial to understanding convexoptimization. How to find the smallest face of any pointed closed convexcone containing convex set C , for example, is divulged; later shown to havepractical application to presolving convex programs. The convex cone ofpositive semidefinite matrices, in particular, is studied in depth:We interpret, for example, inverse image of the positive semidefinitecone under affine transformation. (Example 2.9.1.0.2)Subsets of the positive semidefinite cone, discriminated by rankexceeding some lower bound, are convex. In other words, high-ranksubsets of the positive semidefinite cone boundary united with itsinterior are convex. (Theorem 2.9.2.9.3) There is a closed form forprojection on those convex subsets.The positive semidefinite cone is a circular cone in low dimension,while Geršgorin discs specify inscription of a polyhedral cone into thatpositive semidefinite cone. (Figure 48)Chapter 3 Geometry of convex functions observes Fenchel’s analogybetween convex sets and functions: We explain, for example, how the realaffine function relates to convex functions as the hyperplane relates to convex27
- Page 1 and 2: DATTORROCONVEXOPTIMIZATION&EUCLIDEA
- Page 3 and 4: Convex Optimization&Euclidean Dista
- Page 5 and 6: for Jennie Columba♦Antonio♦♦&
- Page 7 and 8: PreludeThe constant demands of my d
- Page 9 and 10: Convex Optimization&Euclidean Dista
- Page 11 and 12: CONVEX OPTIMIZATION & EUCLIDEAN DIS
- Page 13 and 14: List of Figures1 Overview 211 Orion
- Page 15 and 16: LIST OF FIGURES 1562 Shrouded polyh
- Page 17: LIST OF FIGURES 17130 Elliptope E 3
- Page 21 and 22: Chapter 1OverviewConvex Optimizatio
- Page 23 and 24: ˇx 4ˇx 3ˇx 2Figure 2: Applicatio
- Page 25: 25Figure 4: This coarsely discretiz
- Page 29 and 30: 29cardinality Boolean solution to a
- Page 31 and 32: 31Figure 8: Robotic vehicles in con
- Page 33 and 34: an elaborate exposition offering in
- Page 35 and 36: Chapter 2Convex geometryConvexity h
- Page 37 and 38: 2.1. CONVEX SET 372.1.2 linear inde
- Page 39 and 40: 2.1. CONVEX SET 392.1.6 empty set v
- Page 41 and 42: 2.1. CONVEX SET 41(a)R(b)R 2(c)R 3F
- Page 43 and 44: 2.1. CONVEX SET 43where Q∈ R 3×3
- Page 45 and 46: 2.1. CONVEX SET 45Now let’s move
- Page 47 and 48: 2.1. CONVEX SET 47By additive inver
- Page 49 and 50: 2.1. CONVEX SET 49R nR mR(A T )x px
- Page 51 and 52: 2.2. VECTORIZED-MATRIX INNER PRODUC
- Page 53 and 54: 2.2. VECTORIZED-MATRIX INNER PRODUC
- Page 55 and 56: 2.2. VECTORIZED-MATRIX INNER PRODUC
- Page 57 and 58: 2.2. VECTORIZED-MATRIX INNER PRODUC
- Page 59 and 60: 2.2. VECTORIZED-MATRIX INNER PRODUC
- Page 61 and 62: 2.3. HULLS 61Figure 20: Convex hull
- Page 63 and 64: 2.3. HULLS 63Aaffine hull (drawn tr
- Page 65 and 66: 2.3. HULLS 65subset of the affine h
- Page 67 and 68: 2.3. HULLS 672.3.2.0.2 Example. Nuc
- Page 69 and 70: 2.3. HULLS 692.3.2.0.3 Exercise. Co
- Page 71 and 72: 2.3. HULLS 71Figure 24: A simplicia
- Page 73 and 74: 2.4. HALFSPACE, HYPERPLANE 73H + =
- Page 75 and 76: 2.4. HALFSPACE, HYPERPLANE 7511−1
26 CHAPTER 1. OVERVIEWFigure 5: Swiss roll from Weinberger & Saul [372]. The problem of manifoldlearning, illustrated for N =800 data points sampled from a “Swiss roll” 1Ç.A discretized manifold is revealed by connecting each data point and its k=6nearest neighbors 2Ç. An unsupervised learning algorithm unfolds the Swissroll while preserving the local geometry of nearby data points 3Ç. Finally, thedata points are projected onto the two dimensional subspace that maximizestheir variance, yielding a faithful embedding of the original manifold 4Ç.