(i) {α - Convex Optimization
(i) {α - Convex Optimization (i) {α - Convex Optimization
Construct H : Random Sensing Random Sensing: α is s-sparse and given m measurements selected uniformly at random from an ensemble. If m ≥ Csµ 2 H,Φ log n, then minimizing (P eq) reconstructs α exactly with overwhelming probability. Compressible signals/images and l 2 − l 1 instance optimality: (P eq ) solution recovers the s-largest entries. Stability to noise y = HΦα + ε: the decoder ∆ Φ,σ (y) (P σ ) : min ‖α‖ l1 s.t. ‖y − HΦα‖ l2 ≤ σ has l 2 − l 1 instance optimality and a factor of the noise std σ. Stanford seminar 08-5
Convex analysis and operator splitting Stanford seminar 08-6
- Page 1 and 2: Optimization problems in compressed
- Page 3 and 4: Compressed/ive Sensing Stanford sem
- Page 5 and 6: Compressed/ive Sensing Common wisdo
- Page 7 and 8: Compressed/ive Sensing Common wisdo
- Page 9 and 10: Compressed/ive Sensing (cont’d) C
- Page 11: Compressed/ive Sensing (cont’d) C
- Page 15 and 16: Class of problems in CS (cont’d)
- Page 17 and 18: Class of problems in CS (cont’d)
- Page 19 and 20: Characterization Theorem 1 (i) Exis
- Page 21 and 22: Operator splitting schemes Idea: re
- Page 23 and 24: Proximity operators Some properties
- Page 25 and 26: Example of proximity operator Stanf
- Page 27 and 28: Compressed sensing optimization pro
- Page 29 and 30: Characterizing Problem (P τ ) Stan
- Page 31 and 32: Proximity operators of Ψ Conclusio
- Page 33 and 34: DR to solve Problem (P σ ) Theorem
- Page 35 and 36: DR to solve Problem (Peq) Theorem 1
- Page 37 and 38: Pros and cons (P σ ) and (P eq ) h
- Page 39 and 40: CS reconstruction (1) H = Fourier,
- Page 41 and 42: Inpainting and CS H = Dirac, Φ = C
- Page 43 and 44: Inpainting and CS H = Dirac, Φ = C
- Page 45 and 46: Computation time CS H = Fourier, Φ
- Page 47 and 48: Ongoing and future work Beyond the
<strong>Convex</strong> analysis and operator splitting<br />
Stanford seminar 08-6