(i) {α - Convex Optimization
(i) {α - Convex Optimization (i) {α - Convex Optimization
Problem statement y = HΦα + ε, Var (ε) =σ 2 ε. Consider the following minimization problems: (P σ ) : min Ψ (α) s.t. ‖y − HΦα‖ l2 ≤ σ (P eq ) : min Ψ (α) s.t. y = HΦα (P τ ) : min 1 2 ‖y − HΦα‖2 l 2 s.t. Ψ (α) ≤ τ Ψ (α) = ∑ γ∈Γ ψ (α γ). ψ is a sparsity-promoting penalty. Stanford seminar 08-14
Characterizing Problem (P τ ) Stanford seminar 08-15
- Page 1 and 2: Optimization problems in compressed
- Page 3 and 4: Compressed/ive Sensing Stanford sem
- Page 5 and 6: Compressed/ive Sensing Common wisdo
- Page 7 and 8: Compressed/ive Sensing Common wisdo
- Page 9 and 10: Compressed/ive Sensing (cont’d) C
- Page 11 and 12: Compressed/ive Sensing (cont’d) C
- Page 13 and 14: Convex analysis and operator splitt
- Page 15 and 16: Class of problems in CS (cont’d)
- Page 17 and 18: Class of problems in CS (cont’d)
- Page 19 and 20: Characterization Theorem 1 (i) Exis
- Page 21 and 22: Operator splitting schemes Idea: re
- Page 23 and 24: Proximity operators Some properties
- Page 25 and 26: Example of proximity operator Stanf
- Page 27: Compressed sensing optimization pro
- Page 31 and 32: Proximity operators of Ψ Conclusio
- Page 33 and 34: DR to solve Problem (P σ ) Theorem
- Page 35 and 36: DR to solve Problem (Peq) Theorem 1
- Page 37 and 38: Pros and cons (P σ ) and (P eq ) h
- Page 39 and 40: CS reconstruction (1) H = Fourier,
- Page 41 and 42: Inpainting and CS H = Dirac, Φ = C
- Page 43 and 44: Inpainting and CS H = Dirac, Φ = C
- Page 45 and 46: Computation time CS H = Fourier, Φ
- Page 47 and 48: Ongoing and future work Beyond the
Problem statement<br />
y = HΦα + ε, Var (ε) =σ 2 ε.<br />
Consider the following minimization problems:<br />
(P σ ) : min Ψ (α) s.t. ‖y − HΦα‖ l2<br />
≤ σ<br />
(P eq ) : min Ψ (α) s.t. y = HΦα<br />
(P τ ) : min 1 2 ‖y − HΦα‖2 l 2<br />
s.t. Ψ (α) ≤ τ<br />
Ψ (α) = ∑ γ∈Γ ψ (α γ).<br />
ψ is a sparsity-promoting penalty.<br />
Stanford seminar 08-14