(i) {α - Convex Optimization

(i) {α - Convex Optimization (i) {α - Convex Optimization

convexoptimization.com
from convexoptimization.com More from this publisher
10.03.2015 Views

Class of problems in CS (cont’d) (P σ ), (P τ ) and (P λ ) can all be cast as: (P1) : min α f 1(α) + f 2 (α) f 1 and f 2 are proper lsc convex functions. (P λ ) 1 2 ‖y − HΦα‖2 l 2 + λΨ (α) f 1 (α) = 1 2 ‖y − HΦα‖2 l 2 ,f 2 (α) =λΨ (α) (P σ ) Ψ (α) s.t. ‖y − HΦα‖ l2 ≤ σ f 1 (α) =Ψ (α) ,f 2 (α) =ı Bl2 ,σ (α) (P τ ) 1 2 ‖y − HΦα‖2 l 2 s.t. Ψ (α) ≤ τ f 1 (α) = 1 2 ‖y − HΦα‖2 l 2 ,f 2 (α) =ı Bψ,τ (α) Stanford seminar 08-8

Characterization Theorem 1 (i) Existence: (P1) possesses at least one solution if f 1 + f 2 is coercive, i.e., lim ‖α‖→+∞ f 1 (α)+f 2 (α) = +∞. (ii) Uniqueness: (P1) possesses at most one solution if f 1 + f 2 is strictly convex. This occurs in particular when either f 1 or f 2 is strictly convex. (iii) Characterization: Let α ∈ H. Then the following statements are equivalent: (a) α solves (P1). (b) α =(I + ∂(f 1 + f 2 )) −1 (α) (proximal iteration). ∂f i is the subdifferential (set-valued map), a maximal monotone operator. J ∂(f1 +f 2 ) = (I + ∂(f 1 + f 2 )) −1 is the resolvent of ∂(f 1 + f 2 ) (firmly nonexpansive operator). Stanford seminar 08-9

Characterization<br />

Theorem 1<br />

(i) Existence: (P1) possesses at least one solution if f 1 + f 2 is coercive, i.e.,<br />

lim ‖α‖→+∞ f 1 (α)+f 2 (α) = +∞.<br />

(ii) Uniqueness: (P1) possesses at most one solution if f 1 + f 2 is strictly convex.<br />

This occurs in particular when either f 1 or f 2 is strictly convex.<br />

(iii) Characterization: Let α ∈ H. Then the following statements are equivalent:<br />

(a) α solves (P1).<br />

(b) α =(I + ∂(f 1 + f 2 )) −1 (α) (proximal iteration).<br />

∂f i is the subdifferential (set-valued map), a maximal monotone operator.<br />

J ∂(f1 +f 2 )<br />

= (I + ∂(f 1 + f 2 )) −1 is the resolvent of ∂(f 1 + f 2 ) (firmly nonexpansive<br />

operator).<br />

Stanford seminar 08-9

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!