12.07.2015 Views

v2010.10.26 - Convex Optimization

v2010.10.26 - Convex Optimization

v2010.10.26 - Convex Optimization

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4.6. CARDINALITY AND RANK CONSTRAINT EXAMPLES 383There are two features that distinguish problem formulation (856b) andour particular implementation of it [Matlab on Wıκımization]:1) An image-gradient estimate may engage any combination of fouradjacent pixels. In other words, the algorithm is not lockedinto a four-point gradient estimate (Figure 110); number of pointsconstituting an estimate is directly determined by direction vectory . 4.59 Indeed, we find only c = 5092 zero entries in y ⋆ for theShepp-Logan phantom; meaning, discrete image-gradient sparsity isactually closer to 1.9% than the 3% reported elsewhere; e.g., [355,IIB].2) Numerical precision of the fixed point of contraction (860) (≈1E-2for almost perfect reconstruction @−103dB error) is a parameter tothe implementation; meaning, direction vector y is updated aftercontraction begins but prior to its culmination. Impact of thisidiosyncrasy tends toward simultaneous optimization in variables Uand y while insuring y settles on a boundary point of its feasible set(nonnegative hypercube slice) in (862) at every iteration; for only aboundary point 4.60 can yield the sum of smallest entries in |Ψ vec U ⋆ |.Almost perfect reconstruction of the Shepp-Logan phantom at 103dBimage/error is achieved in a Matlab minute with 4.1% subsampled data(2671 complex samples); well below an 11% least lower bound predicted bythe sparse sampling theorem. Because reconstruction approaches optimalsolution to a 0-norm problem, minimum number of Fourier-domain samplesis bounded below by cardinality of discrete image-gradient at 1.9%. 4.6.0.0.13 Exercise. Contraction operator.Determine conditions on λ and ǫ under which (860) is a contraction andΨ T δ(y)δ(|Ψ vec U t | + ǫ1) −1 Ψ + λP from (861) is positive definite. 4.59 This adaptive gradient was not contrived. It is an artifact of the convex iterationmethod for minimal cardinality solution; in this case, cardinality minimization of a discreteimage-gradient.4.60 Simultaneous optimization of these two variables U and y should never be a pinnacleof aspiration; for then, optimal y might not attain a boundary point.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!