v2009.01.01 - Convex Optimization

v2009.01.01 - Convex Optimization v2009.01.01 - Convex Optimization

convexoptimization.com
from convexoptimization.com More from this publisher
10.03.2015 Views

610 APPENDIX D. MATRIX CALCULUS The gradient of vector-valued function v(x) : R→R N on real domain is a row-vector [ ∇v(x) = ∆ ∂v 1 (x) ∂x while the second-order gradient is ∂v 2 (x) ∂x · · · ∂v N (x) ∂x ] ∈ R N (1639) [ ∇ 2 v(x) = ∆ ∂ 2 v 1 (x) ∂x 2 ∂ 2 v 2 (x) ∂x 2 · · · ] ∂ 2 v N (x) ∈ R N (1640) ∂x 2 Gradient of vector-valued function h(x) : R K →R N on vector domain is ⎡ ∇h(x) = ∆ ⎢ ⎣ ∂h 1 (x) ∂x 1 ∂h 1 (x) ∂x 2 . ∂h 2 (x) ∂x 1 · · · ∂h 2 (x) ∂x 2 · · · . ∂h N (x) ∂x 1 ∂h N (x) ∂x 2 . ⎦ (1641) ∂h 1 (x) ∂h 2 (x) ∂h ∂x K ∂x K · · · N (x) ∂x K = [ ∇h 1 (x) ∇h 2 (x) · · · ∇h N (x) ] ∈ R K×N while the second-order gradient has a three-dimensional representation dubbed cubix ; D.1 ⎡ ∇ 2 h(x) = ∆ ⎢ ⎣ ∇ ∂h 1(x) ∂x 1 ∇ ∂h 1(x) ∂x 2 . ⎤ ⎥ ∇ ∂h 2(x) ∂x 1 · · · ∇ ∂h N(x) ∂x 1 ∇ ∂h 2(x) ∂x 2 · · · ∇ ∂h N(x) ∂x 2 . . ∇ ∂h 2(x) ∂x K · · · ∇ ∂h N(x) ⎦ (1642) ∇ ∂h 1(x) ∂x K ∂x K = [ ∇ 2 h 1 (x) ∇ 2 h 2 (x) · · · ∇ 2 h N (x) ] ∈ R K×N×K where the gradient of each real entry is with respect to vector x as in (1637). ⎤ ⎥ D.1 The word matrix comes from the Latin for womb ; related to the prefix matri- derived from mater meaning mother.

D.1. DIRECTIONAL DERIVATIVE, TAYLOR SERIES 611 The gradient of real function g(X) : R K×L →R on matrix domain is ⎡ ∇g(X) = ∆ ⎢ ⎣ = ∂g(X) ∂X 11 ∂g(X) ∂X 21 . ∂g(X) ∂X K1 ∂g(X) ∂X 12 · · · ∂g(X) [ ∇X(:,1) g(X) ∂X 22 . · · · ∂g(X) ∂X K2 · · · ∂g(X) ∂X 1L ∂g(X) ∂X 2L . ∂g(X) ∂X KL ⎤ ∈ R K×L ⎥ ⎦ ∇ X(:,2) g(X) ... ∇ X(:,L) g(X) ] ∈ R K×1×L (1643) where the gradient ∇ X(:,i) is with respect to the i th column of X . The strange appearance of (1643) in R K×1×L is meant to suggest a third dimension perpendicular to the page (not a diagonal matrix). The second-order gradient has representation ⎡ ∇ 2 g(X) = ∆ ⎢ ⎣ = ∇ ∂g(X) ∂X 11 ∇ ∂g(X) ∂X 21 . ∇ ∂g(X) ∂X K1 [ ∇∇X(:,1) g(X) ∇ ∂g(X) ∂X 12 · · · ∇ ∂g(X) ∂X 1L ∇ ∂g(X) ∂X 22 · · · ∇ ∂g(X) ∂X 2L . . ∇ ∂g(X) ∂X K2 · · · ∇ ∂g(X) ∂X KL ⎤ ∈ R K×L×K×L ⎥ ⎦ ∇∇ X(:,2) g(X) ... ∇∇ X(:,L) g(X) ] ∈ R K×1×L×K×L (1644) where the gradient ∇ is with respect to matrix X .

610 APPENDIX D. MATRIX CALCULUS<br />

The gradient of vector-valued function v(x) : R→R N on real domain is<br />

a row-vector<br />

[<br />

∇v(x) =<br />

∆<br />

∂v 1 (x)<br />

∂x<br />

while the second-order gradient is<br />

∂v 2 (x)<br />

∂x<br />

· · ·<br />

∂v N (x)<br />

∂x<br />

]<br />

∈ R N (1639)<br />

[<br />

∇ 2 v(x) =<br />

∆<br />

∂ 2 v 1 (x)<br />

∂x 2<br />

∂ 2 v 2 (x)<br />

∂x 2 · · ·<br />

]<br />

∂ 2 v N (x)<br />

∈ R N (1640)<br />

∂x 2<br />

Gradient of vector-valued function h(x) : R K →R N on vector domain is<br />

⎡<br />

∇h(x) =<br />

∆ ⎢<br />

⎣<br />

∂h 1 (x)<br />

∂x 1<br />

∂h 1 (x)<br />

∂x 2<br />

.<br />

∂h 2 (x)<br />

∂x 1<br />

· · ·<br />

∂h 2 (x)<br />

∂x 2<br />

· · ·<br />

.<br />

∂h N (x)<br />

∂x 1<br />

∂h N (x)<br />

∂x 2<br />

.<br />

⎦<br />

(1641)<br />

∂h 1 (x) ∂h 2 (x) ∂h<br />

∂x K ∂x K<br />

· · · N (x)<br />

∂x K<br />

= [ ∇h 1 (x) ∇h 2 (x) · · · ∇h N (x) ] ∈ R K×N<br />

while the second-order gradient has a three-dimensional representation<br />

dubbed cubix ; D.1<br />

⎡<br />

∇ 2 h(x) =<br />

∆ ⎢<br />

⎣<br />

∇ ∂h 1(x)<br />

∂x 1<br />

∇ ∂h 1(x)<br />

∂x 2<br />

.<br />

⎤<br />

⎥<br />

∇ ∂h 2(x)<br />

∂x 1<br />

· · · ∇ ∂h N(x)<br />

∂x 1<br />

∇ ∂h 2(x)<br />

∂x 2<br />

· · · ∇ ∂h N(x)<br />

∂x 2<br />

. .<br />

∇ ∂h 2(x)<br />

∂x K<br />

· · · ∇ ∂h N(x)<br />

⎦<br />

(1642)<br />

∇ ∂h 1(x)<br />

∂x K<br />

∂x K<br />

= [ ∇ 2 h 1 (x) ∇ 2 h 2 (x) · · · ∇ 2 h N (x) ] ∈ R K×N×K<br />

where the gradient of each real entry is with respect to vector x as in (1637).<br />

⎤<br />

⎥<br />

D.1 The word matrix comes from the Latin for womb ; related to the prefix matri- derived<br />

from mater meaning mother.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!