v2010.10.26 - Convex Optimization

v2010.10.26 - Convex Optimization v2010.10.26 - Convex Optimization

convexoptimization.com
from convexoptimization.com More from this publisher
12.07.2015 Views

696 APPENDIX D. MATRIX CALCULUSD.2.5determinant∇ X detX = ∇ X detX T = det(X)X −T∇ X detX −1 = − det(X −1 )X −T = − det(X) −1 X −T∇ X det µ X = µ det µ (X)X −T∇ X detX µ = µ det(X µ )X −T∇ X detX k = k det k−1 (X) ( tr(X)I − X T) , X ∈ R 2×2∇ X detX k = ∇ X det k X = k det(X k )X −T = k det k (X)X −T∇ X det µ (X+ t Y ) = µ det µ (X+ t Y )(X+ t Y ) −T∇ X det(X+ t Y ) k = ∇ X det k (X+ t Y ) = k det k (X+ t Y )(X+ t Y ) −Tddet(X+ t Y ) = det(X+ t Y ) tr((X+ t Y dt )−1 Y )d 2dt 2 det(X+ t Y ) = det(X+ t Y )(tr 2 ((X+ t Y ) −1 Y ) − tr((X+ t Y ) −1 Y (X+ t Y ) −1 Y ))ddet(X+ t Y dt )−1 = − det(X+ t Y ) −1 tr((X+ t Y ) −1 Y )d 2dt 2 det(X+ t Y ) −1 = det(X+ t Y ) −1 (tr 2 ((X+ t Y ) −1 Y ) + tr((X+ t Y ) −1 Y (X+ t Y ) −1 Y ))ddt detµ (X+ t Y ) = µ det µ (X+ t Y ) tr((X+ t Y ) −1 Y )D.2.6logarithmicMatrix logarithm.dlog(X+ t Y dt )µ = µY (X+ t Y ) −1 = µ(X+ t Y ) −1 Y ,dlog(I − t Y dt )µ = −µY (I − t Y ) −1 = −µ(I − t Y ) −1 YXY = Y X[203, p.493]

D.2. TABLES OF GRADIENTS AND DERIVATIVES 697D.2.7exponentialMatrix exponential. [77,3.6,4.5] [331,5.4]∇ X e tr(Y T X) = ∇ X dete Y TX = e tr(Y T X) Y (∀X,Y )∇ X tre Y X = e Y T X T Y T = Y T e XT Y T∇ x 1 T e Ax = A T e Ax∇ x 1 T e |Ax| = A T δ(sgn(Ax))e |Ax| (Ax) i ≠ 0∇ x log(1 T e x ) = 11 T e x ex∇ 2 x log(1 T e x ) = 11 T e x (δ(e x ) − 11 T e x ex e xT )k∏∇ x x 1 ki = 1 (∏ kx 1 ki=1 k i)1/xi=1∇ 2 xk∏x 1 ki = − 1 ( k)(∏x 1 ki=1 k i δ(x) −2 − 1 )i=1 k (1/x)(1/x)Tddt etY = e tY Y = Y e tYddt eX+ t Y = e X+ t Y Y = Y e X+ t Y ,d 2dt 2 e X+ t Y = e X+ t Y Y 2 = Y e X+ t Y Y = Y 2 e X+ t Y ,XY = Y XXY = Y Xd jdt j e tr(X+ t Y ) = e tr(X+ t Y ) tr j (Y )

D.2. TABLES OF GRADIENTS AND DERIVATIVES 697D.2.7exponentialMatrix exponential. [77,3.6,4.5] [331,5.4]∇ X e tr(Y T X) = ∇ X dete Y TX = e tr(Y T X) Y (∀X,Y )∇ X tre Y X = e Y T X T Y T = Y T e XT Y T∇ x 1 T e Ax = A T e Ax∇ x 1 T e |Ax| = A T δ(sgn(Ax))e |Ax| (Ax) i ≠ 0∇ x log(1 T e x ) = 11 T e x ex∇ 2 x log(1 T e x ) = 11 T e x (δ(e x ) − 11 T e x ex e xT )k∏∇ x x 1 ki = 1 (∏ kx 1 ki=1 k i)1/xi=1∇ 2 xk∏x 1 ki = − 1 ( k)(∏x 1 ki=1 k i δ(x) −2 − 1 )i=1 k (1/x)(1/x)Tddt etY = e tY Y = Y e tYddt eX+ t Y = e X+ t Y Y = Y e X+ t Y ,d 2dt 2 e X+ t Y = e X+ t Y Y 2 = Y e X+ t Y Y = Y 2 e X+ t Y ,XY = Y XXY = Y Xd jdt j e tr(X+ t Y ) = e tr(X+ t Y ) tr j (Y )

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!