Notes on Convex Optimizatioon



$||x|| \in R^+$

$||x||_2 = \sqrt{\Sigma x_i^2}$

Euclidean Ball

c = center vector, r = radius

$B(c,r) = \{x|\ \ ||x-c||_2 \le r\}$

$=\{c+ru | \ \ ||u||_2 \le 1\}$


PSD: $S_+ \to x^TAx\ge 0, \forall x\in R^n, A\in S_+$

PD: $S_{++} \to x^TAx\gt 0, \forall x\in R^n, A\in S_{++}$

$A<B \to B-A \in S_{++}$

$\ell(c,P) = \{x | \ \ (x-c)^TP^{-1}(x-c)\le 1\}\ \ (P \in S_{++})$

PSD Cone

$S^n = \{X\in R^{n\times n} | X^T=X\}$

$S_+^n = \{X\in S^n | z^TXz \ge 0\ \forall z\}$

$S_{++}^n = \{X\in S^n | z^TXz \gt 0\ \forall z\}$

Convex-closed Operations

Intersection: $\bigcap C_k \in C$

Affine transformation: $f(S) = \{Ax+b, x\in S\} \in C$

Perspective transformation: $f(S)=\{[x_1/x_n,....x_{n-1}/x_n]| x\in S\} $

Linear fractional transformation: $f(S)=\{\frac{Ax+b}{c^Tx+d} | x\in S\}$

Generalized Inequality

$x \le_S y \to y-x\in S$

$x \lt_S y \to y-x\in int(S)$

Minimum element: $\forall x \in C \to x_{min} \le_S x$

Minimal element: $\{y | y \le_S x_{mil}\} \cap C = \{x_{mil}\}$

Dual Cone

$K^* = \{y | x^Ty \ge 0, \forall x \in K\}$

Dual inequality: using generalized inequality over $K^*$

$x \le_K y \iff \lambda^T x \le \lambda ^T y, \forall \lambda \ge _{K^*} 0$

Convex Function


First Order Condition

If a function is differentiable and $\forall x, f(x)\ge \nabla^Tf(x_0)(x-x_0)$ , it is convex function

Second Order Condition

If a function is differentiable and $\nabla^2f\ge0$ (PSD) , it is convex function


$epi(f) = \{(x,t)\in R^{n+1} | x\in dom (A) \and f(x\le t\}$

$epi(f)$ is convex $\iff$ $f(x)$ is convex

Jensen's Inequality

if $f(x)$ is convex and Z is any RV, then $f(E(Z)) \le E(f(Z))$

Convex-closed Transformation

Conjugate Convex Function

$f^*(y) = \sup_{x\in dom{f}}(y^Tx - f(x))$

Quasi Convex

$\forall x,y\in dom(f), \forall 0\le \theta\le 1 \implies f(\theta x+(1-\theta)y)\le \max(f(x), f(y))$

then $f$ is quasi convex

1st order conditions

Log Convex Function

$\log(f)$ is convex

Proof: $\log f(\theta x + (1-\theta)y) \le \log(f^\theta(x) f^{1-\theta}(y))$

Standard Form

Minimize $f(x)$

Subject to $f_i(x) \le 0, i = 1...m \and h_j(x)=0, j=1...p$

Convex Optimization

Minimize (quasi) convex $f(x)$

Subject to convex $f_i(x) \le 0, i = 1...m\ \and$ affine $ h_j(x)=0, j=1...p$

Feasible set of convex opt problem is convex

$\nabla f(x_{opt})^Tx_{opt} \le \nabla f^T(x_{opt})y$

Supporting Hyperplane

Linear Programming

minimize $f(x) = c^Tx + d$

subject to $f_i(x)=g_i^Tx-h_i \le 0 \and h_j(x) = a_j^Tx-b_i = 0$ (polyhedron)