Notes on Convex Optimizatioon
Norm
$||x|| \in R^+$
- zero: $||x||=0 \to x =0$
- scaling: $||ax||=|a|||x||$
- $||x+y||\le ||x||+||y||$
$||x||_2 = \sqrt{\Sigma x_i^2}$
Euclidean Ball
c = center vector, r = radius
$B(c,r) = \{x|\ \ ||x-c||_2 \le r\}$
$=\{c+ru | \ \ ||u||_2 \le 1\}$
Ellipsoid
PSD: $S_+ \to x^TAx\ge 0, \forall x\in R^n, A\in S_+$
PD: $S_{++} \to x^TAx\gt 0, \forall x\in R^n, A\in S_{++}$
$A<B \to B-A \in S_{++}$
$\ell(c,P) = \{x | \ \ (x-c)^TP^{-1}(x-c)\le 1\}\ \ (P \in S_{++})$
PSD Cone
$S^n = \{X\in R^{n\times n} | X^T=X\}$
$S_+^n = \{X\in S^n | z^TXz \ge 0\ \forall z\}$
$S_{++}^n = \{X\in S^n | z^TXz \gt 0\ \forall z\}$
Convex-closed Operations
Intersection: $\bigcap C_k \in C$
Affine transformation: $f(S) = \{Ax+b, x\in S\} \in C$
Perspective transformation: $f(S)=\{[x_1/x_n,....x_{n-1}/x_n]| x\in S\} $
Linear fractional transformation: $f(S)=\{\frac{Ax+b}{c^Tx+d} | x\in S\}$
Generalized Inequality
$x \le_S y \to y-x\in S$
$x \lt_S y \to y-x\in int(S)$
Minimum element: $\forall x \in C \to x_{min} \le_S x$
Minimal element: $\{y | y \le_S x_{mil}\} \cap C = \{x_{mil}\}$
- Preserve addition
- Transitive
- Non-negative scaling
- Reflexivity
- Sequential limits
Dual Cone
$K^* = \{y | x^Ty \ge 0, \forall x \in K\}$
Dual inequality: using generalized inequality over $K^*$
$x \le_K y \iff \lambda^T x \le \lambda ^T y, \forall \lambda \ge _{K^*} 0$
Convex Function
- $dom(f)$ is convex set
- $\forall x,y \in dom(f), \forall 0 \le \theta \le 1, f(\theta x, (1-\theta)y)\le \theta f(x) + (1-\theta)f(y)$
Example
- $f(x) = a^Tx+b$
- $f(x) =e^{\alpha x}$
- $f(x) =x\log x$
First Order Condition
If a function is differentiable and $\forall x, f(x)\ge \nabla^Tf(x_0)(x-x_0)$ , it is convex function
Second Order Condition
If a function is differentiable and $\nabla^2f\ge0$ (PSD) , it is convex function
Epigraph
$epi(f) = \{(x,t)\in R^{n+1} | x\in dom (A) \and f(x\le t\}$
$epi(f)$ is convex $\iff$ $f(x)$ is convex
Jensen's Inequality
if $f(x)$ is convex and Z is any RV, then $f(E(Z)) \le E(f(Z))$
Convex-closed Transformation
- $cf(x)$
- $\sum^N_{i=1}f_i(x), f_i$ is convex
- Pointwise maximum
Conjugate Convex Function
$f^*(y) = \sup_{x\in dom{f}}(y^Tx - f(x))$
Quasi Convex
- dom(f) is convex
- $\forall \alpha, S_\alpha = \{x\in dom(f) | f(x)\le\alpha\}$ is convex
$\forall x,y\in dom(f), \forall 0\le \theta\le 1 \implies f(\theta x+(1-\theta)y)\le \max(f(x), f(y))$
then $f$ is quasi convex
1st order conditions
- $f(y)\le f(x) \implies \nabla f^T(x)(y-x) \le 0$
Log Convex Function
$\log(f)$ is convex
Proof: $\log f(\theta x + (1-\theta)y) \le \log(f^\theta(x) f^{1-\theta}(y))$
Standard Form
Minimize $f(x)$
Subject to $f_i(x) \le 0, i = 1...m \and h_j(x)=0, j=1...p$
Convex Optimization
Minimize (quasi) convex $f(x)$
Subject to convex $f_i(x) \le 0, i = 1...m\ \and$ affine $ h_j(x)=0, j=1...p$
Feasible set of convex opt problem is convex
$\nabla f(x_{opt})^Tx_{opt} \le \nabla f^T(x_{opt})y$
Supporting Hyperplane
- X is entirely contained in one of the 2 closed half spaces bounded by hyperplane
- X has at one bounding pt on hyperplane
Linear Programming
minimize $f(x) = c^Tx + d$
subject to $f_i(x)=g_i^Tx-h_i \le 0 \and h_j(x) = a_j^Tx-b_i = 0$ (polyhedron)