Chinese Yellow Pages | Classifieds | Knowledge | Tax | IME

Theory background

min: f0(x)         ==> p *

s.t.:   fi(x) <= 0  i = 1, … m

hi(x) = 0 ,  j= 1 … m

define lagrange function:

L(x, lambda,v ) = f0(x) + sum ( lambda * fi(x) ) + sum ( u * hi(x)

then define its dual function:

g( lambda, v) = min ( f0(x) + sum ( lambda * fi(x) ) + sum ( u * hi(x)  )

over x ( dual problem):

the we can max( g (lambda, v) ) over lamba, v with lamba >= 0.    ==> d*

it can be proofed that:

d* <= p*

in some cases ( with slater’s condition):

d* = p*

if f0(x) is convex, and d*=p*:

<=> KKT condition

[公式]

https://zhuanlan.zhihu.com/p/36621652

https://en.wikipedia.org/wiki/Karush%E2%80%93Kuhn%E2%80%93Tucker_conditions

 

Algorithms

unconstrained minimization

we can use gradient descent methods, steepest descent/newton method to get minimum value iteratively.

All those frameworks works as following: find descent direction first, then find a good step to approach the minimum, then repeat the those steps.

(nothing to do with dual, ktt )

 

constrained minimization

for problems such as equality constraints, the key is use second-order approximation for the object function, thus convert descend direction problem to a convex ( solvable) problem by using KTT condition, once we got the descend direction, we can use normal newton’s method to get minmum iteratively.

对于带等式约束的凸优化问题,我们将目标函数进行了二次近似,根据KKT条件,确定了最优解的存在条件——KKT方程。然后通过求解KKT方程确定Newton Method需要的下降方向 [公式] ,并且对快速求解KKT方程做了一定的分析。

 

 

References:

 

https://zhuanlan.zhihu.com/p/50411305

https://zhuanlan.zhihu.com/p/36621652