# Cone programs and self-dual embeddings

This post aims at providing some intuition into cone programs from different perspectives; in particular:

1. Equivalence of different formulations of cone programs
2. Fenchel duality
3. Primal-dual optimality conditions (OC)
4. OCs as variational inequalities
5. Homogeneous self-dual embeddings (HSDEs)
6. OCs for HSDEs

# Projection on the epigraph of the squared Euclidean norm

As a follow-up on the previous post titled Projection on an epigraph, we here discuss how we can project on the epigraph of the squared norm function. Continue reading →

# Convergence of the iterates of the gradient method with constant stepsize

The gradient method with constant step length is the simplest method for solving unconstrained optimisation problems involving a continuously differentiable function with Lipschitz-continuous gradient. The motivation for this post came after reading this Wikipedia article where it is stated that under certain assumptions the sequence $\{x_k\}$ converges to a local optimum, but it is no further discussion is provided. Continue reading →

# Pretty convexity result

Where here we discover some interesting facts about continuous convex functions.

We know that a function $f:\mathbb{R}^n\to\mathbb{R}$ is convex if

$f(\tau x + (1-\tau)y) \leq \tau f(x) + (1-\tau)f(y)$

for all $\tau \in [0,1]$ and $x,y\in \mathbb{R}^n$.

We see that if $f$ is a continuous function, then an equivalent condition for convexity is that either of the following inequalities holds

\begin{aligned} f\left(\frac{x+y}{2}\right) \leq \int_0^1 f(\theta x + (1-\theta)y)\mathrm{d}\theta \leq \frac{f(x)+f(y)}{2}\end{aligned} Continue reading →

mathbabe

Exploring and venting about quantitative issues

Look at the corners!

The math blog of Dmitry Ostrovsky

The Unapologetic Mathematician

Mathematics for the interested outsider

Almost Sure

A random mathematical blog

Mathematix

Mathematix is the mathematician of the village of Asterix