In this post we show that it is possible to derive third and higer-order Taylor expansions for functions of several variables. Given that the gradient of a function is vector-valued and its Hessian is matrix-valued, it is natural to guess that its third-order gradient will be tensor-valued. However, not only is the use of tensors not very convenient, but in this context it is also unnecessary.

Often Taylor expansions of functions at a point are meant along a given direction . This facilitates a lot out understanding even for first-order expansions.

Let and define a function by which describes function along a direction. Then is three times continuously differentiable and the third-order Taylor expansion of about is

But is related to the directional derivative of at along the direction which is

Let us denote this by .

Similary, can be interpreted as the *directional Hessian* of at along the directions and , that is

Let us denote this by .

The term – a *Tressian* if we may call it so – is more difficult to represent. If fact, it will be a tensor. However, we are merely interested in the *directional Tressian* of at along directions , and . This construct is actually used in the context of convex optimization theory and in particular the theory of self-concordant functions and is denoted by and we may write

where is the third-order gradient of at which, in my opinion, is best understood via its directional variant:

Here is a matrix – it is a *directional Hessian*. Essentially, describes how the Hessian of changes at along the direction .

Similarly we may produce fourth and higher-order approximations.