Indeed, there are many interesting inequalities. Maybe a good starting point is the study of those related to the first and second-order approximations of a function which is either assumed to be (i) continuously differentiable with Lipschitz gradient or (ii) twice continuously differentiable with Lipschitz Hessian. These are used in the analysis of first and second-order methods (gradient and Newton methods).
I find Nesterov’s book “Introductory Lectures on Convex Optimization” a very good reference. I uploaded a new post you might find interesting: https://mathematix.wordpress.com/2016/08/03/second-order-approximation-of-twice-differentiable-functions/

]]>