# Calculus for Machine Learning

### Model Preliminaries

Machine Learning often involves minimising a cost/objective function, which is a function that measures the error of our model, consisting of several parameters(variables). We use methods from *differential calculus* for finding the minimum of cost functions. (Or maximum of reward functions).

There are several ways to think about calculus

- the study of the relationship between variables and their rates of change.
- a set of tools for analysing the relationship between function and their inputs. Typically we want to find the parameter values which enable a function to best match the data.
- a set of tools for helping us navigate in high-dimensional spaces.

The following posts link mathematical concepts in calculus with Optimization and Machine Learning.

- Derivatives and functions
- Gradients, partial derivatives, directional derivatives and gradient descent
- Jacobian, Chain rule and backpropagation
- Hessian, second derivatives, function convexity, saddle points
- Taylor Series, Newtonâ€™s method
- Lagrange Multipliers and Constrained Optimization
- Limits, delta-epsilon and theoretical guarantees
- Conjugate Gradients
- Discontinuity