# Diffenrences bewteen Objective function, Loss function, Risk function

# Some quotings

"Deep Learning" by Goodfellow > The function we want to minimize or maximize is called the **objective function**, or **criterion**. When we are minimizing it, we may also call it the **cost function**, **loss function**, or **error function**. In this book, we use these terms interchangeably, though some machine learning publications assign special meaning to some of these terms.

Andrew NG >"Finally, the loss function was defined with respect to a single training example. It measures how well you're doing on a single training example. I'm now going to define something called the cost function, which measures how well you're doing an entire training set. So the cost function J which is applied to your parameters W and B is going to be the average with one of the m of the sum of the loss function applied to each of the training examples and turn."

# Conclusion

Objective function: is our target, whether minimize or maximize; e.g. MLE maximum likelihood estimation Cost function: is sum of loss function with additional regularization term. (on a trianing set scale); e.g. MSE mean squared error Loss function: is a function defined on data point, prediction and label, which measures penalty. (on a single sample). e.g. Square loss, hinge loss

Reference https://stats.stackexchange.com/questions/179026/objective-function-cost-function-loss-function-are-they-the-same-thing