Review Note

Last Update: 04/30/2024 09:10 AM

Current Deck: Data Science Interviews::metrics

Published

Fields:

Front
What is the definition of log loss (a.k.a. cross-entropy), and why is it used?
Back
\[ -\frac{1}{N}\sum_{i=1}^n {y_i\log(p_i) + (1 - y_i)\log(1 - p_i)} \]

Binary classification metric, measures similarity of two probability distributions, punishes extreme confidence.  
Natural error function for logistic regression.
Credit

Tags:

low-level

Suggested Changes:

Deck Changes (Suggestion to move the Note to the following Deck):

Field Changes:

Tag Changes: