Precision is a counterpart to Recall(true positive rate) in the following sense: while true positive rate is the proportion of predicted positives among the actual positives, precision is the proportion of actual positives among the predicted positives.
Subscribe to:
Post Comments (Atom)
Blog Archive
-
▼
2015
(38)
-
▼
December
(21)
- python unix style pathname pattern expansion
- Building OpenCV with face module (opencv_contrib) ...
- python notes
- Installin wxpython on Windows with Anaconda
- python in-built data types
- Warning:java: source value 1.5 is obsolete and wil...
- XGBoost Linear Regression output incorrect
- python print object memory address
- Pandas chained indexing - example
- Installing Anaconda and xgboost on Ubuntu
- Installing xgboost on Windows for Python (While us...
- Errors running apt-get on aws ubuntu instance
- ML notes : some data sources
- Great explanation of how async/non blocking I/O in...
- ML Notes : Objective function
- ML notes : good explanation of gradient descent
- ML Notes : Precision vs Recall (true positive rate)
- ML notes : linear regression vs logistic regression
- ML Notes : false positive, false negative, true po...
- ML Notes : Area Under Curve (AUC) vs Overall Accuracy
- memory being used by httpd
-
▼
December
(21)
No comments:
Post a Comment