#1




AUC Metric
Can you comment on using the AUC metric for assessing the quality of a classifier?
Is this the best metric for assessing classifiers? What is the mathematical basis for AUC? Thanks! 
#2




Re: AUC Metric
There is no such thing as "best", actually there is a jungle of validation metrics and curves out there which all have their merit.
What is often also used is the F1 score (+precisionrecallcurves) aside from AUC and ROC. it's problemdependent, ROC has the advantage/disadvantage of being invariant to class skew. The AUC can be directly computed using the Mann Whitney U statistic. hth 
#3




Re: AUC Metric
Quote:
Mathematically, the AUC is also equivalent to measuring the pairwise ranking accuracy introduced from the (decision value of the) classifier. This paper http://www.icml2011.org/papers/567_icmlpaper.pdf is a pretty recent study on the connection between AUC and other metrics (such as the usual 0/1 error). Hope this helps.
__________________
When one teaches, two learn. 
#4




Re: AUC Metric
The link to the paper is not valid. Please fix it.

#5




Re: AUC Metric

Tags 
auc metric, classifiers 
Thread Tools  
Display Modes  

