View Single Post
Old 08-09-2012, 05:20 AM
magdon's Avatar
magdon magdon is offline
Join Date: Aug 2009
Location: Troy, NY, USA.
Posts: 595
Default Re: Sampling bias and class imbalance for target variable

You raise an interesting point regarding unbalanced data, which is often the nature of the data in many "high risk" applications. In learning from data it is useful to distinguish between two distinct goals:

1) Obtaining the best possible classifier;

2) Evaluating the out-of-sample performance of your classifier.

The reason it is easy to confuse these two goals together is because most ways of approaching 1) is to solve 2) first and then optimize your estimate of out-of-sample performance over the hypotheses in \cal H. Using this typical approach, it is quite easy for many learning algorithms to largely ignore the minority class in a severely unbalanced data.

Hence, in an effort to obtain the best possible classifier that pays some attention to the minority class one might artificially reweight the data to emphasize the minority class more so that its properties can be learned. Nevertheless, to evaluate your out-of-sample performance, you should go back to the unweighted, unbalanced data that represents the population.

Originally Posted by rainbow View Post
To avoid sampling bias, the general idea is to have the training distribution to match the testing distribution (as stated in the book). Is this the same as having the sample (train + validation + test) to match the population distribution?

How does this relates to the class imbalance of the target (y) distribution. For instance, training a machine to identify fraud where the number of fraud transactions are much lower than the non-fraud transactions. Is it favourable for the training to upweight the number of fraud transactions in your training data in order to have a balanced data set wrt. y? How does this relates to sampling bias and how do you adjust for this upsampling of fraud cases for the model to generalize well?
Have faith in probability
Reply With Quote