General-purpose and introductory examples for the imbalanced-learn toolbox.
Examples based on real world datasets¶
Examples which use real-word dataset.
Examples using combine class methods¶
Combine methods mixed over- and under-sampling methods. Generally SMOTE is used for over-sampling while some cleaning methods (i.e., ENN and Tomek links) are used to under-sample.
Examples concerning the
Example using ensemble class methods¶
Under-sampling methods implies that samples of the majority class are lost during the balancing procedure. Ensemble methods offer an alternative to use most of the samples. In fact, an ensemble of balanced sets is created and used to later train any classifier.
Examples illustrating how classification using imbalanced dataset can be done.
Examples related to the selection of balancing methods.
Example using over-sampling class methods¶
Data balancing can be performed by over-sampling such that new samples are generated in the minority class to reach a given balancing ratio.
Example of how to use the a pipeline to include under-sampling with scikit-learn estimators.