Other Validation Functionalities
model_selection module offers a lot of functionalities related to model selection and validation, including the following: cross-validation, learning curves, and hyperparameter tuning.
00:19 Cross-validation is a set of techniques that combine the measures of prediction performance to get a more accurate model estimations. One of the most widely used cross validation methods is k-fold cross-validation. In it, you divide your dataset into k—often five or ten subsets—or folds, of equal size and then perform the training and test procedures k times. Each time, you use a different fold as the test set and all of the remaining folds as the training set.
Hyperparameter tuning, also called hyperparameter optimization, is the process of determining the best set of hyperparameters to define your machine learning model. scikit-learn’s
model_selection module provides you with several options for this purpose, including
validation_curve(), and others.
Become a Member to join the conversation.