Content Deep Dive
How to tune hyperparameters on XGBoost
Blog post from Anyscale
Post Details
Company
Date Published
Author
Juan Navas, Richard Liaw
Word Count
1,305
Language
English
Hacker News Points
-
Summary
This blog post is part 2 in a series on hyperparameter tuning, focusing on practical examples with XGBoost and the MNIST dataset. The authors provide step-by-step instructions on how to preprocess the data, build a model without hyperparameter tuning, tune hyperparameters using random search, and demonstrate improvements in accuracy. They also outline their plans for part 3, which will explore distributed hyperparameter tuning using Ray Tune.