The Hugging Face transformers library has become critical in natural language processing due to its cutting-edge research implementations, thousands of trained models, and accessibility. Simple experiments have shown the benefit of using advanced tuning techniques like genetic optimization for large performance improvements over standard hyperparameter optimization methods. The Transformers 3.1 release integrated Ray Tune, a popular Python library for hyperparameter tuning, providing a simple yet powerful integration that allows users to easily access powerful hyperparameter tuning solutions without sacrificing customizability. This integration enables users to leverage state-of-the-art algorithms and tooling like Weights and Biases and tensorboard, making it easy to fine-tune models on various datasets, including MRPC. The Hugging Face + Ray Tune integration is demonstrated through an example that uses HyperOptSearch for hyperparameter search with optional usage of Weights and Biases.