Home / Companies / Zama / Blog / Post Details
Content Deep Dive

Comparison of Concrete ML regressors

Blog post from Zama

Post Details
Company
Date Published
Author
The Zama Team
Word Count
2,170
Language
English
Hacker News Points
-
Summary

The Zama Bounty Program's second season invited participants to create a tutorial comparing the performance of various regressors in Concrete ML with those in scikit-learn, with a particular focus on Fully Homomorphic Encryption (FHE). The blog post, based on a contribution by GitHub user AmT42, explores how Concrete ML's regressors, which closely mirror the API of scikit-learn, can be compiled and tested in FHE using a simulated environment to assess their performance. The tutorial evaluates different regression models—linear, neural networks, and tree-based—using R2 scores, highlighting that Concrete ML models perform well on encrypted data despite quantization, with linear models showing minimal performance loss and tree-based models maintaining good scores. Neural networks, while facing heavy quantization, still perform admirably due to Quantization Aware Training, although models like XGBRegressor require further hyperparameter optimization to bridge the performance gap between FHE and their fp32 counterparts. Suggestions include using GridSearch for better optimization and noting that the low sample size may impact runtime, especially for complex models.