Concrete ML v1.4: Encrypted Training and Faster Decision Trees
Blog post from Zama
Concrete ML v1.4 introduces significant enhancements, notably the ability to train models on encrypted data without compromising accuracy, ensuring sensitive data remains secure throughout the process. This update improves the speed of tree-based models like XGBoost, random forests, and decision trees by 2-3 times in common quantization scenarios, allowing for more complex models without slowing performance. The new feature for logistic regression allows direct training on encrypted data, preserving data security and facilitating collaboration on sensitive information. The precision parameter for tree-based models has been optimized for Fully Homomorphic Encryption, enabling faster and more predictable performance, with the capability of handling higher quantization precision without increased latency. These advancements make it feasible to analyze large datasets securely and efficiently, enhancing both speed and model complexity.