Home / Companies / Zama / Blog / Post Details
Content Deep Dive

Concrete ML v1.6: Bigger Neural Networks and Pre-trained Tree-based Models

Blog post from Zama

Post Details
Company
Date Published
Author
Andrei Stoian
Word Count
453
Language
English
Hacker News Points
-
Summary

Concrete ML v1.6 introduces significant updates that enhance performance and usability, including reduced latency for large neural networks, support for pre-trained tree-based models, and improved collaborative computation through DataFrame schemas and logistic regression deployment. The update supports importing pre-trained tree models with the [.c-inline-code]from_sklearn[.c-inline-code] function while maintaining accuracy on encrypted data. Notable latency improvements are showcased in two notebooks, demonstrating a 20-layer deep MLP model with a significant reduction to 1-second latency on encrypted data and a ResNet18 model showing a 4x improvement over previous results. Additionally, deployment enhancements allow logistic regression training to be easily deployed as a client-server service, with options for parametrization and cloud deployment. DataFrame schemas now enable users to control schema details, facilitating compatibility across different users' encrypted data. Upcoming GPU support promises further advancements, and users are encouraged to engage with the project through various community channels and participate in the Zama Bounty Program.