Home / Companies / Seldon / Blog / Post Details
Content Deep Dive

The Environmental Impact of ML Inference

Blog post from Seldon

Post Details
Company
Date Published
Author
Alex Buckalew
Word Count
810
Language
English
Hacker News Points
-
Summary

The environmental impact of machine learning (ML) inference is a growing concern, particularly given the significant compute resources required that contribute to global carbon emissions. While much research has focused on the energy consumption of ML model training, it is the inference phase that consumes the majority of resources, estimated at 70-90% of total compute usage. This discrepancy highlights an opportunity to reduce the environmental footprint by optimizing inference processes. Tools like Seldon Core aim to enhance efficiency through features such as multi-model serving and auto-scaling, which can lower both infrastructure costs and carbon emissions. The importance of technical reports from businesses detailing energy consumption for both training and inference is emphasized, as such data would be invaluable for advancing research and developing strategies to mitigate the environmental impact of ML applications.