Behind the math: how we built the annotation savings calculator
Blog post from Voxel51
The article introduces an Annotation Savings Estimator designed to compare the costs and efficiency of human labeling versus auto-labeling, particularly with the Verified Auto Labeling system. It details the inputs required for the estimator, such as the number of images, task type, and complexity, and explains how these factors influence the costs associated with both human and auto-labeling methods. The estimator uses benchmark data sources and models to calculate labeling costs and time, presenting examples to illustrate potential savings. Human annotation is noted to have hidden costs, such as onboarding and quality assurance, which can substantially increase expenses. The estimator primarily highlights the significant time and cost savings of auto-labeling, suggesting up to 100,000x lower costs for large-scale projects. The tool aims to provide users with a clear understanding of the potential return on investment when opting for automation in data annotation tasks.