Company
Date Published
Author
Chainlink Labs Research
Word count
1410
Language
English
Hacker News points
None

Summary

The sixth post in a series by the Chainlink Labs Research Team, authored by Chenkai Weng, delves into the efficient generation of Vector Oblivious Linear Evaluation (VOLE) correlations for commitments using recent advancements based on the Learning Parity with Noise (LPN) assumption. The post introduces a protocol that optimizes communication costs by leveraging a combination of SPVOLE (Single-Point VOLE) and MPVOLE (Multi-Point VOLE) sub-protocols, which significantly reduce the overhead associated with traditional oblivious transfer extensions. The innovative approach involves generating sparse VOLE correlations and using LPN to expand them into a large number of correlations with a sublinear communication cost relative to the vector length N. This method capitalizes on the LPN assumption to transform a small number of initial correlations into a pseudorandom large-scale set, thus enhancing efficiency while maintaining cryptographic security. Additionally, the post highlights the historical development and improvements of VOLE protocols over various fields and discusses the bootstrapping process that further reduces the need for repeated communication-intensive steps.