Company
Date Published
Author
Yusuf Ishola
Word count
1289
Language
English
Hacker News points
None

Summary

Chain-of-Draft (CoD) prompting is a new technique designed to enhance the efficiency of Large Language Models (LLMs) by reducing computational costs without compromising reasoning quality. Developed by researchers at Zoom Communications, CoD simplifies complex reasoning tasks by encouraging LLMs to generate concise reasoning steps, typically five words or less, emulating how humans take brief notes. This approach contrasts with the more verbose Chain-of-Thought (CoT) prompting. CoD demonstrates comparable accuracy to CoT across various tasks, such as math problem-solving and symbolic reasoning, while significantly cutting token usage by up to 92% and reducing latency. Ideal for high-volume, cost-sensitive applications, CoD excels in few-shot settings but may struggle in zero-shot scenarios and with smaller models. Despite these limitations, CoD offers practical value for production applications, providing a balance between reasoning efficiency and quality, and can be integrated with other prompting techniques to further enhance performance.