Autonomous AI developers were assessed for their capability to make specific improvements to Replay’s devtools, focusing on indicating important CSS selectors. OpenHands, Copilot Workspace, Devin, and Amazon Q were tested, revealing that detailed instructions on data flow significantly enhanced their performance, with OpenHands performing best when provided with such annotations. A Replay-based analysis was developed to automatically annotate the source code with relevant comments, enabling OpenHands to reliably execute tasks with a simplified prompt. This approach aims to streamline development workflows by automatically generating detailed data flow information, thereby reducing the need for extensive task specifications and facilitating more effective AI-driven improvements. The study suggests that data flow annotations serve as an essential tool for AIs, increasing their ability to perform complex tasks by providing critical information they cannot infer independently.