ClickPipes, part of the ClickHouse team, is developing high-performance connectors to transfer data from various sources, including Delta Lake, to ClickHouse. The team has built Change Data Capture (CDC) connectors for databases like Postgres, MySQL, and MongoDB and is now focusing on implementing CDC from data lakes, starting with Delta Lake. A reference implementation of this process is available as open-source and uses Python to move data from Delta Lake to ClickHouse. Delta Lake provides a transactional storage layer on object storage, effective for handling large volumes of data, while ClickHouse is optimized for fast analytical queries. The combination of these technologies allows for efficient data replication and real-time data access. The process involves using Delta Lake's Change Data Feed (CDF) to capture changes and ClickHouse's ReplacingMergeTree table engine to model data changes. While the current implementation is not production-ready, it lays the groundwork for a robust CDC pipeline, with plans to integrate it into ClickPipes in the future. The implementation demonstrates the potential for real-time analytics by utilizing Delta Lake's CDF and ClickHouse's capabilities, although challenges remain, such as handling schema evolution and supporting delete operations.