While SingleStore and Hadoop are both data stores, they fill different roles in the data processing and analytics stack. The Hadoop Distributed File System (HDFS) is used primarily for batch processing due to its design limitations. However, newer execution frameworks are challenging MapReduce as a business's batch processing interface of choice. A number of SingleStore customers have implemented systems using the Lambda Architecture, which is a common design pattern for stream-based workloads where recent data requires fast updates and analytics. Using SingleStore as the real-time path and HDFS as the historical path has been a winning combination for many companies. This architecture enables fast real-time analytics on large datasets while maintaining long-term history on cheaper storage. For example, Comcast uses SingleStore and Hadoop together to proactively diagnose potential issues from real-time intelligence and deliver the best possible video experience. SingleStore enables this by providing lightning-fast real-time analytics on changing datasets and making their analytics infrastructure more performant overall.