Many legacy database systems are not equipped for modern applications that require high-velocity, high-volume data workloads, driven by near ubiquitous connectivity. To support such workloads, database systems must have characteristics like real-time ingestion and processing, subsecond response times, anomaly detection as events occur, and the ability to generate reports over changing datasets. Companies are using in-memory solutions to meet these requirements, with examples including Pinterest's real-time analytics pipeline and Novus's portfolio management platform, which provide instant answers to analysts querying their dataset. As more data comes online, organizations will need systems that can rapidly ingest data while simultaneously making it accessible for analysis, highlighting the importance of building real-time data pipelines through in-memory architectures.