The convergence of AI and data streaming - Part 1: The coming brick walls
Blog post from Redpanda
At the AI-by-the-Bay Conference in Oakland, the author discussed the convergence of Artificial Intelligence (AI) and real-time data streaming, highlighting the challenges faced by the AI industry as it evolves. The presentation explored the impact of the transformer model, which has significantly accelerated AI development but remains predominantly batch-trained, leading to systemic limitations. The "d20 test" was introduced as a metaphorical gauge of AI's current capabilities, illustrating the complexity of achieving basic tasks like drawing a d20 die accurately. The author emphasized the need for AI systems to transition from public data to vast private data reservoirs to overcome limitations in ethically-sourced training data. The discussion also touched on the inefficiencies of large-scale AI models, which are costly and energy-intensive, and the necessity for real-time training and adaptive strategies to circumvent these hurdles. This discussion sets the stage for further exploration of adaptive strategies and the role of data streaming in advancing enterprise AI architectures.