How Apple made the F1 movie trailer literally shake things up
Blog post from Mux
An avid Formula 1 fan discovered an unexpected feature while watching a trailer for an upcoming F1 movie starring Brad Pitt on his iPhone; the phone emitted haptic feedback synchronized with the on-screen action, such as engine roars and tire screeches. Intrigued by this innovation, he investigated how Apple achieved this effect and found that Apple utilized a custom extension of the HTTP Live Streaming (HLS) protocol, embedding haptic feedback data within the video manifest using the #EXT-X-SESSION-DATA tag and Apple Haptic and Audio Pattern (.ahap) files. Although attempts to replicate Apple's implementation in personal projects were unsuccessful, likely due to restrictions or custom app logic, the author suggests that developers could create their own haptic data formats and integrate them into HLS streams using similar methods. Despite the challenges, the exploration highlights Apple's clever use of existing technology to enhance the multimedia experience and suggests the potential for future adoption of haptic feedback in video streaming.