Front-facing cameras and AI analyze real-time visual data to guide visually impaired runners during the London Marathon. These glasses convert environmental imagery into audio cues delivered via built-in speakers. While the application is niche, Fortune highlights how this hardware integrates vision models for immediate accessibility. It provides a concrete use case for real-time edge inference.