Front-facing cameras and AI analyze visual input to convert surroundings into real-time audio for runners. These smart glasses allow visually impaired athletes to navigate the London Marathon independently. The hardware translates complex environmental data into spoken cues. This application demonstrates a practical, narrow use case for multimodal AI in assistive accessibility tools.