Frontier stage
Keynote

Smartphone as a guide dog: AI for blind mobility

In person
  • Date
    9 July 2025
    Timeframe
    15:25 - 15:45 CEST
    Duration
    20 minutes
    Share this session

    For decades, blind individuals have relied on guide dogs for navigation. With recent advances in AI, could a virtual guide dog, on a smartphone, offer a more scalable, affordable, and accessible alternative?

    We introduce AI Guide Dog (AIGD), a smartphone-based navigation assistant developed through a collaboration between Carnegie Mellon University and industry researchers. AIGD is designed to help blind users navigate diverse environments confidently and independently, just like sighted people. Inspired by self-driving technology, AIGD integrates smartphone camera video, GPS, and map data to predict future movements and walkable paths, generating real-time, natural-language navigation instructions delivered audibly.

    Leveraging multimodal generative AI models and on-device edge processing, AIGD maintains high performance while preserving user privacy. This talk explores our approach to balancing accuracy, explainability, and efficiency on resource-constrained mobile devices. We also share insights from our research into blind mobility patterns, which inform a user-centered design that lowers both financial and technical barriers to adoption.

    Beyond its technical innovations, AIGD offers data-driven insights into broader societal challenges, highlighting accessibility gaps in urban infrastructure. Our work emphasizes the need for cross-sector collaboration among AI researchers, urban planners, regulators, and advocacy groups to establish shared standards for AI-powered mobility and to optimize urban environments for accessibility and scalable deployment of assistive AI.

    Share this session
    • 14
      Days
      17
      Hours
      32
      Min
      52
      Sec