Ground Truth at Super Bowl LX: Teaching Robots to Play in a Digital World
This past week, the energy in the Bay Area was electric. As the Super Bowl returned to our backyard for the first time in a decade, San Francisco transformed into a global stage for more than just football. For those of us at Niantic Spatial, it was a homecoming that offered the perfect opportunity to showcase how we are connecting the bits to the atoms in the city we call home.
To celebrate the occasion, we set out to build a "moonshot" experience in under two weeks. We wanted to explore a simple but profound question: What happens when a physical robot and a digital AR experience share the exact same reality?
Imagine looking through your phone at a busy event space and seeing a digital field goal post that is not just floating in space, but precisely tracked to a roaming robot. As the robot moves across the room, the digital goal moves with it in real time. In this shared AR space, the digital and physical become indistinguishable parts of the same experience. Multiple users can flick digital footballs through the moving goal to score, while others can spawn digital balloons for the physical robot to "pop" by driving over them.
This was not just a fun experience; it was a display of our Visual Positioning System (VPS) acting as a bridge between the digital and physical realms. Using our app, Scaniverse, we quickly created a high fidelity digital twin of the venue, which became the foundational shared language for each device in the room. Because the robot and phones were all localized to the environment, they all had the exact same understanding of where they were in space. This is the core story of the digital twin. It allows a machine to interact with a digital object as if it were a physical one.
A Shared Map for a Smarter Future
The power of our geospatial AI technology is what made this project possible so quickly. This same digital twin even allowed us to seamlessly bring our Project Jade AI companion, Dot, into the space to help navigate attendees to different areas of the event. It proves that once a space is understood by machines, the possibilities for entertainment and utility are limited only by our imagination.
This demo was the perfect stage to demonstrate the next frontier of our work: AI that understands the physical world. Artificial intelligence is already making a massive impact in our digital lives, from the way we work to the way we create. However, at Niantic Spatial, we believe there is a significant, untapped potential that is realized when AI moves beyond the screen and into our physical reality. Our mission is to move past the idea of AI as a digital only tool by giving it a sense of place.
Connecting the Dots in the Bay Area
We were honored to showcase this work alongside the Bay Area Host Committee (BAHC) at their Innovation Summit and Tech Playground. While the action on the field in Santa Clara was the main event, the BAHC spent the week ensuring the surrounding celebration was a showcase of the region's lasting legacy of innovation.
By utilizing our Large Geospatial Model (LGM) to reconstruct and understand environments, we are building the infrastructure for a new era. This is a future where robots, smartphones, AI glasses, and digital content all share the same ground truth as humans.
–Asim Ahmed, Head of Product Marketing at Niantic Spatial