AIR
Augmented Indexed Reality, better known as AIR, is the first social platform in Mobile AR that lets friends connect through objects in the real world. We believe we can give people the power to connect with friends in new, fun ways through objects — bringing the magic and meaning of digital and social connections to the world around you, today.
-
2021
-
The app was built in React Native and Firebase. For the MVP, we used QR codes to represent object detection as Detectron was not running well enough for production.
-
See UXR deck below
A month after joining Meta, I began exploring new technologies that the Meta AI Research (the FAIR) was creating. Among them was Detectron2go, a mobile version of detectron which detects objects, masks subject, and provides bounding boxes. I quickly became obsessed with its promise and teaser video and began prototyping potential experiences that I found to be interesting. The first, was a new AR interface for Instagram, shown here.
Early at Meta, I prototyped mostly on AR Studio. After hitting a few roadblocks because of my Noobness, I decided to speed up the process and focus on the experience design to test desirability among my peers. I managed to get a few people interested and want to help support me, so we participated in the 2020 Meta summer hackathon. A few days later, after taking the initial design prototype and building an MVP with it, we won the hackathon. The prize was an invitation to join NPE (New Product Experimentation) for 6 moths to develop our product. During my time at NPE, I was a Founder, Product Designer, and Engineer. I had the help of a PM and a contract engineer and designer (Connected.io, hired by NPE to advise).
Below you will find the V0 AIR prototype.
This is the product MVP. After many design revisions, we arrived at something that was a bit different than our initial assumptions about people’s relationships to objects.
AIR MVP Problem and Opportunity
How can we create a fun, lightweight experience that makes people feel excited to share everyday objects, while being able to easily navigate and organize their content?
After a few revisions, we held a UXR session with people in the US, ages 15-54. We used splitmetrics, ran usability interviews, and a painted door test for desirability metrics. Please refer to this deck for more information.
We learned that there was a desire among the Gen Z audience to have a more digital relationship to objects around them, especially those that have more meaning beyond the superficial. We also learned that many people were interested in digital property, such as NFTs. Toward the end of the 6 months, we wrapped up our learnings and pivoted to prototype Omni, a link-in-bio NFT gallery. Check it out here.