At Google I/O’17, Google Lens was first introduced to the world. Coming off the heels of VR and AR, I saw the potential; In order for AR to work well, it needed to truly understand the world. In the fall of 2017, I became the UX lead for Lens. By the spring of 2018, I managed a team of UX Designers, UX Engineers and UX Researchers working to scale Lens as a platform. At I/O 2018, Lens was working as promised and this time was designed in a way to scale and integrate.



What is Lens?

 
 
 
Consumer_TrainTicket.gif

Scan and translate text

Translate text in real time, look up words, add events to your calendar, call a number, and more. Or just copy and paste to save some time.

See what’s popular on menus

Wondering what to order at a restaurant? Look up dishes and see what’s popular, right on the menu, with photos and reviews from Google Maps.

Unlock print media

Instead of jumping to a website with a recipe, just point Lens at a recipe in a magazine to see if come alive.

Find a look you like

See an outfit that caught your eye? Or a chair that's perfect for your living room? Get inspired by similar clothes, furniture, and home decor—without having to describe what you're looking for in a search box.

Explore nearby places

Learn more about popular landmarks. See ratings, hours of operation, historical facts and more.

Identify plants and animals

Find out what plant is in your friend's apartment, or what kind of dog you saw in the park.

Availability

A core focus of my team was to enable Lens to become the Google Search for what you see. We did this by providing Lens as a standalone app, and making it accessible from Google Assistant, Google Photos and Google Search.

Google Lens app

Use as a standalone experience

Google Assistant

Integrated with voice actions or manual control

Photos.png

Google CameRa & Photos

Search directly from the camera app or from any photo in your library

Search.png

Google Search

Open lens from the Google Search bar or from any image within Google Image Search

Press

The 2nd announcement at Google I/O’18 was very well received as a solid foundation for Lens and Google I/O’19 showed how Lens grew to solve more real-world problems.

 
 

WWD: “Describing fashion can be a daunting task, even for retailers. For consumers, the challenge can be a major obstacle. “You see a polka-dotted dress, with a V-neck, a bias cut, flowing fabric, etc....how do you type all that into a search box?” she said. With Style Match, it’s about using cameras instead of keyboards to search for things.”

CNET: “Google Lens is pushing that territory further. This year, Lens feels like even more of a push toward the future of computer vision, where apps "know" what the camera is "seeing." It's what's needed before anyone makes a killer pair of smartglasses.”

Engadget: “During our demo, a Google rep used Lens to snap a picture of a book's inside cover, and highlighted the text in the photo. Just like you'd expect with words in a messaging or editing app, a menu popped up to let you copy your selection to paste elsewhere. I was especially impressed that the words in the picture didn't have to be perfectly aligned for Lens to recognize them.”