Google Lens can now tell you about works from local artists


Ever ponder the meaning of a portrait or painting hanging on the wall of your favorite coffee shop? Thanks to Google Lens, Google’s AI-powered search and computer vision tool, you’re now only a few taps away from learning everything there is to know about it. Wescover, a San Francisco-based startup building a catalog of local artists and their work, today announced that it’s teaming up with Google to supply Google Lens with information about art and design installations.

As of this week, Google Lens can match designs and artwork in Wescover’s database with artists and stories in places like hotels, restaurants, and city streets. It’s as easy as launching Lens and pointing your phone’s camera at the art in question.

Wescover curated an initial map of art pieces throughout San Francisco that Google Lens recognizes, and the company says that it’ll continue to publish more content to Lens globally in the near future. To date, it claims to have documented more than 50,000 images of unique art and design from 6,000 local brands and independent artists.

Wescover Google Lens

“We’re excited to see the difference our content is making. Each exact match gives creators the credit they deserve and enables consumers with trust to find what they’re looking for,” said Wescover CEO Rachely Esman in a statement. “If you love the chair at the Ace Hotel, it’ll take hard work to find the exact same product. While another blue chair may look similar, it won’t have the same quality, materials, or story.”

Wescover’s integration follows on the heels of a Google Lens feature that highlights top meals at a restaurant, along with dish ratings and reviews. In other news, Lens recently gained the ability to split a bill or calculate a tip after a meal; read signs and other text for people who can’t read or don’t understand the printed language; and overlay videos atop real-world publications like Bon Appetit in augmented reality.

Google Lens began as a feature exclusive to Pixel smartphones, but it’s evolved dramatically in recent years. It quickly spread to Google Photos, and Lens now ships natively in flagship smartphones from companies like Sony and LG.

The growing list of things Lens can recognize includes over 1 billion products from Google Shopping, including furniture, clothing, books, movies, music albums, and video games. (That’s in addition to landmarks, points of interest, notable buildings, Wi-Fi network names and passwords, flowers, pets, beverages, and celebrities.) Additionally, Lens can read and prompt you to take action with words in signage and surface stylistically similar outfits or home decor. And perhaps most useful of all, it’s able to extract phone numbers, dates, and addresses from business cards and add them to your contacts list.

Back in May at its I/O keynote, Google took the wraps off of a real-time analysis mode for Lens that superimposes recognition dots over actionable elements in the live camera feed — a feature that launched first on the Pixel 3 and 3 XL. Lens recently came to Google image searches on the web, and more recently, Google brought Lens to iOS through the Google app and launched a redesigned experience across Android and iOS.

Source link


Please enter your comment!
Please enter your name here