The Business & Technology Network
Helping Business Interpret and Use Technology
S M T W T F S
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
 
 
 
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 
 
 
 

Google Lens’ video search tool begins to roll out

DATE POSTED:October 1, 2024
A logo illustration of Google Lens with a camera lens and a magnifying glass. The Google Lens text is in the same font as the Google logo. The background is a gradient of blue to white.

The Google Lens video feature first announced at I/O 2024 appears to now be rolling out to more Android users.

The search tool enables people to discover content based on a photograph or a video. When you search with a video, for example, it’ll bring up related content to help you identify objects, discover similar products, and find visually similar images.

Users can record a short video and ask the tool questions about it. For those in supported regions, AI-generated responses will follow too.

On September 30, technology reporter and co-host of the Android Faithful podcast,  Mishaal Rahman took to X to say: “Google announced this feature back at I/O in May but it’s started to roll out for some users in the last few days. LMK if you have it!”

You can now send a video to Google to ask questions about it!

If you open Google Lens on Android and hold down the shutter button, it'll record a short video that you can ask a question about.

If you're in a region where AI overviews are enabled, then you'll get an AI-generated… pic.twitter.com/qeGWy6u1TM

— Mishaal Rahman (@MishaalRahman) September 30, 2024

How does Google Lens work?

The feature is based on a set of vision-based computing capabilities that make understanding the image or video able to happen.

It does so by comparing objects in the image or video to others online and it then ranks those results based on similarity and relevance to the original picture.

It then uses its understanding of objects in the picture or video to find other relevant results from the web. Some signals are used too including words, languages, and metadata on the image’s host site.

On the Google Lens website, the team explains the analysis that takes place when a search begins: “When analyzing an image, Lens often generates several possible results and ranks the probable relevance of each result.

“Lens may sometimes narrow these possibilities to a single result. Let’s say that Lens is looking at a dog that it identifies as probably 95% German shepherd and 5% corgi. In this case, Lens might only show the result for a German shepherd, which Lens has judged to be most visually similar.”

If Lens is confident in its understanding of the object in the picture or video you’re interested in, it will then return Search results related to that.

If you allow the tool to use your location, it also uses that information to return better results – especially when it comes to places and landmarks.

This means that if you’re in Paris and you take a video of a landmark, curious to know the details, Google Lens will use the location data to help determine where you are to provide more accurate information.

Featured Image: AI-generated via Ideogram

The post Google Lens’ video search tool begins to roll out appeared first on ReadWrite.