Google is testing additional text searches in Google Lens – IT Pro – News

Google allows English-speaking Google Lens users to perform additional text searches in order to get better search results. The feature, called multi-search, currently only works for English-speaking users in the US and is still in beta.

English-speaking users of the Google app for Android and iOS can enter an additional text search after taking a photo through Google Lens to get a more specific search result. google gives Itself is an example of a photographed piece of clothing, where a different color can also be selected afterwards. The new results then show similar clothes as in the original image, but in the color that was typed afterwards.

This comprehensive and contextual search is made possible by the new search algorithm, according to Google Unified multitasking model, or my mother. According to the American search giant, this algorithm is several times more powerful than BERT, the algorithm that was used until recently for searches. “My mom understands information from images and text, and soon also from video and audio,” it seemed when the algorithm was announced in 2021. “The algorithm has also been trained to work in 75 languages. It can automatically follow a search from one language. In a different language yields results more “.


Multiple search in Google Lens
See also  Unreal Engine 5 tech demo Matrix Awakens includes Railshooter and openworld parts - Gaming - News

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top