Google’s Multisearch Feature Goes Global, Combining Images and Text for Better Search Results

Google’s Multisearch Feature Goes Global, Combining Images and Text for Better Search Results

This article was initialized by a human, created by AI, updated by a human, copy edited by AI, final copy edited by a human, and posted by a human. For human benefit. Enjoy!

Google has announced that its Multisearch feature is now available to global users on mobile devices, wherever Google Lens is available. Multisearch allows users to search using both text and images simultaneously, providing a more modern and efficient way of searching. A variation of Multisearch, called “Multisearch near me,” which is targeted towards local businesses, will also become available to users globally in the coming months. In addition, Multisearch for the web and a new Lens feature for Android users will also be launched soon.

Multisearch is powered by an AI technology called Multitask Unified Model (MUM), which can understand information across various formats, including text, photos, and videos. MUM can then draw insights and connections between topics, concepts, and ideas. Google integrated MUM into its Google Lens visual search feature, where it allowed users to add text to a visual search query.

Prabhakar Raghavan, Google’s SVP in charge of Search, Assistant, Geo, Ads, Commerce and Payments products, said at a press event in Paris, "We redefined what we mean to search by introducing Lens. We’ve since brought Lens directly to the search bar and we continue to bring new capabilities like shopping and step-by-step homework help.”

Multisearch can help users process and understand search queries that were previously difficult to input using text alone. For instance, a user can pull up a photo of a shirt they liked in Google Search and ask Lens where they can find the same pattern on a different type of apparel, like a skirt or socks. Or they can point their phone at a broken piece on their bike and type into Google Search a query like “how to fix.” The combination of words and images could help Google to process and understand search queries that it couldn’t have previously handled.

Multisearch is most helpful with shopping searches, where users can find clothing they like in different colors or styles. They can also take a photo of a piece of furniture, like a dining set, to find items that match, like a coffee table. Multisearch users can narrow and refine their results by brand, color, and visual attributes, according to Google.

Google initially launched Multisearch in the US in October 2021 and later expanded it to India in December. As of now, Google says Multisearch is available to all users globally on mobile devices in all languages and countries where Lens is available. Google has also announced that the “Multisearch near me” feature will become available to all languages and countries where Lens is available over the next few months. Additionally, Multisearch will expand beyond mobile devices with support for Multisearch on the web in the coming months.

Google teased an upcoming Google Lens feature called “search your screen,” noting that Android users would soon be able to search what they see in photos and videos across apps and websites on their phone while remaining in the app or on the website. Google shared a new milestone for Google Lens, too, noting that people now use the technology more than 10 billion times per month.

Google’s Multisearch feature is a significant step towards a more modern and efficient way of searching, and its global availability on mobile devices is sure to make it a popular choice for users worldwide. The addition of the “Multisearch near me” feature and the new Lens feature for Android users will only enhance the search experience, making it easier and faster for users to find what they’re looking for.

As the use of AI continues to evolve, we can expect Google to introduce even more advanced search features that will make search queries more accurate and efficient than ever before. The company's continued investment in its search capabilities is a testament to the importance of AI in the tech industry.

It is worth noting that Google is not the only tech giant investing heavily in AI. Other companies like Microsoft, Amazon, and Apple are also making significant strides in the development and implementation of AI-powered technologies.

The increasing use of AI in everyday life is both exciting and concerning. On the one hand, AI has the potential to revolutionize many industries, from healthcare to finance, by improving efficiency and accuracy. On the other hand, there are concerns about the ethical implications of AI, such as privacy concerns and the potential for AI to be used for malicious purposes.

As we move forward, it is crucial that we continue to monitor the development of AI and its impact on society. While we cannot predict the future, it is clear that AI is here to stay and will continue to play an increasingly important role in our lives.

In conclusion, Google's Multisearch feature, along with the new "Multisearch near me" feature and the Lens feature for Android users, represents a significant step forward in the evolution of search technology. With its global availability on mobile devices, users worldwide can now enjoy a more efficient and accurate search experience. As AI continues to evolve, we can expect even more advanced search features to be introduced, revolutionizing the way we search and access information.

Interested in the latest updates on AI technology? Follow us on Facebook and join our group (Link to Group) to leave your comments and share your thoughts on this exciting topic!