Using AI (artificial intelligence) in the Google search engine makes it easier to find what we are looking for on the web.
It will allow us to find the information we are looking for more precisely and quickly. Using an increasingly natural language.
In a recent event related to its search engine, the North American company has announced a series of new features that are going to arrive, thanks to the use of the MUM model.
Table of Contents
What is MUM?
It is the so-called “Unified Multitasking Model”, which allows a better understanding of the information. It will make it possible for a series of changes to arrive in the search engine.
The aim is to achieve more detailed and contextualized answers and a more prosperous and profound search experience.
Google tries to offer us other ways to search, in addition to the classic one of writing our search in the search engine’s text box.
It also wants to boost the use of its Google Lens image recognition software, which will include in the Google app on iOS, and in the Chrome browser on laptops.
Main News And Changes
Multimodal Search
MUM makes a new way of searching visually possible. You can now include text in visual searches.
Using an image, we can ask questions related to what we are seeing.
Context, therefore, will become much more critical in new searches.
Google gives us a couple of examples so that we understand it better.
In the first example, we see how using the image of a shirt with a printed drawing pattern, we can search, for instance, for socks in the same way to buy them and that they match.
Secondly, by taking a photo of a broken part of the bicycle, we can search for how to fix it directly, and the search engine will recognize the specific role that needs to repair and how to do it.
This will make it relaxed to find the exact moment of a video where they explain how to fix it.
Redesigned Search Page
The use of advances in AI will also serve this redesign. The goal is for searches to become more natural and intuitive.
On the results page of the search that we carry out, there will be a new section called “Things to know” or “Things to know”.
Dependent on what we are looking for, we will be shown topics related to the main search that may interest us.
Later, the “Refine this search” and “Broaden this search” sections will appear, offering related search terms or suggestions that help us find what we are looking for.
Better Results in Video
It is already possible to get search results for critical moments in the video, such as the moment of the decisive goal in a football match or the steps of a recipe.
But Google wants to go one step additional and aims to identify related topics in a video, offering links to go deeper into the subject and continue learning.
The AI can recommend content you may be interest in. And they do not necessarily have to be topics expressly mentioned in the video.
Google Shopping
This tool to make purchases will also receive news, although some will only reach the United States initially.
It will make it easier for us. Among other things, to find garments of a similar style to the ones we have searched for or of different colours.
It will also tell us if there is a product stock in nearby stores.
Conferring to Google, all of these new features are intend to help people find the answers they’re looking for and inspire further questions during the search process.
Related posts
Featured Posts
Cinemaflix: A Canadian Global Media Production
Cinemaflix is a Canadian media production and distribution company that has been involved in the creation of several successful television…
Will You Choose Professional or DIY Car Detailing
Will You Choose Professional or DIY Car Detailing A vehicle is among the largest investments people make in their lifetime….