MUM: Google introduces a new AI milestone for understanding information

Sometimes if you asked Google, “Is there any work left to be done?”. The short answer is an emphatic “Yes!”. Because there are endless challenges, Google tries to solve every moment so that we get a better search result.

Today Google team is sharing how they can address one of them with whom it can identify: typing as many searches as possible to get the answers you need. Today, Google can help you with this, but it can take a lot of searching to consider – you need to know the height of each mountain, the average temperature of the fall, the difficulty of mountaineering paths, use the right gear, and more. After several searches, you will finally be able to get the answer you need.

But if you were talking to a hiking expert; You may ask a question – “What should I do separately to prepare?” You will get a thoughtful answer with your work needs in hand and guide you to many things to consider. This example is not unique – Google deals with all sorts of tasks that require multiple steps with Google each day.

But however, today’s search engines aren’t quite sophisticated enough to answer the way an expert would. But with a new technology called Multitask Unified Model, or MUM, Google gets closer to helping you with these types of complex needs. So you’ll need fewer searches to get things done in the future.

MUM Hero 1 1

The new technology helping you when there isn’t a simple answer. How do you use Google helps to convert the complex has the potential of MUM. MUM is built on a Transformer architecture like BERT, but it Is 1,000 times more powerful. MUM not only understands language but also generates the language. It is simultaneously trained in 75 different languages and many different functions, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models. And MUM is multimodal, so it understands information across text and images and can expand into more modular like video and audio in the future.

Like top gear or best training practice – MUM helpful subplots for in-depth search can also point to helpful articles, videos, and images across the web.

Language can be an important barrier to accessing information. But MUM has the potential to break down these boundaries by transferring knowledge across languages. It can learn from sources that weren’t written in the language you wrote your search for and helps bring that information to you.

MUM is multimodal, which means it can simultaneously understand the information in the format of webpages, images, and many more. Whenever Google jumps in with AI to make the world’s information more accessible, Google does it responsibly. Google Search is undergoing a rigorous evaluation process of each improvement to deliver more relevant, helpful results.

Since 2019, Google has carefully tested the many applications of BERT launched. MUM also will go through the same process as we apply these models to searches. In particular, Google will look for patterns that may indicate machine learning bias to avoid the introduction of bias into its systems. Google will also apply learnings from our latest research on reducing the carbon footprint of training systems like MUM, To make sure the search is running as efficiently as possible.

Google will bring MUM-powered features and improvements to its products in the coming months and years. Though in the early days of exploring MUM, it’s an important milestone toward a future for Google, Where Google communicates with people in various ways and interprets information.

Get in Touch


Please enter your comment!
Please enter your name here

Latest Posts