Google searches affect such a big portion of our overall searches that we no longer “search” or “look up” something we need to know – we “Google” it. As important as Google is, they don’t make changes very often, but when they do, it’s noteworthy. To go along with their twentieth anniversary, they have announced some major changes to their search function.

Ben Gomes, VP of Search, News, and Assistant, announced these changes in a blog post. He spoke of growing up in India with a thirst for information, leading up to a move to the United States to study computer science where he later landed with Google to become one of their first employees.

Seemingly to beat back allegations that search results are politically biased, he spoke about Google’s focus on making information available with a focus on the user, a goal to give users the most relevant, highest quality information, their algorithmic approach, and testing every change they make.

news-goole-search-phone

news-goole-search-phone

To go along with this huge milestone Google is approaching, they are announcing new changes, with Gomes explaining that they are looking to “make information more accessible everywhere.”

Their “next chapter” is driven by three shifts in the approach to search: the shift from answers to journeys, a shift from queries to providing a way to get information, and a shift from text to a more visual way of finding information.

And how will they make all these changes? With artificial intelligence. Google plans to improve the way they understand language, in ways that are possible in 2018 and beyond that weren’t possible twenty years ago.news-google-search-eyes

news-google-search-eyes

The thought is that neural networks can now help them search with concepts instead of words. Gomes explains, “neural embeddings, an approach developed in the field of neural networks, allows us to transform words to fuzzier representations of the underlying concepts and then match the concepts in the query with the concepts in the document.”

They call this “neural matching,” a process that allows them to address questions with the most relevant results, even if the words aren’t being used.

The first change is that when you visit the homepage on a mobile device, you’ll be introduced to a new version of the Google feed, one that was first introduced last year in the search app. They call this “Discover.” It will show surface content and relevant articles attached to what Google already knows about you.

Another change is that you’ll see information cards and images that will be called “Activity cards.” These will be shown at the top of your search results and will show previous searches and activity if Google feels it is useful to the current search. You can remove the cards that don’t apply.

The “Featured Video” cards feature will show up in a carousel if your search is related to a video. Additional tabs of information on a common topic will be offered in “enhanced topics,” those things that are searched for often. There will also be an option that will save your searches for possible use in the future. This will be called “Collections.”news-google-search-display

news-google-search-display

Since every other tech service is moving into “Stories,” so will Google. Just like Instagram and Snapchat, AMP stories will be available with curated text information, video, and images.

Image searches will receive more quality and priority in the pages that are used, and they are also integrating with Lens, their tool that will allow you to identify what you’re searching for by using your own photo.

But will all this really change the opinion of Google? They get such flack for gathering information on users, especially with regards to searching, and it seems like all these changes only have the propensity to make it worse.

Is that what you think too? Are you worried this will only make Google soak up more information on you? Let us know what you think about Google’s new changes in their search function.

LEAVE A REPLY

Please enter your comment!
Please enter your name here