10 Google Algorithm Updates That Shook the SEO World & How to Be Ready for the Next One

Google announced the release of another core update today. That makes this the perfect time to remember the biggest algorithm updates of the past. Let’s look at what we learned from them and discuss how a site can prepare to come out a winner in the next major quake.

Google Search changes constantly. A recent post revealed that Google made more than 3,200 changes in one year alone — counting new search features as well as updates to the algorithm that ranks search results. My personal view is that Google should also publish how many of these were to fix their own mistakes versus useful updates.

With Google algorithm changes, most go unnoticed. But every now and again, a significant update hits. And it can greatly affect your site’s search traffic — for better or for worse.

These bigger algo shifts shake up the first search engine results page (SERP). Their seismic nature causes many people to treat algorithm changes like big, troubling, cataclysmic events. As soon as they see something change, they panic. They leap into action, trying to “get around it” or “beat the algorithm.”

The truth is … none of us is ever going to beat the algorithm.

But beating the algorithm is not the goal, as I’ll explain later. (Keep reading for my story about a bear …)

What CAN we in SEO do?

Educate ourselves about the important Google updates. Execute based on what the search engines reward. And put a plan in place to be ready for the next significant algorithm shift.

Here’s what I’ll cover in this mega post:

  • The 10 most significant algorithm updates so far.
  • How to watch for algorithm changes.
  • What to do after an update.
  • Why you don’t need to beat the algorithm.

The 10 Most Significant Google Algorithm Changes
This is my “top 10” list based on impact in recent years.

These Google algorithm updates have shaped the face of search and SEO. Here’s what you need to know about the most important Google updates in recent years, arranged chronologically with the most recent first.

(For a complete list of known updates, Search Engine Journal’s History of Google Algorithm Updates is a good resource.)

June 2019 Core Update & Site Diversity Update
Launch date: June 3–8, 2019
Google pre-announced the June 2019 Core Update. This update seemed to focus on correcting the way the algorithm evaluated links. It put more weight on the domain authority and trustworthiness of a site’s incoming links.

In my opinion, the “trustworthiness” component of the E-A-T factors rose in importance. And it overflowed SEO to include a fix for detecting sentiment. Note that this firmly places the issue with ranking loss on marketing in general … No wonder Google has said that there is nothing sites can do specifically to “fix” their rankings after a core update runs. Mess with user sentiment, link sentiment and quality, and prepare to die.

The search engine simultaneously released a separate site diversity update. Google stated the goal with this one — to make search results more diverse. For most queries, users no longer see a single domain appear more than twice in the top-ranked results. This greater diversity in results makes it harder for a site to “saturate” a topic organically.

Hazards: Sites with too low a percentage of backlinks from trusted websites may have dropped. Those that used to have many pages ranking for a single query also lost some SERP real estate.

Winners: Google said that the June 2019 Core Update benefited pages that were previously under-rewarded. (Aren’t we all?).


March 2019 Core Update
Launch date: March 12, 2019
The March 2019 Core Update seemed to fine tune broad core algorithm changes made in past updates (such as the August 2018 “Medic” update). To prevent naming confusion, Google tweeted the update’s name the same day it was released.

Research found that sites whose search traffic increased experienced higher rankings site-wide, with no increase in the number of keywords ranked.

On the flip side, the update hurt many sites that provide a poor user experience (due to excessive pop-ups, poor navigation, over-optimization and so forth).

And of course, trust was a significant signal. Sites dealing with YMYL (Your Money or Your Life) topics took ranking hits if they were participating in untrusted activities.

Hazards: Google’s March 2019 Core Update behaved like an evolution of previous algorithms, negatively affecting sites that were over-optimized.

Winners: Reputable sites ranking for queries related to particularly sensitive topics, like those about health questions, benefited. According to SearchMetrics, “websites with a strong brand profile and a broad topical focus” also advanced. See, trust matters.


Launch date: March 8, 2017
Fred was the name sarcastically given to Google’s quality updates. The search engine rolled out a string of these starting in 2015. Fred was wide-reaching and focused on quality across a variety of factors, not just a single process.

One specific target was sites using aggressive monetization tactics that provided a bad user experience. Poor-quality links were also targeted by Fred. Link to an untrusted site and it lowers your trust … and rankings. (You did, but brands don’t?)

Hazards: Sites with thin, affiliate-heavy, or ad-centered content were targeted.

Winners: Many websites featuring quality, high-value content with minimal ads benefited.


Launch date: September 1, 2016
Possum is an unconfirmed yet widely documented Google algorithm update that rolled out in 2016. This update targeted the local pack. Unlike confirmed updates, the details of the Possum update are a bit less clear. SEOs believe it sought to bring more variety into local SERPs and to help increase the visibility of local companies.

With this update, Google seemed to change the way it filtered duplicate listings. Before Possum, Google omitted results as duplicates if they shared the same phone number or website. With Possum, Google filtered listings that shared the same address. This created intense competition between neighboring or location-sharing businesses.

Hazards: Businesses with intense competition in their target location could be pushed out of the local results.

Winners: Businesses outside physical city limits had a chance to appear in local listings.


Announcement date: October 26, 2015
RankBrain was a significant update in that it introduced live AI into Google’s SERPs. RankBrain has been hailed as one of Google’s most important ranking components, although it is not truly a ranking signal.

Instead, it’s a processing mechanism that uses machine learning to help Google understand queries better. RankBrain is particularly useful for the 15% of never-before-searched queries that Google faces daily.

According to Google’s Gary Illyes on Reddit,

RankBrain is a PR-sexy machine learning ranking component that uses historical search data to predict what would a user most likely click on for a previously unseen query. It is a really cool piece of engineering that saved our butts countless times whenever traditional algos were like, e.g. “oh look a “not” in the query string! let’s ignore the hell out of it!”, but it’s generally just relying on (sometimes) months old data about what happened on the results page itself, not on the landing page. Dwell time, CTR, … those are generally made up crap. Search is much more simple than people think.

OK, if it changes the target, then it is always right. If you change the results to match a user-intent profile, then in the future all clicks would match that profile since that is all there is. Are all searches for a particular keyword always informational? RankBrain may think so and push out ecommerce sites from the results. Fortunately, it is often correct.

Hazards: No specific losers, although sites won’t be found relevant that have shallow content, poor UX, or unfocused subject matter.

Winners: Sites creating niche content and focusing on keyword intent have a better chance of ranking.


Mobile-Friendly Update
Launch date: April 21, 2015
This was one of the updates heard ‘round the world, dubbed Mobilegeddon. On its Webmaster Central Blog, Google said this about the update:

We’re boosting the ranking of mobile-friendly pages on mobile search results. Now searchers can more easily find high-quality and relevant results where the text is readable without tapping or zooming, tap targets are spaced appropriately, and the page avoids unplayable content or horizontal scrolling.

The update underlined mobile-friendliness as a ranking signal and laid the foundation for the way Google’s mobile-first search mechanism works today. It was fun watching ecommerce sites try to fit 700 navigation links into a mobile menu. (Side note: There are better ways to handle mobile navigation.)

Hazards: Sites without a mobile-friendly version of the page, or with poor mobile usability, suffered.

Winners: Responsive sites and pages with an existing mobile-friendly version benefited.


Launch date: July 24, 2014 (US); December 22, 2014 (other English-speaking countries)
The update dubbed Pigeon shook up the local organic results in Google Web and Map searches.

With Pigeon, Google updated local ranking signals to provide better “near me” results for users. To do this, it improved distance and location ranking parameters, and incorporated more of the ranking signals used in Google’s main web search algorithms.

Hazards: Local businesses with poor on- and off-page SEO suffered.

Winners: Local companies with accurate NAP information and other SEO factors in place gained rankings.


Launch date: August 22, 2013
Hummingbird was announced in September of 2013, though it had already been active for a month at that point. It represented a “brand new engine,” a complete overhaul of Google’s ranking algorithm that was based on semantic search. Hummingbird is the name of the algo rewrite, not an added-on update.

Google needed a way to better understand the user intent behind a search query. Search terms that were similar but different, for example, often generated less-than-desirable results. Take the word “hammer” as an example. Is the searcher looking for the musician, the museum, or a tool to pound nails with?

Google’s Knowledge Graph was a first step. Released the year before Hummingbird, the Knowledge Graph mapped the relationships between different pieces of information about “entities.” It helped the search engine connect the dots and improve the logic of search results.

Hummingbird used semantic search to provide better results that matched the searcher’s intent. It helped Google understand conversational language, such as long-tail queries formed as questions. It impacted an estimated 90% of searches and introduced things like conversational language queries, voice search, and more.

Hazards: Pages with keyword stuffing or low-quality content couldn’t fool Google anymore.

Winners: Pages with natural-sounding, conversational writing and Q&A-style content benefited.


Launch date: April 24, 2012
Penguin was Google’s major update of 2012. The change was also called the “webspam algorithm update.” It targeted link spam and manipulative link building practices, or “black-hat” SEO.

Before Penguin rolled out, Google paid close attention to page link volume while crawling webpages. This made it possible for low-quality pages to rank more prominently than they should have if they had a lot of incoming links.

Penguin helped with the mission to make valuable search results as visible as possible by penalizing low-quality content. Many sites cleaned up their links but stayed in Penguin jail for months, unable to regain their lost rankings until Google ran the next update.

Google made Penguin part of its real-time algorithm in September 2016, and a friendlier version emerged. Rather than demoting a site for having bad inbound links, the new Penguin tried to just do away with link spam.

Now, if a site has inbound links from known spam sites, Google just devalues (ignores) the links.

However, if a site’s backlink profile is too bad, Google may still apply a manual action for unnatural links to your website. Also, John Mueller said earlier this year that a software-induced penalty can still occur if a site has “a bunch of really bad links” (h/t Marie Haynes).

Friendlier Penguin has not proven to be 100% effective. As a result, many businesses still need help cleaning up their link profile to restore lost rankings. Google has said that you should not need disavow files, yet also welcomes them. To me, that is a very clear signal that we should not rely on the algorithm alone when it comes to backlinks.

Hazards: Sites that had purchased links were targets, as well as those with spammy or irrelevant links, or incoming links with over-optimized anchor text.

Winners: Sites with mostly natural inbound links from relevant webpages got to rise in the SERPs.


Launch date: February 24, 2011
Panda was rolled out in February of 2011, aimed at placing a higher emphasis on quality content. The update reduced the amount of thin and inexpert material in the search results. The Panda filter took particular aim at content produced by so-called “content farms.”

With Panda, Google also introduced a quality classification for pages that became a ranking factor. This classification took its structure from human-generated quality ratings.

Websites that dropped in the SERPs after each iteration of Panda were forced to improve their content in order to recover. Panda was rolled into Google’s core algorithm in January 2016, which means that Panda’s ability to detect low-quality content is always on.

Hazards: Websites lost rankings if they had duplicate, plagiarized or thin content; user-generated spam; keyword stuffing.

Winners: Original, high-quality, high-relevance content often gained rankings.


How to Watch for Google Algorithm Updates
Google rarely announces its algorithm updates. And even when it does, it’s usually only after others have discovered them. (Although this may be changing, at least for their core updates.)

With so many tweaks going on daily, it is possible that Google doesn’t know that some changes will be significant enough to mention.

Often the first indication you have is your own website. If your search traffic suddenly jumps or dives, chances are good that Google made an algo update that affected your search rankings.

Where can you go for information when your online world gets rocked? Here’s what I recommend …


Have a “seismograph” in place on your website.
To detect search traffic fluctuations on your own website, you need analytics software. If you haven’t already, install Google Analytics and Google Search Console on your website. They’re free, and they’re indispensable for SEO.


Watch the SERP weather reports.

Various websites and tools monitor ranking changes across categories and search markets and report on SERP volatility. Here are places you can check for early warning signs of a search ranking algorithm update:

  • MozCast
  • SEMrush Sensor
  • CognitiveSEO
  • SERPMetrics
  • Algoroo
  • Advanced Web Rankings
  • Accuranker
  • RankRanger

Follow industry resources.
I’m always reading as an SEO. For the latest Google news, I recommend that you:

  • Read the Google Webmaster Central Blog and the official Google Search Blog. Google often uses these blogs to announce upcoming moves and explain algorithm updates after they’ve rolled out.
  • Follow Danny Sullivan on Twitter, who is a search liaison for Google.
  • Check search-industry sites like Search Engine Land, Search Engine Roundtable and Search Engine Journal for SEO news.


What To Do After a Google Update
Think that an algorithm update has penalized your site?

Don’t panic. Remember — nobody truly understands the algorithm. Whatever you’re experiencing may or may not be due to a Google fluctuation. And Google may “fix” it tomorrow, or next week.

With this in mind, get intentional. And stay calm. Decide whether you need to act before you do.

Here’s a plan to follow after an algorithm update …

  1. Stay calm.
  2. Get into puzzle-solving mode. Do NOT react or make changes hastily. Instead, gather data. Determine whether your site was impacted by the change and not something else, such as a technical SEO issue. Or it could be that your rankings dived because your competitors moved up in the SERPs. Depending on the cause, you need to do something different in response.
  3. Learn about the update from several sources (see my suggested resources above). Find out what other SEO experts are saying and experiencing.
  4. Adjust your SEO strategy accordingly.
  5. Remember that Google’s ranking algorithms change all the time. What impacts your site today could reverse itself in a month.
  6. Change what makes sense on your website.
  7. Re-evaluate your impact.
  8. If no results have changed, now you can panic.
  9. Call us.


Last Thoughts: You Don’t Need to Beat the Algorithm
Google’s algorithm updates are constant, often unverified, and difficult to anticipate. That doesn’t mean you have to be afraid.

Don’t spend your time trying to figure out a way to beat the algorithm. You’ll waste hours chasing your tail and missing the things that truly matter, like creating a high-quality website that is worthy of ranking.

I like to tell a story to illustrate this …

Imagine you’re out camping with a friend, and a bear shows up. You both take off running, the bear in hot pursuit.

In this situation, do you have to be an Olympic runner to survive?

No — you just have to be faster than your buddy.

In the world of SEO, your mission is to be better than your competition. You don’t need to beat the bear.

So don’t let algorithm updates cause you to make knee-jerk decisions. Instead, be strategic about how and when, but stay informed so you can make these decisions properly.

Leave a Reply

Your email address will not be published. Required fields are marked *

20 − 10 =