When it comes to web development and SEO, keeping up with Google’s ever-shifting algorithms is essential for success.
To ensure that your site is ranked highly in relevant searches, it’s crucial to make sure you’re ticking off all the right boxes to stay out of the danger zone. And that means making sure you’re not doing anything to be penalised by Google, especially when it rolls out one of its sneaky algorithm updates. Essentially, when these do roll out, it can be hard to know exactly what has changed, or if you’ve been affected (and why). For the most part, penalised websites are often punished as a result of sneaky, black hat SEO techniques or from a lack of quality content.
The thing is, we only hear about the major updates Google tends to hit the ‘go’ button on. Or example, back in 2012 alone, there were 665 updates to the ranking algorithm in total. And those are words directly out of the Head of Google's Webspam Team, Matt Cutts.
In actual fact, each year there can be anywhere between 500-600 updates to the algorithm that we’re not told about. Most of these, though, are mere quality control tactics and are designed to keep Google running smoothly, or to enhance user experience.
What web developers and SEO techs often want to know is whether the major updates will completely rewrite the criteria known for making sites rank higher than others. Ultimately, that means reshaping the exact tactics required to optimise a site overall – a huge task of its own.
So, here are the major updates Google’s made over the years and how they’ve come to have an impact on your SEO results.
The major Google algorithm changes to date
Panda, February 23, 2011: The first major shot fired at low-quality and spam impacted a whopping 11.8 per cent of all queries at the time. You may read about it being called ‘Farmer’, which was its original name before being changed to the name of the engineer who created the algorithm change. This hit the SEO industry hard at the time, as they pretty much had free reign prior to Panda. Low-quality links had previously ranked highly in Google searches and it was becoming more and more prevalent, prompting the need for this major update.
An example of a website hit hard by Google. This shows the visibility score on Analytics both before and after Panda was rolled out. Source.
Page Layout Algorithm, January 19, 2012: Google’s first major update helped shape the way we view websites to this very day. Sites had started to cram so much advertising at the top of their page that users would have to scroll down an entire page just to reach content, so Google put them in their crosshairs. Any page caught stuffing absurd volumes of advertising above the fold were forced to create a better user experience.
Penguin, April 24th, 2012: This update is believed to be named after the Penguin villain in the Batman comics, which is pretty cool. Penguin was an extension and addition to the Panda upgrade and boosted Google’s fight against spammy and low-quality websites. It was a more powerful tool against black hat link building techniques and was designed to stamp out the sites that Panda had missed.
This image shows a website hit by the Penguin algorithm update. See its fluctuation in visits from users. Source.
EMD (Exact Match Domain), September 28, 2012: Website developers had discovered a new trick to bamboozle the algorithm, finding key SEO phrases and using them as their domain name. This would shoot the site to the top of searches, but usually, these sites were thin on content but ranked above superior websites. This update literally filtered out these sites that used an exact match of key ranking terms as the URL.
Payday, June 2013: This was a significant update that impacted 0.3 per cent of all searches in the United States. The name ‘Payday’ was used because payday loan sites offering high-interest loans had exploded and were spamming people’s searches. It wasn’t exclusively dodgy lenders that were spamming searches, though, with porn, online casinos, debt consolidation sites and other shady industries also guilty of the practice. This updated targeted them all to enhance the user experience and push the dodgy sites back into the dark.
Hummingbird, August 20, 2013: While Panda and Penguin were add-ons to the existing Google architecture, Hummingbird was the first complete overhaul of the core algorithm. It moved the search queries towards and intent-based model which was aimed at delivering more relevant results to users.
This began with the launch of the knowledge graph in 2012, which would provide more than just URL results in searches. It was the first instance of what we see today, where a recipe search would include a popped out popular recipe and steps, so users could get the results they needed without needing to even click through.
Hummingbird introduced snippets to essential information sought after by users, without them having to click through to websites at all. Source.
It also added a semantic search function, which was where the intent of the user was determined. If you typed “chicken roast”, for example, Google would now assume that you would want to know how to prepare a chicken roast or where you could purchase one from. This information would be popped out for you and enhance the experience.
This update helped filter out irrelevant information and also provided the first stepping stone towards true voice search, which would arrive in the near future.
Pigeon, July 24, 2014: This update was designed to localise searches, rewarding local businesses with strong organic presence with a greater ranking. It was designed to assist smaller businesses to compete with the big players by grouping them all together in searches. It also improved the user experience by delivering relevant business content that was in close geographic proximity. This update enhanced both Google Search and Google Maps to create a better experience.
RankBrain, April 2015: This was the introduction of artificial intelligence (AI) to the Google algorithm. In layman’s terms, it translated words and phrases into mathematical entities called vectors. Then, words and phrases that were not translated could be processed and a calculated guess made to improve search results. This form of AI is only used to “teach” the algorithm because AI set loose in the wild of the internet could be a very dangerous thing indeed, as well as almost impossible to fix if it broke.
Mobilegeddon, April 21, 2015: Launched at about the same time as RankBrain, this update made the distinction between websites and sites that were not mobile-friendly. Mobile-friendly sites became ranked the highest on smart devices, which this update was aimed exclusively at.
Google updated its algorithm to favour websites with mobile-friendly layouts and functionalities.
Intrusive Interstitials Update: January 10, 2017: Web developers began getting sneaky again, this time with mobile devices. They were enjoying the high ranking that Mobilegeddon gave them, but then employing tricks like putting a pop-up ad on the site that would cover all of the content, or creating above the fold advertising that looked like the real content below. Site employing tricks like this were penalised in this update.
Fred, March 2017: This is a bit of a tricky one because it is not a true update to the algorithm at all. This is basically an in-house Google term for any quality-related algorithm updates that are not related to the main updates previously mentioned. Basically, Google is constantly patched for quality reasons and Fred was the name Google gave these mini updates, essentially as a bit of a joke.
What can we expect in future Google algorithm changes?
While it is important to understand the changes that have come prior to now, peeking into the future is equally important to understand where Google might move next.The company constantly tightens up and penalises dodgy practices like keyword stuffing and spamming, and there are usually one or two fairly major updates every year.
Luckily, the search engine giant doesn’t keep you in the dark on major updates. They almost always announce intentions months before they are rolled out, and outline what changes will be made.
Twitter is the best place to be following Google as this is almost always the first place they make announcements.
In the meantime, here are some bits and pieces to avoid, in order to keep your website out of any penalisation trouble:
- Buying links: It can be tempting to buy links that will make your site more appealing to Google. The company cottoned onto this practice a long time ago and will take manual action against your site if you pay for low-quality, non-earned links. Keep them organic.
- Link swaps: It may sound ethical to swap URL links with another site; it’s just a case of ‘you scratch my back and I’ll scratch yours’. The problem is, some sites make a habit out of this and can link off to hundreds, even thousands of other pages. This is where Google will raise a red flag, so ensure that if your site and another are going to link to each other, that their website is legit and not farming links.
Don’t mislead or bombard your visitors with advertising, either. Moreover, avoid stuffing your content with a mass amount of keywords without any quality in the work at all. The formula remains simple: quality, regular content that is valuable to the user, will get you in the good books.
Need assistance in building a quality site for your business? Contact the team Website Design now on 1300 367 009.