If you own a website that has seen a dramatic loss of organic search traffic from Google, then you have likely spent a fair amount of time trying to figure out which Google penalty or algorithm update might have led to the downturn. And if you do figure out which one it is, you have to decipher what you can do to fix it if your website was negatively impacted.
In this post, we’ll share the basics of Google manual actions and algorithm changes, along with the easiest ways to see what has affected your website.
Google Manual Actions
Google manual actions, also referred to as manual penalties, are the easiest to figure out but not necessarily to recover from. These are the ones that you would receive a notification about in Google Webmaster Tools (GWT).
If you’ve never set up GWT for your website, you might want to do so now so you can be alerted to such penalties in the future. Be sure to set up GWT using the same Google account you use for Google Analytics and Google+ for additional benefits beyond checking for manual penalties.
To check if your website has a manual action in GWT, you will look under Search Traffic > Manual Actions. Ideally, you’d like to find the following:
If you do have a manual action against your site, you will see a specific message relevant to your action in this area of GWT. Manual actions can affect your entire site (site-wide matches) and/or specific sections and pages (partial matches). Below are some of the most common manual actions and what you’ll need to get them resolved.
- Unnatural links to your site. This means Google has detected unnatural, artificial, deceptive, or manipulative links pointing to your website, including paid links and link schemes. To resolve this action, you must have these types of links removed from your backlink profile. More often that not, this will involve contacting webmasters to remove those links. Tools like CognitiveSEO can help you identify your worst links to help you remove them faster.
- Hacked site. This means Google has detected that your website has been hacked or affected by malware. To resolve this action, you must have the hacked code and/or malware removed from your website. Services like Sucuri can help with this.
- Thin content. Google has detected pages on your website that contain low-quality content or that are shallow (affiliate pages, cookie-cutter sites, doorway pages, scraped/stolen content, etc.). To resolve, you must remove these types of pages from your website or give them better content.
- Pure spam. Google has detected pages that are in clear violation of Webmaster Guidelines through their use of automated, cloaking, and stolen content. Remove these pages from your website.
- User-generated spam. This is spam created by others on your website, including spammy content on forum threads, guestbook pages, and in user profiles. To resolve, find and remove these threads, pages, and profiles.
- Cloaking. This means Google has detected that your website has pages that are shown to Google but not to visitors. To avoid cloaking, you must create pages that are for both search bots and users.
- Hidden text and keyword stuffing. Google has detected hidden text on pages meant for search optimization or an overuse of keywords on a page to help it rank for specific keywords. To resolve this action, you must remove hidden text and excessive keywords from any pages affected.
- Spammy freehosts. This means that Google has detected a significant amount of spam on sites that allow people to create their own Web content. The spammy content/websites must be removed, and measures must be implemented to prevent the abuse from happening again.
- Spammy structured markup. There is a violation of the use of rich snippets on your website, such as marking up content invisible to users or marking up irrelevant/misleading content. To resolve this action, you must ensure that your use of rich snippets is valid and remove any markup that is not. You can learn more about suggested use of rich snippets in Google Webmaster Tools Help.
When it comes to manual actions, one thing to keep in mind is resolving a manual action does not mean that your organic traffic and rankings will return. In fact, a survey from Search Engine Roundtable revealed that 53% of those who worked toward getting a manual action removed did not see an improvement in rankings, even after a year. Of course, that doesn’t mean it’s not a worthwhile effort, as the other 47% did see improvement — some within days, some within months.
The first major algorithm change that rocked the Web was Google Panda (first referred to as the “Farmer Update”). This update, released in 2011, was aimed at removing low-quality content and “thin sites” from search engine results. Well-known article marketing services were hit the hardest, ending the era of article marketing as a link- and traffic-building strategy. Some sites, such as HubPages, managed to rebound by changing their policies, as seen in this search engine traffic graphic from SEMrush:
Others, like EzineArticles, were not so lucky:
Google has since updated this algorithm over 25 times, with the most recent major update confirmed in May 2014.
Google Penguin, first launched in 2012, was aimed at lowering the search rankings of websites that had violated Google Webmaster Guidelines with the usage of black-hat SEO strategies such as link buying, link schemes, over-optimized keyword anchor text, and keyword stuffing.
This algorithm has only been updated five times since, with the most recent update in October 2013. These updates have had much more of an impact on search queries and have left many businesses cleaning up the work of SEO agencies they had hired in the past to help them rank in search.
In 2012, Google announced a change in the way it would handle exact-match domains, or domains that were based on specific keyword phrases instead of brand names. In theory, if you owned two sites — diamondengagementrings.com and jcjewelers.com, for example — the former would be most likely affected by this update than the latter.
Google Hummingbird, first announced in 2013, was less about penalizing websites and more about rewarding them. This algorithm update was aimed at determining user intent in search and delivering results based on that intent. Instead of delivering results based specifically on keyword phrases, Google would deliver results based on semantic analysis.
Search Engine Land gave the best example of the difference in pre- and post-Hummingbird update search results.
“What’s the closest place to buy the iPhone 5s to my home?” A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.
This update forces content creators to think more about the intent of a user’s search query when creating content aimed at ranking for a specific keyword phrase.
Google Payday Loan Update
The Google Payday Loan update, first announced in 2013, was aimed at the industries using the most black-hat and spammy tactics to rank, including payday loan services, online pharmacies, casinos, and similar. Updates to this algorithm have targeted both spammy queries and spammy sites matching these queries.
If you have a local business, then Google Pigeon could have affected you. This update, first announced in 2014, was aimed at modifying how to interpret location cues when returning local search results in order to return closer, relevant results to users. For some local queries, the local seven-pack (seven results local to the query) was removed altogether, leading to a loss of traffic for those businesses previously included in it but a boost in traffic for those that are optimized for local keywords but not local search itself.
The latest update from Google gives a slight boost to sites with SSL encryption.
Determining Which Google Update Affected You
Now for the part you’ve been waiting for.
How can you determine what Google update affected your website’s organic traffic?
The two simplest ways are as follows.
1. Use Google Analytics + the Google Algorithm Change History
Moz regularly updates the Google Algorithm Change History, a page that documents each major change made by Google to its search engine. You can use this guide in conjunction with your Google Analytics organic search traffic (Acquisition > Keywords > Organic) to match any major gains or losses to a specific update. This can be a bit difficult to pinpoint if you have natural fluctuations in your search traffic though.
2. Try the Google Penalty Checker
The Google Penalty Checker automates the process of matching up major algorithm changes to your Google Analytics data. This tool will allow you to see the link between a specific update and whether your website was affected positively (green) or negatively (red) by it in an easy-to-read graph.
This would allow you to see a history of how different updates have affected your website, based on how much data you have in Google Analytics. You can also scroll below the graph to see the timeline of updates and get more details about each.
In addition to major updates, Penalty Checker will also let you know about the smaller ones, such as Google removing authorship photos, the takedown of guest blogging networks, and other Google changes that might affect your website and the way you do SEO.
We hope you’ve enjoyed this guide to Google and all of the ways its changes can impact your website. Has yours been affected? If so, what was the outcome? Please share in the discussion below!