Google Traffic & How To Kill It!

Today, in 2016/17, your Google traffic is dependent on many different factors. Each element needs to meet a minimum quality threshold otherwise your site will not rank in SERP’s as it should.

It has never been easier to inadvertently damage your traffic & rankings and there are more ways than ever to fall foul of Googles stringent algorithmic ranking process.

google organic traffic

These are the most commonly found errors that demote sites in Googles results & kill organic traffic, so work through them one by one and make sure that your site isn’t being penalised because of one or more of these ranking killers;

1. Over Optimized Link Text

Getting to the top of Google’s listings before the launch of Panda, Penguin, Hummingbird and the Payday algorithms required lots of targeted back-links that used the exact phrases that you wanted to rank for.

Today, and going forward, exact match anchor text links are harmful to your rankings and hence your organic traffic.

Google believes that the only reason you would have large volumes of exact match anchor text is because you (or someone for you) put them there. Ergo, you have used SEO to manipulate their rankings and they really don’t like anyone doing that these days!

Check your back-link keyword ratios using www.majesticseo.com or www.ahrefs.com and if you have large volumes of exact match links, you need to get them removed, edited to be your company name, URL or junk links such as ‘click for more’, ‘read more’ etc. or you need to disavow them using Google’s disavow tool.

There are different opinions about using the disavow tool, with some experts claiming it actually did little more than allow web masters the opportunity to highlight the sites that they considered to be spam to Google.

Imagine Google receiving millions of disavow requests. What would you do with that information? If if was me, I would scrap all the data and de-index all the pages/websites that were listed ‘x’ many times.

It was an easy way for Google to highlight low quality content without having to do much work themselves IMHO.

The other option to repair your back-link text ratios is to dilute the existing links with new links built to be Google friendly. In time you will create enough quality links to make the exact match links low enough in percentage of the whole that your overall back-link profile will fall into line with Google’s expectations and requirements.

It is usually the case that Google will drop you from the rankings for the keywords that are over optimised with EMT (exact match text links). If you check your rankings and traffic for the keywords used in your links, you will see a pattern where less used keywords rank, albeit not very highly.

2. Over Optimized Content Keywords

Over use of keywords within your content is quite common, and while it is related to your back-link density, repetition of the same words on your pages is a sure fire way to guarantee that you don’t rank for those terms.

Google introduced latent semantic indexing into their algorithm process during 2012 and ever since, have been able to relate similar words and thus associated terms.

It is beneficial to use a large variety of terms within your content, in a natural way, without forcing keywords where they don’t belong.

When you are writing content, don’t think about where you can add a keyword, but instead, consider different variations of that keyword instead.

In the majority of sites we analyse, removing keywords leads to ranking & organic traffic improvements rather than the other way round.

3. Bad link Neighbourhoods

Google reviews all of the sites that link to you, and if you have a lot of links from websites that Google has highlighted as being ‘bad’ then you will drop in the rankings.

There are far more low quality linking opportunities than there are high quality ones, and it is only natural for the majority of sites to have a back-link profile that includes many low value links, but there is a difference between lower quality links and bad links.

As a rule of thumb, you want your links to come from pages that are in Google index, so search for the URL in Google and if there are no results then you should assume that the site isn’t of good enough quality for Google to index.

If you are creating a new page of content on a site (which will then link back to you) then it won’t be indexed yet. In this instance, search for the home page of the domain (or other pages that have been published for a while) and check that they are being indexed.

Google frequently review the indexing of sites, so an indexed page today might be de-indexed tomorrow, so you need to stay abreast of your back-links and check that they are still indexed. You can easily check your links to see if they are indexed using Scrapebox which will report on all your back-links status.

In addition, you can use http://www.bad-neighborhood.com/text-link-tool.htm to quickly scan your site for potential bad links. While this tool may not be 100% infallible, it does offer a quick overview of the general quality of your linking profile and highlight links that are harming your SEO efforts.

4. Links on Non Google Friendly pages

A regular review of your back-links to see if they are still being indexed by Google will highlight possible ‘bad’ links, or sites that Google has doubts or issues with.

Google works by using trust metrics to evaluate linking sites. If the site containing your link gives Google any reason for concern it will either de-index it, give it a manual penalty or mark it as a spam source….. either way it will hit your traffic volumes.

A single bad link will have more effect on your rankings than 100 good quality links, so there is more to be gained by managing potential bad links than there is building new links.

5. Orphan links

When you create a new page of content and publish it, it becomes an ‘orphan’ page. An Orphan page is a page that links to other sites, but doesn’t have any inbound links itself.

Google uses this metric to highlight pages that have been built purely to link to other content. If you are going to derive any benefit from this new page then it will require some inbound links of its own.

Once a new page is published, it takes around 3 months before it begins to carry any weight for linking purposes. During that time, creating a few inbound links will ensure that the page gives you a linking benefit.

6. No contextual internal linking

Google likes to see content that contains ‘contextual’ links to related content, both within your site, as well as on other sites.

Don’t be afraid of linking out to ‘authority’ resources where necessary. Google would rather see your site link to a high quality referring resource rather than repeating existing content on your own site.

Wikipedia is a good example of a quality resource that uses contextual linking to good effect. Even though some of the information in Wikipedia might be wrong, it is still considered to be a high quality content site and your content needs to mirror their linking strategy where it’s suitable to do so.

7. Duplicate, weak or low quality content

When Google first released the Panda algorithm in 2011, it began the process of filtering duplicate, low quality and spun content from Google index.

The days of being able to scrape content from other sites has gone. Instead, you need, unique, well written, high quality content that your visitors engage with if you are going to gain a Google benefit.

Google tracks how your visitors interact with your content and can easily determine if real people don’t like what they are reading.

In an ideal world, you need your visitors to arrive on your page, stay for a while, reading your content and then click through to a second page to read more information.

To Google that is a successful page, and they consider that they have delivered a good result for the original search term the visitor used to find you.

If lots of your visitors sent from google don’t engage with your content, but instead hit the back button and search for another site, then google (possibly rightly) conclude that your content isn’t a good result for that search term and guess what?….. you will drop in the rankings for that term.

It is all about providing content that real people will want to consume, so in turn your site returns healthy metrics back to google and you climb in SERP’s.

8. Slow loading pages

One of the easiest ways to be penalised in Google SERP’s is to have slow loading pages. Google measures average load times for all websites and if your’s falls outside of the fastest 20% then you will drop in the listings.

Google wants to encourage a fast internet and this is the metric that they use to make sure that slow sites don’t appear.

YSlow, a Firefox plugin is a free tool you can use to measure the load speed of your pages. It provides a score out of 100. You need to aim for a score in the high 80’s or better to be safe in the longer term and avoid future speed related penalties.

YSlow will list the issues that are slowing your site down and offer suggestions to improve your load speed.

The main culprits are usually image sizes and the volume of DNS requests needed to load your content. Combining multiple CSS StyleSheets into one for example will reduce the volume of DNS requests, as one large file will load much faster than many smaller files.

The other easy win for improving your load speed is to GZip your site, which works by compressing as much data as possible to deliver it faster.

9. High Bounce Rate

Your bounce rate is one of the main visitor engagement metrics that Google uses to determine the usefulness of your content.

If search traffic and your visitors don’t stick around to enjoy your content then they will ‘bounce’ back to Google to search for another result.

The bounce rate needed does vary from keyword to keyword, and depends on the bounce rate of your competitors.

As a rule of thumb, work to improve any page with a bounce rate higher than 55%. In many instances, you can still rank well with a bounce rate of 80%+, but that is because your competitors also have high bounce rates.

If you want to improve your site traffic in the longer term, work to improve your content and visitor experience so that more of your visitors stay on your pages and Google will value your content higher and you will receive more traffic as a result.

10. No Social Interaction

Google introduced the ‘Hummingbird’ algorithm to track social media activity. This is an important metric for Google as it demonstrates to them that real people are prepared to share your content on a regular basis.

If real users share your content with their friends then Google assumes it is good quality content & you will be rewarded with an organic traffic boost.

A lack of content sharing conversely indicates that your content isn’t very good.

To make your content more shareable, add share button so it is easy for your visitors to share each page. You need share buttons for all the major social platforms; Facebook, Twitter, Google+, LinkedIn, Pinterest, etc. so that your visitors can share regardless of their preferred social network.

Sharing across social media has become a significant indicator of content quality and is being used as a more and more important and will only become more so with each algorithm update.

11. Unique IP address links

When you create new back-links, Google looks at the IP addresses that they originate from. Because many link networks often use many sites on the same Class C IP address, this has become an indicator of lower quality links.

For this reason there is a significant advantage to be found by building links on different servers.

Shared hosting identifies related websites and is an easy way for google to associate websites.

If Google has any reason to doubt the origin of your back-link profile then they will err on the side of caution and penalise you rather than reward you.

12. Using Link Networks

Many link networks have been de-indexed in recent months, including several guest blogging networks. The use of organised networks has become a hazardous practice and they should be avoided at all costs.

Some networks are still working, as yet undetected, but, Google’s manual spam team are actively searching for them and when found, not only do they remove them and their influence, but they usually penalise the sites that are using them.

Getting Google penalties removed is a lengthy process and unravelling the back-links is time consuming and costly.

It is far better to build quality links instead of seeking a short cut which is unlikely to continue to provide a benefit beyond the short term.

Additional Reading

What is SEO?