There are several IMPORTANT things that you NEED to understand about SEO.
1/ There is nothing in SEO that you can’t do yourself if you apply enough time, analysis and research.
2/ No tutorial will provide a comprehensive SEO routemap to get you to the top of SERPs.
The reasons for this is that every book and white paper written about SEO is out of date before it’s published.
Dedicated SEO Agencies like us carry out hundreds of ongoing tests to maintain our edge over Google. We deploy resources that most website owners don’t have access to and it is that knowledge that accumulates to give our clients top rankings for the most competitive terms.
These SEO Tutorials cover the main areas of SEO and how to set your website up to succeed online and in Google search.
It is by no means a definitive guarantee that if you deploy every recommendation then your site will hit No.1 spot for all your keywords….. what it will do is move you up the rankings significantly.
We aim to future proof our SEO as much as possible, but that said, we live at Google’s whims and fancies. If Google decide to turn their ranking algorithm on its head tomorrow, there is little that you can do today to guard against it.
At the root of good long term SEO is a basic principle that Google will want to rank high quality websites, that publish useful content, and that are highly valued by their visitors.
Good SEO will enable you to rank well in search & to also future proof your SEO efforts (as much as is possible to do) against any changes Google may make to their algorithm in the months and years to come.
Technical Site Structure – Device & Speed Optimisation
Google search engine history and best practice
SEO that will attract a Google ranking penalty
Firstly, watch this 5 minute seo tutorial video to get an overview of what we are about to do to your website:
Our free search engine ranking guide will walk you through each on-page element to give you a successful, high ranking, search engine friendly website.
Below you will find step by step SEO tutorials, which are easy to follow.
Follow one page at a time, implementing the elements for each to make everything on your site as SEO friendly & optimized as possible.
To optimize your site correctly, you need to understand a little about how search engines chooses the best sites to rank.
SEO practices can be categorized into three main areas;
‘White Hat’, ‘Grey Hat’ and ‘Black Hat’ search engine optimisation. (Not counting Negative SEO)
True ‘white hat’ SEO involves sticking to Google’s best practice and following their recommendations verbatim.
The downside to this strategy is that their overall philosophy is that you should create high quality content and rely on other webmasters to find your content, consider it valuable, share it and link to it & ultimately to promote it for you.
If your content is truly exceptional then you will be rewarded for its quality and consequently move up the search rankings.
This will in turn share your content with more visitors who will link to your content, as well as sharing it socially via Facebook likes, tweets, and shares, recommending it to their visitors.
All well and good in theory…….
Unless you manage to create exceptional content, with ‘viral’ properties, you may be waiting an awfully long time.
Waiting for those important links to be created for you organically in sufficient volumes to assist your SERPs may take a very long time!
The principles of ‘white hat‘ are sound and in an ideal world, where content is regularly shared by others, it can work well for you.
However, in the real world, unless you are already a household Brand, or have a lot of web ‘authority‘ online, this can be a very slow and ultimately unprofitable process.
This makes it necessary to give your website a helping hand and there are several approaches to doing this.
We teach White Hat SEO, with a helping hand.
On a pure technicality, any proactive practice that you carry out in order to affect your rankings can be classified as ‘Grey hat‘.
If you create a link pointing to your site then you are using a ‘grey hat’ practice.
White hat would involve waiting for a third party to link to your content naturally and is all well and good as a concept, but for the majority of website owners this will involve a long wait.
We think of grey hat as proactive rather than passive.
Google uses an algorithmic set of metrics to rank websites for each search term.
Grey hat SEO is, as a point of fact, the process of optimizing a website.
If you build a link in order to attempt to affect the ranking of your website, congratulations, you are using grey hat processes!
Each and every element of your on-page and off-page SEO campaign carries a weighting within each search engines algorithm.
Your task is to make each of those elements as relevant as possible to the search query that you want to rank for.
Grey hat involves the manipulation of these algorithms, but without using any underhand practices that might work in the short term but have no long term future.
By using grey hat techniques, all you are doing is accelerating the organic process of building your contents popularity.
In recent years, Google have become much more adept at measuring ‘black hat‘ activities and penalising them.
‘Black hat’ used to be prolific online, with many tactics such as invisible text, cloaking and hidden redirects being used to achieve top search engine rankings.
Most of these ‘sledge hammer’ techniques are now so easy to detect that they no longer work.
Google employs some of the smartest programmers on the planet.
In all probability one of the most densely concentrated collections of intelligence ever assembled.
The idea that you are going to pull the wool over their eyes and beat their algorithm in the long term is optimistic to say the least!
Stealing rankings that your content doesn’t deserve is a massive ask and we can confidently state that long term, Black hat SEO will not work for you.
If you want to rank well and receive your share of the search engine traffic available to you through top SERPs then you need to forget trying to cheat the system.
Instead, work to deliver their core ideals, enhancing every element of your site along the way to meet googles expectations.
Firstly, and this is important, avoid using any blatant black hat techniques on your site.
Google has seen most things and even if you escape detection today, they are very efficient at finding and removing low quality, spammed websites from their listings.
Once your URL is labeled as a spam site, it will be all but dead as far as Google search traffic is concerned.
Unless you are building disposable ‘Churn & Burn’ sites, to hit a niche for a short time, forget about using any black hat strategies.
If you want to build a website that will work for you in the future, then forget about black hat as a viable strategy.
On the other hand you opt for pure whiter than white compliance then you will in all likelihood be waiting a long time to get the SERP’s results you want.
Following Google best practice to the letter is all well and good in theory.
But it won’t allow you to compete against all the other sites in your niche that are using proactive grey hat practices to improve their rankings.
By definition, grey hat is not what they want you to be doing, but, it is necessary if you want your website to move up their listings.
Think of grey hat as giving your white hat strategy a bit of a boost, without implementing any out and out black hat SEO.
Over time, your high quality site will naturally start to attract inbound links.
If you create new, high quality links pointing to your content pages, then you are using Grey hat optimisation.
As long as they are search engine friendly links, built within a natural framework then you are implementing what is called grey hat optimization, but it is not a black hat process.
Spamming hundreds of links through a link network, and hiding content in invisible text areas are two examples of black hat activity.
These practices should be avoided at all costs.
The objective of applying search engine friendly optimization to your site is to boost and accelerate what is going to happen eventually naturally…… a helping hand, if you will.
Be aware that there are still automated linking services advertising and selling their services online.
It might seem like a good idea to take a shortcut, you are playing with fire.
While Google may not have identified a particular link network so far, that doesn’t mean that they won’t tomorrow, next week or next year.
When they do, they will penalise all the sites that are attempting to manipulate their ranking, page rank and the SERP’s results.
If you can find a link network and hand your money over, then Google can easily find it too.
Since the introduction of the Penguin algorithm, they have declared that they will eradicate automated manipulation of their results.
Our advice is to not get involved in automated linking services.
There are plenty of ways of building high quality links.
These will have a long term positive affect on your site without risking your sites future on someone else’s low quality linking service.
There are so many different definitions of SEO (an abbreviation for search engine optimisation or in the UK, search engine optimization worldwide).
We prefer to think of SEO as the process of creating unique high quality content, presented as effectively as possible to build the online authority for the host site.
Off Site Popularity
Your content quality will make or break your sites success.
To begin with, everything written on your site needs to be ‘original’ content.
Not just original in terms of other content on your site, but unique to your site.
The days of copy and paste are long gone, only high quality, well written, unique content will have any chance of ranking well.
It is better to have one awesome page of content covering a topic in great detail than to have ten pages repeatedly covering a similar topic more generally.
Google, Bing and Yahoo are getting much better at being able to tell if a page has been built just to rank well and steal traffic, or if it was built to give valuable content to others.
Search engines wants to share high quality information with the world.
You therefore need to create a valuable resource about a topic & then you will have a head start in SERPs (search engine results pages).
The ‘Authority’ of your site is vital to your long term success.
You should aim to make your site the default online resource for your niche or topic.
This is a long term goal because you can’t create authority overnight.
Instead you need to work consistently to build the reputation, credibility and power of your site.
Authority is measured by a variety of metrics, many of which are social.
The easiest way for you to gauge how authoritative your site is considered to be is to track your Trustflow (Majestic) or DA (Domain Authority) & PA (Page Authority) in ahrefs.com.
These tools show trust scores based on the trust and authority of the sites that link to you.
The more authoritative the sites are that link to you, the more page trust & authority you receive.
Often, the biggest and best niche related sites tend to have the most authority.
If they are prepared to link to you as a valuable resource then your own authority will rise.
The amount of ‘link juice’ that is passed from an authoritative page through to your page depends on several factors.
Assuming that the link to you from the authority page is a ‘DO FOLLOW’ link rather than a ‘NO FOLLOW’ link, you will receive a percentage of their page rank link juice.
The percentage you receive depends on how many sites they link to from that page and the Trust Score of the page.
For example, if you are linked to from a 50/100 (Computers / Internet / Searching Category page) and you are one of the first 100 links on that page, then you receive about 60% of that score (30/100) from that page.
Following on from this, getting links on high trust score, theme relevant pages with only a few outbound links is therefore much more beneficial to you than non-topic related links from pages with little or no trust score.
Most major search engines only follow the first 100 links on a page.
Not only are pages with lots of links far less valuable for building authority, but many links won’t even be followed by search engine spiders.
Where your link is positioned on the authority page is also important.
The further down a page your link appears, the less authority it will have.
Google use metrics such as these to make sure that high trust score directory pages don’t have the capability to pass much page rank.
This particular metric also ensures that blog comments pass little trust because they are always below the main body of content.
Footer and sidebar links have less value than contextual links from within the main body of text on a page.
Comments on high trust score blogs are positioned below the main content, these are valued to be less powerful than a link higher up the page.
Where possible, to get the most link juice from a link you should try to get a contextual link in the page content rather than in the comments section.
There are more linking tips and suggestions in the linking section below.
The Trust Scores of the pages that link to you give your site added authority and value.
A high trust score will help you to rank higher.
Trust Score is an indicator of the ‘link juice‘ being passed to your site from other sites.
Increasingly, your online reputation is being measured and assessed by the engagement metrics calculated using data collected from social media.
The volume of likes, shares, tweets, retweets, comments and interaction with your content provides search engines with user engagement data.
This activity gives a very precise indication of the value and worth of your content and how your visitors interact with it.
Search engines measure how your visitors interact with your site content.
This information is used to determine the relevance and suitability of your content for the search term you ranked for.
If a visitor sent via a Google search stays on your page and spends time reading your content then the engine will conclude that the content is a good resource for that particular search term.
As a result, you will rise in the rankings.
Alternatively, if they send visitors to your page, but many of them click the back button, you lose.
By choosing a different result it will be assumed that your content is not applicable to that search result, and you will drop in the results.
This is called your ‘bounce rate’ and is a science in itself. Find out more about how to improve your bounce rate.
In addition to your bounce rate, they also measure the social engagement between your content and your visitors.
You need your visitors to share your content, like it, tweet it, comment on it, re-tweet it, pin it, etc ……. or at least, you need them to do that to your content more than they do for your competitors.
Given two sites that are fairly equal in terms of content, link structure, bounce rate, load speed etc, search engines will rank the socially popular site higher every time.
Keywords used to be the driving force behind success online.
Google is determined to reduce the reliance on keywords.
The introduction of semantic search & latent semantic indexing is their first significant step in that direction.
Everyday, millions of people type billions of different search terms into search engines looking for goods and services.
In essence, search engines perform an aggregation role, collecting all the people looking for similar things in one place.
They then present what they consider to be the most relevant websites and businesses to them.
If you target the search terms that searchers are actively looking for then you can position yourself in front of streams of targeted visitors.
These visitors want to see your content, products or services.
However, it is important to point out that keywords are yesterdays search technology.
As far as Google in concerned keywords are dead and semantic search is where the web is heading.
More than 50% of searchers now use mobile devices.
Voice recognition such as Apple’s Siri is now capable of providing hands free computer control.
Google have identified the requirement for an advanced ‘answers’ engine, rather than a traditional keyword based search engine.
Sat in front of a PC, you might type “Eiffel Tower” if you are looking for information, history, height, opening times etc.
Today, more and more people are now talking to Google, asking questions such as “How tall is the Eiffel Tower”.
There are billions of possible questions you might ask.
Because of this, Google has developed the ‘knowledge graph‘ to move on from identifying keywords, to identifying the meaning and intent of content.
They then display related answers as appropriate.
Knowledge Graph results appear in #1 position in search results in a box.
To achieve the rankings you want, and to receive the traffic volumes that you need, you need to understand what they are looking for so that you can deliver it.
This is the ultimate key to top rankings.
Back-links are still important for rankings, but whereas they used to be responsible for 80%+ of a sites rankings, today they are just one essential part of the ranking recipe.
Importantly, it is better to have too few links than badly built links.
Links pointing at your website pages are like votes for your site.
Search engines measure those links depending on the authority of the site that links to you, as well as the construction, position and structure of the link.
Not all links are equal, in fact they vary greatly.
A badly constructed link on an authority website can actually harm your website rankings.
It’s very important to build links with foresight and planning if you want them to promote rather than hinder your sites performance.
The key to success is to persuade search engines that your content is the most relevant for a given search query.
It is important to understand what a search engine is trying to deliver if you are going to provide content that they like.
All that the search engines are trying to do, is to provide the best set of relevant sites for the search term that is used.
The sites in the top ten are deemed the most suitable, and are the ones that visitors get the best user experience from.
Make your content better than the existing top ten sites & you will outrank them.
Your visitors need to love your content, they need to interact with it, to read it, follow links to other pages of your content.
Your content needs to be ‘sticky’ for your visitors so that they don’t need to look for other sites to find what they want.
If you achieve that then (in conjunction with the support of some quality links) your pages will rank at the top of SERPs.
These changed the requirements to rank well in SERPs drastically.
Many websites that were previously optimized to rank well within Googles old algorithm, suffered big ranking drops because of the way that Panda and Penguin filter results.
The Panda algorithm is focused on website content and specifically ‘over optimization’ of your web content.
It is quite straight forward to repair on-page ‘over optimisation’ by reducing the keyword density of all the content on your site.
Remove duplicate content, to improve the quality of your content.
You will also need to make sure that your site is less focused on search engines and is instead written for the benefit of your visitors.
In April 2012 Google introduced its Penguin algorithm.
Penguin looks at your off-page linking structure and when it launched, caused many sites that had escaped their previous Panda algorithm release to drop in the rankings.
Google’s Penguin Algorithm has two major elements that are responsible for the majority of ranking issues in Google.
Firstly, the link text density limits were reduced overnight.
Previously, you used to be able to have up to 80% of your back-links using exact match link text.
With Penguin, that limit is nearer 1%.
Googlebot expects a site to have much more varied link text in order to satisfy the ‘natural’ linking test.
Think of it this way, if 100 people found your site and chose to link to it without any guidance from you, it is quite likely that they would each choose a different link text.
A percentage of them might choose your URL as the link text.
Others might link to you using a combination of different related link terms, while others will use ‘read more’, ‘click here’, ‘read full article’ etc.
If you wanted to rank for “Red Widgets”, then your old link profile to get you to the top might have looked like this:
70% = “Red Widgets”
20% = “Buy Cheap Red Widgets”
5% = “www.yoururl.com” and variations.
Up to 5% = “images links, junk links etc”
70% = Brand Links, “Your Company”, “Your Company Limited” in different variations
20% = “www.yoururl.com” and variations
1-5% = Junk links “click here”, “read more”, “read the full article” in different variations
1-5% = Longtail & Brand + Keyword links
1% = Exact match anchors
Your remaining links should be a wide variety of keyword related search terms and phrases.
You only really want to have a few exact match terms as your link text.
Your other links should be longer tail terms which include variations of your target term.
For example, “cheap red widgets”, “best red widgets”, “red widgets supplier”, “online coloured widgets” etc.
Using this SEO strategy, you will end up with a wide variety of longtail links.
The majority of these links will contain a variation of your primary keywords.
When Google looks at links structured in this way, it will see the huge variety of different link texts that it considers to be ‘natural’.
At the same time, they will see a lot of themed variations of your primary keywords, making this search term or phrase the most significant term related to your site.
If hundreds of sites all link to you using related but different phrases, then Google can safely conclude that your content is all about that phrase.
Assuming you have quality content will rank you accordingly.
Think of it like this; rather than blatantly plugging your primary keywords with exact match anchor text in your links, hide the keywords you want to rank for within longer terms and phrases.
This will give you the variety of links required to look totally organic.
As with all things, search engines are evolving with each passing year.
Where it once used to only measure positive ranking factors, it now has a series of trapdoors through which your website can easily fall.
In 2011 Google decided that enough was enough and that it had to confront the growing volume of low quality websites that were infiltrating its results.
It wasn’t possible to do that without introducing a range of negative ranking factors.
Life used to be so easy.
Links promote websites and the top ranked site has 3000, so I will build 3001 and outrank them.
Today, those links are still important, but they need to tick a lot more boxes in order to benefit your rankings.
It is in fact easier to build a link that harms your rankings than it is to build a good one!
The search engine developed a sequence of spam footprints which allow them to identify low quality link networks.
These and other undesirable sites & neighbourhoods can then be excluded from its index.
As an example, many lower grade link networks used to be populated by badly written, duplicate content (because it requires minimal effort to publish).
Once upon a time this content worked well as a linking tool and it was cheap, fast and effective.
Google now use duplicate content as a key indicator of low quality content.
They would rather you link to a default resource instead of publishing the same content yourself.
As a direct consequence of this particular spam footprint, Google were able to de-index millions of article directory sites.
This removed the link benefit they were providing to millions of websites.
In the process, they also penalised millions of blogs and websites worldwide that contained duplicated/syndicated content.
The only instance of duplicate content not harming rankings can be seen on Press Release sites.
Press releases syndicate the same content but in the main are given an exemption by Google and are not penalised.
The philosophy used to be;
“Do you have x on your site?”, “Do you have more of x than your competitors?”, “The site with the most of x must be the best, we will rank it first”.
Today the philosophy is somewhat different;
“Do you have x on your site?”, “Does the value of x on your site fall within our expectations of what is considered ‘natural’?”, “Your x value isn’t what we would expect, so we will penalise you”.
Many websites looking to improve their rankings are actually carrying one or more penalties that are holding them back in SERPs.
(That is why a full site audit is so important).
|Slow Page Load Speed||Removal From SERPs|
|Excessive On-Page Keyword Density||SERPs Dropped For Affected Terms|
|Excessive Off-Page Link Text Density||Dropped potentially site wide|
|Links From Networks & Bad Neighbourhoods||Removal From SERPs|
|Duplicate Content||Removed From SERPs|
|Black Hat Practices, cloaking, hidden text etc||Dropped From SERPs|
|Linking Out To Non Trusted Sites||Removal From SERPs|
|Buying Links (PR)||Removed From SERPs|
|High Bounce Rate||Dropped From SERPs|
|Multiple Websites Covering Same Topic/niche||Removal From SERPs|
|Excessive Coding Errors, 404’s, etc||Removed From SERPs|
In short, if your site falls foul of any one of the ‘offences’ listed above, you will either rank badly, or not rank at all.
It doesn’t make any difference if your site is the best in all other regards, it just takes one of these issues to decimate your rankings and traffic.
Each on-page element adds or detracts from your overall ranking.
Your site may score well for some metrics, while being help back in SERPs because of other weaknesses in your optimization.
Let’s Assume that your site doesn’t have any manual ranking penalties imposed on it by Google (check in Google WMT’s).
The only reason that you are not the No. 1 ranked site for your preferred search terms is that your site fails to meet one or more on-page or off-page metrics.
Think about it like this;
The algorithm analyses one thousand different measurable elements for every website that it can find related to a specific phrase.
For each of those one thousand elements, there is an ideal level that will get you the No. 1 spot in its results.
Over optimize an element and you trigger a ranking drop/exclusion.
Under optimize an element and your page won’t have enough significance for the term.
The difficult part is of course knowing which elements they measure and what the ideal requirements are.
Search engines don’t publish their algorithms, so we all have to deduce what is important and what doesn’t matter.
Ten years ago for example, the meta keyword tag was commonly used to indicate page content.
Then, because it was easily spammed by the black hat forces of evil, the major search engines removed it from their algorithm as a ranking factor.
Today it is only used as a spamming factor.
We know this because if you take a high ranking page that has an empty keyword tag, and cram a whole bunch of keywords into the keyword metatag, the ranking of that page will plummet.
As optimization consultants, we spend much of our time testing and measuring different elements of the algorithm.
This allows us to establish minimum & maximum ideal limits for each component.
This testing process gives search engine optimisation companies an unfair advantage over the average website owner.
We have much more detailed data and site examples to analyse, allowing us to draw conclusions and confirm data.
We put all that data into our own blueprint for the perfect page.
This web page ‘blueprint’ is the framework we use to optimize each page of a website.
The basic On Page SEO factors are these;
<title>What Is The Best Title Tag For Google?</title>
The Title Tag is one of the most important elements on the page.
It is used to make up the top line of your search listing (in most instances) and is the text that appears at the top of your browser window when you view the page.
You have 512 pixels of space, which more or less equates to approximately 65 characters.
The earlier your primary keywords or target phrase appears in your title tag, the more weight it will carry.
Ideally, you want to include your primary phrase once at or near the beginning, followed by a latent semantic variation of your term.
Do NOT keyword stuff the title tag.
Don’t exceed the 65 character limit excessively as a rule of thumb.
65+ Character titles have been known to rank well in limited volume, but site wide just look like spam content.
Write your Title Tag for people as much as for search engines.
Bear in mind that it is your main opportunity to convince searchers to click through to visit your website.
Each page title should be unique so spend some time working on great titles.
Read more …..
<meta name="Description" content="Add your captivating and compelling keyword rich content here." />
Along with the title tag, the description tag is used to make up your listing in SERP’s.
The description is your opportunity to convince search traffic to visit your site.
While the page needs to be congruent with yout title and meta description, you also need to use variations of your main keywords once in the description tag.
The description tag isn’t always used in search results.
However, sometimes, they can opt to use either your Open Directory Project listing description if you have one.
Alternatively, they can use relevant sections of your page if they consider that it offers the user a better experience.
Do NOT keyword spam your description tag.
You have 158 characters including spaces and any additional text will not be read and can be considered spammy.
Make every Description unique on every page.
Read More ……
<meta name="keywords" content="" />
Do NOT use the keyword tag for anything!!
Seriously, all the keyword tag will do is to draw Google’s attention to the terms and phrases that you would like to rank for, which will actually make it easier for them to not rank you for those terms.
There is no benefit to having the keywords tag in your metadata.
<meta name="robots" content="index, nofollow" />
The robots tag can be used to exclude pages from being visited and indexed by search engines.
Additionally it is used to stop bots from following any links on the page.
You can also give additional instructions to the robot including whether to use your ODP (Open Directory Project) description in your Google snippet, or not to archive (cache) a page.
As a standard, all search engine spiders will attempt to index and follow the links in every page that they can find.
On that premis, there is no need to include a robots tag…… <meta name=”robots” content=”index, nofollow” />.
meta name=”robots” content=”noindex, follow”
=”robots” content=”index, nofollow”
meta name=”robots” content=”noindex, nofollow”
meta name=”googlebot” content=”nosnippet”
|noindex||page will be excluded from index|
|nofollow||page links will not be followed|
|noarchive||cached page not shown in SERP’s|
|noodp||ODP description not used in SERP’s|
|nosnippet||No snippet shown in SERP’s|
|none||Same as “noindex, nofollow”|
A robots tag can be useful if you want to exclude certain pages from being indexed by a search engine.
For example; similar product pages, how to find us, about us page, etc.
<link rel="canonical" href="https://deehoseo.com/" />
As a standard, Google treats https://deehoseo.com and https://www.deehoseo.com as two different websites, regardless of the fact that they are actually one and the same.
Problems can occur when people find your site either with or without the www element of the domain URL.
When this happens, and people link to both variations of your domain, it can quickly cause ranking issues.
The canonicalization tag is used to choose which version you want to be used.
We always use URL’s with the www included as our standard.
Google doesn’t promise to abide by your preference however, but they do treat the tag as a strong indication of what you would like them to do.
<h1>Your main Page Heading Goes Here</h1>
Heading tags help your visitors to identify relevant sections and subsections of your page.
They also indicate the different topics on your pages to search engines, so write them with both these factors in mind.
While it isn’t easy to prove or disprove that heading tags contribute significantly to search results, it is a good practice to use them wisely.
1 x H1 tag per page
One or more H2 tags per page
1 or more H3 tags per page
H4, H5 and H6 if and when required, but not essential.
<img src="man-crossing-road.gif" alt="man crossing A12 road">
Computers can’t view images.
In order to explain what your image is about, you have two opportunities to add valuable descriptions.
The first is by naming the image with a suitable phrase or term that is appropriate to the image.
The second is by adding a congruent ‘alt text’ element.
Do NOT over optimize alt tags with excessive keywords.
Alt text is frequently spammed and is a common reason for ranking demotions in Google SERP’s.
If you have used related images on your page, then natural descriptions of what they contain will work to reinforce page topic.
The quality of the content that you publish on your site is very important to the rankings you will achieve.
There are several reasons for this;
Google analyses your pages for originality, quality, value and benefit.
As well as how real people interact with your website.
There are three basic rules for successful page text that search engines will love;
Written to a good standard and ooze quality
Valued by your visitors
If you publish low grade content, duplicate text, badly written, or spun content that doesn’t make sense then engines can instantly filter it out of their results.
Once you get past that initial process and Google still likes you, the search engine bots then looks at how your visitors interact with your pages.
Do your visitors stick around, reading, watching videos, signing up to get more from you, clicking through to deeper pages, or do they leave shortly after arriving?
Search engines are in the business of providing answers to peoples questions.
Imagine someone searches for ‘Garden Centres near Grimsby’ and you have written a page all about how to plot the exact centre of your garden in Grimsby.
You may, initially at least receive some traffic from people looking for garden centres.
It would be reasonable to assume that most people who were looking for a local plant shop would in all probability click the back button if they landed on your site.
This would give your page a high bounce rate, which in turn will demote your site for ‘garden centre’ searches.
https://deehoseo.com/page-1.php or https://deehoseo.com/seo-tips.php
Your page URL is an easy win for optimisation purposes.
Some pages without optimized URL’s do in some instances rank well for highly competitive search terms.
They normally have very high trust and authority scores.
For less powerful sites, you can make your life much easier and give your site a boost by using appropriate URLs for each page.
URL’s are a good indication of probable content.
As with the other elements on your page, it is important to aim for congruence in all things, so stick to the topic where possible.
Contextual linking to more of your web content is a very important element to improve rankings and page quality.
Advanced internal linking is all about passing link juice to the pages that you want to help.
You can also restrict the passing of page rank to preserve as much as possible for other pages.
Googles view seems to be that it is beneficial if you link where relevant to related content.
The content that you need to link to should be highly valuable (like all your content).
This content needs to offer more than is available on the original page (So don’t just duplicate your core content on a new page just to have something to link to!).
Most sites have a defined menu structure at or near the top of the page.
Contextual links allow users to find the relevant information they are looking for without having to search for it.
There are several different internal linking strategies currently being used on a large scale.
One of the most popular is to link to one specific page using contextual links from 8 or 10 of your content pages.
Testing has shown that if the target page doesn’t have contextual links back to the 10 linking pages there is a ranking benefit.
Where needed, it is best practice to link out to other website content, rather than regurgitating that content on your website.
For example, if your page is all about coffee importing, you might want to include information about annual consumption, tonnage grown in which countries etc.
Instead of copying that information, it is preferable to link out to the best default resource that you can find on the subject.
It might be; Wikipedia, a Government statistics site, a retail sales statistics site etc.
The key here is not to be selfish, but to link out to good quality sites and information, it is after all the key to how the web works.
Generally speaking, it isn’t ideal to have external links on your home page, so use them where applicable on your deeper content pages.
Off-page search engine optimisation is primarily all about back-links and building your websites authority.
The aim and purpose of creating back-links is to have respected, trusted on-topic websites link to your content as a recommendation of quality content.
Google is getting smarter and their methods of detecting, measuring and passing ‘authority’ status have improved massively since 2011.
As a consequence, there are several important attributes that your back-links need to exhibit. (See the off-page checklist below).
Focus your efforts into building quality links in line with these attributes and you will create strong links that will support your site and increase your rankings over time.
As a rule of thumb, when you create a new page with a link on it, it will hold little or no value for at least 90 days.
Because of this, many people start to think that either they are building the wrong kinds of links, not enough of them or that they don’t work to improve rankings.
The important thing is to keep chipping away, drip feeding new links from a variety of different sources.
Over time they will influence your rankings, positively.
If you create new links on Google indexed sites then Google will find them and start to index them.
As the 90 day period passes, you should be reinforcing those links by building links from a variety of sources to them too.
This gives them credibility and increases their value, negating their ‘orphan’ status.
In a word yes.
However, there are articles and there are articles.
The article publishing arena has been flooded in recent years with very low quality content, created on the whole by webmasters and marketing companies looking to create as many low cost links as possible for their clients.
This resulted in many article directories publishing low quality -‘spun’ content articles, which often make no sense, are not related to the links they are trying to support, and as a consequence have no authority.
For a while this strategy worked, and many sites benefited in the short term from the creation of these junk links.
Today however, article publication requirements are very different which is why we have a dedicated content copywriting team to create high quality articles for our clients.
Good articles offering information that readers will find educational and informative are still important, and should be a part of your link building strategy.
Your first step is to make sure that the article directory that you find is a Google friendly one.
The easiest way to do this is to Google the root domain (http://www.thearticledirectory.com) and if nothing is found then it isn’t in Googles index.
If there are results for the article directory, then you know it is at least not out of favour with Google.
Some article directories have a high trust scores making them possible linking opportunities.
As a rule of thumb, if you don’t try to manipulate your ‘follow/nofollow’ link ratios, the majority of sites will develop a natural blend of the two link types.
You don’t need to pay too much attention at this stage to whether the article directory will give you a nofollow or a do follow link.
Both are natural for longer term Google rankings.
In order to keep your back-links profile within Google’s accepted levels of acceptance, the trust score of the article directory is not too important either.
There are, it goes without saying, far more low score sites in the world than there are high scoring ones.
Just targeting the highest authority sites looks unnatural and Google will penalise you.
1 – Write high quality, original copy that adds value to the article site, be it opinion, conjecture, analysis etc. Have a point and write it well.
2 – Choose where you publish your articles very carefully. Chose high quality sites. (The harder it is to get published, the better the article directory)
3 – Write about something themed to the link you want, publish the article on a themed article directory, link using themed, but varied link text.
4 – Don’t duplicate your article, 100% rewrite it for each directory.
Again, in a word, yes.
Blog comments have been hijacked over the years by certain areas of the online community as a source of easy to find links.
Many bloggers have their comments set to ‘auto approve’, so there was no policing of spam comments.
This meant that in many cases, reputable blogs were swamped with spam comments, often linking to non desirable neighbourhoods.
Penguin has taken steps to rectify this problem, and it is now more important than ever to make sure that you find good authority blogs in order to create links that have some value.
Once you have found a suitable blog, that is related to your topic, and offers good content, you should build a relationship with the blog owner.
It is good practice to initially comment on the blog, saying something that either adds to the conversation, or adds value to the blog post.
Initially don’t include any links as you are trying to make friends with someone at this stage.
Once you have made a few comments and interacted with the blogger, it is well worth adding a link from your blog, Facebook etc to their blog.
In your next blog comment to them, tell them that you have done this to share their content with your visitors. (Quite often they will reciprocate and link back to you).
Once you have created a rapport with the blog owner, they will be far more accepting of you including a related, useful link to your content.
Our link building team can manage this process for you, creating links to your pages from good quality related blog sites.
This process helps to create a ‘natural’ internet presence for your website, enhancing your site content.
Directory links have taken a hit in recent years, at least as far as Google is concerned.
Many of these directories have been removed from Google’s index, rendering any links they provide, with little or no SEO value.
Some link directories still have merit however, most noticeably the Open Directory Project (ODP) which Google still values highly.
Forums have lost a great deal of their link benefit in recent years, and was a heavily abused linking strategy up until recently.
Personally we expect forums to take a bigger hit in the coming months and years as they have the potential to pass a great deal of themed PR and authority, but have a low barrier of entry, meaning that anyone can build forum links in a few minutes.
Guest blogs were hailed as the saviour of ranking manipulation in the wake of the Google Penguin Algorithm launch.
At first glance, they offer huge potential gains with little downside, but Google soon got wise to the miriad of guest blogging networking sites that sprung up online.
While the idea was simple, namely to put content writers in contact with eager publishers, the end result was to build a network of guest bloggings sites, all of which were influencing Google SERP’s.
As we have already discussed, Google isn’t enthusiastic about anything that intentionally manipulates results that they produce.
The Google anti-spam team have hit several guest blogging networks in the past few years.
The most noticeable was MyGuestBlog, which was previously one of the larger and more highly regarded networks online.
Overnight, users of the service began to see penalty messages on their Google Webmaster Tools accounts.
These indicated that they had been penalised for using a blog network that Google didn’t approve of.
This was the first instance of Google penalising sites for who they link to as well as who links to them.
This represented a step change in the way Google is planning to deal with undesirable networks in the future.
Guest blogging can still be a powerful way to build quality links, but you should stay away from automated networks that make it easy for you.
There is nothing to stop you from finding a quality blog related to your niche and contacting them offering to write a post for them.
They may very well post your new article on their site and link back to you.
The danger is that if any of the sites you contact are themselves a part of a guest blogging network, then you might find yourself being penalised just because they link to you.
If you are going to use guest blogging as a link building strategy, it is wise to take a few precautions.
You can carry out a few simple checks on the blog you intend to have a link on.
Check it is indexed in Google. (This will show you that it isn’t currently penalised)
Analyse the blogs backlink profile. (Does it look natural?)
My opinion is that having a few guest blogs amongst your back-links looks natural.
However, if the majority of your linking strategy is guest blogging then you are leaving yourself open to potential penalty issues.
If you use guest blogging as a way to gain exposure on respected blogs rather than purely to attempt to manipulate Google then we believe it still has a legitimate place in your linking arsenal.
Press releases can vary greatly in their quality and influence.
Most free press release distribution services are not worth the money!
I am not about to tell you how to write a compelling press release, plenty of people have already covered that….. one of my favourites is the Huffington Post article linked below.
Press releases often have a finite lifespan, but if used on a consistent basis can have a dramatic effect of your rankings.
They are especially effective as a means of boosting viral marketing campaigns in the shorter term.
Be aware though that press releases usually pass “News” category relevance.
This will not be relevant to your niche, and you can accrue a lot of links quite quickly.
This can change your backlink profile considerably.
Social bookmarking is a Google friendly way to support the links that you create.
I’m yet to meet anyone who has ever used social bookmarks as they were originally intended.
That doesn’t stop them from being a useful tool for improving your traffic streams.
While we don’t tend to use social bookmarking to link directly to our ‘money sites’ themselves.
Tthey are though a good way to get your links indexed.
If you create a spread of social bookmarking accounts, their affect on your rankings can be quite profound.
Is the page content on each page unique?
Unique page title on every page?
Is the description tag unique on every page?
Does every page html and css validate?
Do you have one relevant H1 tag on each page?
One or more relevant H2 tags on each page?
Do you have one or more relevant H3 tags on each page?
Are your pages linked internally where suitable?
Do you link out to authority references where necessary?
Is your bounce rate below 55%?
Does your site load fast enough? (Use Google Page Speed Tool to check load speed 80/100 is minimum requirement)
Do you have a variety of links from different platforms pointing at your site?
Do you use the correct ratio of link anchor text to balance your links?
Are your links from different Class C IP addresses?
Do you link to different pages of your website?
Are your links on relevant websites?
Do you have a mixture of NO FOLLOW and DO FOLLOW links?
Do you avoid link farms and networks?
Are your links on ‘Orphan’ pages, or do your links have links?
To boil the whole process down into one sentence, you need to do the following:
|“Create quality pages of unique content that offer real value for readers, then link to them using a variety of natural link text, from relevant, Google indexed pages on different IP addresses, that themselves have relevant links pointing to them from a variety of Google indexed sites”|