Search engine optimisation – commonly abbreviated to SEO – is the process whereby a
web site, or more specifically a web page or document, is constructed or amended in such
a way as to improve its placement in the search engine results pages or SERPs. Search
engine optimisation should not be seen as an end in itself. It is a function that should be
undertaken to improve the overall commercial performance of a web site.
Good search engine optimisation will ensure that a page appears higher in the search
engine results for a range of relevant, specific and valuable search terms or queries. The
simple objective of SEO is to generate more valuable web site traffic. The achievement of
a higher ranking against relevant search terms has commercial value for a web site
because it will attract more traffic than a lower ranking. In an increasingly crowded
online environment, search engine optimisation is therefore a crucial online marketing
The role of SEO is to legitimately influence the process of improving rankings. There are
few genuine guarantees of a top placement, particularly for highly competitive search
terms. Good SEO will improve a web site’s ranking across a range of selected terms.
However, any process whereby a search engine is illicitly manipulated in order to
guarantee a high placement is referred to as spamming.
The successful execution of a search engine optimisation project requires skills in the
areas of analysis, research, planning, copy writing and communication. A comprehensive
search engine optimisation project is divided into four interrelated phases.
1. Pre-site activities – The research and planning activities undertaken before an
existing or new site or page is actually touched or built.
Understanding your organisation’s online business strategy
Researching your market category, customers and competitors
Keyword research and selection
2. On-site activities – The activities directly involved in the content and design of
Writing the title, description and keyword meta tags
Writing content – Body copy, titles, image tags, outbound links that reflect and
Building internal links – Helping the search engines navigate the site
Site design and construction - Ensuring the web page utilises design and code that
can be properly crawled and indexed by the search engines.
3. Off-site activities – Building a portfolio of quality inbound links to your web site.
4. Post –site activities – Analysing and responding to site traffic and user feedback
once a web site has been optimised. Effective SEO is a continuous activity.
Changing Industry – History of SEO
The 10-year history of search engine optimisation is closely tied to the underlying growth
of the internet and the development of its attendant search technologies. The three driving
forces have been:
1. The growth and commercial value of the internet consumer base.
2. The rising intensity of competition in online market categories.
3. The increasing sophistication of search technologies.
With the growth in value, the rewards for success in the marketing battle have risen
significantly. With an increasingly crowded internet, search has become a more important
component of commercial success. Without search, how is your site found? As a result,
attempts to both legitimately manage or illegitimately manipulate search results have
become motivated by the greater rewards on offer.
The early days of search engine optimisation go back to mid-1990s when the internet first
began to attract significant numbers of web sites and users. In those early days, emphasis
was on the submission stage – getting your site placed into as many search engines as
possible. The most important aspect of a search engine algorithm appeared to be entirely
“on-page” based and was focused almost exclusively around meta tags and their related
Search algorithms could be decoded simply by analysing the results pages. During the
late 1990s, ethical SEOs and spammers alike realised that search engine results could be
manipulated by the simple process of adjusting a site’s meta tags to match the desired
keywords. During this period there were many crude attempts by spammers to stuff meta
tags with irrelevant but popular search terms. Famous spamming keyword meta tags have
included “Britney Spears” on sites with nothing to do with Britney Spears. It just
happened to be that Britney was one of the most searched for terms.
Google’s arrival in 1998 and the introduction of its “off-page”, link based, approach
signalled the beginning of the end for the exclusively meta tag driven approach. Google
was really the first engine to establish that sites carrying similar content had a propensity
to be linked. Google’s strength appeared that the relevance of its results was less
vulnerable to the orthodox spamming techniques of its day. Search users were attracted
by its relevance to their search needs. In essence the key to success under the Google
algorithm was not what your site said about itself but what the links from other sites said.
The Google spider apparently ignores keyword meta tags entirely and only the MSN
spider apparently places any emphasis on them at all. Abuse of the keyword meta tag by
spammers led to its downfall. Google’s subsequent rise to dominance eventually
transformed the SEO industry. Google’s rise in popularity forced many competitor search
engines to fall by the wayside or to be consolidated with larger parents such as Yahoo.
Due to Google’s success, both Yahoo and Microsoft, through its newly independent and
revised MSN search engine, have had to take on board many of the features of Google’s
approach. The influence of inbound links continues to increase.
What to avoid in SEO
In recent years a number of illicit techniques have grown up to artificially manipulate a
web site’s ranking. These techniques are referred to as spamming or sometimes “Black
hat” techniques. The “black hat” description refers to the fact that in the old western
movies the bad guys always wore black hats. The core of any spamming technique is the
attempt to deceive the search engine, and ultimately the site visitor, about the true nature
of a web site’s content.
The question is whether spamming techniques actually deliver any long term benefit. In
addition, it is known that using proscribed spamming techniques can get the spammer,
their client sites and organisations delisted by the major search engines. It has happened
publicly in the past and the search engines particularly Google place great emphasis on
their warnings. Google even has a page for reporting spamming offenders.
I have identified a list of nine types of illicit SEO or spamming techniques.
Keyword stuffing is the technique of excessively using lots of keywords with express
intention of influencing the search engines. Quite often this use appears in an
incomprehensible or ungrammatical manner. Keyword stuffing is often used in
conjunction with other spamming techniques such as cloaking, doorway sites, hidden text
and tiny text.
Cloaking is the technique whereby the web site visible to a site visitor is entirely different
from that seen by a search engine spider. The ordinary user may see one set of text and
images but underneath that image, or “cloak”, the site is “stuffed” with keywords. By
examining the cache of a cloaked site on the Google search results, we can see that the
site shows entirely different information to the spider, from that shown to the human eye.
A doorway site is a site that acts as a referring page for another site. The doorway page is
highly optimised – containing hidden links and keywords that the ordinary web user
never sees. The doorway site then climbs the search engine rankings but re-directs all of
its traffic to the target – and perhaps poorly optimised site.
Throwaway sites are almost always doorway sites. They are web sites built by spammers
to provide a short-term and artificial boost to traffic. Once their traffic objectives are
achieved they are often switched off or left to decay – hence throwaway. Throwaway
sites are stuffed with links and keywords to attract and then re-direct traffic to a target
web site. Typically, the spammers retain ownership of the throwaway domain. The
spammers’ clients initially receive large amounts of traffic. But once the throwaway site
is switched off – or thrown away – the traffic comes to an abrupt halt and the clients
business suffers. The clients are then effectively blackmailed into spending vast sums to
retain traffic. The target web site receives no long term ranking benefits.
Mirror sites use an alternative URL to the target site but contain identical content. With
automated page production, there maybe hundreds of different URLs all with the same
content. This technique is sometimes referred to as domain duplication.
The technique here is to fill or “stuff” a page with keywords invisible to the naked eye.
This is done by using the same colour for text as for the background page. This technique
is sometimes referred to as WOW, short for white on white.
Tiny text is a technique of using very small text that is barely visible to the human eye.
This text can be read by the engines. However, the engines will also attribute this text as
In many respects, due to the increasing influence of links, it was inevitable that link
spamming would become an issue. Spamming of links has been a growing problem as
many people have realised the importance that Google, in particular, places on links. As a
significant issue it raised its head in April 2005 when Google’s new release appeared to
ban one of the leading SEO firms from its rankings. Few people outside of Google and
the SEO firm concerned are entirely sure why this is the case. But the industry consensus
is that Google are cracking down on web sites and organisations that accumulate vast
numbers of irrelevant links with the sole intention of climbing the rankings.
Related to link spamming is comment spam. Comment spam is where a spammer visits a
publicly accessible site and deposits a comment with an anchor text link back to a
designated site. Forums and blogs are typical target. This activity became identified as a
major problem in January 2005 when Google took steps to prevent it from the blogs of
Blogger.com. The reason was that spammers working for so called PPC (Pills, Porn and
Casino) web sites were trawling legitimate blogs and posting uninvited comment
advertisements with their web site’s anchor text. Blogs were vulnerable because they
typically possess a comment section that can be accessed without the need for passwords
or even registration.
How to Optimise Your Site
This section describes the key processes undertaken to obtain a higher organic ranking
with the major search engines.
How search engines work is part of their proprietary knowledge. The exact workings of
their algorithms are closely guarded commercial secrets. However, guidance to how these
algorithms (or algos) work can be found or deduced from various sources. Some general
guidance is available free, directly from the search engines’ own web sites. Some
guidance can be found from examining the various Google and related patents. Some
general guidance can be found from authoritative articles on SEO forum sites. However,
real world applications of this knowledge can only be found by experimentation and trial
There are some general rules. Applying them will provide a route to improved search
engine visibility. The guidance in this section could be broadly applied to the three main
engines – Google, Yahoo and MSN. However, given its dominance, much of the advice
is derived from my interpretation of the Google “Hilltop” patent of 2001. The patent is
believed by SEOs to have been the basis of the so-called Google “Florida” update of
The Four Phases of an SEO Project
Site Design and
In addition to definitive information about the workings of search engines, there is much
speculation, myth and rumour. There are many spurious ideas in circulation and applying
them may do more harm than good. In this section, I will try to stick to tried and trusted
How Search Engines Gather Information
Search engines gather information by crawling web sites. They crawl from page to page
visiting sites already known and by following the links that they find. Whilst crawling,
the robots, or spiders, gather information from the source code of each site and then send
back that information for indexing. The Spiders were designed to read HTML code or
code related to it such as XHTML or PHP. The Spiders find it difficult to read pages
written in Flash and some other popular web programmes. Spiders cannot directly read
Java Script or images. They can however read the alt tags which may be provided with
GIF, JPEG or PNG images.
SEO 1 - The Pre-Site Phase
Search engine optimisation is a marketing discipline. It is not a stand alone function.
Before any specific optimisation activity is undertaken it is essential that two areas are
non-search areas are appraised:
Understanding your Organisation’s Online Business Strategy
Good SEO requires a through understanding of your organisation’s overall business
strategy. How does search fit in with activities such as advertising, e-mail and direct
marketing? Is there a marketing plan? What does it say about objectives, strategy and
budgets? What is the overall direction of the business and what can search contribute?
Researching your Market Category, Customers and Competitors
Good SEO also requires a thorough understanding of the market category within which
the search project and web site will compete. What is the category size and how is it
developing. What other channels to market are there? What information is available
regarding their behaviour and attitude of customers? What role in the buying process is
played by the search marketing? Who are current and likely competitors? Once the above
is fully grasped you can proceed to the first real activity of SEO; Keyword selection.
Keyword Selection - Factors
Keyword selection is the first search specific discipline. Having explained that spiders
read and index text, we find that some text is more important than others. That text is
keywords. Valuable keywords are the words or phrases that prospective customers use
when searching in your market category. Keyword selection is therefore crucial and has
implications for so much else within search. I have drawn up a list of factors that should
be taken into account when selecting keywords.
The first thing to remember is that the number of keywords you can use on any one site
or page has a finite limit. A general recommendation is that there is an overall limit of 20
individual words. In my opinion – due to other factors – the limit should be drawn much
tighter than this. Rather than a limit of words, I prefer, a limit of characters – including
spaces - of no more than 64. In essence, you must be sufficiently focused to sum up the
key priorities of your business within this limit – typically no more than 6 to 8 words.
The only way around this limit is to have an endless number of pages on an endless
number of sites – all optimised, monitored and updated on a regular basis.
You should use a word or phrases that have sufficient search volumes for your needs.
You can find out about search volumes by checking with Word Tracker software or
Yahoo’s Overture keyword suggestion tool. Read more about these tools below.
A place to look for keywords is where you enjoy some competitive advantage. How are
your products or services differentiated? What are the real strengths of your business
compared to your closest competitors? What proprietary advantages do you enjoy? What
is it you do better that may persuade prospective purchasers to visit your site?
You may have decided on your own keyword priorities but you must also check out the
competition for those keywords. Selecting a word or phrase already prioritised by a
multitude of competitive sites will see you struggle for visibility. Try to find words or
phrases that appear ignored or underutilised by your competitors. An alternative but
higher risk approach is to see what keywords are used by competitor sites and then
attempt to outmanoeuvre them by better use of links, content and meta tags.
The keyword terms you select must be relevant, salient and part of the vocabulary used
by the audience you are seeking to attract. If that audience is a consumer one it is unlikely
to use jargon. The opposite may be true if you are seeking B2B prospects. My experience
suggests that consumers will often use entirely different vocabulary from marketing,
advertising and IT people. To avoid confusion use simpler but more specific terms.
Making your keyword choice
In essence, you must synthesise all of the above five factors in selecting and refining your
keywords. Ignoring any one of the factors could create problems. Do not rush into this
process. Test out your keywords by making trial searches on the major engines and see
what company results you might keep. Getting it wrong may involve a large amount of
SEO 2 - The On-Site Phase
Writing Meta Tags
There is much debate about the current value of meta tags. I still find them very effective
– both as an end in themselves and also as a guide to producing better and more search
friendly content. Although Google apparently ignore their contents, MSN and Yahoo
both still utilise the site title and description meta tags in their search algorithms. MSN’s
newly launched web site still makes reference to the value of the key words meta tag.
Meta tags are so called because they sit above the site – in the “Head” section – and are
not visible to the casual site visitor. The meta tags can be found between the and
lines of HTML code, as the description suggests, at the top of the page.
Meta Tag Priorities
The Site Title Meta Tag
The site title tag is the most important meta tag. The site title meta tag is still read and
indexed by all the major engines. How do we know this? Because it appears at the top of
each organic search entry in the search engine results pages. However, some SEOs
dispute whether it is really a meta tag at all – because the information the tag contains is
clearly visible in the top left corner of the blue area surrounding the screen. The
recommendation of RFC 1866, the international standard for HTML, is that the tag
should contain no more than 64 characters, including spaces. There is nothing physically
stopping you exceeding this limit. I have seen some major sites with 150 characters in
this tag. However the typical, browser can only show 70 or so characters and secondly,
and with more characters, the impact of keywords within the tag is progressively diluted.
From my experience the keywords in the early part of the tag carry more weight. I
personally prefer a limit of 50 to 55 characters. Checking the quality of the title meta tag
is the quickest way of assessing whether a site has been optimised.
A key debate, given the character limitations, is whether you should include the
organisation’s name in the title meta tag. Much depends on the names length and whether
it includes desired keywords. My view is that with limited space, you are wasting a
valuable resource if you use your organisation name here.
The Site Description Meta Tag
The site description is the second most important meta tag. It is read by the engines
Yahoo and MSN and still plays a significant role in their searches. The site description
should tell the engine about the nature of the web site. It is recommended that this is done
in no more than 200 characters including spaces. It should be presented using good
grammar and avoiding repetition. The site description should include relevant keywords.
Keywords Meta Tag
You would have read in the previous section on search history that the keywords meta tag
is, today, ignored by most of the search engines. So, if the spiders do not take them that
seriously, why do we still have them? In my opinion the main role they perform is one of
internal guidance and discipline for the web master. If you know what keywords you are
looking for it is easy to test your content to ensure there’s a match.
Content is deemed to be increasingly important by many in the SEO field. This is
apparently because with all the spamming and other optimisation techniques becoming
increasingly sophisticated, only content can give the search engines a true indication of a
site’s content. There are some general rules, amongst them is the more copy the better –
aim for 250 words. Secondly look to use the keywords in two or three word phrases.
Where should content with keywords be located? High up on the first page is the general
rule. Certainly get keywords into the opening sentence or paragraph. The latest MSN
engine picks out and uses a selection of text from the opening paragraph in their site
description. This implies that the MSN algorithm is placing additional emphasis on this
text. Keywords should then be spread throughout the first page and the rest of the site.
Keywords and their surrounding copy should be relevant to one another. Certain words
and combinations of words go together and the search engine algorithms know this. So
advertising goes with marketing. Food with drink. Photographs with film. Also derivative
words with different utilisation go well. So market and marketed with marketing etc.
There is much debate about density. Too little and the keyword or phrase won’t be picked
up. Too much and your site may fail the spamming test. Some SEOs suggest repeating
keywords no more than 7 times on any one page. Density is always measured in relative
terms. A page with a lot of copy will have more word repetition than one with few words.
Text within title tags has a greater weighting than ordinary copy. Within HTML code
titles are marked up