A Brief History of the Evolution of SEO Techniques
By Adam Norris, digital developer and web analyst
SEO techniques have had to change as search engine companies have constantly adapted their functionality to give users the best possible experience. To understand today’s SEO strategies, it’s worth taking a look back to where it all began.
The B.G. era (Before Google)
Back in the early 90s when the World Wide Web was beginning to flourish, it soon became clear that there needed to be a way of finding what you were looking for amongst the growing sea of content.
One solution to this problem was to keep a directory of the most important and influential sites on the web. Companies like Yahoo! turned this into a business model by allowing sites to be added to their catalogue for a yearly fee.
Yahoo! Circa 1996 courtesy of web.archive.org
The Wild World Web of SEO Techniques
With the number of web pages increasing exponentially per year the problems with this approach quickly became clear and a better solution needed to be found.
By this late 90s, a few companies began using search engine technology to help users find what they were looking for. They did this by periodically indexing of all the pages on the web and attempting to offer the best suggestions based on what a user had searched for.
Unfortunately, as these early search engines relied heavily on keywords (the words a user searched for) for their rankings they were easy to manipulate. Controversial SEO techniques such as keyword stuffing and hidden text meant that they often returned poor quality results. There needed to be a better solution.
Birth of a Giant
In 1998, with significant investment from backers, google.com was launched. Rather than relying on keywords to provide search results, Google’s PageRank algorithm also looked at the links between sites. Instead of just counting the number of links to a site, PageRank placed a higher value on links coming from useful and trustworthy sources.
Despite this improvement, there was still room for deception. Some unscrupulous authors found that by posting links on the comments and forums section of popular websites they could in turn increase their own rankings, a strategy known as comment spam.
Google Circa 1998 courtesy of web.archive.org
A Game of Cat and Mouse
To combat this problem, the three big players in the search world, Google, Yahoo and MSN, worked cooperatively to create a new value for use on links called ‘nofollow’. This update, released in 2005, instructed search engines that a nofollow link was not endorsed by the owner of a site and should be ignored for ranking purposes.
Even with the change, it wasn’t long before newer ‘black hat’ SEO techniques began to appear which skewed the rankings again. It became clear that the only way to ensure good quality search results would be to release continuous updates to the ranking algorithm and to tackle each new problem as it arose.
As part of this plan, Google began rolling out a series of updates designed to tackle the most prevalent unethical SEO strategies. These updates are continually revised and often have a significant impact on the ranking of websites in search results.
Panda Update (2011)
The Panda update was intended to combat the rise of content farms. This practice involved companies employing freelance writers to create large amounts of low-quality content to maximize the chances of appearing in search results.
Penguin Update (2012)
The Penguin update was concerned with stopping the growing number of link schemes and other negative link building tactics, which were employed by some authors to increase their search rankings.
Hummingbird Update (2013)
The Hummingbird update introduced the concept of semantic search. This meant that instead of just looking at the keywords on a site, Google’s algorithm would effectively read all the content on a page to determine if it was a high quality and relevant result.
Fast, Accessible and Secure
Following on from this, Google moved its focus to examine the quality of a site along with its content. A further series of updates were released to ensure sites which provide a better experience to users were shown higher up in the search results.
In 2014, Google announced they would begin to use HTTPS as a ranking signal. This meant that sites using a secure protocol would gain an advantage over their competitors. (You can read more about this in our post: Why HTTPS is important for your business)
Responding to the growing number of people using mobile devices to browse the web, Google updated its algorithm to favour sites which are mobile friendly. (You can read more about this in our post: Is Your Website Mobile Friendly?)
Most recently, Google started tackling the problem of ‘intrusive interstitials’. These are whole-page pop-ups such as newsletter sign-ups on mobile devices, which often get in the way of users trying to read content and can cause accessibility issues.
Google Offices: Mountain View, California
The Lure of the Dark Side
Modern search engines are continually updating their algorithms to ensure they provide high quality results (Google itself makes around 600 changes to its algorithm per year) . This constant change means that companies have to be careful to ensure that they are following best practices and avoid black hat SEO techniques to make sure they aren’t penalised in the future.
The best approach to ensure your site does well in search engine rankings is to make use of positive SEO techniques such as on-page optimisation, ensuring a good user experience, creating high-quality content, and building relevant links.
For advice on your SEO strategy our digital team is ready to help you with any questions you might have.