Content providers in the form of webmasters, web developers and blog writers began to see the benefits of optimizing their content for search page ranking in the early to mid 90s. The first search engines were listing web pages for web users to easily find pages.
The early web page creators would submit their pages’ URL to the search engines directly. If a page wasn’t submitted to the search engine, the page was not found easily. Once the site was submitted, the search engine would send what was known as a spider because it would crawl through the submitted website indexing pages, grabbing links to other pages and returning the information to the search engine. Once the spider returned to the main search engine, another program would then index the pages based on key words and where the words were located.
Early on in the process, web creators started to understand the spider programs, and how it would react to their pages. Site owners realized that having their pages highly ranked based on keywords would drive traffic to their sites. Based on this data, creators understood that they could optimize their rankings. The term search engine optimization became commonplace around that time. According to industry analysts, the phrase came about in 1997. Soon companies popped up claiming to be able to engineer search engine rankings based on SEO practices.
The early forms of SEO optimization were based on understanding the algorithms that search engines used to rank pages. They used keyword saturation in meta tags as a guide to the pages content. Unfortunately, some unscrupulous web creators used this information to saturate their pages with keywords that didn’t accurately describe the site. They would drive traffic based on words that had nothing to do with their content.
Manipulated keywords provided inaccurate and inconsistent page rankings. Those with the most keywords were ranking in categories that had little to do with page content. They also manipulated HTML tags and code in this manner for higher rankings on search pages that had nothing to do with their topics. This blatant SEO manipulation was dubbed black hat SEO practices.
These early manipulators skewed the search engine rankings and ensured that the results page of search words was filled with pages unrelated. Visitors were getting irrelevant pages based on the black hat practices of these web page creators.
More complex ranking algorithms developed to combat this problem. One of the first was created by Stanford University students, Sergey Brin and Larry Page. The algorithm they developed was called Backrub, and it was a more complex search engine that used highly-mathematical algorithms. It based the results in a function called PageRank supported on whether the inbound and outbound links would bring the casual user to more pages that would fulfill their search criteria.
This type of algorithm merged search engine rankings with the casual surfer’s ability to find other websites that would match their criteria.
Larry Page and Sergey Brin were the founders of Google in 1998. Google revolutionized search engines, page rankings and hyperlinks. Soon link farms and schemes to buy and sell link exchanges were another way that SEO practices were being manipulated.
To combat this problem, search engines began to add factors to the search process and not telling the public. Finally, in 2005 Google announced that every search result would be personalized for each user. Over the years, Google has implemented many plans and algorithms to combat the practices of link exchange schemes.
In 2011, Google announced the Panda update which penalizes websites that have duplicate content. With every change made by the major search engines, those engaged in bad SEO and ranking abuse changed to adapt their practices. Websites would often copy other content and use this plagiarized content on their own sites. Panda effectively killed that practice.
In 2012, Google Penguin was released to further penalize those using manipulative practices.