Short History of Link Based Ranking Rationale

by Mark Hess
Building Links
Building Links by Mark Hess

For the best SEO you need lots of things, but links to your site from related sites are at the top of the list. Why? For a simple explanation, our friends at wilsonweb.com provide this entry:

Why You Need Links to Get High Rankings

Eric Enge , Stone Temple – Mar 16, 2010

You’ve heard that you need links to your site to get high rankings, but you may have wondered why. Just why are links to your site are important?

In this first in a series on linking, I’ll give you a brief history of search engines to explain how links fit in. The problem search engines have always faced is how to point users to the best, most relevant webpages in response to their queries.

1. Relevance Based on Human Judgments

The Yahoo Directory was one of the first practical tools for searching the web. It was and is a human-based directory built by humans checking out new sites and making new entries in the directory. However, it is not a scalable model. As the web grew exponentially, humans couldn’t keep up. Something else was needed.

2. Relevance Based on Keyword Density

Subsequent search engines, such as Infoseek and Altavista, did not rely on human editors. Rather, they used software program to crawl the web and discover all of its pages. Then a computer analyzed each webpage according to an algorithm to determine what words were in use on those pages. Pages that showed a higher density for a keyword were considered likely to be more relevant for a particular search term than webpages that showed lower densities for that search term.

The approach, however, was soon abused by spammers, who stuffed hundreds of keywords into pages. They even found ways to hide those keywords from users, while making them visible to search engine spiders or crawlers. The biggest problem, however, was that keyword density is not really a good way to measure the value of a webpage. What was needed was a way to re-introduce human editorial judgment into the equation, while making it more scalable.

3. Relevance Based on Links to a Site

The problem was discussed in various academic circles. While they were students at Stanford University in 1998, Larry Page and Sergey Brin wrote their now classic paper, “The Anatomy of a Large-Scale Hypertextual Web Search Engine” (Computer Networks and ISDN Systems, volume 30, April 1998, pp. 107-117). They went on to found Google and the rest is history.

The underlying concept is that Google uses HTML links it finds while crawling the web to fill the role of human editors. Since each link requires website publishers to take the time to go in and modify the HTML of their webpages, they are only likely to do that if they believe that the page they are linking to is of potential interest for the visitors to their site. As a result, each link can be treated as an editorial judgment.

The Page-Brin paper also introduced the concept that one link might be a more valuable endorsement than another. For example, an individual consumer’s opinion on something may not count as much as the opinion of a recognized expert in the field. In fact, one link might be worth 10,000 times more than another one.

In time, all search engines adopted this approach, determining relevance using an algorithm that combined an analysis of keywords on a webpage along with a computation of the number and quality of links to a site.

Spamming and Links

Of course, spammers got into the act with this new algorithm by creating a marketplace for buying and selling links as a means of improving search engine rankings. The reason this is a problem is that the accuracy of the algorithm depends on the concept of “citation” — links that are editorially given. A purchased link is not a true citation. The link was implemented because the publisher got paid to do it, not because the publisher liked the content.

As a result, Google does not want publishers to make use of paid links for ranking purposes. Moreover, they may even punish publishers who pursue this path, or simply discount the purchased links so they become a waste of the publisher’s money.

You’ll hear some who tell you that you can get high rankings quickly through the purchase of links. It may even work for a while. But don’t do it. Over the long term, such artificial rankings will go away. Rather, focus your energies on creating a great site with great content. Something like that will never go out of style and people will want to link to it.

To Get Links to Your Site …

You’ll need to do several things to succeed in obtaining links from other sites to yours:

  1. Build a website that has unique value and content. You’ll need to go beyond listing your products and services, because people don’t like to link to a purely commercial site. They do, however, like linking to sites that have great articles or resources that are helpful to others. These can even reside on a commercial site, provided that the article or resource pages are not commercial in nature themselves and that these articles or resources are freely given.
  2. Actively promote your site to other website publishers. Let them know about the great resources you have available for visitors to your site and where to find them.
  3. Never stop getting more links to your site. Don’t just implement one linking campaign and then quit. There is almost always more growth possible. Most important: your competition won’t stop adding links. You don’t want obtain those great rankings only to lose them because your competitor keeps working after you have stopped.

Link Building as a Core Discipline

Link building is a core marketing function that all web publishers should pursue. Basically, it is the process by which you promote your site.

Now that you understand the basic theory of how this all this works, you’ll know how to do build links more effectively. In future articles I will outline various strategies that will help you you can focus your link building right on target and running at full speed.


Tags: , , , , ,

Leave a Comment