10 Commandments of Search Engine Optimization (SEO)

May 25

Through Search Engine Optimization (SEO) digital marketers plan and develop web content in a way that maximizes search engine results and rankings.

In order for any search engine, such as Google, Yahoo, or Bing, to work three things need to happen.

First, search engine algorithms (also known as robots, bots, or spiders) crawl web content by following links. Then, crawlers index acquired data to the search engine’s main database, a step known as indexing. Finally, when you or I type a search term into Google or the like, another set of algorithms match our query to the data in the search engine’s database and display results as vertical links on a Search Engine Results Page (SERP).

These three basic search engine steps are simply called Crawl-Index-Display. And by influencing what gets crawled and how it is indexed, content marketers ultimately master where and to whom the content ends up being displayed.

The following list of 10 factors represents a professional SEO toolkit:


Every page on the web has a PageRank (PR). PageRank is a score out of ten that basically determines number and quality of links that link to the content. Generally the higher the score the higher the page’s relevance. A lot of online content is not at all optimized and has a PR score of zero. Depending on whom you talk to, a PR score of 4 to 7 is considered reputable. A PR score of eight or more is mostly attributed to authority hub sites like Google itself. You may check PageRank score of any page at anytime here.


TrustRank (TR) is a proprietary metric that actually determines how high the content is displayed within SERP’s. It’s a confidential score, which search engines don’t disclose. Such secrecy allows search engines to stay competitive and maintain search results’ quality. However, it’s possible to infer a sense of relative TrustRank level because industry professionals confirm positive correlation between TrustRank and PageRank metric.

Competitive Keywords

Keywords are words or phrases that we use to search for any content of interest online. Quality content which contains properly optimized keywords shows up higher within SERP’s. Google Keyword Tool, which Google offers as a part of its AdWords platform, provides approximate frequency and competition data for all searches. Typically, a great way to establish organic presence online is to develop and optimize content for top 100 and 1,000 keywords relevant to one’s niche.

Keyword Density

Keyword density refers to the ratio of keywords versus total number of words on a webpage. Search engines penalize content with keyword ratios that are too high or too low by assigning lower TrustRank scores to those pages. Ideally keyword density should be kept around two to three per cent.

Page Factors

The three page optimization factors are: keywords, description, and title. Since the page title and description are the first things that a spider crawls, including strong keywords when creating these page elements is critical for higher PageRank. Similarly important is avoiding stop words because stop words dilute PR and relevance when crawlers identify them. The stop words are: “And”, “A”, “The”, “In”, “Out”, “Of”, “Be”, “I”, “Me”, “Are”, and “To”.


A backlink is a link that points to any content from another page. Since crawlers compute PageRank scores based on PageRanks of the page’s banklinks, their quality usually outweighs their quantity. Thus, the higher the page’s PageRank, the more valuable it is to receive a backlink from it. Building as many high quality backinks as possible leads web marketers to SEO success.

HTML Tagging

Albeit there are many different HTML tags, most essential to SEO is the “nofollow”. The “nofollow” tag gets inserted in HTML right after a link that SEO manager does not want a web crawler to index. A link tagged this way, shows up to people but stays invisible in search engines. Strategically, it’s possible to conserve and channel PageRank in a powerful way by controlling and not giving away PR scores of relevant content and pages.

Duplicate Content

Sometimes spiders wrongfully punish some sites for duplicate content even though no duplicate content actually exists. This happens when multiple versions of a URL point to the same place. For example, both http://site.com and http://www.site.com lead to the same homepage but a search engine would bar this scenario as two different sites that copy each other’s content. With 301 redirect coding, redundant URL’s are forwarded along with their entire PageRank scores to a unified source. In code jargon, 301 means to move permanently while 302 conveys to move temporarily. Always give duplicate content a 301!


Not to be confused with Sitemap created for users on the actual site. Within the context of SEO sitemap.xml is a file in the site’s hosting server root directory. Any such XML file conveys to spiders three things: 1) What pages to index 2) How often the site’s content is updated 3) How to prioritize individual pages. In short, a sitemap honours search engine with easier crawling and this favour is returned by search engine rewarding the site with higher PageRank.

Ethical Tactics Only

Search engines lower PageRank scores and ban sites that engage in unethical SEO practices. It’s always highly desirable to refrain from the following: 1) Cloaking (redirecting bots after they come to index the site to other pages); 2) Spamming (having too many irrelevant keywords or backlinks), and; 3) Keyword Stuffing (hiding keywords within code so they are invisible to people and are only detectable to search engines). Search engine algorithms are extremely efficient at uncovering these and similar unethical tactics. Ethical search engine optimization is simply way more sustainable.

It is widely accepted that given Google’s market share dominance, by optimizing any web content for Google, simultaneously optimizes it for other search engines as well.

Typical lead-time for a comprehensive SEO program is three to six months.

Leave a Reply