Digital Insights
SHARES

10-Point Checklist For High Visibility in Google

By

Published: May 4, 2010

Google is the pre-eminent search engine (SE) with no close competitor. Given that inclusion is free, your Web pages must be in it. We’ll show you how to top the Google SERPs, that is, be found at the top of the search engine results pages. These techniques are known as search engine optimization (SEO) and require a small investment of your time.

I took two of my sites to the top of the SERPs in three months, so it can be done. My pages have few competitors: my challenge was mainly to get past false positives such as resumes, job vacancies, articles, and so on. If you are competing with “real” sites that are selling competitive products such as the ones you read about in your spam e-mails, you can get there within a year with some persistence.

Goal

Casual Google users use default settings, so your site must get into the top 20 results. Unfortunately, you cannot assume that visitors will use the most appropriate search terms. Real people are unpredictable.

Google SERP

You must understand that SERP positioning is dynamic – what you see depends on no single factor. It depends on the viewer’s location, the type of search used (basic, advanced, regional, filtered, and so on), the content of the page, their keyword density, the page rank (PR), the search term (words or phrase), and so on. Therefore, you need to plan your site carefully.

Ten-Point Checklist

1. Domain Name and Server

Get a.ca domain if your audience is likely to look for Canadian sites. Use a global, top-level domain (gTLD) such as.com if your business is not local. A unique, topical name such as “dentist-atlanta.com” should rank higher in the SERPs than “dentist.com” or “smithclinic.com” (if the search term is likely to be “dentist in atlanta” or similar).

It is nice but not essential if the web host gives your site a unique IP address, but it is highly advisable to host your site on your own dedicated server. Shared web hosting means that a server could host thousands of web sites, and Google’s spiders would be slowed down.

If you already have a Web site, you can find out its IP address using cmd.exe or an MS-DOS prompt, e.g. “ping cnn.com” and call up the displayed IP address in the browser. If you don’t see the expected web page, it has a shared IP address.

2. Page Title

An ill-planned page title is the Achilles Heel of a Web page. This is the text that appears at the very top of the browser window.

The Title tag text should be brief and readable, avoiding superfluous words and punctuation marks. Begin with the most valuable keywords, e.g. “Root canal specialist dentist clinic, Mayfair, London”, not something like “***** Fred Smith, BDS – 5 Stars Dental Clinic *****”, or worse, “Welcome to my home page”, or “Untitled”.

3. Style Sheet

Placing style definitions in a.css (Cascading Style Sheet) file moves the body text close to the top of the document and shrinks the page size. Many Javascript effects can be replaced by CSS. Fast-loading pages are good for both humans and search engine crawlers.

4. Meta Tags

Google ignores the Keywords meta tag for ranking but other SEs use it. An extract from the Description meta tag sometimes appears in the SERP; sometimes you see a snippet from the body text. Moderation and relevance should be your benchmark for placing keywords in these tags.

5. Content

  • Quality content is rewarded by top placement in the search results. For example, if you sell new cars, used cars, and car service, you would have three branches, each containing pages relevant to that theme.
  • Links to popular causes, memorial ribbons, HTML validation, page counters, etc. could distract visitors to other sites.
  • Optimise images and keep the page size low.
  • Place key phrases towards the top of pages and in heading tags such as H1. Don’t get hung up on a single keyword for the whole site. Pick different ones for different pages so that you have more ways to be found. Optimise for the search terms used by your paying customers, if you can identify them, not casual visitors.
  • Consider (this depends on the size and nature of your business) placing noncommercial pages such as staff pages, personal hobbies, genealogy and so on at a secondary level but not linked from the entry page. Some of my ranking success comes from hosting my hobby pages below my commercial site, because I cannot justify buying a domain for each of my interests. I legitimately link them to my resume, which has a link to my business site. This enables free placement of the secondary pages in otherwise for-fee directories.

    6. Links and Folders

  • Link a site map from the home page so that crawlers can find the rest of your pages.
  • Link each page to the home page and to others in its logical group (but not to every other page in the site). The anchor text should use key phrases and words.
  • Use keywords for folders, image names and Alt text but don’t overdo it. e.g. /hamilton/lawyer/divorce.htm, alt=”Perth plumber” The deeper your directory structure, the less likely it will be spidered regularly.

    7. Neighbourhood Watch

    Get quality incoming links from sites that share your theme. Without referrals, it’s near impossible to be visited by Googlebot. Try to get such links from sites with PR3 (Page Rank – see below) or better, not from link farms that are clearly built to boost PR. Make it easy for other sites to use keyword-loaded phrases in their links, say, by offering a cut-and-paste slice of HTML anchor code. Here is an example you can use to link to this page:

    Links from lower-ranking peers will not penalise you; they simply won’t appear in Google’s list of backward links to your page. You cannot control who links to you, but you have control over who you link to.

    Add a judicious number of outbound links to topical peers of the same or better calibre. Google likes links to authoritative sites, but don’t overdo the external links. Although such sites might not overtly link to your site, their site statistics file might get crawled and constitute a link back to you.

    8. Cloaking

    Cloaking hides content from humans and SEs, which is generally a bad practice. Good reasons to cloak include hiding parts of your optimised pages from amateur competitors or to show different pages to different visitors based on their browsers.

    Subscriber-only sites also manage to get into SEs. They use a cloaking practice known as “agent name delivery”, which is a slab of code that checks whether the visitor is a crawler or a human. Crawlers get to see the whole site, but others are directed to a sign-up form.

    9. Avoidable Practices

    The following practices could get your site banned from Google at worst or lower its ranking at best:

  • Gimmicks. Pointless Javascript effects such as cursor trails and transitions do nothing for your viewers but place a lot of code above your body text. You want your content close to the top of the page.
  • Bad HTML code. Novice hand-coders might copy some HTML tags without understanding their meaning. One webmaster used the robot directive
  • Multiple sites with duplicated content, e.g. http://www.example.net and http://www.example.com hosted on the same server or different ones, as this is considered spamming. Use a permanent redirect on all secondary sites to point to the main domain.
  • Multiple copies of the same page. This is typically an entry or “doorway” page optmised for different keywords to lure different people, e.g. crackz.htm, serials.htm, passwords.htm, and so on.
  • Hidden content. This can be repetitive text on the same colour background or a layer with coordinates that are off the visible page. It begs the question why the author does not put this effort into creating visible text.
  • Flash-Only pages. A solution is a user agent entry check that displays Flash to enabled browsers, but plain HTML to crawlers and other human visitors.
  • Frames. Googlebot will crawl links in the Noframes text, but not ones in the framed pages. Other SEs might not crawl frames, so it is better to use tables and more so to use CSS. If you must use frames, ensure that you use the correct Doctype declaration for frames. I have noticed that Googlebot can now crawl links in frames but sometimes it cannot.
  • Submission software or service. They could submit your site to thousands of unknown SEs. You will get a lot of spam, abuse, and possible inclusion in link farms that will ruin your reputation in Google’s eyes. After all, can you name more than five major SEs?
  • Session IDs. Sites that require session IDs from crawlers will get poor visibility because the previous session will have expired by the time Googlebot returns.
  • Over-optimisation. [Update 11/2003] Many sites that followed a strict “SEO formula” found that they could not be found at the top of the SERPs, or in the index at all. There is speculation that such tactics cause the sites to be filtered out of the search results.

    10. Patience

    Having optimized and submitted your pages to Google, get on with growing your business, because Google takes time to rank you. Work on getting quality, inbound links from high-ranking sites that feature the same subject matter. Increase your content and keep it fresh. Get free or paid listings in Google AdWords, Overture, Yahoo, Open Directory Project (dmoz.org), and reputable engines such as MSN, Yahoo!, and Ask Jeeves.

  1. Dennis Szuba says:

    Thank you for providing this great blog. See my own!