Monthly Archives: April 2008

Absolute Linking Vs Relative Linking

An absolute link gives the absolute path to a webpage where as a relative link points to the current area of the page. This short definition might not be enough to tell you the difference between these two. There has always been two sided opinion about which one should be used the most. It is better if I start off describing each one individually and then compare the pros and cons.

What is Absolute Linking?
When the href value is a fully qualified URL, it is known as an absolute link. A fully qualified URL consists of the transfer protocol (http://), domain name ( and filename (pagename.html). An absolute link looks like this: <a href= ““>

What is Relative linking?
When the destination href value is relative to the location of the current webpage or source anchor, it’s known as relative linking. Relative links can only link to a page from the same site. The address of the link is always relative to the position of the file. A relative link’s example: <a href=”links/webfiles.html”>

Absolute link Vs Relative link:
So which one is better? People have individual opinion about that but when everything is taken into consideration, absolute linking is a much preferred method. Here are the reasons why relative linking should be avoided. When you use relative link, search engines will index the non www version of your site which some search engines might take it as a different page from that of the one with www and you might lose the entire search engine ranking you’ve earned with so much effort. This will happen also if other sites link to you without www in the link. It is however much easier to code and program with relative links. It is equally easy to hack it. If you manage a large site with lot of folders and sub-folders, you might save a file in a wrong folder and all the links will be broken as the structure is not the same.

No matter how easy it is to program with relative links, it is always recommended to use absolute linking wherever possible. When you leave comments on posts or drop your address on directories, absolute linking will do a great job for better ranking of your site. Some major search engines index sites from popular directories like Most of your backlinks will be linking to, not There’s a possibility that search engines might index your site with and you could lose your search engines rankings as I’ve already mentioned before.

There are cases where using absolute links has caused difficulty in programming but due to the major problems caused by relative linking with search engine rankings it is highly recommended to use absolute links. It is always better to get good rankings on search engines as they bring more traffic to your site, the primary reason why you developed a site at the first place. When we compare these two linking method, absolute linking is preferred by most SEO experts.

JULY 31, 2008 ISSUE #21

10 SEO Mistakes

Your website might have great content but small & simple mistakes can lead to poor ranking on search engines. Here’s a list of common mistakes people make on their website that prevent them for getting better ranking.

Targeting Wrong Keywords:
List of appropriate keywords is an essential part of every website owner. A common mistake made by most people is targeting wrong keywords or not enough keywords. It is important you know what your customers are looking for. You can make use of free tools like: Google Adwords Keyword Tool & Overture Keyword Tool to find out appropriate keywords for your website.

No robot.txt File:
“robot.txt” protocol, also known as the Robots Exclusion Protocol prevents the search engine spiders and other web robots to access all or certain parts of a website. In simple words they tell these web spiders which portion to crawl and which not to crawl. There is a possibility that your entire site might not be indexed if you do not have this file.

Inappropriate or Ignoring Title tag <title>:
There are many site that use short or inappropriate title tag. In the worst scenario, they completely ignore title tag. This could be one of the biggest SEO mistakes. It will be much better if you can use different title for different page depending on its type. While some use their company name, it is always wise to include few keywords in it. For example: <title>company name – keyword1 keyword2.. </title>.

Use of Flash:
Flash could make a website look attractive but it is always advisable to use a HTML alternative as well. Search engines spiders find it hard to read flash sites. This could be a reason why your site did not get indexed at the first place. If you really want to use flash on your site, you can always optimize it.

Lack of Keywords in the Content:
You might have listed all the keywords on the meta tags but lack of keywords in the content is another mistake people usually make. You should place your keywords at appropriate places in the content. Also make sure you do not over stuff your keywords that make your content unreadable.

JavaScript Menus:
If your site includes JavaScript menu, it is necessary you use a site map or put the links on a noscript tag to be crawled by web spiders. It is best if you do no use JavaScript menus as search engines cannot read them.

Consistency and Maintenance:
Some people make a mistake by thinking site optimization is just for once. It is very important that you keep yourself updated with the changes that occur in search engines, keep an eye on your competitors and keep optimizing your site accordingly.

User usability should not be ignored. Easy navigation, proper site structure and descriptive link text is admired by visitors as well as spiders.

Keyword Stuffing:
Keyword stuffing is a thing of the past. Keyword stuffing is taken as spamming and you could be banned from the search engines. So be careful not to stuff your keywords.

Backlink Spamming:
Search engines do give priority to website with backlinks, but plastering your link all over will be considered more as spamming than proper SEO. You should try to get quality backlinks from websites with higher page rank or appropriate websites.

APRIL 17, 2008 ISSUE #6

Site not indexed yet?

Top reasons why your webpage may not be indexed by search engines.
Search Engine optimization being important for a website, you define the keywords, provide a Search Engine friendly content, even use submission tools to submit your site but yet your website doesn’t seem to get indexed by these search engines. There might be various reasons why they seem to take forever to index your site.

Index Time:
It is important that you give enough time to get your site indexed. It is sometimes listed on search engine’s submission page the amount of time before it gets indexed. Index time usually ranges from 1 – 8 weeks depending on the type of search engine. Search Engines like AltaVista and Inktomi provide paid option to get listed quickly. You can give up to 4 months time to get indexed. If you are still not indexed then you need to go through your keywords, meta-title, meta description and other aspects of SEO.

Already Indexed:
There is a possibility that you’ve already been indexed but do not know about it. It is up to you to find out whether you’ve been indexed or not since major search engines do not tell you if you’re listed. It is not possible for you to search for keywords related to your site and expect it to pop up on the first page. You can find out if your site has been indexed with the command: “url: www.yourdomainname” on Google, Yahoo or Live search.

Missing Page:
It is a must that you upload the pages to your site before submitting them. Submitting a page that does not exist or submitting with a subtle typo in the URL is an error we might all make at one time or another. There is a greater possibility that many search engines will not notify you if the URL does not exist when you submit them hence it is important you make sure all the pages exist.

Roadmap from Home Page:
There are some search engines that have been known to drop pages that cannot be traveled to/from the home page. The reason behind this is that search engines might decide the page is unnecessary or unimportant if there’s no road from your home page to the page you want indexed. You can think of your site links as a series of roads from one page to another.

External Links:
There are some search engines that refuse to index websites that do not have any other websites linking to them. There is a possibility that they might index your home page but refuse to index other pages until you achieve at least one or more links from another domain. The best way to solve this problem is by establishing some links and after that you can resubmit your pages as well as the pages that link to you. Hence it is recommended that you have links to your websites making it easier to get indexed and get better rankings.

If you have content inside HTML frames, it can cause problems with submissions because the search engine may index the main content of the page, but not the surrounding menu frame. Visitors find some information but miss the associated menu. It is therefore, for your best if you can create non-framed versions of your pages. Optimizing a non-framed page will often achieve better results.

Free Sites:
One of the drawbacks of free websites is that many search engines do not index pages from them because of all the “junk” submissions. It is a better choice to buy your own domain as they are preferred by search engines. The other drawbacks of free website domains include unreliability, forceful display of banner ads etc.

Spider Blocks:
When you design a website it is always suggested to make it search engine friendly which means the search engine spiders should be able to crawl and index your site. These spiders cannot index sites that require any kind of registration, password authentication or filling out forms. For this you need to create static pages that search engines will be able to find and index without performing special action on your site. There are utility programs that help you with this depending on the database system you have.

Guilt through Association:
You can ask your hosting service if your domain name has its own unique IP address assigned to it. If your website shares the same IP as other websites then there is a possibility that your IP has been banned because of something someone else did.

Dynamic Pages:
Dynamic pages are often ignored by search engine spiders. Pages generated on the fly from a database usually contain symbols like a question mark (?) or an ampersand (&) which are ignored by many search engines. The simpler the page the better it is for search engines. So avoid using unnecessary fancy scripts and code which can hurt your page rankings.

Submission Limits:
There are some search engines that have specific number of submissions per day for the same domain. If you exceed their limit, all your submissions could be ignored. Some submission consultants suggest not submitting more than one page a day to an engine for a given website.

Spamming is usually blacklisted by many search engines. Excessive use of keywords, same color text as background are few of the spamming techniques that might result in ignoring or being rejected by search engines.

Large Pages:
If your website has pages that are very complex and takes too long to load then it might time out before the spider finishes indexing it. The solution to this is to limit your page size to 50k or less. Visitors with slow internet connection might leave your page before it is fully loaded, hence it is better to limit your page size.

Page Limits:
Search engine spiders crawl only certain number of pages of your website. This might range from few dozen or three or four hundred depending upon the engine. Search engines like Google tend to crawl deeper into your site. The best way to get search engines to crawl your pages is to get quality backlinks.

If your site contains redirects or meta refresh tags, they can sometimes cause the engines to have trouble indexing your site. Search engines generally indexed the page that it redirects to but if they think you might be trying to trick them by using cloaking or IP redirection, there is a chance that your site might not get indexed at all.

Unreliable Hosts:
A reliable hosting service usually guarantees 99.5% up time. The reason for choosing a reliable hosting service is because when your website fails to respond when the search engine spider visits your site, it might not get indexed at all. In the worst scenario even if your site is indexed but when the engine spider visits at the time your site is down, you might be removed from their database.

Random Errors:
There might be times when search engines just lose submissions at random through technical errors and bugs. Therefore, it is advisable to resubmit your site once or twice a month for good merit.

It is for your best if you stop submitting your site once it achieves a desirable ranking. The search engines might re-evaluate the page and might reduce the ranking.

APRIL 11, 2008 ISSUE #5

Search Engine Optimization basics: 10 SEO Points to remember

It is quite necessary for every webmaster to know few basics about SEO if they want to get good page ranking. It might be quite expensive to hire a SEO expert. If you can optimize your site yourself and follow other internet marketing rules, it might prove really beneficial for your site at less cost.

1. Domain Name:
You do not have to get a domain name with keywords on it. The best way to choose a domain name is to keep it simple and short which could be easily remembered.

2. Title Tag <title></title>:
The title tag is one of the crucial part of site optimization. The title tag is usually placed at the top of your html page and tells search engines what the page is all about. It is important that this tag includes your main keywords and should be 63 characters or less to appear full on Google. It is highly recommended that you have a title that incorporates your website name and your main keywords.

3. Meta Description Tag:
Meta description tag contains the description of your website. A meta description tag helps you boost your page rankings on some search engines. Some search engines even display this on their result page giving the readers to get a brief idea before visiting the site. Here is an example of meta description tag:
<meta name=”description” content=”Site summary here.”>

4. Natural Keyword Integration:
Search engines simply hate keyword stuffing. Keyword stuffing means over usage of keywords in your content. If your content uses too much of your keywords than normal, your site may be banned from many search engines. Keyword density is the thing of a past. Keyword stuffing not only drives search engines away but if the visitors do not understand what you are trying to sell they might never want to visit your site again. So make your content sound natural with proper usage of keywords.

5. Checking Google Cache:
You can type into Google Search: cache: yourURL to check the Google cache. You will also discover when it was last retrieved. You can check it at:

6. Finding Backlinks:
One of the keys to get better ranking on search engines is to get quality backlinks. Search engines like Google give much preference to sites who have backlinks from higher PR sites. If you could get backlink from famous directories like or you have better chance of getting indexed sooner. You can check the number of back links to your site by typing into the Google Search: link: yourURL

7. Site Map:
Site Maps make it easy for search engines to find and index every page of your website. According to Wikipedia, “A site map is a representation of the architecture of a website. It can be either a document in any form used as a planning tool for web design, or a web page that lists the pages on a website, typically organized in hierarchical fashion.” Site maps are important if you use Flash or JavaScript menus instead of HTML.

8. Limit Flash Usage:
It is quite difficult to optimize websites that use Macromedia Flash. While it is not impossible to optimize such site, it is highly recommended to minimize the usage of Flash on your website.

9. Duplicate Content:
It is for your best to avoid duplicate content. Unique content is much preferred by Search Engines. If you are afraid your content might be copied too, you can use this free tool:

10. Proper Foundation:
It is always wise to check if your site is working properly. It is necessary that you check:
– that all your pages load properly.
– meta tags have been defined.
– your site have no browser compatibility issues.
– robots.txt validates and sitemap.xml works.


APRIL 03, 2008 ISSUE #4