Monthly Archives: August 2009

The New Buzz in Town: Squidoo

Squidoo lensThe strange looking icon on the left is that of a squid and the eye ball at the end denotes lens. All together this is the logo of one of the most popular community websites that allows users to create pages (also known as lens in Squidoo).

History on Squidoo:
Seth Godin founded Squidoo in the year 2006 (alright I admit it’s not relatively new) as a platform for users to set up pages on any topic. They have a special term for the page known as “Lens”. On how Squidoo gets its name, Godin explains, “Squids have large eyes, and each lens on Squidoo provides a view on the world.”

Features of Squidoo:
The users here are known as “Lensmasters” and creating a lens is pretty easy with the facility of modules. You can include almost everything necessary on your page: text, images, RSS feeds, video, audio, recommendations, links and opt-in boxes.

What’s so special about Squidoo?
As a Squidoo user, I love their presentation. They are very friendly to the users, yes they are user-friendly as well. What’s the difference? Here’s a piece from the recent mail I got from them: “Happy Birthday! Okay, chances are it’s not really your birthday. But we wanted to give you a present anyway. So here it is. It’s another Squidoo secret. And it’s a big one. You’ve probably heard that you can make money from Squidoo. Yes, for free. We’re here to confirm the rumors.” You can see their approach is quite friendly. Ah wait I think I forgot to mention earning money from creating lens there. Let’s have a whole paragraph on that.

  • Making Money on Squidoo:
    Seth Godin states that he started Squidoo for the purpose of earning money, not only for the company but for charities and lensmasters who provide content as well. Squidoo earns money through ads and affiliate links. 5% of this goes to charity, 45% goes to the company and the remaining 50% is either donated to charity or given to the Lensmasters. The charity or the page creator earns depending on how popular the Lens is.
  • What else?
    Last but not the least advantage of creating Squidoo Lens is to get traffic to your site and your blog. Leave link of your website and your blog on your Lens and people are bound to click on that depending on how informative it is. The best way to enhance them is to keep updating them, add modules, experiment with the keywords, images and links, sky is the limit here!

Conclusion:
Opening an account on Squidoo is real easy and it is equally easy to create lens there. The reason I am talking about this site is because it has so many features and it is like a micro search engine for the lenses. More and more people are into Squidoo these days. If you are a regular Twitter, Squidoo has the facility to announce your lens there as well. It has the facility to connect to all your social networking profiles in an easy way. Squidoo is a great platform, if you expertise on something. Share you knowledge and you shall be rewarded!!

August 25, 2009 ISSUE #38

First Click Free

For years webmasters have faced problems indexing their authenticated pages on search engines. While they may want Google to index their pages, they still want normal visitors to log in to view that information. Search engines like Google do not index pages that ask for authentication. The log in page treats all the users including Googlebots the same way. This prevents the pages to appear on the SERP (search engine results page).

Considering this factor Google has introduced a new feature known as, “First Click Free”. Here, only those users who arrive at the website through Google will be granted permission to view the page. However, to view further information, they will be asked for login details. To implement this feature we need to follow these Google guidelines:

  • The users who arrive at your website through Google search result should be allowed to see the full text of the content they’re trying to access.
  • You need to make the content identical to both Googlebot and the users who visit from Google.
  • In the case of multi-page article, Google states that you need to give permission to both Googlebot and the users to view the entire article and for this you need to display the entire content on a single page. If you are not able to display the entire article on one page you can use cookies to make sure that the user can visit each page of the article before being asked for registration or payment. You are however allowed to restrict the other parts of your website by asking for registration or payment. On the other hand if you want to display only certain portion of the content and want to prevent the rest, then you can make that portion of the article visible and restrict the rest of the content. However, Googlebot will only index the visible part and the rest of the article will not be accessible by Google or the users.

Technical ImplementationTo include the restricted content in Google’s search index, the crawler needs to be able to access that content on the site. Since Googlebot cannot access pages behind registration or login forms we need to configure the website to serve the full text of each document when the request is identified as coming from Googlebot via the user-agent and IP-address. It’s equally important that the robots.txt file allows access of these URLs by Googlebot.

When users click a Google search result to access the content, the web server will need to check the “Referer” HTTP request-header field. The website needs to display the entire content of the page that is protected from other visitors. When the referring URL has a Google domain www.google.com, the website needs to display the entire content. Hence, based on the IP addresses or the User-Agent HTTP header, the content is delivered.

This helps the Googlebots to crawl the protected content, providing you quality traffic you seek.

Controversy
Some people refer to this as Cloaking, but Google denies this fact. According to Google, “Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.” But this practice shows the same content to Googlebots and Google users which could be termed as a fair practice.

August 10, 2009 ISSUE #37