hostmonster-Host Unlimited Domains on 1 Account   coolhandle offering reliable webhosting since 2001
Unlimited Hosting Space - FREE Site Builder   Smart Website Solutions for Your Small Business=

Five ways to inadvertently keep Google from indexing your website

It might be that you’re blocking Google without knowing
it. That means that Google won’t index all the pages of your website.
In
this article, you’ll learn how to block Google and how to make sure
that you do not block Google inadvertently.

Keep out - Google

1. Errors in the
robots.txt file of your website will keep Google away

The disallow directive of the
robots.txt
file is an easy way to exclude single files or whole directories from
indexing. To exclude individual files, add this to your robots.txt file:

User-agent:
*
Disallow: /directory/name-of-file.html

To exclude whole directories,
use
this:

User-agent:
*
Disallow: /first-directory/
Disallow: /second-directory/

If your website has a
robots.txt file, double
check your robots.txt file to make sure that you do not exclude
directories that you want to see in Google’s search results.

Note that your website visitors can still see the pages that you
exclude in the robots.txt file. Check your website with the
website audit tool
in SEOprofiler to find out if there are any
issues with the robots.txt file.

2. Use the meta robots
noindex tag and Google will go away

The meta robots noindex tag
enables you to tell search engine robots that a particular page should
not be indexed. To exclude a web page from the search results, add the
following code in the head section of a web page:

In this case, search engines won’t index the page and they also won’t
follow the links on the page. If you want search engines to follow the
links on the page, use this tag:

The page won’t appear on Google’s result page then but the links will
be followed. If you want to make sure that Google indexes all pages,
remove this tag.

The meta robots noindex tag only influences search engine robots.
Regular visitors of your website still can see the pages. The website
audit tool in SEOprofiler will also inform you about issues with the
meta robots noindex tag.

3. The wrong HTTP
status code will send Google away

4. Google won’t index
password protected pages

If you password protect
your
pages, only visitors who know the password will be able to view the
content.

Search engine robots won’t be able to access the pages. Password
protected pages can have a negative influence on the user experience so
you should thoroughly test this.

5. If your pages require
cookies or JavaScript, Google might not be able to index your pages

Cookies and JavaScript can
also keep search engine robots away from your door. For example,
you
can hide content by making it only accessible to user agents that
accept cookies.

You can also use very complex
JavaScripts to execute your content. Most search engine robots do not
execute complex JavaScript code so they won’t be able to read your
pages.

In general, you
want Google to index page pages. The tools in SEOprofiler help you to
make sure that Google and other search engines can index your web pages
correctly. Use the website audit tool in SEOprofiler to make sure that
you do not inadvertently block Google.

Back
to
table of
contents
Visit
SEOprofiler.com

Article source: http://www.free-seo-news.com/newsletter642.htm#facts

Tags:


Submit a Review




If you want a picture to show with your review, go get a Gravatar.

1&1 has shared hosting and dedicated hosting solutions for every budget and free domains with all hosting packages!  StartLogic - Affordable hosting: Free setup/domain, unlimited emails, PHP, mySQL, CGI, FrontPage. As low as $3.95/month
Cloud ecommerce platform delivers more traffic, higher conversion and unmatched performance

© Copyright 2008 Tyconia International, Inc. All Rights Reserved.