You have a great website, it has a great design, it has great content,
it contains all of the important keywords, the usability is fine…
but: the website does not have good rankings on search engines.
Many webmasters experience this
problem. Fortunately, there is a way to solve seemingly inexplicable
why your web pages do not get ranked
There can be several “invisible”
reasons why your web pages do not get high rankings on Google and other
1. The robots.txt file
is not correct
If your content management system
offers a development mode, chances are that the robots.txt file of your
website blocked all search engines when you developed the website.
If you did not change the robots.txt
file of your website, the search engine robots will still be blocked.
Remove the “Disallow:” lines from your robots.txt file to make sure
that your web pages can be accessed by search engine robots.
2. The HTTP status code
of your pages is not correct
When search engine robots and normal
visitors request a page from your server, your server answers with a
so-called HTTP status code. This status code cannot be seen by the
visitor as it is targeted at the program that requests the page.
The status code for a normal page
should be ‘200 OK’. All other status codes mean that there is something
special with the page. For example, 4xx status codes mean that the page
is broken, 5xx status codes mean that there is a problem with the
Some servers have configuration errors
and they deliver the wrong HTTP status code. This does not matter for
human website visitors and you cannot see it in your browser. Search
engine robots, however, won’t index your web pages if they get the
wrong HTTP status code.
3. There are other
Other technical errors can
also have a
negative influence on your rankings. For example, the HTTPS settings on
your website could not be correct, or the pages might load too slowly.
In addition, websites automatically
get errors over time. Some links on the pages become obsolete, old
pages do not fit on the new website, etc. If a website contains too
many of these errors, it will look like a low quality website.
to find these errors
Of course, you can manually search
your web pages for these errors. This can take quite
some time. That’s the reason why we developed the Website Audit tool in
Among many other things, the Website
Audit tool in SEOprofiler checks the robots.txt file of your site, it
checks the HTTP status codes, and it checks your web pages for other
errors that can have a negative influence on the rankings of your web
The Website Audit tool also shows you
how to remove these errors so that search engines can index your web
pages as well as possible.
SEOprofiler offers many tools that
help you to get more out of your website. If you haven’t done it yet,
create your SEOprofiler account now and create an audit report for your
Article source: http://newsletter.seoprofiler.com/newsletter776.htm