Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How to use SharedSSL

SharedSSL is now available with our Ultra Unlimited. It is not as professional looking as having...

html or htm parsed as shtml not working

To parse .html or .htm files as .shtml add the following to your .htaccess file in the...

Error 508 / 503 - Resource Limit Reached

To help maintain the stability of servers and keep websites fast, UltraWebHosting has resource...

403 Error on POST

This can occur for several reasons: Be sure your file permissions are correct. If the file needs...

Default Nameservers

If you have signed up for an Ultra 1x, Ultra Unlimited, or Ultra Unlimited Pro plan the following...