Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

What are these vt directories?

Those are FrontPage extension directories. If you have and no longer need them you can uninstall...

Cross Origin Request Error

When receiving a cross-origin / cross original request, CORS header 'Access-Control-Allow-Origin'...

Can I change my username?

Usernames are setup in the manner for security. Most programs will allow you to save the...

403 Error on POST

This can occur for several reasons: Be sure your file permissions are correct. If the file needs...

My site was hacked

This typically occurs when you are running a script/app on your website which is out-dated and...