Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

What is the location of curl?

Curl is located @ /usr/bin/curl

Sub-Domain times out

When you add a sub-domain and it times out make sure you do not have redirection set from that...

SSL - Creating a CSR in Windows 2003

Creating the CSR IIS Windows 2003 or 2000 Server: From Administrative Tools, run the Internet...

Why can I not ping my domain?

For security purposes we have closed off the ICMP port which is used for ping.

Change your Websites Favorite Icon

How to change your website's icon for your internet browser. Create a 16x16 pixel icon image...