a collection of technical fixes and other randon stuff

Spodworld

  • Join Us on Facebook!
  • Follow Us on Twitter!
  • LinkedIn
  • Subcribe to Our RSS Feed

Robots.txt , dont forget to fill it in

Primary purpose of robots.txt file on your website is to tell search engines and crawlers what NOT to index.

One common mistake is to not fill it in correctly, this for one can cause google to think the file is invalid.

This usually happens if you think 'i dont want to block this site' so i wont put anything in it.

 

My advice when creating a robots file is to create a rule to block an imaginary folder or file.

eg:

User-agent: *
Disallow: /mynonexistantdirectory/
Disallow: /myloginpage.ashx
Sitemap: http://mywebsiteURL/sitemap.xml

(replacing mywebsiteURL with whatever your site is)

If you have things you actually need to block , then use them instead, just dont leave it blank.