Robots.txt is a config format created in 1994 by Martijn Koster.
#2086on PLDB | 29Years Old |
A robots.txt file tells search engine crawlers which URLs the crawler can access on your site.
User-agent: googlebot # all Google services
Disallow: /private/ # disallow this directory
User-agent: googlebot-news # only the news service
Disallow: / # disallow everything
User-agent: * # any robot
Disallow: /something/ # disallow this directory
Feature | Supported | Token | Example |
---|---|---|---|
Comments | ✓ | # A comment |
|
Line Comments | ✓ | # | # A comment |