Questions Columns Rows
GitHub icon

Robots.txt

Robots.txt - Config format

< >

Robots.txt is a config format created in 1994 by Martijn Koster.

#2086on PLDB 29Years Old

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site.


Example from the web:
User-agent: googlebot # all Google services Disallow: /private/ # disallow this directory User-agent: googlebot-news # only the news service Disallow: / # disallow everything User-agent: * # any robot Disallow: /something/ # disallow this directory

Language features

Feature Supported Token Example
Comments
# A comment
Line Comments #
# A comment

View source

- Build the next great programming language · Search · Add Language · Features · Creators · Resources · About · Blog · Acknowledgements · Stats · Sponsor · Traffic · Traffic Today · Day 277 · feedback@pldb.com · Logout