webspace/static/robots.txt

3 lines
63 B
Text

# allow crawling everything by default
User-agent: *
Disallow: