Creating robots.txt ahead of time

Soon, versioned docs are coming back to subdirectories; making sure they aren't indexed.
This commit is contained in:
John Mulhausen 2016-10-27 12:34:58 -07:00 committed by GitHub
parent 13e79b3ec0
commit 482007ee35
1 changed files with 9 additions and 0 deletions

9
robots.txt Normal file
View File

@ -0,0 +1,9 @@
User-agent: *
Disallow: /v1.4/
Disallow: /v1.5/
Disallow: /v1.6/
Disallow: /v1.7/
Disallow: /v1.8/
Disallow: /v1.9/
Disallow: /v1.10/
Disallow: /v1.11/