The old robots.txt was invalid because "User-agent: *" was found twice in robots.txt. Using the API will generate correct robots.txt.