Robots.txt Generator
Create a custom robots.txt file to control how search engines crawl and index your website
What is robots.txt?
A robots.txt file tells search engine crawlers which URLs they can access on your site. This is used mainly to avoid overloading your site with requests.
Place the robots.txt file at the root of your website: https://example.com/robots.txt