Robots.txt Generator | Create Custom Robots.txt Files

Robots.txt Generator

Create a custom robots.txt file to control how search engines crawl and index your website

What is robots.txt?

A robots.txt file tells search engine crawlers which URLs they can access on your site. This is used mainly to avoid overloading your site with requests.

Place the robots.txt file at the root of your website: https://example.com/robots.txt

User-agent Directives

User-agent

Crawl-delay Directive

Crawl-delay
Number of seconds the crawler should wait between requests (optional)

Sitemap Directive

Sitemap URL

Host Directive

Host URL
Preferred domain name without protocol (optional, mainly for Yandex)

Options

PHP Code Snippets Powered By : XYZScripts.com