Guides
Customizing The robots.txt File
The robots.txt
file is essential for controlling search engine crawlers’ access to your site. It provides information on which pages and sections can be visited by web crawlers.
If you need to customize the contents of this file when your application is being deployed, you can add a command to the Build Commands section of your environment’s deployment settings.
For example, if you want to disallow crawling any part of your website:
- Visit your application environment
- Open its Settings page
- On the Deployments section of the Settings page. Under the Build commands section, add:
Was this page helpful?