The robots.txt file is essential for controlling search engine crawlers’ access to your site. It provides information on which pages and sections can be visited by web crawlers. If you need to customize the contents of this file when your application is being deployed, you can add a command to the Build Commands section of your environment’s deployment settings. For example, if you want to disallow crawling any part of your website:
  1. Visit your application environment
  2. Open its Settings page
  3. On the Deployments section of the Settings page. Under the Build commands section, add:
echo -e "User-agent: *\nDisallow: /" > public/robots.txt

Cloudflare Reserved Paths and SEO

Laravel Cloud utilizes Cloudflare’s CDN service, which automatically reserves the /cdn-cgi path on all hosted domains. This path appears on your domain as https://www.example.com/cdn-cgi/l/email-protection and similar endpoints. These /cdn-cgi endpoints are:
  • Automatically generated by Cloudflare’s service
  • Managed entirely by Cloudflare
  • Cannot be modified or customized
For optimal SEO performance, consider adding the following directive to your robots.txt file to exclude these paths from search engine crawling:
Disallow: /cdn-cgi/
For additional technical details about Cloudflare’s CDN endpoints, refer to Cloudflare’s developer documentation.