This plugin adds robots.txt file for your Winter CMS site.
A robots.txt
file is used to manage crawler traffic to your site. It tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests.
Use composer to install the plugin:
composer require webvpf/wn-robots-plugin
In the backend of your application, go to settings and select Robots.txt
. When filling out the rules, you can use Twig.
An example of filling robots.txt:
User-agent: *
Allow: /
Sitemap: {{ 'sitemap.xml' | app }}
or
Sitemap: {{ url('sitemap.xml') }}
As a result, robots.txt will look like this:
User-agent: *
Allow: /
Sitemap: https://example.com/sitemap.xml
For detailed instructions on filling out the robots.txt file, see developers.google.com