Skip to content

WebVPF/wn-robots-plugin

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robots plugin

This plugin adds robots.txt file for your Winter CMS site.


A robots.txt file is used to manage crawler traffic to your site. It tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests.

Getting started

Use composer to install the plugin:

composer require webvpf/wn-robots-plugin

Editing robots.txt

In the backend of your application, go to settings and select Robots.txt. When filling out the rules, you can use Twig.

An example of filling robots.txt:

User-agent: *
Allow: /
Sitemap: {{ 'sitemap.xml' | app }}

or

Sitemap: {{ url('sitemap.xml') }}

As a result, robots.txt will look like this:

User-agent: *
Allow: /
Sitemap: https://example.com/sitemap.xml

Instructions for robots.txt

For detailed instructions on filling out the robots.txt file, see developers.google.com