Extend robots.txt configuration
Overview
Since Shopware 6.7.1, the platform provides full robots.txt support with all standard directives and user-agent blocks. This feature was developed as an open-source contribution during Hacktober 2024 (learn more). For general configuration, refer to the user documentation.
INFO
The events and features described in this guide are available since Shopware 6.7.5.
You can extend the robots.txt functionality through events to:
- Add custom validation rules during parsing
- Modify or generate directives dynamically
- Support custom or vendor-specific directives
- Prevent warnings for known non-standard directives
Prerequisites
This guide requires you to have a basic plugin running. If you don't know how to create a plugin, head over to the plugin base guide:
You should also be familiar with Event listeners.
INFO
This guide uses EventListeners since each example listens to a single event. If you need to subscribe to multiple events in the same class, consider using an EventSubscriber instead.
Modifying parsed directives
The RobotsDirectiveParsingEvent is dispatched after robots.txt content is parsed. You can modify the parsed result, add validation, or inject dynamic directives.
This example shows how to dynamically add restrictions for AI crawlers:
Handling custom directives
The RobotsUnknownDirectiveEvent is dispatched when an unknown directive is encountered. Use this to support vendor-specific directives or prevent warnings for known non-standard directives:
Validation and parse issues
You can add validation warnings or errors during parsing using the ParseIssue class. This example shows common validation scenarios:
Issues are automatically logged when the robots.txt configuration is saved in the Administration. Use WARNING for recommendations and ERROR for critical problems that prevent proper generation.