Provides Search Engine Optimization (SEO) features:
- Meta description, keywords, robots, and custom meta tags
- Canonical URL
- Open Graph metadata
- Twitter Card Tags
- Google schema
Starting at version 1.7, the feature of creating a robots.txt file via site settings was introduced. This feature allows website owners to easily define the directives for search engine crawlers and other web robots accessing their site. By default, the following settings are provided in the robots.txt file:
This directive applies to all web robots.
This directive specifies that web robots should not access the /admin directory, which is commonly used for administrative purposes.
These default settings aim to provide a basic configuration that ensures search engines can access the necessary files and directories while restricting access to sensitive areas of the site. However, website owners can modify these settings according to their specific requirements by navigating to the admin dashboard then Configuration >> Settings >> SEO.
If the Sitemaps feature is enabled, all sitemap indexes and sitemaps are added to the
robots.txt by default.
If the site's filesystem contains a
robots.txt, this file will take precedence and the site settings to generate the files will be ignored.