Skip to content

SEO (OrchardCore.Seo)

Provides Search Engine Optimization (SEO) features:

robots.txt File

Starting at version 1.7, the feature of creating a robots.txt file via site settings was introduced. This feature allows website owners to easily define the directives for search engine crawlers and other web robots accessing their site. By default, the following settings are provided in the robots.txt file:

User-agent: *
    This directive applies to all web robots.
Disallow: /Admin/
    This directive specifies that web robots should not access the /admin directory, which is commonly used for administrative purposes.

These default settings aim to provide a basic configuration that ensures search engines can access the necessary files and directories while restricting access to sensitive areas of the site. However, website owners can modify these settings according to their specific requirements by navigating to the admin dashboard then Configuration >> Settings >> SEO.

Note

If the Sitemaps feature is enabled, all sitemap indexes and sitemaps are added to the robots.txt by default.

Warning

If the site's filesystem contains a robots.txt, this file will take precedence and the site settings to generate the files will be ignored.

Video