Robots.txt is an ordinary text document that lies at the root of a website and informs search robots about which pages and files they should crawl and index.
To manage crawler traffic on your site, the SoloMono team developed the Editable robots.txt module, which contains instructions that tell crawlers which URLs on your site they are allowed to process. With it, you can limit the number of crawl requests and thereby reduce the load on the site. The robots.txt file is not intended to prevent your content from appearing in Google search results. If you don't want any pages on your site to be submitted to Google, add a noindex directive to them or make them password-only.
Leave your review