As you know, configuring robot.txt is important to any website that is working on a site’s SEO. Particularly, when you configure the sitemap to allow search engines to index your store, it is necessary to give web crawlers the instructions in the robot.txt file to avoid indexing the disallowed sites. The robot.txt file, that resides in the root of your Magento installation, is directive that search engines such as Google, Yahoo, Bing can recognize and track easily. In this post, I will introduce the guides to configure the robot.txt file so that it works well with your site.
Stores. In the
Generalin the panel on the left
Search Engine Robotssection, and continue with following:
Default Robots, select one of the following:
Edit Custom instruction of robots.txt Filefield, enter custom instructions if needed.
Reset to Defaults field, click on
Reset to Default button if you need to restore the default instructions.
Disallow: /lib/ Disallow: /*.php$ Disallow: /pkginfo/ Disallow: /report/ Disallow: /var/ Disallow: /catalog/ Disallow: /customer/ Disallow: /sendfriend/ Disallow: /review/ Disallow: /*SID=
It comes to the end of tutorial: How to Configure Robots.txt in Magento 2.