Get 15% off Mageplaza extensions & subscriptions! Use code HIMAY at checkout.

How to Config Magento 2 robots.txt for SEO?


The Most Popular Extension Builder for Magento 2

With a big catalog of 224+ extensions for your online store

If your website is like a house, the Robots.txt file is the rule when entering that house. The first thing when visitors come home is to read the rules of the house and to know whether the host allows it to visit or not.

Robots.txt is a text file that webmasters create to guide robots (search engine spiders) how to crawl and index pages on their site. Therefore, proper configuration of Robots.txt file is very important. If your website has sensitive information, do not want to the public, please set up here. In addition, the reasonable configuration also helps you very well in SEO.

Importance of Robots.txt in SEO

Robots.txt has an important role in SEO. It helps search engines automatically reach the pages you want to search and index the page. However, most web pages have directories or files that do not require the search engine robots to visit. Adding robot files will greatly assist you in SEO.

Magento SEO Services
by Mageplaza

Let experienced professionals optimize your website's ranking

Learn more
Magento SEO service

Configuration of the Robots.txt for SEO

In essence, Robots.txt is a very simple text file placed in the host’s root directory. You can use any text editor to create. For example Notepad. Below is a simple robots.txt structure of WordPress:

robots.txt stop

  • User-agent: * All bots are allowed to access.
  • Allow: / Allows to detect and index entire pages and directories
  • Disallow: /admin/ Block two wp-admin and wp-includes folders
  • Sitemap: Diagram of the website

How to use the Robot.txt

  • Do not allow bots access to any directory you do not want
  • Block one page
  • Block a certain bot
  • Remove one image from Google Images
  • Use both “Allow” and “Disallow” together
  • Lock the whole site not for bots to index

Some mistakes to avoid when using robot.txt file

When you re-use someone’s robots.txt or create your own robots.txt for your website, having some mistakes are inevitable.

  • Differentiate uppercase and lowercase letters.
  • Each statement should be written on one line.
  • Do not write excess or lack of white space.
  • Do not insert any other characters except the command syntax.


If there is no Robots.txt, search engines will have a free run to crawl and index anything that they find on the website. However, creating Robots.txt will help search engines crawl from your site. So your website can be more appreciated.

Magento 2 SEO plugin from Mageplaza includes many outstanding features that are auto-active when you install it without any code modifications. Besides that, it is also friendly and helps your SEO better.

Related Post

Website Support & Maintenance Services

mageplaza services

Make sure your store is not only in good shape but also thriving with a professional team yet at an affordable price.