How To Use Robots.txt For Magento 2

Ipad on Macbook Pro Beside Apple Magic Mouse

The Robots.txt file is a standard used by websites to communicate with web robots (also known as web crawlers or spiders). This file is used to instruct web robots how to crawl and index the website. It provides a way for website owners to restrict access to certain sections of the website that the owner does not want the robots to access.

Speed Up Your Magento Store by 300%

If you are using Magento 2 for your online store, you can use the Robots.txt file to control how search engines and other bots crawl and index your website. This article will provide you with a guide on how to use Robots.txt for Magento 2.

What is Robots.txt

Robots.txt is a file that is located at the root of every website. The Robots.txt file provides instructions to web robots on which part of the website it can crawl and which part it cannot crawl. It is used to prevent web crawlers from indexing certain pages of the website, such as duplicate content, login pages, and admin pages.

Why Use Robots.txt

Using a Robots.txt file is an excellent way to control how search engines and bots crawl your website. It is particularly useful if you have pages on your site that you do not want to be indexed. You can prevent search engines from indexing specific pages by including them in the Robots.txt file.

How To Use Robots.txt For Magento 2

To generate a Robots.txt file for Magento 2, follow the steps below:

Step 1

Log in to your Magento 2 back end and click on the "Content" menu.

Step 2

Click on the "Design" option under the "Configuration" tab.

Step 3

Select the theme for which you want to generate a Robots.txt file.

Step 4

Click on the "Search Engine Robots" tab.

Step 5

Under the "Default Robots" section, select the option that applies to your website. The most common options are "INDEX, FOLLOW," "NOINDEX, FOLLOW," "INDEX, NOFOLLOW," and "NOINDEX, NOFOLLOW."

Step 6

Under the "Edit Custom Instructions" section, enter any custom instructions that you want to include in the Robots.txt file. For example, you can use this section to allow or disallow access to certain parts of your website.

Step 7

Click on the "Save Config" button to generate the Robots.txt file.

Final Thoughts

Using the Robots.txt file is an essential part of maintaining your website’s SEO performance. By providing instructions to search engines and bots on how to crawl your website, you can ensure that your website is crawled in the most efficient way possible and that your content is delivered to the right audience. With the instructions provided here, you can generate a Robots.txt file for your Magento 2 website and configure it appropriately.

Scroll to Top