, , , , ,

How to Edit & Optimize WordPress Robots.txt File for SEO

Optimizing-WordPress-Robots

Have you optimized your WordPress Robots.txt petition for SEO?

On the off-chance that you haven’t, you are overlooking an essential part of SEO. Robots.txt document assumes a critical part of your site’s SEO.

You are fortunate that WordPress consequently makes a Robots.txt petition for you. Having this record is half of the fight. You need to ensure that Robots.txt record is advanced to get the full advantages.

Robots.txt document advises internet searcher bots what pages to slither and what pages to stay away from. In this post, I will demonstrate to you best practices to alter and streamline Robots.txt record in WordPress.

What is Robots.txt File?

Robots.txt record is a content document which trains internet searcher bots how to creep and list a web page. At whatever point any web index bots go to your web page, it peruses the robots.txt record and takes after the guidelines. By utilizing this record, you can indicate bots which some portion of your site to slither and which part to maintain a strategic distance from. Be that as it may, the nonattendance of robots.txt won’t stop web search tool bots to creep and list your web page.

Editing & Understanding Robots.txt in WordPress

I’ve as of now said that each WordPress site has a default robots.txt document in root registry.

In the event that you don’t have a robots.txt record, you’ll need to make one. It’s anything but difficult to do. Simply make a content record in your PC and spare it as robots.txt and transfer it to your root index. You can transfer it by means of FTP Manager or cPanel File Manager.

Presently how about we perceive how to alter your robots.txt record.

You can alter your robots.txt document by utilizing FTP Manager or cPanel File Manager. In any case, now is the ideal time to expand and somewhat troublesome.

An ideal approach to alter Robots.txt document is, utilizing a module. There are a few WordPress robots.txt modules out there. I incline toward Yoast SEO. This is the best SEO module for WordPress. I’ve officially shared how to set up Yoast SEO.

Yoast SEO permits you to adjust the robots.txt record from your WordPress administrator territory. Be that as it may, on the off chance that you would prefer not to utilize Yoast module, you can utilize different modules like WP Robots Txt.

Once you’ve installed and activated Yoast SEO plugin, go to WordPress Admin Panel > SEO > Tools.
How to Edit & Optimize WordPress Robots.txt File for SEO
Then click on “File editor”.

Then you need to click on “Create robots.txt file”.

Then you will get the Robots.txt file editor. You can configure your robots.txt file from here.

Before editing the file, you need to understand the commands of the file. There are three commands mainly.

User-agent – Defines the name of the search engine bots like Googlebot or Bingbot. You can use an asterisk (*) to refer to all search engine bots.
Disallow – Instructs search engines not to crawl and index some parts of your site.
Allow – Instructs search engines to crawl and index which parts you want to index.
Here’s a sample of Robots.txt file.

User-agent: *
Disallow: /wp-admin/
Allow: /

This robots.txt record teaches all web crawler bots to slither the web page. In the second line, it advises web crawler bots not to slither the/wp-administrator/part. In the third line, it educates web crawler bots to creep and list entire site.

Designing and Optimizing Robots.txt File for SEO

A basic misconfigure in Robots.txt record can totally deindex your site from web crawlers. For instance, in the event that you utilize the order “Deny:/” in Robots.txt record, your site will be deindexed from web indexes. So you should be watchful while designing.

Another essential thing is an enhancement of Robots.txt record for SEO. Before heading off to the accepted procedures of Robots.txt SEO, I’d get a kick out of the chance to caution you about some terrible practices.

  • Try not to utilize Robots.txt record to conceal low-quality substance. The best practice is to utilize no index and no follow meta tag. You can do this by utilizing Yoast SEO module.
  • Try not to utilize Robots.txt record to stop web indexes to file your Categories, Tags, Archives, Author pages, and so on. You can include no follow and no index meta labels to those pages by utilizing Yoast SEO module.
  • Try not to utilize Robots.txt record to deal with copy content. There are different ways.

Presently how about we perceive how you can make Robots.txt record SEO amicable.

  1. At to start with, you have to figure out which parts of your site you don’t need internet searcher bots to creep. I incline toward forbidding/wp-administrator/,/wp-content/modules/,/readme.html,/trackback/.
  2. Including “Permit:/” subordinates on Robots.txt document is not all that critical as bots will slither your site in any case. Be that as it may, you can utilize it for the specific boat.
  3. Adding sitemaps to Robots.txt document is additionally a decent practice.

Here’s an example of an ideal Robots.txt file for WordPress.

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-content/plugins/
Disallow: /readme.html
Disallow: /trackback/
Disallow: /go/
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Sitemap: https://test.com/post-sitemap.xml
Sitemap: https://test.com/page-sitemap.xml

You can check RTB Robots.txt file here: https://test.com/robots.txt

Testing Robots.txt File in Google Webmaster Tools

After updating your Robots.txt file, you have to test the Robots.txt file to check if any content is impacted by the update.

You can use Google Search Console to check if there is any “Error” or “Warning” for your Robots.txt file. Just login to Google Search Console and select the site. Then go to Crawl > robots.txt Tester and click on “Submit” button.

A box will be popped up. Just click on “Submit” button.

Then reload the page and check if the file is updated. It might take some time to update the Robots.txt file.

If it hasn’t updated yet, you can enter your Robots.txt file code into the box to check if there are any errors or warnings. It will show the errors and warnings there.

If you notice any errors or warnings in the robots.txt file, you have to fix it by editing the robots.txt file.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *