How to: Force Search Engines for Quick Indexation in WordPress?

No votes yet.
Please wait...

Search engine bots are very clever, they read the most important files from our blog like robots.txt and sitemap.xml. Usually, they directly read robots.txt (as Google does) and decide that what kind of data they have to index from your site in the search engines also exclude those which are not. This file contains the HTML or XML sitemaps URLs which then indicate the search engines about the whole structure of your site. I was trying to find robot.txt file, and edit that file to add the sitemaps URLs in it for better crawlment! Today, we’ll show you how to implement SEO using robots.txt file in WordPress to rank well in search engines. The Robots.txt file in WordPress

Quick Indexation in WordPress using robots.txt File

There is also a WordPress plugin which lets you edit your blog robots.txt file, to edit read this article. But we should create a new robots.txt file and make that pretty good to which search engines love by following the steps below:

1. Create a robots.txt file in the main root directory where these of the following folders are located:

  • wp-admin
  • wp-content
  • wp-includes

2. Add the following code in your robots.txt file which especially created for your WordPress blog:

User-agent: *
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /tag

User-agent: Mediapartners-Google
Allow: /

User-agent: Adsbot-Google
Allow: /

User-agent: Googlebot-Image
Allow: /wp-content/uploads/

User-agent: Googlebot-Mobile
Allow: /


Note: robots.txt file tells the crawlers which directories should or shouldn’t be crawled. Don’t make changes in the above codes if you don’t have enough knowledge about it. To add HTML site map in robots.txt paste the url below xml one like this:


Add this code to header.php

The purpose of adding the following code in your WordPress blog’s header.php file is to block strongly search engine bots not to list 404 not found error pages in search results which usually tend to increase the bounce rate and give the bad user experience.

<!--?php if(is_single() || is_page() || is_category() || is_home()) { ?--><meta name="robots" content="all,noodp" />
<!--?php } ?-->
<!--?php if(is_archive()) { ?--><meta name="robots" content="noarchive,noodp" />
<!--?php } ?-->
<!--?php if(is_search() || is_404()) { ?--><meta name="robots" content="noindex,noarchive" />
<!--?php } ?-->

Similar Posts

One Comment

  1. Thanks for your explanation, will add mine right now and see how things work out.

    No votes yet.
    Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *