ROBOT.TXT: Optimize Blogger for Crawlers and indexing

Custom Robot Header Tags and ROBOT.TXT: Optimize Blogger for Crawlers and indexing

In this short guide, I will explain to you how to Optimize Blogger for Crawlers and indexing using Custom Robot Header Tags and ROBOT.TXT

 

robot.txt is responsible for recognizing updates on our site and notifying search engines about them.


With the help of appropriately defined tags, you can communicate with robots to increase or decrease search visibility.


Before we get started with optimizing your Blogger, you should be familiar with header labels and their meaning.

 

{tocify} $title={Table of Contents}


Custom robot header Tags and their meaning


  • all: This feature allows you to visit the browser and discover every single element of your blog. If you enable this feature, you will give the crawler complete freedom to index everything.
  • noindex: If you do not want to share your blog publicly, these options prevent the search engine indexing tool from indexing the blog. So no one will see a blog through search engines.
  • nofollow: Blogger offers this feature that allows you to make all outgoing links of your blog nofollow.
  • nothing: none is a combination of noindex and nofollow tags.
  • noarchive: This feature is to control the permission of the search engine cache. Search engines cache your website and display it on the search engine results page (SERP). The cache is an updated copy of your website, used to serve your website in times of downtime by search engines.
  • nosnippet: It helps people get an idea of ​​the content of a website. Enabling this feature will prevent search engines from displaying this snippet of text.
  • noodp: ODP stands for “Open Directory Project” as Dmoz. You can prevent the addition of information about your website.
  • notranslate: This option allows you to disable the translation of your blog in different languages.
  • noimageindex: This option prevents search engines from indexing your blog's images. Images are a major part of blogging and can increase the organic traffic of your blog.
  • unavailable_after: This option is used to de-index your blog after a certain time.

 

How do I set Custom Robot Header Tags? 


Settings> Crawlers and indexing. Here you can see the option for Enable custom robots header tags.

 

Set everything as in the pictures below and click save.


How do i set custom robot header tags 1

custom robot tags settings blogger optimizing

What is the purpose of a robot.txt file, and how to insert it?


A robot.txt file is a plain text file that we use to instruct crawlers (spiders) on how to search and index pages on a site. For example, Google (Googlebot) has crawlers that search the internet and index all the websites they come across.

How to configure robot.txt

Go to Settings>Enable custom robots.txt, click on Custom robots.txt and copy all this:

If you use a blogspot or custom domain name:

User-agent: *
Disallow: /search

Sitemap: https://yourblogname.com/sitemap.xml

If you also want to index a blogger's page:

User-agent: *
Disallow: /search

Sitemap: https://yourblogname.com/sitemap.xml
Sitemap: https://yourblogname.com/sitemap-pages.xml

You can use this Custom Robot.txt Generator to create your robot.txt file. 

All you have to do is enter the complete URL of your site (https://www.yoursitename.com) and click Generate Custom Robot.txt

configure robot.txt blogger

add custom robot.txt to blogger

Now go to: https://www.google.com/webmasters/tools/

You'll see something like this:

add website to google search console

Enter the address of your site in URL prefix.

Once you've done that, you'll see the following:

google search console go to property

 Once you've done that, you'll see the following:

google search console sitemap

Now all you have to do is add a sitemap, click on Sitemaps and add the following:
 
sitemap.xml
sitemap-pages.xml
add sitemap in google search console
add sitemap-pages in google search console

 After adding a Sitemap to the Google Search Console, it should look like this.

added sitemap in google search console

Conclusion

Crawlers and indexing settings can play an essential role in SEO blog posts. A sitemap makes it easier for search engines, in this case, Google, to crawl all pages on your blog.

When you post a new blog post, you want that post to be available to the general public on Google as soon as possible and ranked in a web browser.

Read also: How to Get Google to Index Your Blogger Site Faster

I am sure that if you follow the steps described above, there will be no problem. If you still have questions, write in the comments section.


Jasmin K.

SEO Blogger Tips The main goal of this site is to provide quality tutorials, tips, courses, tools, and other resources that allow anyone to work online and master digital marketing.

4 Comments

  1. Can I use two properties on a search console

    ReplyDelete
    Replies
    1. Yes, but only for different websites. It is not recommended for the same website, www.yoursite.com, and yoursite.com. This way, you create duplicate content, which can be bad for SEO.

      Delete
  2. please my sitemap is not showing "couldn't fetch" please why is it like that?

    ReplyDelete
  3. Please I have added the site map and it works, but in the page's, is showing zero. Can you teach me how to create pages for blogger

    ReplyDelete
Post a Comment
Previous Post Next Post