How to Use Custom Robots.txt for more than 500 blog post with Your Blogger? Blogger Better SEO | Organic Traffic

How to Use Custom Robots.txt for more than 500 blog post with Your Blogger?  Blogspot | Blogger SEO | Organic Traffic 

Hi beginners here I have given some tips for How to Use Custom Robots.txt for more than 500 blog post with Your Blogger? Blogspot | Blogger SEO | Organic Traffic. 


Custom Robots.txt for Best SEO


SEO Highlights 

  • What is Robots.Txt?
  • Robots.Txt Method 1 for default
  • Robots.Txt Method 2 for most recent post
  • Robots.Txt Method 3 for more than 500 post

Robots.txt is a text file on the server that you can customize for search engine bots. It says search engine bots which directories, web pages, or links should be indexed or not be indexed in search results. It means you can restrict search engine bots to crawl some directories and web pages or links of your website or blog. 

Now the custom robots.txt is available for Blogspot. In Blogger search option is related to Labels. If you are not using labels wisely, you should disallow the crawl of the search result pages. In Blogger, by default, the search link is disallowed to crawl. In this robots.txt, you can also write the location of your sitemap file. A sitemap is a file located on a server that contains all posts’ permalinks of your website or blog. Mostly sitemap is found in XML format, i.e., sitemap.xml.

Also check Best 5 Ways To Optimize Your Blog Posts for get more views (Organic Traffic) Best SEO for blogger 2021

How Custom Robots.txt working? 

Blogger has completed it's work on sitemap.xml. Now Blogger is reading sitemap entries through the feed. By this method, the most recent 25 posts are submitted to search engines. If you want search engine, bots, only work on the most recent 25 posts, and then you should use robots.txt type 1 given below. If you set robots.txt like this, then the Google Adsense bot can crawl the entire blog for the best Adsense performance and page ranking. 

Custom Robots.txt Method 1

User-agent: Mediapartners-Google

Disallow: 

User-agent: *

Disallow: /search

Disallow: /b

Allow: /

Sitemap: https://www.ramrw7.blogspot.com.com/sitemap.xml


Custom Robots.txt Method 2


User-agent: Mediapartners-Google

Disallow: 

User-agent: *

Disallow: /search

Disallow: /b

Allow: /

Sitemap:https://www.ramrw7.blogspot.com/feeds/posts/default?orderby=updated


Note: Don’t forget to change the https://www.ramrw7.blogspot.com with your blog address or a custom domain address. If you want search engine bots to crawl the most recent 500 posts, you should need to use the following robots.txt type 2. If you already have more than 500 posts on your blog, you can add one more sitemap line highlighted in red. Robots.txt


Custom Robots.txt Method 3 for more than 500 post. 

User-agent: Mediapartners-Google

Disallow: 

User-agent: *

Disallow: /search

Disallow: /b

Allow: /

Sitemap:https://www.ramrw7.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

Sitemap:https://www.ramrw7.blogspot.com/atom.xml?redirect=false&start-index=501&max-results=1000


Website for making Custom robots.txt 

https://www.labnol.org/blogger/sitemap/

https://www.xml-sitemaps.com/


Manage Blogger custom robots.txt

This is very important steps to follow carefully. Dashboard ›› Blog’s Settings ›› Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes. For better understanding of Custom robots.txt insertion see the image  below:


Custom robots.txt for better SEO


Post a Comment

If you any doubts, Please let me know

Previous Post Next Post