Robots.txt is a text file on the server that you can customize for search engine bots. It tells search engine bots which directories, web pages, or links should be indexed or not be indexed in search results. It means you can restrict search engine bots to crawl some directories and web pages or links of your website or blog. Now custom robots.txt is available for Blogspot.
In Blogger search option is related to Labels. If you are not using labels wisely, you should disallow the crawl of the search result pages. In Blogger, by default, the search link is disallowed to crawl. In this robots.txt, you can also write the location of your sitemap file. A sitemap is a file located on a server that contains all posts’ permalinks of your website or blog. Mostly sitemap is found in XML format, i.e., sitemap.xml.
Presently Blogger has completed his work on sitemap.xml. Now Blogger is reading sitemap entries through the feed. By this method, the most recent 25 posts are submitted to search engines. If you want search engine, bots, only work on the most recent 25 posts, and then you should use robots.txt type 1 given below. If you set robots.txt like this, then the Google Adsense bot can crawl the entire blog for the best Adsense performance.
Robots.txt
or
User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchAllow: /Sitemap: https://www.xdflix.com/sitemap.xml
Sitemap - (pages 150 above) added additional
Internally, the XML sitemap generator counts all the blog posts that are available in your Blogger blog. It then splits the posts in batches of 151 posts each and generates multiple XML feed for each batch. Thus search engines will be able to discover every single post on your blog since it would be part of one of these XML sitemaps.
Manage Blogger custom robots.txt
For this, please follow these steps carefully. Dashboard ›› Blog’s Settings ›› Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes To get a better understanding of this you can take reference of the image given below:
And we are done. Search engines will automatically discover your XML sitemap files via the robots.txt file and you don’t have to ping them manually.
Test your robots.txt with the robots.txt Tester
The robots.txt Tester tool shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawl the URL of an image you wish to block from Google Image Search.
0 Yorumlar