.
Back to Top

0 Prevent Search Engines From Indexing User Specific Sitemaps

Google XML Sitemaps often assists websites getting indexed through the regular crawling of search engine's bots through the content. But that is not a completely good idea. It may have some reversal effect on your other web pages too.

Just try to search filetype:xml inurl:sitemap on Google and see the result. Your other webpages which are hard to find are not easily accessible due to regular crawling.

One of the most prominent SEO tips is the use of META tags but if your website has XML based sitemaps then you need another option.

Use of "nonindex x-robots-tag"
Using this tag you can restrict the bots to search your particular content. Open directory on your Apache web server, which is the location of your sitemap files and open ".htaccess" file. Add below line of codes on into it.
<IfModule mod_rewrite.c>
 <Files sitemapfile.xml>
  Header set X-Robots-Tag "noindex"
 </Files>
</Ifmodule>

Change 'sitemapfile' to an actual file name.
For the websites using multiple sitemaps use following line of codes.

<IfModule mod_rewrite.c>
 <Files ~ "^ (sitemapfile1|sitemapfile2|sitemapfile3)\.xml$">
  Header set X-Robots-Tag "noindex"
 </Files>
</Ifmodule>

Once you have done the changes save the file and go to URI Valet and insert your XML sitemaps URL and check. If you see X-Robots Tag with noindex in response that means that particular sitemap is blocked from search engine's crawling.
You must have now successfully implemented a "noindex x-robots tag" to your website.
Related Posts Plugin for WordPress, Blogger...

Zergnet