August 5, 2019
Since Google cancelled the support for Noindex rule in Robots.txt, we have brought you some ways to do it without this rule. One more thing, from Sept 1, 2019, Noindex rule will stop working. In this blog, we would like to share 5 defined ways when you don’t want any particular directories to get indexed or blocking of pages from Indexing. Here how you can do it:
- Meta Properties: Add a noindex meta tag in the page’s HTML code
- Use 404 and 410 HTTP status codes
- Particular page and directories can be Password protected
- Use rule of Disallow in robots.txt
- In Google webmaster, you can do it via remove URL option in Search Console