Google had internal discussions about allowing the noindex

Indonesia Data Forum Pioneering and Big Data Growth
Post Reply
samiaseo75
Posts: 128
Joined: Tue Dec 17, 2024 3:13 am

Google had internal discussions about allowing the noindex

Post by samiaseo75 »

Google's John Mueller revealed in a recent LinkedIn post that Google has been having internal discussions about implementing a noindex directive in robots.txt . This directive, which is currently non-standard, would allow publishers to block both crawling and indexing of content.

Mueller explained that the idea of ​​introducing the noindex directive was considered 10 years ago. The reason was to make it easier for publishers to block content from being indexed without having to use a robots.txt file and robots meta tags.

Ultimately, Google decided not to support this dentist data directive, fearing that it could easily lead to critical parts of a website being accidentally removed from the search engine index. Many people copy and paste robots.txt files without careful review, potentially inadvertently blocking important content.

Mueller's robots.txt file has been causing a stir over the past week due to its unusual use of non-standard directives. Some SEOs believe that Mueller's robots.txt file is serving as a testing ground for various experiments. Others believe that Mueller is simply making mistakes.

Whatever the truth, this episode highlights the importance of following robots.txt standards. Using non-standard directives can lead to unpredictable results and make the job of search engine crawlers more difficult.
Post Reply