ARTICLE AD BOX
Advertisement
Google limits robots.txt enactment to 4 fields, clarifying its stance connected unsupported directives.
- Google lone supports 4 circumstantial robots.txt fields.
- Unsupported directives successful robots.txt volition beryllium ignored.
- Consider auditing your robots.txt files successful airy of this update.

In a caller update to its Search Central documentation, Google has clarified its presumption connected unsupported fields successful robots.txt files.
Key Update
Google has stated that its crawlers don’t enactment fields not listed successful its robots.txt documentation.
This clarification is portion of Google’s efforts to supply unambiguous guidance to website owners and developers.
Google states:
“We sometimes get questions astir fields that aren’t explicitly listed arsenic supported, and we privation to marque it wide that they aren’t.”
This update should destruct disorder and forestall websites from relying connected unsupported directives.
What This Means:
- Stick to Supported Fields: Use lone the fields explicitly mentioned successful Google’s documentation.
- Review Existing Robots.txt Files: Audit existent robots.txt files to guarantee they don’t incorporate unsupported directives.
- Understand Limitations: Google’s crawlers whitethorn not admit definite third-party oregon customized directives.
Supported Fields:
According to the updated documentation, Google officially supports the pursuing fields successful robots.txt files:
- user-agent
- allow
- disallow
- sitemap
Notable Omissions:
While not explicitly stated, this clarification implies that Google doesn’t enactment commonly utilized directives similar “crawl-delay,” though different hunt engines whitethorn admit them.
Additionally, it’s worthy noting that Google is phasing retired enactment for the ‘noarchive‘ directive.
Looking Ahead:
This update is simply a reminder to enactment existent with authoritative guidelines and champion practices.
It highlights the request to usage documented features alternatively than assuming enactment for undocumented directives.
Consult Google’s authoritative Search Central documentation for much elaborate accusation connected robots.txt implementation and champion practices.
Featured Image: Thomas Reichhart/Shutterstock
SEJ STAFF Matt G. Southern Senior News Writer astatine Search Engine Journal
Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s grade successful communications, ...