robots.txt for Confluence

for Confluence Server 6.0.1 - 7.12.3 and more
6 installs
  • Supported

Avoid overloading your confluence with undesired requests

Add robots.txt for Confluence

robots.txt fie and inform search engines and other crawlers: which pages and resources they can or can't request.

Avoid overloading your confluence with requests

Do search engines overload your confluence? inform search engines and other crawlers: which pages and resources they can or can't request.

Set the Crawl speed of robots

with Crawl-delay property you can tell robots that they should be crawling no more than one page per x seconds.

Privacy and security

Privacy policy

Atlassian's privacy policy is not applicable to the use of this app. Please refer to the privacy policy provided by this app's vendor.

Vendor privacy policy

Security

This app is not part of the Marketplace Bug Bounty program. Learn more

Resources