Robots exclusion standard (robots.txt)
Jump to navigation
Jump to search
Elastic App Search web crawler
Failed to fetch robots.txt: SSL certificate chain is invalid [unable to find valid certification path to requested target]. Make sure your SSL certificate chain is correct. For self-signed certificates or certificates signed with unknown certificate authorities, you can add your signing certificate to Enterprise Search Crawler configuration. Alternatively, you can disable SSL certificate validation (non-production environments only).
User-agent:
Related
Advertising: