We have hosted the application robotstxt in order to run this application in our online workstations with Wine or directly.
Quick description about robotstxt:
This is a high-performance, production-tested library for parsing and evaluating robots.txt rules against crawler user agents. It implements the core semantics of the Robots Exclusion Protocol: user-agent sections, Allow/Disallow directives, wildcard handling, and precedence rules. The code is optimized for speed and low memory so large crawls can evaluate millions of URLs quickly. It also focuses on correctness�edge cases like overlapping patterns and longest-match resolution are handled consistently. Consumers integrate it to decide whether a specific URL may be fetched by a particular bot name and to respect crawl-delay or sitemaps hints where applicable. The library serves both search-scale crawlers and smaller tools that need a reliable decision engine for polite crawling.Features:
- Fast parser and matcher for Allow/Disallow rules
- Correct handling of wildcards and longest-match precedence
- User-agent specific rule sections with sensible fallbacks
- Low-overhead evaluation for high-throughput crawlers
- Support for common extensions like Sitemap hints
- Clear API to check URL fetch permissions per bot name
Programming Language: C++.
Categories:
©2024. Winfy. All Rights Reserved.
By OD Group OU – Registry code: 1609791 -VAT number: EE102345621.