robocop
Crawler meta tag
A middleware that adds a meta tag to HTTP responses to instruct search engines on how to crawl the content.
Robocop is a simple Rack middleware that inserts the X-Robots-Tag into the headers of all your responses
3 stars
1 watching
0 forks
Language: Ruby
last commit: about 12 years ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
| A flexible web crawler that follows robots.txt policies and crawl delays. | 787 |
| Generates meta tags based on page content using Rack Middleware | 2 |
| A Ruby on Rails plugin for generating and managing meta tags. | 440 |
| A concurrent web crawler written in Go that allows flexible and polite crawling of websites. | 2,036 |
| Generates standardized metadata tags for search engines and social networks to improve website indexing and display. | 1,663 |
| A Ruby gem for web scraping and extracting metadata from web pages. | 1,038 |
| A web crawler designed to crawl websites while obeying robots.txt rules, rate limits and concurrency limits, with customizable content handlers for parsing and processing crawled pages. | 380 |
| A Rust-based library for building and managing cryptocurrency crawlers | 235 |
| An OSINT bot that crawls pastebin sites to search for sensitive data leaks | 634 |
| Extracts metadata from public documents found on websites, useful for brute-force attacks. | 1,050 |
| A Ruby-based tool for web crawling and data extraction, aiming to be a replacement for paid software in the SEO space. | 143 |
| A Rails gem for managing tags with PostgreSQL array columns in a flexible and efficient way | 53 |
| A Ruby web crawling library that provides flexible and customizable methods to crawl websites | 809 |
| Performs web page crawling at high performance. | 51 |
| A Pythonic framework for building high-speed web crawlers with flexible data extraction and storage options. | 188 |