Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/logocomune/botdetector
BotDetector is a golang library that detects Bot/Spider/Crawler from user agent
https://github.com/logocomune/botdetector
botdetector bots crawler go golang golang-library spider user-agent
Last synced: 4 days ago
JSON representation
BotDetector is a golang library that detects Bot/Spider/Crawler from user agent
- Host: GitHub
- URL: https://github.com/logocomune/botdetector
- Owner: logocomune
- License: mit
- Created: 2020-03-04T22:03:44.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2024-10-04T20:00:50.000Z (about 1 month ago)
- Last Synced: 2024-10-05T06:54:34.539Z (about 1 month ago)
- Topics: botdetector, bots, crawler, go, golang, golang-library, spider, user-agent
- Language: Go
- Homepage:
- Size: 6.72 MB
- Stars: 7
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# BotDetector
[![Build Status](https://app.travis-ci.com/logocomune/botdetector.svg?branch=master)](https://app.travis-ci.com/logocomune/botdetector)
[![Go Report Card](https://goreportcard.com/badge/github.com/logocomune/botdetector)](https://goreportcard.com/report/github.com/logocomune/botdetector)
[![codecov](https://codecov.io/gh/logocomune/botdetector/branch/master/graph/badge.svg)](https://codecov.io/gh/logocomune/botdetector)BotDetector is a Go library that detects bots, spiders, and crawlers from user agents.
## Installation
`go get -u github.com/logocomune/botdetector/v2`
## Usage
### Simple usage
```go
userAgent := req.Header.Get("User-Agent")detector, _ := botdetector.New()
isBot := detector.IsBot(userAgnet)if isBot {
log.Println("Bot, Spider or Crawler detected")
}```
### Adding Custom Rules
You can add custom detection rules with the `WithRules` method. For example:
```go
userAgent := req.Header.Get("User-Agent")detector, _ := botdetector.New(WithRules([]string{"my rule", "^test"}))
isBot := detector.IsBot(userAgent)if isBot {
log.Println("Bot, Spider or Crawler detected")
}```
Custom Rule Patterns:
| pattern | description |
|---------|-----------------------------------------------------------|
| "..." | Checks if the string contains the specified pattern. |
| "^..." | Checks if the string starts with the specified pattern. |
| "...$" | Checks if the string ends with the specified pattern. |
| "^...$" | Checks if the string strictly matches the entire pattern. |In this example, the custom rules "my rule" and "^test" are added to the existing detection rules.
### Adding Cache
You can add a lru cache rules with the `WithCache` method. For example:```go
userAgent := req.Header.Get("User-Agent")detector, _ := botdetector.New(WithCache(1000))
isBot := detector.IsBot(userAgent)if isBot {
log.Println("Bot, Spider or Crawler detected")
}```
### Example
[Simple example](_example/main.go)
## Inspiration
BotSeeker is inspired by [CrawlerDetect](https://github.com/JayBizzle/Crawler-Detect), an awesome PHP project.