Closes #30 Introduces the "challenge" field in bot rule definitions: ```json { "name": "generic-bot-catchall", "user_agent_regex": "(?i:bot|crawler)", "action": "CHALLENGE", "challenge": { "difficulty": 16, "report_as": 4, "algorithm": "slow" } } ``` This makes Anubis return a challenge page for every user agent with "bot" or "crawler" in it (case-insensitively) with difficulty 16 using the old "slow" algorithm but reporting in the client as difficulty 4. This is useful when you want to make certain clients in particular suffer. Additional validation and testing logic has been added to make sure that users do not define "impossible" challenge settings. If no algorithm is specified, Anubis defaults to the "fast" algorithm. Signed-off-by: Xe Iaso <me@xeiaso.net>
12 lines
573 B
Text
12 lines
573 B
Text
---
|
|
title: Proof-of-Work Algorithm Selection
|
|
---
|
|
|
|
Anubis offers two proof-of-work algorithms:
|
|
|
|
- `"fast"`: highly optimized JavaScript that will run as fast as your computer lets it
|
|
- `"slow"`: intentionally slow JavaScript that will waste time and memory
|
|
|
|
The fast algorithm is used by default to limit impacts on users' computers. Administrators may configure individual bot policy rules to use the slow algorithm in order to make known malicious clients waitloop and do nothing useful.
|
|
|
|
Generally, you should use the fast algorithm unless you have a good reason not to.
|