-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Another attempt at fixing imports; rough version of block unwanted re…
…quests
- Loading branch information
1 parent
5f6a9b6
commit 2dfcbaa
Showing
5 changed files
with
67 additions
and
64 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
--- | ||
title: Block Unwanted Requests Examples | ||
sidebar_label: Block Unwanted Requests | ||
--- | ||
|
||
With Traffic Policy, you can block unwanted requests to your endpoints. This page demonstrates a few examples of doing so. | ||
|
||
### Deny traffic from Tor networks | ||
|
||
Use connection variables available in [IP Intelligence](/docs/traffic-policy/variables/ip-intel) to block Tor exit node IPs. | ||
|
||
```yaml traffic-policy | ||
on_http_request: | ||
- expressions: | ||
- "!('proxy.anonymous.tor' in conn.client_ip.categories)" | ||
actions: | ||
- type: deny | ||
config: | ||
status_code: 403 | ||
``` | ||
### Disallow bots and crawlers with a `robots.txt` | ||
|
||
This rule returns a custom response with a [`robots.txt` file](https://developers.google.com/search/docs/crawling-indexing/robots/intro) to deny search engine or AI crawlers on all paths. | ||
|
||
<AddRobotsTxt /> | ||
|
||
You can also extend the expression above to create specific rules for crawlers based on their user agent strings, like `ChatGPT-User` and `GPTBot`. | ||
|
||
<AddRobotsTxtSpecific /> | ||
|
||
### Block bots and crawlers by user agent | ||
|
||
In addition to, or instead of, denying bots and crawlers with a `robots.txt` file, you can also take action on only incoming requests that contain specific strings in the [`req.user_agent` request variable](/docs/http/traffic-policy/expressions/variables.mdx#requser_agent). | ||
|
||
You can extend the expression to include additional user agents by extending `(chatgpt-user|gptbot)` like so: `(chatgpt-user|gptbot|anthropic|claude|any|other|user-agent|goes|here)`. | ||
|
||
<BlockSpecificBots /> | ||
|
||
### Deny non-GET requests | ||
|
||
This rule denies all inbound traffic that is not a GET request. | ||
|
||
<Deny /> | ||
|
||
### Custom response for unauthorized requests | ||
|
||
This rule sends a custom response with status code `401` and body `Unauthorized` | ||
for requests without an Authorization header. | ||
|
||
<CustomResponse /> | ||
|
||
### Block traffic from specific countries | ||
|
||
Remain compliant with data regulations or sanctions by blocking requests originating from one or more countries using their respective [ISO country codes](https://en.wikipedia.org/wiki/List_of_ISO_3166_country_codes). | ||
|
||
<BlockCountries /> | ||
|
||
### Limit request sizes | ||
|
||
Prevent excessively large user uploads, like text or images, that might cause performance or availability issues for your upstream service. | ||
|
||
<LimitSize /> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters