Skip to content

Commit

Permalink
Another attempt at fixing imports; rough version of block unwanted re…
Browse files Browse the repository at this point in the history
…quests
  • Loading branch information
S3Prototype committed Feb 7, 2025
1 parent 5f6a9b6 commit 2dfcbaa
Show file tree
Hide file tree
Showing 5 changed files with 67 additions and 64 deletions.
2 changes: 1 addition & 1 deletion docs/traffic-policy/examples/add-authentication.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import { JWTsRateLimiting } from "/traffic-policy/gallery/JWTsRateLimiting";
import { OAuthConditionalAccess } from "/traffic-policy/gallery/OAuthConditionalAccess";
import { OIDCIdentityToken } from "/traffic-policy/gallery/OIDCIdentityToken";

You can use Traffic Policy to add authentication to your endpoints, granting conditional access to traffic trying to reach your services.
You can use Traffic Policy to add authentication to your endpoints, granting conditional access to traffic trying to reach your services. This page demonstrates a few examples of doing so.

## JWT authentication

Expand Down
63 changes: 63 additions & 0 deletions docs/traffic-policy/examples/block-unwanted-requests.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
---
title: Block Unwanted Requests Examples
sidebar_label: Block Unwanted Requests
---

With Traffic Policy, you can block unwanted requests to your endpoints. This page demonstrates a few examples of doing so.

### Deny traffic from Tor networks

Use connection variables available in [IP Intelligence](/docs/traffic-policy/variables/ip-intel) to block Tor exit node IPs.

```yaml traffic-policy
on_http_request:
- expressions:
- "!('proxy.anonymous.tor' in conn.client_ip.categories)"
actions:
- type: deny
config:
status_code: 403
```
### Disallow bots and crawlers with a `robots.txt`

This rule returns a custom response with a [`robots.txt` file](https://developers.google.com/search/docs/crawling-indexing/robots/intro) to deny search engine or AI crawlers on all paths.

<AddRobotsTxt />

You can also extend the expression above to create specific rules for crawlers based on their user agent strings, like `ChatGPT-User` and `GPTBot`.

<AddRobotsTxtSpecific />

### Block bots and crawlers by user agent

In addition to, or instead of, denying bots and crawlers with a `robots.txt` file, you can also take action on only incoming requests that contain specific strings in the [`req.user_agent` request variable](/docs/http/traffic-policy/expressions/variables.mdx#requser_agent).

You can extend the expression to include additional user agents by extending `(chatgpt-user|gptbot)` like so: `(chatgpt-user|gptbot|anthropic|claude|any|other|user-agent|goes|here)`.

<BlockSpecificBots />

### Deny non-GET requests

This rule denies all inbound traffic that is not a GET request.

<Deny />

### Custom response for unauthorized requests

This rule sends a custom response with status code `401` and body `Unauthorized`
for requests without an Authorization header.

<CustomResponse />

### Block traffic from specific countries

Remain compliant with data regulations or sanctions by blocking requests originating from one or more countries using their respective [ISO country codes](https://en.wikipedia.org/wiki/List_of_ISO_3166_country_codes).

<BlockCountries />

### Limit request sizes

Prevent excessively large user uploads, like text or images, that might cause performance or availability issues for your upstream service.

<LimitSize />
60 changes: 0 additions & 60 deletions docs/traffic-policy/examples/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -24,66 +24,6 @@ Explore a curated collection of examples and configuration examples spanning fro

A number of these examples come from a longer article about how ngrok [makes policy management accessible](https://ngrok.com/blog-post/api-gateway-policy-management-examples) to developers, including a simple Go-based application for testing these and other configurations.


## Block unwanted requests

### Deny traffic from Tor networks

Use connection variables available in [IP Intelligence](/docs/traffic-policy/variables/ip-intel) to block Tor exit node IPs.

```yaml traffic-policy
on_http_request:
- expressions:
- "!('proxy.anonymous.tor' in conn.client_ip.categories)"
actions:
- type: deny
config:
status_code: 403
```
### Disallow bots and crawlers with a `robots.txt`

This rule returns a custom response with a [`robots.txt` file](https://developers.google.com/search/docs/crawling-indexing/robots/intro) to deny search engine or AI crawlers on all paths.

<AddRobotsTxt />

You can also extend the expression above to create specific rules for crawlers based on their user agent strings, like `ChatGPT-User` and `GPTBot`.

<AddRobotsTxtSpecific />

### Block bots and crawlers by user agent

In addition to, or instead of, denying bots and crawlers with a `robots.txt` file, you can also take action on only incoming requests that contain specific strings in the [`req.user_agent` request variable](/docs/http/traffic-policy/expressions/variables.mdx#requser_agent).

You can extend the expression to include additional user agents by extending `(chatgpt-user|gptbot)` like so: `(chatgpt-user|gptbot|anthropic|claude|any|other|user-agent|goes|here)`.

<BlockSpecificBots />

### Deny non-GET requests

This rule denies all inbound traffic that is not a GET request.

<Deny />

### Custom response for unauthorized requests

This rule sends a custom response with status code `401` and body `Unauthorized`
for requests without an Authorization header.

<CustomResponse />

### Block traffic from specific countries

Remain compliant with data regulations or sanctions by blocking requests originating from one or more countries using their respective [ISO country codes](https://en.wikipedia.org/wiki/List_of_ISO_3166_country_codes).

<BlockCountries />

### Limit request sizes

Prevent excessively large user uploads, like text or images, that might cause performance or availability issues for your upstream service.

<LimitSize />

## Manipulate headers

### Enrich your upstream service
Expand Down
2 changes: 1 addition & 1 deletion docs/traffic-policy/examples/rate-limit-requests.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ import { RateLimit } from "/traffic-policy/gallery/RateLimit.tsx";
import { RateLimitAuthentication } from "/traffic-policy/gallery/RateLimitAuthentication.tsx";
import { RateLimitPricing } from "/traffic-policy/gallery/RateLimitPricing.tsx";

With Traffic Policy, you can rate limit requests to your endpoints based on a variety of criteria.
With Traffic Policy, you can rate limit requests to your endpoints based on a variety of criteria. This page demonstrates a few examples of doing so.

## By endpoint

Expand Down
4 changes: 2 additions & 2 deletions docs/traffic-policy/examples/route-requests.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ sidebar_label: Route requests

import TabItem from "@theme/TabItem";
import Tabs from "@theme/Tabs";
import { BasedOnURL } from "@site/traffic-policy/gallery/route-requests/BasedOnURL";
import { BasedOnHeaders } from "@site/traffic-policy/gallery/route-requests/BasedOnHeaders";
import { BasedOnURL } from "../../../traffic-policy/gallery/route-requests/BasedOnURL";
import { BasedOnHeaders } from "../../../traffic-policy/gallery/route-requests/BasedOnHeaders";

You can use [CEL
interpolation](/docs/traffic-policy/concepts/cel-interpolation) to
Expand Down

0 comments on commit 2dfcbaa

Please sign in to comment.