Proxy Rate Limiting
Proxy rate limiting allows rate limiting tasks to be performed at a centralized location. In typical rate limiting, each runtime has its own rate limiter and in-memory index, which can lead to inconsistent results when multiple runtimes are accessing a shared resource. Proxy rate limiting is useful for scenarios that require more than one runtime, such as:
- Serverless computing.
- Multi-processing.
- Sharding and replication.
The following flowchart is based on a project that contains:
- Application APIs on serverless computing.
- Cronjobs on compute instances.
flowchart TB
subgraph "Compute"
A{"RateLimiter"}
end
B["DataSource"]
C["AppAPI:1"] --> A
D["AppAPI:2"] --> A
subgraph "Serverless"
C; D
end
E[CronJob] --> A
subgraph "Compute"
E
end
A --> B
Info
The centralized rate limiter is hosted on a compute instance.
The following flowchart is based on a project that contains:
- Application APIs on serverless computing.
- Cronjobs on compute instances.
flowchart TB
B["DataSource"]
subgraph "Serverless"
P{"RateLimiter:1"}
Q{"RateLimiter:2"}
C["AppAPI:1"] --> P
D["AppAPI:2"] --> Q
end
subgraph "Compute"
R{"RateLimiter:3"}
E[CronJob] --> R
end
P --> B
Q --> B
R --> B
Warning
Each runtime has its own rate limiter, and is not aware of other runtimes.
Configuration
The following instructions shows how to configure a centralized RiotAPIRateLimiter
.
On your hardware that will host the centralized rate limiter:
-
Setup python and install pulsefire.
-
Create a python file
ratelimiter.py
with the following code: -
Manage the execution with a service manager such as systemd.
-
(Public only) A reverse proxy (e.g. nginx) may be set up if HTTPS is needed.
-
(Public only) Ensure that the firewall is configured correctly to allow incoming traffic.
On each of your runtimes:
-
Provide proxy target to the rate limiter:
-
Run a test invocation to ensure the proxy is reachable.