Documentation
Everything you need to integrate Request Proxy into your application
Core Headers
These headers are required for every request through Request Proxy.
Request Headers
| Header | Required | Description |
|---|---|---|
Request-Proxy-Key | Yes | Your API key from the dashboard |
Request-Proxy-Host | Yes | Target API hostname (e.g., api.example.com) |
Response Headers
| Header | Description |
|---|---|
X-Request-Proxy-Request-Id | UUID for request tracing and debugging |
X-Request-Proxy-Latency-Ms | Total request duration in milliseconds |
Error Responses
| Status | Error | Description |
|---|---|---|
| 400 | Missing Request-Proxy-Host | Target host header is required |
| 401 | Invalid API key | API key not found or malformed |
| 403 | API key disabled | Your API key has been disabled |
| 403 | Free tier exhausted | Upgrade your plan or wait for reset |
| 429 | Rate limit exceeded | Too many requests, slow down |
Code Examples
curl -X GET "https://api.requestproxy.com/v1/users" \
-H "Request-Proxy-Key: al_live_abc123" \
-H "Request-Proxy-Host: api.example.com" \
-H "Authorization: Bearer your-target-api-token"
import requests
response = requests.get(
"https://api.requestproxy.com/v1/users",
headers={
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
"Authorization": "Bearer your-target-api-token",
}
)
print(f"Request ID: {response.headers['X-Request-Proxy-Request-Id']}")
print(response.json())
const response = await fetch("https://api.requestproxy.com/v1/users", {
headers: {
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
"Authorization": "Bearer your-target-api-token",
},
});
console.log("Request ID:", response.headers.get("X-Request-Proxy-Request-Id"));
const data = await response.json();
HttpClient client = HttpClient.newHttpClient();
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create("https://api.requestproxy.com/v1/users"))
.header("Request-Proxy-Key", "al_live_abc123")
.header("Request-Proxy-Host", "api.example.com")
.header("Authorization", "Bearer your-target-api-token")
.GET()
.build();
HttpResponse<String> response = client.send(
request, HttpResponse.BodyHandlers.ofString());
System.out.println("Request ID: " +
response.headers().firstValue("X-Request-Proxy-Request-Id").orElse(""));
import Network.HTTP.Simple
main :: IO ()
main = do
let request = setRequestHeader "Request-Proxy-Key" ["al_live_abc123"]
$ setRequestHeader "Request-Proxy-Host" ["api.example.com"]
$ setRequestHeader "Authorization" ["Bearer your-target-api-token"]
$ "GET https://api.requestproxy.com/v1/users"
response <- httpBS request
let requestId = getResponseHeader "X-Request-Proxy-Request-Id" response
print requestId
print (getResponseBody response)
Retries
Request Proxy automatically retries failed requests with configurable backoff strategies.
Request Headers
| Header | Required | Description |
|---|---|---|
Request-Proxy-Max-Retries | 3 | Maximum retry attempts (0-10) |
Request-Proxy-Retry-Backoff | exponential | Backoff strategy: exponential, linear, or fixed |
Request-Proxy-Timeout | 30000 | Request timeout in milliseconds |
- exponential(default): Delays double each retry (1s, 2s, 4s, 8s...)
- linear: Fixed delay increase (1s, 2s, 3s, 4s...)
- fixed: Same delay between each retry (1s, 1s, 1s...)
Response Headers
| Header | Description |
|---|---|
X-Request-Proxy-Retries | Number of retry attempts performed |
Error Responses
| Status | Error | Description |
|---|---|---|
| 503 | Circuit breaker open | Target API is failing repeatedly, try again later |
Code Examples
curl -X GET "https://api.requestproxy.com/v1/users" \
-H "Request-Proxy-Key: al_live_abc123" \
-H "Request-Proxy-Host: api.example.com" \
-H "Request-Proxy-Max-Retries: 5" \
-H "Request-Proxy-Retry-Backoff: exponential" \
-H "Request-Proxy-Timeout: 60000"
import requests
response = requests.get(
"https://api.requestproxy.com/v1/users",
headers={
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
"Request-Proxy-Max-Retries": "5",
"Request-Proxy-Retry-Backoff": "exponential",
"Request-Proxy-Timeout": "60000",
}
)
retries = response.headers.get("X-Request-Proxy-Retries", "0")
print(f"Request completed after {retries} retries")
const response = await fetch("https://api.requestproxy.com/v1/users", {
headers: {
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
"Request-Proxy-Max-Retries": "5",
"Request-Proxy-Retry-Backoff": "exponential",
"Request-Proxy-Timeout": "60000",
},
});
const retries = response.headers.get("X-Request-Proxy-Retries") ?? "0";
console.log(`Request completed after ${retries} retries`);
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create("https://api.requestproxy.com/v1/users"))
.header("Request-Proxy-Key", "al_live_abc123")
.header("Request-Proxy-Host", "api.example.com")
.header("Request-Proxy-Max-Retries", "5")
.header("Request-Proxy-Retry-Backoff", "exponential")
.header("Request-Proxy-Timeout", "60000")
.GET()
.build();
HttpResponse<String> response = client.send(
request, HttpResponse.BodyHandlers.ofString());
String retries = response.headers()
.firstValue("X-Request-Proxy-Retries").orElse("0");
System.out.println("Completed after " + retries + " retries");
import Network.HTTP.Simple
main :: IO ()
main = do
let request = setRequestHeader "Request-Proxy-Key" ["al_live_abc123"]
$ setRequestHeader "Request-Proxy-Host" ["api.example.com"]
$ setRequestHeader "Request-Proxy-Max-Retries" ["5"]
$ setRequestHeader "Request-Proxy-Retry-Backoff" ["exponential"]
$ setRequestHeader "Request-Proxy-Timeout" ["60000"]
$ "GET https://api.requestproxy.com/v1/users"
response <- httpBS request
let retries = getResponseHeader "X-Request-Proxy-Retries" response
putStrLn $ "Completed after " ++ show retries ++ " retries"
Circuit Breaker
Request Proxy protects your systems by detecting repeated failures to a target host and temporarily blocking requests until the host recovers.
- Closed (normal): Requests pass through normally.
- Open (blocking): After reaching the failure threshold, the circuit opens and blocks all requests with a 503 error.
- HalfOpen (testing): After the reset timeout, a single test request is allowed. If successful, the circuit closes. If it fails, the circuit reopens.
Response Headers
| Header | Description |
|---|---|
X-Request-Proxy-Circuit-State | Current circuit state: Closed, Open, or HalfOpen |
Error Responses
| Status | Error | Description |
|---|---|---|
| 503 | Circuit breaker open | Target API is failing repeatedly, try again later |
Code Examples
# Check circuit breaker state in response headers
curl -X GET "https://api.requestproxy.com/v1/users" \
-H "Request-Proxy-Key: al_live_abc123" \
-H "Request-Proxy-Host: api.example.com" \
-v # Verbose mode to see response headers
# Handle 503 Circuit Breaker Open error
# X-Request-Proxy-Circuit-State: Open
# Retry after the reset timeout (typically 60 seconds)
import requests
import time
response = requests.get(
"https://api.requestproxy.com/v1/users",
headers={
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
}
)
circuit_state = response.headers.get("X-Request-Proxy-Circuit-State")
print(f"Circuit state: {circuit_state}")
if response.status_code == 503:
print("Circuit breaker is open. Retrying after 60 seconds...")
time.sleep(60)
# Retry request
const response = await fetch("https://api.requestproxy.com/v1/users", {
headers: {
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
},
});
const circuitState = response.headers.get("X-Request-Proxy-Circuit-State");
console.log(`Circuit state: ${circuitState}`);
if (response.status === 503) {
console.log("Circuit breaker is open. Retrying after 60 seconds...");
await new Promise(resolve => setTimeout(resolve, 60000));
// Retry request
}
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create("https://api.requestproxy.com/v1/users"))
.header("Request-Proxy-Key", "al_live_abc123")
.header("Request-Proxy-Host", "api.example.com")
.GET()
.build();
HttpResponse<String> response = client.send(
request, HttpResponse.BodyHandlers.ofString());
String circuitState = response.headers()
.firstValue("X-Request-Proxy-Circuit-State").orElse("Unknown");
System.out.println("Circuit state: " + circuitState);
if (response.statusCode() == 503) {
System.out.println("Circuit breaker is open. Retrying after 60 seconds...");
Thread.sleep(60000);
// Retry request
}
import Network.HTTP.Simple
import Control.Concurrent (threadDelay)
main :: IO ()
main = do
let request = setRequestHeader "Request-Proxy-Key" ["al_live_abc123"]
$ setRequestHeader "Request-Proxy-Host" ["api.example.com"]
$ "GET https://api.requestproxy.com/v1/users"
response <- httpBS request
let circuitState = getResponseHeader "X-Request-Proxy-Circuit-State" response
putStrLn $ "Circuit state: " ++ show circuitState
when (getResponseStatusCode response == 503) $ do
putStrLn "Circuit breaker is open. Retrying after 60 seconds..."
threadDelay 60000000 -- 60 seconds in microseconds
-- Retry request
Caching
GET requests are cached by default based on the target's Cache-Control headers.
Request Headers
| Header | Required | Description |
|---|---|---|
Request-Proxy-Cache | default | Cache behavior control |
- default: Respect the target API's Cache-Control headers
- bypass: Skip cache, always fetch fresh data
- force: Return cached response if available, even if stale
Response Headers
| Header | Description |
|---|---|
X-Request-Proxy-From-Cache | "true" if response was served from cache |
Code Examples
# Use cached response if available
curl -X GET "https://api.requestproxy.com/v1/users" \
-H "Request-Proxy-Key: al_live_abc123" \
-H "Request-Proxy-Host: api.example.com" \
-H "Request-Proxy-Cache: force"
# Always fetch fresh data
curl -X GET "https://api.requestproxy.com/v1/users" \
-H "Request-Proxy-Key: al_live_abc123" \
-H "Request-Proxy-Host: api.example.com" \
-H "Request-Proxy-Cache: bypass"
import requests
# Force use of cached response
response = requests.get(
"https://api.requestproxy.com/v1/users",
headers={
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
"Request-Proxy-Cache": "force",
}
)
from_cache = response.headers.get("X-Request-Proxy-From-Cache") == "true"
print(f"From cache: {from_cache}")
// Force use of cached response
const response = await fetch("https://api.requestproxy.com/v1/users", {
headers: {
"Request-Proxy-Key": "al_live_abc123",
"Request-Proxy-Host": "api.example.com",
"Request-Proxy-Cache": "force",
},
});
const fromCache = response.headers.get("X-Request-Proxy-From-Cache") === "true";
console.log(`From cache: ${fromCache}`);
// Force use of cached response
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create("https://api.requestproxy.com/v1/users"))
.header("Request-Proxy-Key", "al_live_abc123")
.header("Request-Proxy-Host", "api.example.com")
.header("Request-Proxy-Cache", "force")
.GET()
.build();
HttpResponse<String> response = client.send(
request, HttpResponse.BodyHandlers.ofString());
boolean fromCache = "true".equals(
response.headers().firstValue("X-Request-Proxy-From-Cache").orElse(""));
System.out.println("From cache: " + fromCache);
import Network.HTTP.Simple
main :: IO ()
main = do
-- Force use of cached response
let request = setRequestHeader "Request-Proxy-Key" ["al_live_abc123"]
$ setRequestHeader "Request-Proxy-Host" ["api.example.com"]
$ setRequestHeader "Request-Proxy-Cache" ["force"]
$ "GET https://api.requestproxy.com/v1/users"
response <- httpBS request
let fromCache = getResponseHeader "X-Request-Proxy-From-Cache" response == ["true"]
putStrLn $ "From cache: " ++ show fromCache
Concurrency Limits
Request Proxy limits the number of simultaneous in-flight requests per account to protect backend services from overload.
- Default limit: 10 concurrent requests per account
- Scope: Per account, across all API keys
- Release: Slots are released immediately when a request completes (success or failure)
Error Responses
| Status | Error | Description |
|---|---|---|
| 429 | Concurrency limit exceeded | Too many simultaneous requests, wait for one to complete |