Resilience
Retry Strategy
Build resilient integrations with exponential backoff, jitter, and circuit breakers. Handle transient failures gracefully without overwhelming our servers.
Exponential backoff
Full jitter
Circuit breakers
Backoff Visualization
Retry Demo
Watch exponential backoff in action
1
Attempt 1
2
Attempt 2
Wait ~1232ms before retry
3
Attempt 3
Wait ~2410ms before retry
4
Attempt 4
Wait ~4444ms before retry
When to Retry
| Status | Name | Retry? | Strategy |
|---|---|---|---|
| 429 | Rate Limited | Yes | Use Retry-After header, then exponential backoff |
| 500 | Internal Server Error | Yes | Exponential backoff with jitter |
| 502 | Bad Gateway | Yes | Exponential backoff with jitter |
| 503 | Service Unavailable | Yes | Use Retry-After if present, else backoff |
| 504 | Gateway Timeout | Yes | Exponential backoff with jitter |
| 400 | Bad Request | No | Fix request payload, do not retry |
| 401 | Unauthorized | No | Check API key, do not retry |
| 403 | Forbidden | No | Check permissions, do not retry |
| 404 | Not Found | No | Check URL, do not retry |
Implementation Examples
Basic Retry Logic
async function fetchWithRetry(url, options, maxRetries = 5) {
for (let attempt = 1; attempt <= maxRetries; attempt++) {
try {
const response = await fetch(url, options);
if (response.ok) return response;
// Don't retry client errors (4xx except 429)
if (response.status >= 400 && response.status < 500 && response.status !== 429) {
throw new Error(`Client error: ${response.status}`);
}
// Check Retry-After header
const retryAfter = response.headers.get('Retry-After');
const waitMs = retryAfter
? parseInt(retryAfter) * 1000
: calculateBackoff(attempt);
console.log(`Attempt ${attempt} failed, retrying in ${waitMs}ms`);
await sleep(waitMs);
} catch (error) {
if (attempt === maxRetries) throw error;
}
}
}Exponential Backoff
// Exponential backoff with full jitter (AWS recommended)
function calculateBackoff(attempt, baseMs = 1000, maxMs = 32000) {
// Exponential: 1s, 2s, 4s, 8s, 16s, 32s (capped)
const exponential = Math.min(baseMs * Math.pow(2, attempt - 1), maxMs);
// Full jitter: random between 0 and exponential
const jitter = Math.random() * exponential;
return Math.round(jitter);
}
// Alternative: Equal jitter (50% base + 50% random)
function equalJitter(attempt, baseMs = 1000, maxMs = 32000) {
const exponential = Math.min(baseMs * Math.pow(2, attempt - 1), maxMs);
const half = exponential / 2;
return Math.round(half + Math.random() * half);
}SDK Retry Config
import { JustKalm } from '@justkalm/sdk';
// SDK handles retries automatically
const client = new JustKalm({
apiKey: process.env.JUSTKALM_API_KEY,
// Customize retry behavior
retry: {
maxRetries: 5, // Max retry attempts
initialDelayMs: 1000, // First retry delay
maxDelayMs: 32000, // Cap on backoff
jitter: 'full', // 'full', 'equal', or 'none'
retryableStatuses: [429, 500, 502, 503, 504],
},
});
// SDK automatically retries transient failures
const result = await client.valuate('https://example.com/product');
// Disable retries for specific call
const noRetry = await client.valuate(url, { retry: false });Circuit Breaker
// Circuit breaker pattern for resilience
class CircuitBreaker {
constructor(threshold = 5, resetTimeMs = 30000) {
this.failures = 0;
this.threshold = threshold;
this.resetTime = resetTimeMs;
this.state = 'CLOSED'; // CLOSED, OPEN, HALF_OPEN
this.lastFailure = null;
}
async call(fn) {
if (this.state === 'OPEN') {
if (Date.now() - this.lastFailure > this.resetTime) {
this.state = 'HALF_OPEN';
} else {
throw new Error('Circuit breaker is OPEN');
}
}
try {
const result = await fn();
this.onSuccess();
return result;
} catch (error) {
this.onFailure();
throw error;
}
}
onSuccess() {
this.failures = 0;
this.state = 'CLOSED';
}
onFailure() {
this.failures++;
this.lastFailure = Date.now();
if (this.failures >= this.threshold) {
this.state = 'OPEN';
}
}
}Best Practices
Use Jitter
Add random jitter to prevent thundering herd when many clients retry at the same time after an outage.
Honor Retry-After
Always check and respect the Retry-After header on 429 and 503 responses before calculating your own backoff.
Cap Max Retries
Set a maximum retry limit (we recommend 5). Infinite retries can cause cascading failures and resource exhaustion.
Use Idempotency Keys
For mutating operations, always include an idempotency key to prevent duplicate operations on retry.
Build Resilient Integrations
Our SDKs handle retry logic automatically with sensible defaults.