Retry & memory cache
Retry — createRetryMiddleware
Factory returns a middleware that calls next() repeatedly on retryable failures with exponential backoff and jitter.
Resolved options (defaults + ctx.request.retry)
| Option | Default | Description |
|---|---|---|
maxAttempts | 3 | Total tries including the first |
baseDelayMs | 300 | Base delay before first retry |
maxDelayMs | 30000 | Cap on delay |
factor | 2 | Exponential multiplier |
retryOnStatus | [408,429,500,502,503,504] | HTTP statuses to retry when validateStatus failed |
retryOnNetworkError | true | Retry ERR_NETWORK / ERR_PARSE when rules allow |
retryNonIdempotentMethods | false | If false, only GET, HEAD, OPTIONS, TRACE retry for network/parse/retryable status failures |
shouldRetry | — | Optional (error, attempt) => boolean after built-in checks |
timeoutTotalMs | — | Monotonic budget for the entire retry sequence (including backoff). Uses performance.now() when available. On expiry: OpenFetchError code ERR_RETRY_TIMEOUT. |
enforceTotalTimeout | true | When true (default), each attempt merges a deadline into signal so an in-flight fetch aborts when the total budget is exhausted (user abort still wins). When false, the budget is checked between attempts only — a single slow response can exceed timeoutTotalMs. |
timeoutPerAttemptMs | — | If set, overrides request.timeout for each attempt inside the retry middleware (per-attempt fetch timeout). |
Canceled requests (ERR_CANCELED) are not retried. ERR_RETRY_TIMEOUT is not retried.
Plugin shortcut
import { retry } from "@hamdymohamedak/openfetch/plugins";
client.use(retry({ attempts: 5, timeoutTotalMs: 20_000 }));attempts is an alias for maxAttempts. See Plugins & fluent API.
POST / PUT / side effects
By default, mutating methods are not retried for network or retryable HTTP errors to avoid duplicate side effects. Opt in with:
retry: { retryNonIdempotentMethods: true }(per client defaults or per request).
Automatic Idempotency-Key for POST
When retryNonIdempotentMethods is true, maxAttempts > 1, and the method is POST without an existing idempotency header, the middleware adds a stable Idempotency-Key (unless autoIdempotencyKey: false). Retries then share the same key for server-side deduplication (Stripe-style APIs).
Package exports generateIdempotencyKey, hasIdempotencyKeyHeader, ensureIdempotencyKeyHeader if you build your own logic.
Memory cache — MemoryCacheStore + createCacheMiddleware
Store
const store = new MemoryCacheStore({ maxEntries: 500 });maxEntries— When full, oldest entries evicted (Map insertion order). Default500.
Middleware
createCacheMiddleware(store, {
ttlMs: 60_000,
staleWhileRevalidateMs: 0,
methods: ["GET", "HEAD"],
key: ({ request, url }) => `…`,
varyHeaderNames: ["authorization", "cookie"],
});Behavior:
- Only configured methods (default GET, HEAD).
- Skips read/write when
memoryCache.skipis true on the request (also used internally for background refresh). - Per-request
memoryCache.ttlMsandmemoryCache.staleWhileRevalidateMsoverride factory defaults when set.
Cache key
Default key: METHOD fullUrl (full URL after buildURL).
For authenticated or personalized responses, include headers in the key so entries do not leak across users:
varyHeaderNames: ["authorization", "cookie"]Or use a custom key and appendCacheKeyVaryHeaders(baseKey, headers, names) from the package.
Stale-while-revalidate
If staleWhileRevalidateMs > 0, after TTL the middleware can still serve the stale entry and trigger a background dispatch with memoryCache.skip: true to refresh. Failures keep serving stale until expireAt.
Implementation note: background refresh calls dispatch directly, not the full client use() stack. Custom middleware on the instance (logging, auth refresh) does not run for that background fetch. Shared behavior should live in transformRequest / defaults or a documented workaround. See Architecture & internals.
Next
- Errors & security for cache-related security notes and SSRF guard.
