Skip to content

Retry & memory cache

Retry — createRetryMiddleware

Factory returns a middleware that calls next() repeatedly on retryable failures with exponential backoff and jitter.

Resolved options (defaults + ctx.request.retry)

OptionDefaultDescription
maxAttempts3Total tries including the first
baseDelayMs300Base delay before first retry
maxDelayMs30000Cap on delay
factor2Exponential multiplier
retryOnStatus[408,429,500,502,503,504]HTTP statuses to retry when validateStatus failed
retryOnNetworkErrortrueRetry ERR_NETWORK / ERR_PARSE when rules allow
retryNonIdempotentMethodsfalseIf false, only GET, HEAD, OPTIONS, TRACE retry for network/parse/retryable status failures
shouldRetryOptional (error, attempt) => boolean after built-in checks
timeoutTotalMsMonotonic budget for the entire retry sequence (including backoff). Uses performance.now() when available. On expiry: OpenFetchError code ERR_RETRY_TIMEOUT.
enforceTotalTimeouttrueWhen true (default), each attempt merges a deadline into signal so an in-flight fetch aborts when the total budget is exhausted (user abort still wins). When false, the budget is checked between attempts only — a single slow response can exceed timeoutTotalMs.
timeoutPerAttemptMsIf set, overrides request.timeout for each attempt inside the retry middleware (per-attempt fetch timeout).

Canceled requests (ERR_CANCELED) are not retried. ERR_RETRY_TIMEOUT is not retried.

Plugin shortcut

ts
import { retry } from "@hamdymohamedak/openfetch/plugins";

client.use(retry({ attempts: 5, timeoutTotalMs: 20_000 }));

attempts is an alias for maxAttempts. See Plugins & fluent API.

POST / PUT / side effects

By default, mutating methods are not retried for network or retryable HTTP errors to avoid duplicate side effects. Opt in with:

ts
retry: { retryNonIdempotentMethods: true }

(per client defaults or per request).

Automatic Idempotency-Key for POST

When retryNonIdempotentMethods is true, maxAttempts > 1, and the method is POST without an existing idempotency header, the middleware adds a stable Idempotency-Key (unless autoIdempotencyKey: false). Retries then share the same key for server-side deduplication (Stripe-style APIs).

Package exports generateIdempotencyKey, hasIdempotencyKeyHeader, ensureIdempotencyKeyHeader if you build your own logic.

Memory cache — MemoryCacheStore + createCacheMiddleware

Store

ts
const store = new MemoryCacheStore({ maxEntries: 500 });
  • maxEntries — When full, oldest entries evicted (Map insertion order). Default 500.

Middleware

ts
createCacheMiddleware(store, {
  ttlMs: 60_000,
  staleWhileRevalidateMs: 0,
  methods: ["GET", "HEAD"],
  key: ({ request, url }) => `…`,
  varyHeaderNames: ["authorization", "cookie"],
});

Behavior:

  • Only configured methods (default GET, HEAD).
  • Skips read/write when memoryCache.skip is true on the request (also used internally for background refresh).
  • Per-request memoryCache.ttlMs and memoryCache.staleWhileRevalidateMs override factory defaults when set.

Cache key

Default key: METHOD fullUrl (full URL after buildURL).

For authenticated or personalized responses, include headers in the key so entries do not leak across users:

ts
varyHeaderNames: ["authorization", "cookie"]

Or use a custom key and appendCacheKeyVaryHeaders(baseKey, headers, names) from the package.

Stale-while-revalidate

If staleWhileRevalidateMs > 0, after TTL the middleware can still serve the stale entry and trigger a background dispatch with memoryCache.skip: true to refresh. Failures keep serving stale until expireAt.

Implementation note: background refresh calls dispatch directly, not the full client use() stack. Custom middleware on the instance (logging, auth refresh) does not run for that background fetch. Shared behavior should live in transformRequest / defaults or a documented workaround. See Architecture & internals.

Next

MIT · @hamdymohamedak/openfetch