Hey everyone, I’m wiring up a small fetch wrapper for a UI that does optimistic updates, and I keep hitting a weird failure mode where retries + cache hits cause the same “success” to apply twice (duplicate items + flicker). I want retries for flaky networks, but I also don’t want to leak memory by keeping tons of in-flight promises around.
const cache = new Map();
const inflight = new Map();
export async function getJSON(url, { retries = 2, ttl = 5000 } = {}) {
const now = Date.now();
const cached = cache.get(url);
if (cached && now - cached.t < ttl) return cached.v;
if (inflight.has(url)) return inflight.get(url);
const p = (async () => {
let lastErr;
for (let i = 0; i <= retries; i++) {
try {
const res = await fetch(url, { cache: "no-store" });
if (!res.ok) throw new Error(res.status);
const data = await res.json();
cache.set(url, { v: data, t: Date.now() });
return data;
} catch (e) {
lastErr = e;
}
}
throw lastErr;
})().finally(() => inflight.delete(url));
inflight.set(url, p);
return p;
}
What’s a solid pattern to ensure only one “result” ever commits to state per request key even when cache + retries + shared inflight promises overlap?
BayMax