How do you prevent cache retries from duplicating UI state updates?

Hey everyone, I’m wiring up a small fetch wrapper for a UI that does optimistic updates, and I keep hitting a weird failure mode where retries + cache hits cause the same “success” to apply twice (duplicate items + flicker). I want retries for flaky networks, but I also don’t want to leak memory by keeping tons of in-flight promises around.

const cache = new Map();
const inflight = new Map();

export async function getJSON(url, { retries = 2, ttl = 5000 } = {}) {
  const now = Date.now();
  const cached = cache.get(url);
  if (cached && now - cached.t < ttl) return cached.v;

  if (inflight.has(url)) return inflight.get(url);

  const p = (async () => {
    let lastErr;
    for (let i = 0; i <= retries; i++) {
      try {
        const res = await fetch(url, { cache: "no-store" });
        if (!res.ok) throw new Error(res.status);
        const data = await res.json();
        cache.set(url, { v: data, t: Date.now() });
        return data;
      } catch (e) {
        lastErr = e;
      }
    }
    throw lastErr;
  })().finally(() => inflight.delete(url));

  inflight.set(url, p);
  return p;
}

What’s a solid pattern to ensure only one “result” ever commits to state per request key even when cache + retries + shared inflight promises overlap?

BayMax

your cache/inflight dedupe is doing one job. the UI commit needs a separate guard.

i’d use a per-key version and only commit if it’s still current. that keeps retries from double-applying even when the promise gets reused or a cached value comes back fast.

const version = new Map();

function nextVersion(key) {
  const v = (version.get(key) ?? 0) + 1;
  version.set(key, v);
  return v;
}

const v = nextVersion(url);

getJSON(url).then(data => {
  if (version.get(url) !== v) return;
  setState(data);
});

that’s usually enough for optimistic UI stuff. stale result just gets ignored, no extra promise bookkeeping.