Why is my cursor pagination merging duplicate items after a retry?

Hey everyone, I’m wiring up a feed loader right now and trying to keep retries and caching simple, but after a failed request the next page sometimes appends duplicate rows. I wanted a quick merge by id so the UI stays fast, though now I can’t tell if I’m breaking page order or cursor handling.

type Item = { id: string; createdAt: number };

let nextCursor: string | null = null;
const cache = new Map<string, Item[]>();
let items: Item[] = [];

async function loadPage(cursor: string | null, retry = false) {
  const key = cursor ?? "first";

  if (!retry && cache.has(key)) {
    items = [...items, ...cache.get(key)!];
    return;
  }

  const res = await fetch(`/api/feed?cursor=${cursor ?? ""}`);
  const data: { items: Item[]; nextCursor: string | null } = await res.json();

  cache.set(key, data.items);
  nextCursor = data.nextCursor;

  const seen = new Set(items.map(x => x.id));
  items = [...items, ...data.items.filter(x => !seen.has(x.id))];
}

What is the safest way to structure the cache and merge logic so retries do not create duplicates or silently scramble cursor order?

BayMax

@BayMax yeah, the bug is that items is being built in response order, not cursor order. A retry can return a page you already have, and deduping against the current flat array hides the duplicate row but can still leave the overall sequence wrong.

I’d cache whole pages by the cursor that fetched them, then rebuild the flat list by walking the cursor chain from "first" every time:


ts
type Item = { id: string; createdAt: number };
type Page = { items: Item[]; nextCursor: string | null };

const cache = new Map<string, Page>();
let items: Item[] = [];
let nextCursor: string | null = null;

function keyFor(cursor: string | null) {
  return cursor ?? "first";
}

function rebuildItems() {
  const out: Item[] = [];
  const seen = new Set<string>();
  let cursor: string | null = null;

  while (true) {
    const page = cache.get(keyFor(cursor));
    if (!page) break;

    for (const item of page.items) {
      if (!seen.has(item.id)) {
        seen.add(item.id);
        out.push(item);
      }
    }

    cursor = page.nextCursor;
    if (cursor == null) break;
  }

  return out;
}

async function loadPage(cursor: string | null, retry = false) {
  const key = keyFor(cursor);

  if (!retry && cache.has(key)) {
    nextCursor = cache.get(key)!.nextCursor;
    items = rebuildItems();
    return;
  }

  const requestedCursor = cursor;
  const res = await fetch(`/api/feed?cursor=${requestedCursor ?? ""}`);
  const data: Page = await res.json();

  // only accept the response for the cursor we asked for
  cache.set(key, data);
  nextCursor = data.nextCursor;
  items = rebuildItems();
}

That way retries are idempotent: page "c2" always replaces page "c2", and dedupe happens after walking first -> nextCursor -> nextCursor. I’d also track in-flight requests per cursor so a slower stale retry does not overwrite a newer page.

Quelly