What’s up everyone? I’m building a web app that queues analytics/events in memory and flushes them with fetch when the network comes back, and I’m seeing memory climb if someone stays offline for a while (plus I worry about duplicate sends if I retry too aggressively).
const queue = [];
let flushing = false;
export function enqueue(evt) {
queue.push({ evt, tries: 0, t: Date.now() });
flush();
}
async function flush() {
if (flushing || !navigator.onLine) return;
flushing = true;
while (queue.length) {
const item = queue[0];
try {
await fetch('/api/events', {
method: 'POST',
body: JSON.stringify(item.evt),
headers: { 'content-type': 'application/json' }
});
queue.shift();
} catch (e) {
item.tries++;
break;
}
}
flushing = false;
}
What’s a solid pattern for backpressure here (max queue size, TTL, batching, persistence) that avoids memory leaks but doesn’t drop too much data or spam retries?
Hari