I recently was trying to stream some NDJSON to a service worker and also into a comment feed.
It was a bit of an issue finding just a NDJSON reader as anything I could find was a reader → new readableStream pipe. (returns a new readable stream).
There was Jake Archibald’s XMLHTTP → NDJSON but I wanted to use fetch, so I re-did it with fetch and an async generator (I like generators).
streaming-html/xhr-ndjson.js at master · jakearchibald/streaming-html · GitHub
The reader seems to work fine,
It takes 13-> 21ms (performance.now()) for the function to run(after response) and stream → encode → json parse → template → and insert 1000 basic separate JSON objects (each line is a new JSON object).
{"id":1,"name":"Alice"}
{"id":2,"name":"Bob"}
{"id":3,"name":"Carol"}
The browser then takes 100 - 300ms to paint/ layout ect of the 1000 basic objects.
If anyone is in the same boat and needs to read an NDJSON stream and insert into the DOM this will work.
const readNDJSONStream = function(url, element) {
return new Promise(async function(resolve, reject) {
const decoder = new TextDecoder();
let el = document.querySelector(element);
let buff =[];
let pos = 1;
const response = await fetch(url);
const reader = response.body.getReader();
const pump = async function*(){
while (true) {
const { done, value } = await reader.read();
yield value;
if (done) { pos = 0; return yield value}
}
}
const pipe = async function() {
for await (const i of pump()) {
try {
buff += decoder.decode(i, {stream: true});
buff = buff.split("\n");
let chunk = buff.splice(0, (buff.length -pos))
for(let i of chunk){
let obj = JSON.parse(i);
let temp = `<h3>id: ${obj.id}, name: ${obj.name}</h3>`;
el.insertAdjacentHTML('beforeend', temp)
}
}
catch (error) {
reject(Error(`threw at ${decoder.decode(i)}`)); return
}
}
}();
resolve();
})
}
Obviously this is all inspired by Jake and Thomas Steiner.
If anyone has any suggestions on how to make this better whilst still performant that would be much appreciated.