最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Parse streamed chunk data into JSON - Stack Overflow

programmeradmin2浏览0评论

Hi I'm trying to display data in chunk since I'm getting data in chunk.

for example let us assume that data is something like this.

data: {
user: [
    {
        name: 'a',
        bankAccounts: ['123', '234', '567'],
        address: ['some address', 'some other address', 'some more addres']
    },
    {
        name: 'b',
        bankAccounts: ['1233', '2334', '5637'],
        address: ['some address1', 'some other address1', 'some more addres1']
    },
    {
        name: 'c',
        bankAccounts: ['123355', '233455', '563700'],
        address: ['some address12', 'some other address12', 'some more addres12']
    },
]    
}

but the chunk I'm receiving is something like this

1st chunk: "data: user: [ {name: a"
2nd chunk: "bankAccounts: ['123', '234', '567'],"
3rd chunk: "address: ['some address', 'some other address', 'some more addres']"

and so on..

I'm receiving chunked data in such a way which can't be converted into json since it is inplete.

How can I stream this data in UI?

Any Idea !!!

My code for fetching streaming data

fetch('some url which stream data')
// Retrieve its body as ReadableStream
    .then(response => {
        const reader = response.body.getReader();
        let decoder = new TextDecoder();
        return new ReadableStream({
        start(controller) {
            return pump();
            function pump() {
                return reader.read().then(({ done, value }) => {
                    // When no more data needs to be consumed, close the stream
                    let newData = decoder.decode(value, {stream: !done});
                    console.log(newData);
                    if (done) {
                        controller.close();
                        return;
                    }
                    // Enqueue the next data chunk into our target stream
                    controller.enqueue(value);
                    return pump();
                });
            }
        }
    })
})
.then(stream => new Response(stream))
.then(response => {
    console.log('response', response)
})

Hi I'm trying to display data in chunk since I'm getting data in chunk.

for example let us assume that data is something like this.

data: {
user: [
    {
        name: 'a',
        bankAccounts: ['123', '234', '567'],
        address: ['some address', 'some other address', 'some more addres']
    },
    {
        name: 'b',
        bankAccounts: ['1233', '2334', '5637'],
        address: ['some address1', 'some other address1', 'some more addres1']
    },
    {
        name: 'c',
        bankAccounts: ['123355', '233455', '563700'],
        address: ['some address12', 'some other address12', 'some more addres12']
    },
]    
}

but the chunk I'm receiving is something like this

1st chunk: "data: user: [ {name: a"
2nd chunk: "bankAccounts: ['123', '234', '567'],"
3rd chunk: "address: ['some address', 'some other address', 'some more addres']"

and so on..

I'm receiving chunked data in such a way which can't be converted into json since it is inplete.

How can I stream this data in UI?

Any Idea !!!

My code for fetching streaming data

fetch('some url which stream data')
// Retrieve its body as ReadableStream
    .then(response => {
        const reader = response.body.getReader();
        let decoder = new TextDecoder();
        return new ReadableStream({
        start(controller) {
            return pump();
            function pump() {
                return reader.read().then(({ done, value }) => {
                    // When no more data needs to be consumed, close the stream
                    let newData = decoder.decode(value, {stream: !done});
                    console.log(newData);
                    if (done) {
                        controller.close();
                        return;
                    }
                    // Enqueue the next data chunk into our target stream
                    controller.enqueue(value);
                    return pump();
                });
            }
        }
    })
})
.then(stream => new Response(stream))
.then(response => {
    console.log('response', response)
})
Share Improve this question edited Dec 6, 2023 at 13:23 ᄂ ᄀ 5,7927 gold badges48 silver badges60 bronze badges asked Aug 13, 2019 at 10:10 shaanshaan 5246 silver badges26 bronze badges
Add a ment  | 

3 Answers 3

Reset to default 2

I know that generators are not very monly used, but i feel like they would be perfect for streaming the data in this task,

async function* streamAsyncIterator(stream) {
  const reader = stream.getReader();
  const decoder = new TextDecoder();
  while (true) {
    const {done,value} = await reader.read();
    if (done) break;
    yield decoder.decode(value, { stream: !done });
  }
  reader.releaseLock();
}

fetch('https://httpbin/stream/1')
    .then(async response => {
        let str="";
        for await (const value of streamAsyncIterator(response.body))
            str+=value;
        return JSON.parse(str);
    })
    .then(response => {
        console.log('response', response)
    })

however it seems what you want is to parse partially plete JSON, which can be achieved in variety of ways, for instance by using an npm library partial-json-parser

import partialParse from 'partial-json-parser';

fetch('https://httpbin/stream/1')
    .then(async response => {
        let str="";
        for await (const value of streamAsyncIterator(response.body)){
            str+=value;
            functionUpdatingYourUi(partialParse(str));
        }
        return JSON.parse(str);
    })
    .then(response => {
        console.log('response', response)
    })

You can accept a string(start with an empty string) to your function pump and keep appending it until chunk is there. at the end when terminating the recursion, return the parsed data.

const manager = require('./manager');

// manager.UpdateEC2Instances().then(console.log);
manager.UpdateRDSInstances().then(console.log);

fetch('some url which stream data')
    // Retrieve its body as ReadableStream
    .then(response => {
        const reader = response.body.getReader();
        let decoder = new TextDecoder();
        return new ReadableStream({
            start(controller) {
                return pump('');
                function pump(str) {
                    return reader.read().then(({ done, value }) => {
                        // When no more data needs to be consumed, close the stream
                        str += decoder.decode(value, { stream: !done });
                        console.log(str);
                        if (done) {
                            controller.close();
                            return JSON.parse(str);
                        }
                        // Enqueue the next data chunk into our target stream
                        controller.enqueue(value);
                        return pump(str);
                    });
                }
            }
        })
    })
    .then(stream => new Response(stream))
    .then(response => {
        console.log('response', response)
    })

See this thread for a more plete discussion & more plete examples from @Damian Nadales. If you are expecting your chunks to be plete JSON, which is not at all guarantee, you may decode your chunked value (of type Uint8Array) into UTF-8 using TextDecoder.decode, then parse the JSON using JSON.parse. E.g.,

var num = JSON.parse(
      new TextDecoder("utf-8").decode(result.value)
);
发布评论

评论列表(0)

  1. 暂无评论