So I am trying to read the Chat GPT API response from my NodeJS server. The response is piped from my backend to the client (in Angular).
const chatStream = await fetch(";, {
method: "POST",
body: JSON.stringify({
model: process.env.CHAT_GPT_MODEL,
messages,
n: 1,
max_tokens: parseInt(process.env.CHAT_GPT_TOKEN_LIMIT ?? "2000"),
stream: true,
}),
headers: {
"Content-Type": "application/json",
Authorization: "Bearer " + process.env.CHAT_GPT_API_KEY,
},
});
res.setHeader("Content-Type", "application/octet-stream");
res.setHeader(
"Content-Disposition",
'attachment; filename="generated_text.txt"'
);
// Pipe the GPT API response directly to the client's response object
chatStream.body.pipe(res);
This is how I handle the response in my front end (Angular). This is my httpPostStream
function:
const res = this.http
.post(this.getUrl(url), body, {
headers: {
'Content-Type': 'application/json',
// eslint-disable-next-line quote-props
Authorization: 'Bearer ' + this.auth?.loginToken,
},
responseType: 'blob',
observe: 'body'
})
.pipe(catchError(this.catchError.bind(this)));
And was called from this:
const res = this.http
.httpPostStream('/create-new-chat-gpt-room', { title })
.subscribe((result: any) => {
const reader = new FileReader();
reader.onload = () => {
const text = reader.result as string;
const parts = text.split('\n'); // Split text into parts based on newline character
for (const part of parts) {
// Display each part as needed
console.log(part);
}
};
reader.readAsText(result);
});
The problem is that console.log(part)
only runs after the result is fully
returned. I want to log each part as soon, as result, is returned part by part, I do not want to wait until the Blob
is pleted.
How can I do this?
So I am trying to read the Chat GPT API response from my NodeJS server. The response is piped from my backend to the client (in Angular).
const chatStream = await fetch("https://api.openai./v1/chat/pletions", {
method: "POST",
body: JSON.stringify({
model: process.env.CHAT_GPT_MODEL,
messages,
n: 1,
max_tokens: parseInt(process.env.CHAT_GPT_TOKEN_LIMIT ?? "2000"),
stream: true,
}),
headers: {
"Content-Type": "application/json",
Authorization: "Bearer " + process.env.CHAT_GPT_API_KEY,
},
});
res.setHeader("Content-Type", "application/octet-stream");
res.setHeader(
"Content-Disposition",
'attachment; filename="generated_text.txt"'
);
// Pipe the GPT API response directly to the client's response object
chatStream.body.pipe(res);
This is how I handle the response in my front end (Angular). This is my httpPostStream
function:
const res = this.http
.post(this.getUrl(url), body, {
headers: {
'Content-Type': 'application/json',
// eslint-disable-next-line quote-props
Authorization: 'Bearer ' + this.auth?.loginToken,
},
responseType: 'blob',
observe: 'body'
})
.pipe(catchError(this.catchError.bind(this)));
And was called from this:
const res = this.http
.httpPostStream('/create-new-chat-gpt-room', { title })
.subscribe((result: any) => {
const reader = new FileReader();
reader.onload = () => {
const text = reader.result as string;
const parts = text.split('\n'); // Split text into parts based on newline character
for (const part of parts) {
// Display each part as needed
console.log(part);
}
};
reader.readAsText(result);
});
The problem is that console.log(part)
only runs after the result is fully
returned. I want to log each part as soon, as result, is returned part by part, I do not want to wait until the Blob
is pleted.
How can I do this?
Share Improve this question edited Aug 6, 2024 at 14:13 Heretic Monkey 12.1k7 gold badges61 silver badges131 bronze badges asked Mar 21, 2023 at 3:10 Alvin StefanusAlvin Stefanus 2,1633 gold badges29 silver badges77 bronze badges 5- 2 use fetch: Provide first class HTTP streaming API as part of @angular/mom/http package – traynor Commented Mar 21, 2023 at 9:08
- @traynor is this still not supported using the FetchBackend maybe (angular.io/api/mon/http/HttpHandler)? – Juarrow Commented Sep 6, 2023 at 11:49
- Looks like it only works for responseType === 'text' (github./angular/angular/blob/…) – Juarrow Commented Sep 6, 2023 at 12:00
-
1
@Juarrow good catch, I got some results with streaming text from the server, and with
{ observe: 'events', responseType: 'text', reportProgress:true }
settings – traynor Commented Sep 7, 2023 at 7:11 -
To use fetch with
provideHttpClient
you can doprovideHttpClient(withFetch())
. If I get the whole thing working I will provide a full answer. – Mark Lagendijk Commented Oct 2, 2023 at 7:40
1 Answer
Reset to default 3This can be done with the normal HttpClient
of Angular. You need the following approach to implement this.
In your main.ts
where you have your bootstrapApplication
call use withFetch
to have HttpClient use the HttpFetchBackend so fetch
is used for all HTTP requests.
bootstrapApplication(AppComponent, {
providers: [
provideHttpClient(withFetch()),
// ...
You can now use the HttpClient to do streaming responses, but you need to use these options:
{
observe: "events",
responseType: "text",
reportProgress: true
}
You then get HttpEvent
s in your response stream, which you can check the type of, and process.
export default class ChatComponent {
currentUserMessage: string;
chatMessages: ChatMessage[];
loadingResponse: boolean = false;
sendMessage() {
if (this.currentUserMessage.trim() === "") return;
this.loadingResponse = true;
this.chatMessages.push({ role: "user", content: this.currentUserMessage});
this.currentUserMessage= "";
const responseMessage = {
role: "assistant",
content: "…",
};
this.chatMessages.push(responseMessage);
this.http
.post("/api/chat-gpt", this.chatMessages, {
observe: "events",
responseType: "text",
reportProgress: true,
})
.subscribe({
next: (event: HttpEvent<string>) => {
if (event.type === HttpEventType.DownloadProgress) {
responseMessage.content = (
event as HttpDownloadProgressEvent
).partialText + "…";
} else if (event.type === HttpEventType.Response) {
responseMessage.content = event.body;
this.loadingResponse = false;
}
},
error: () => {
this.loadingResponse = false;
},
});
}