gpt4 book ai didi

node.js - 通过 chatGpt api 流从 NextJs (express) 返回 ReadableStream

转载 作者:行者123 更新时间:2023-12-02 22:47:57 27 4
gpt4 key购买 nike

我对 Nodejs 很陌生,但正在尝试一些新的 chatgpt 东西。

我有一些代码可以获取一个主题并生成一个笑话。这是使用 https://api.openai.com/v1/chat/completions

的流式传输版本

我可以看到流返回并给出每个部分,但客户端未正确获取流。

客户端上的 console.log({done, value}); 仅被命中两次,但调试服务器时的流具有更多的 block 。

// the value decoded here is '{}'
home-page.tsx:46 {done: false, value: Uint8Array(2)}
home-page.tsx:46 {done: true, value: undefined}

要从服务器正确连接此流,我缺少什么?

OpenAPI 帮助程序

import { createParser, ParsedEvent, ReconnectInterval } from "eventsource-parser";

export const config = {
runtime: "edge",
};

export async function OpenAIStream(payload) {
const encoder = new TextEncoder();
const decoder = new TextDecoder();

let counter = 0;

const res = await fetch("https://api.openai.com/v1/chat/completions", {
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
method: "POST",
body: JSON.stringify(payload),
});

const stream = new ReadableStream({
async start(controller) {
function onParse(event: ParsedEvent | ReconnectInterval) {
if (event.type === "event") {
const data = event.data;
if (data === "[DONE]") {
controller.close();
return;
}
try {
const json = JSON.parse(data);
const text = json.choices[0].delta?.content || "";
if (counter < 2 && (text.match(/\n/) || []).length) {
return;
}
console.log(text);
const queue = encoder.encode(text);
controller.enqueue(queue);
counter++;
} catch (e) {
controller.error(e);
}
}
}

// stream response (SSE) from OpenAI may be fragmented into multiple chunks
// this ensures we properly read chunks & invoke an event for each SSE event stream
const parser = createParser(onParse);

// https://web.dev/streams/#asynchronous-iteration
for await (const chunk of res.body as any) {
parser.feed(decoder.decode(chunk));
}
},
});

return stream;
}

Nest Controller

import { Body, Controller, Post } from '@nestjs/common';
import { AppService } from './app.service';
import { OpenAIStream } from './helpers/openai';
import { ChatCompletionRequestMessage } from 'openai';

const MAX_RESPONSE_TOKENS = 200;//1024;

@Controller()
export class AppController {
constructor(private readonly appService: AppService) { }

@Post("joke")
async generate(@Body() message: JokeTemplate) {
let messages: Array<ChatCompletionRequestMessage> = [
{ "role": "system", "content": "You are a joke engine." },
{ "role": "user", "content": `Tell me a joke about ${message.subject}` }]

const payload = {
model: 'gpt-3.5-turbo',
max_tokens: MAX_RESPONSE_TOKENS,
temperature: 0,
messages,
stream: true
};

const stream = await OpenAIStream(payload);
return new Response(stream);
}
}

interface JokeTemplate {
subject: string;
}

服务器的客户端按钮触发器

const triggerGPTRequest = async (e: any) => {
setGptResponse('');
setLoading(true);

const response = await fetch("/api/joke", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({ 'subject': promptText }),
});

if (!response.ok) {
throw new Error(response.statusText);
}

const data = response.body;
if (!data) {
return;
}
const reader = data.getReader();
const decoder = new TextDecoder();
let done = false;

while (!done) {
const {value, done: doneReading} = await reader.read();
done = doneReading;
const chunkValue = decoder.decode(value);
console.log({done, value});
setGptResponse((prev) => prev + chunkValue);
}

setLoading(false);
}

最佳答案

Nest 不知道 fetch 的 Response 类型,它只是将该类型视为一个类,因此它尝试序列化它,这会导致错误。尝试返回一个 StreamableFile 实例,Nest 将通过调用 stream.pipe(res) 正确处理该实例。

import { Body, Controller, Post, StreamableFile } from '@nestjs/common';
import { AppService } from './app.service';
import { OpenAIStream } from './helpers/openai';
import { ChatCompletionRequestMessage } from 'openai';

const MAX_RESPONSE_TOKENS = 200;//1024;

@Controller()
export class AppController {
constructor(private readonly appService: AppService) { }

@Post("joke")
async generate(@Body() message: JokeTemplate) {
let messages: Array<ChatCompletionRequestMessage> = [
{ "role": "system", "content": "You are a joke engine." },
{ "role": "user", "content": `Tell me a joke about ${message.subject}` }]

const payload = {
model: 'gpt-3.5-turbo',
max_tokens: MAX_RESPONSE_TOKENS,
temperature: 0,
messages,
stream: true
};

const stream = await OpenAIStream(payload);
return new StreamableFile(stream)
}
}

interface JokeTemplate {
subject: string;
}

关于node.js - 通过 chatGpt api 流从 NextJs (express) 返回 ReadableStream,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/76582684/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com