Discussions
Streaming Avatar Response Using OpenAI LLM
2 months ago by Arthur
Hi,
I'm trying to generate the response using OpenAI LLM and send the response Heygen. Can i send the the OpenAI Stream result which word by word to Heygen, or i concat all the response or not using stream and send the result to heygen?
I just need to cut some waiting time for the avatar to response the user question
thank you