Discussions

Ask a Question
Back to All

Streaming Avatar Response Using OpenAI LLM

Hi,

I'm trying to generate the response using OpenAI LLM and send the response Heygen. Can i send the the OpenAI Stream result which word by word to Heygen, or i concat all the response or not using stream and send the result to heygen?

I just need to cut some waiting time for the avatar to response the user question

thank you