Discussions

Ask a Question

Connect interactive Avatar with LLM

I am trying to set up the interactive avatar using the git demo (InteractiveAvatarNextJSDemo). Unfortunately, I can't get it to work even though I followed the instructions. even after adding an openai key to the env file, there is no way to use whisper in the browser, nor can the avatar access chatgpt4. In general, the instructions seem imprecise or incorrect to me. I have generally only been able to interact with the avatar after modifying the code in "packages". In the dev specification, a "node " must be added at the beginning of the string, otherwise an error is generated that the path to "node_modules" does not exist. It would be great if someone who has managed to link the demo to their own LLM could point out where changes are needed.

Answered

Not able to use Monica_insuit_20220819 avatar having Creator plan

It is giving error as {'code': 10013, 'message': 'avatar not found'}. How to use the particular avatar.

Answered

Cost for using audio only mode of HeyGen

What is the cost of using audio only mode of HeyGen?

Answered

How to detect when the avatar has stopped talking?

Hello,
I'm using the streaming-avatar npm package, and I wanted to know if it's possible to detect when the Interactive Avatar has stopped talking every time it "reads" something.
Thanks in advanced!

Dropping words when Avatar starts speaking

Has anyone encountered HeyGen dropping the first few words when an Avatar starts speaking? The mouth moves but no sound comes out. I've tried adding spaces, delays, etc. but nothing seems to work.

Answered

Upload portrait videos and create videos that speak

How to call the API to create a selfie portrait video and audio file uploaded by a user

Answered

Can HeyGen interactive avatar supports two way conversation in Urdu?

I want to use HeyGen interactive avatar for real time conversation in Urdu language. Is it supported?