Discussions

Ask a Question
Back to All

How to integrate interactive avater with LLM?

I am using streaming avatar sdk. With the HeyGen knowledge base, my interactive avatar works perfectly.

Instead of HeyGen knowledge base, I need my LLM to respond to the interactions.

For example, my LLM is hosted on a backend server: 'https://myllm.com'.

I have a trained my LLM like this:

Question: What is Docketry?

Answer: Docketry is a platform to extract data from different files.

So when I ask the question to my avatar, it should get the response from my LLM and tell the answer.

This is my current code:

const startVoiceChat = async () => {
        const avatar = avatarRef.current;
        if (!avatar) return;

        try {
            await avatar.startVoiceChat({
                useSilencePrompt: false
            });
            await avatar.speak({
                text: "Hi There! Welcome! How can I assist you?",
                taskType: TaskType.REPEAT,
                taskMode: TaskMode.SYNC,
            });
            setVoiceStatus("Waiting for you to speak...");
        } catch (error) {
            console.error("Error starting voice chat:", error);
            setVoiceStatus("Error starting voice chat");
        }
    };