Discussions

Ask a Question
Back to All

how to use my local custom LLM with interactive avatat?

Following the api documentation it is not clear to me how I can prompt the custom interactive avatar with the text and answers I receive from my custom local LLM. I appreciate if someone can point a direction here.Thanks