Discussions
how to use my local custom LLM with interactive avatat?
3 months ago by null
Following the api documentation it is not clear to me how I can prompt the custom interactive avatar with the text and answers I receive from my custom local LLM. I appreciate if someone can point a direction here.Thanks