anything-llm/server/utils/AiProviders
Timothy Carambat 4bb99ab4bf
Support LocalAi as LLM provider by @tlandenberger (#373)
* feature: add LocalAI as llm provider

* update Onboarding/mgmt settings
Grab models from models endpoint for localai
merge with master

* update streaming for complete chunk streaming
update localAI LLM to be able to stream

* force schema on URL

---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
Co-authored-by: tlandenberger <tobiaslandenberger@gmail.com>
2023-11-14 12:31:44 -08:00
..
anthropic assume default model where appropriate (#366) 2023-11-13 15:17:22 -08:00
azureOpenAi Enable chat streaming for LLMs (#354) 2023-11-13 15:07:30 -08:00
lmStudio assume default model where appropriate (#366) 2023-11-13 15:17:22 -08:00
localAi Support LocalAi as LLM provider by @tlandenberger (#373) 2023-11-14 12:31:44 -08:00
openAi assume default model where appropriate (#366) 2023-11-13 15:17:22 -08:00