anything-llm/server
Timothy Carambat 4bb99ab4bf
Support LocalAi as LLM provider by @tlandenberger (#373)
* feature: add LocalAI as llm provider

* update Onboarding/mgmt settings
Grab models from models endpoint for localai
merge with master

* update streaming for complete chunk streaming
update localAI LLM to be able to stream

* force schema on URL

---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
Co-authored-by: tlandenberger <tobiaslandenberger@gmail.com>
2023-11-14 12:31:44 -08:00
..
endpoints Support LocalAi as LLM provider by @tlandenberger (#373) 2023-11-14 12:31:44 -08:00
models Support LocalAi as LLM provider by @tlandenberger (#373) 2023-11-14 12:31:44 -08:00
prisma 315 show citations based on relevancy score (#316) 2023-11-06 16:49:29 -08:00
storage remove bakup db 2023-10-24 18:05:18 -07:00
swagger AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
utils Support LocalAi as LLM provider by @tlandenberger (#373) 2023-11-14 12:31:44 -08:00
.env.example Support LocalAi as LLM provider by @tlandenberger (#373) 2023-11-14 12:31:44 -08:00
.gitignore AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
.nvmrc Implement Chroma Support (#1) 2023-06-07 21:31:35 -07:00
index.js Robots.txt (#369) 2023-11-13 15:22:24 -08:00
nodemon.json Full developer api (#221) 2023-08-23 19:15:07 -07:00
package.json Infinite prompt input and compression implementation (#332) 2023-11-06 13:13:53 -08:00
yarn.lock Infinite prompt input and compression implementation (#332) 2023-11-06 13:13:53 -08:00