anything-llm/server
Timothy Carambat e0a0a8976d
Add Ollama as LLM provider option (#494)
* Add support for Ollama as LLM provider
resolves #493
2023-12-27 17:21:47 -08:00
..
endpoints fix success is not defined error (#484) 2023-12-21 10:31:14 -08:00
models Add Ollama as LLM provider option (#494) 2023-12-27 17:21:47 -08:00
prisma Add user PFP support and context to logo (#408) 2023-12-07 14:11:51 -08:00
storage feat: Embed on-instance Whisper model for audio/mp4 transcribing (#449) 2023-12-15 11:20:13 -08:00
swagger AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
utils Add Ollama as LLM provider option (#494) 2023-12-27 17:21:47 -08:00
.env.example Add Ollama as LLM provider option (#494) 2023-12-27 17:21:47 -08:00
.gitignore AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
.nvmrc Implement Chroma Support (#1) 2023-06-07 21:31:35 -07:00
index.js GitHub loader extension + extension support v1 (#469) 2023-12-18 15:48:02 -08:00
nodemon.json Full developer api (#221) 2023-08-23 19:15:07 -07:00
package.json Add LLM support for Google Gemini-Pro (#492) 2023-12-27 17:08:03 -08:00
yarn.lock Add LLM support for Google Gemini-Pro (#492) 2023-12-27 17:08:03 -08:00