anything-llm/server
Timothy Carambat 446164d7b9
Add Groq vision preview support (#2511)
Adds support for only the llama3.2 vision models on groq. This comes with many conditionals and nuances to handle as Groqs vision implemention is quite bad right now
2024-10-21 12:37:39 -07:00
..
endpoints Add backfilling on query for chat widget to improve UX (#2482) 2024-10-15 14:37:44 -07:00
jobs Update bgworker to use fork instead of worker_thread (#1808) 2024-07-03 11:44:34 -07:00
models Tts open ai compatible endpoints (#2487) 2024-10-15 21:39:31 -07:00
prisma Daily message limit per user (#2417) 2024-10-15 14:01:29 -07:00
storage Integrate Apipie support directly (#2470) 2024-10-15 12:36:06 -07:00
swagger Daily message limit per user (#2417) 2024-10-15 14:01:29 -07:00
utils Add Groq vision preview support (#2511) 2024-10-21 12:37:39 -07:00
.env.example Tts open ai compatible endpoints (#2487) 2024-10-15 21:39:31 -07:00
.flowconfig devcontainer v1 (#297) 2024-01-08 15:31:06 -08:00
.gitignore Add support for custom agent skills via plugins (#2202) 2024-09-10 17:06:02 -07:00
.nvmrc Implement Chroma Support (#1) 2023-06-07 21:31:35 -07:00
index.js AnythingLLM Chrome Extension (#2066) 2024-08-27 14:58:47 -07:00
jsconfig.json devcontainer v1 (#297) 2024-01-08 15:31:06 -08:00
nodemon.json Full developer api (#221) 2023-08-23 19:15:07 -07:00
package.json bump jsonwebtoken version (#1971) 2024-07-25 11:03:39 -07:00
yarn.lock bump jsonwebtoken version (#1971) 2024-07-25 11:03:39 -07:00