anything-llm/server
Timothy Carambat 20135835d0
Ollama sequential embedding (#2230)
* ollama: Switch from parallel to sequential chunk embedding

* throw error on empty embeddings

---------

Co-authored-by: John Blomberg <john.jb.blomberg@gmail.com>
2024-09-06 10:06:46 -07:00
..
endpoints Bug/make swagger json output openapi 3 compliant (#2219) 2024-09-04 15:40:24 -07:00
jobs Update bgworker to use fork instead of worker_thread (#1808) 2024-07-03 11:44:34 -07:00
models Feature/add searchapi web browsing (#2224) 2024-09-05 10:36:46 -07:00
prisma AnythingLLM Chrome Extension (#2066) 2024-08-27 14:58:47 -07:00
storage 1173 dynamic cache openrouter (#1176) 2024-04-23 11:10:54 -07:00
swagger Bug/make swagger json output openapi 3 compliant (#2219) 2024-09-04 15:40:24 -07:00
utils Ollama sequential embedding (#2230) 2024-09-06 10:06:46 -07:00
.env.example Feature/add searchapi web browsing (#2224) 2024-09-05 10:36:46 -07:00
.flowconfig devcontainer v1 (#297) 2024-01-08 15:31:06 -08:00
.gitignore Patch WSS upgrade for manual HTTPS certs (#1429) 2024-05-17 14:03:25 -07:00
.nvmrc Implement Chroma Support (#1) 2023-06-07 21:31:35 -07:00
index.js AnythingLLM Chrome Extension (#2066) 2024-08-27 14:58:47 -07:00
jsconfig.json devcontainer v1 (#297) 2024-01-08 15:31:06 -08:00
nodemon.json Full developer api (#221) 2023-08-23 19:15:07 -07:00
package.json bump jsonwebtoken version (#1971) 2024-07-25 11:03:39 -07:00
yarn.lock bump jsonwebtoken version (#1971) 2024-07-25 11:03:39 -07:00