anything-llm/server
Timothy Carambat 8cc1455b72
feat: add support for variable chunk length (#415)
fix: cleanup code for embedding length clarify
resolves #388
2023-12-07 16:27:36 -08:00
..
endpoints Add user PFP support and context to logo (#408) 2023-12-07 14:11:51 -08:00
models feat: add support for variable chunk length (#415) 2023-12-07 16:27:36 -08:00
prisma Add user PFP support and context to logo (#408) 2023-12-07 14:11:51 -08:00
storage [Feature] AnythingLLM use locally hosted Llama.cpp and GGUF files for inferencing (#413) 2023-12-07 14:48:27 -08:00
swagger AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
utils feat: add support for variable chunk length (#415) 2023-12-07 16:27:36 -08:00
.env.example feat: add support for variable chunk length (#415) 2023-12-07 16:27:36 -08:00
.gitignore AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
.nvmrc Implement Chroma Support (#1) 2023-06-07 21:31:35 -07:00
index.js Robots.txt (#369) 2023-11-13 15:22:24 -08:00
nodemon.json Full developer api (#221) 2023-08-23 19:15:07 -07:00
package.json [Feature] AnythingLLM use locally hosted Llama.cpp and GGUF files for inferencing (#413) 2023-12-07 14:48:27 -08:00
yarn.lock [Feature] AnythingLLM use locally hosted Llama.cpp and GGUF files for inferencing (#413) 2023-12-07 14:48:27 -08:00