anything-llm/server/utils
Timothy Carambat 452582489e
GitHub loader extension + extension support v1 (#469)
* feat: implement github repo loading
fix: purge of folders
fix: rendering of sub-files

* noshow delete on custom-documents

* Add API key support because of rate limits

* WIP for frontend of data connectors

* wip

* Add frontend form for GitHub repo data connector

* remove console.logs
block custom-documents from being deleted

* remove _meta unused arg

* Add support for ignore pathing in request
Ignore path input via tagging

* Update hint
2023-12-18 15:48:02 -08:00
..
AiProviders [Feature] AnythingLLM use locally hosted Llama.cpp and GGUF files for inferencing (#413) 2023-12-07 14:48:27 -08:00
chats patch: implement @lunamidori hotfix for LocalAI streaming chunk overflows (#433) 2023-12-12 16:20:06 -08:00
database Full developer api (#221) 2023-08-23 19:15:07 -07:00
EmbeddingEngines patch: API key to localai service calls (#421) 2023-12-11 14:18:28 -08:00
files GitHub loader extension + extension support v1 (#469) 2023-12-18 15:48:02 -08:00
helpers fix: patch api key not persisting when setting LLM/Embedder (#458) 2023-12-16 10:21:36 -08:00
http Add API key option to LocalAI (#407) 2023-12-04 08:38:15 -08:00
middleware Create manager role and limit default role (#351) 2023-11-13 14:51:16 -08:00
prisma Add built-in embedding engine into AnythingLLM (#411) 2023-12-06 10:36:22 -08:00
telemetry Replace custom sqlite dbms with prisma (#239) 2023-09-28 14:00:03 -07:00
vectorDbProviders feat: add support for variable chunk length (#415) 2023-12-07 16:27:36 -08:00