anything-llm/server/endpoints
Timothy Carambat a8ec0d9584
Compensate for upper OpenAI emedding limit chunk size (#292)
Limit is due to POST body max size. Sufficiently large requests will abort automatically
We should report that error back on the frontend during embedding
Update vectordb providers to return on failed
2023-10-26 10:57:37 -07:00
..
api Add support for chatting via the API (#261) 2023-09-29 13:45:35 -07:00
admin.js Replace custom sqlite dbms with prisma (#239) 2023-09-28 14:00:03 -07:00
chat.js Replace custom sqlite dbms with prisma (#239) 2023-09-28 14:00:03 -07:00
invite.js Replace custom sqlite dbms with prisma (#239) 2023-09-28 14:00:03 -07:00
system.js AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
utils.js How to: run docker on remote IP 2023-08-15 11:36:07 -07:00
workspaces.js Compensate for upper OpenAI emedding limit chunk size (#292) 2023-10-26 10:57:37 -07:00