anything-llm/frontend
Timothy Carambat a8ec0d9584
Compensate for upper OpenAI emedding limit chunk size (#292)
Limit is due to POST body max size. Sufficiently large requests will abort automatically
We should report that error back on the frontend during embedding
Update vectordb providers to return on failed
2023-10-26 10:57:37 -07:00
..
public AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
src Compensate for upper OpenAI emedding limit chunk size (#292) 2023-10-26 10:57:37 -07:00
.env.example update local dev docs 2023-08-16 15:34:03 -07:00
.eslintrc.cjs [Fork] Additions on franzbischoff resolution on #122 (#152) 2023-07-20 11:14:23 -07:00
.gitignore untrack frontend production env 2023-08-16 12:16:02 -07:00
.nvmrc bump node version requirement 2023-06-08 10:29:17 -07:00
index.html add feedback form, hosting link, update readme, show promo image 2023-08-11 17:28:30 -07:00
jsconfig.json inital commit 2023-06-03 19:28:07 -07:00
package-lock.json add codeblock support for prompt replies and historical messages (#55) 2023-06-14 13:35:55 -07:00
package.json AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
postcss.config.js inital commit 2023-06-03 19:28:07 -07:00
tailwind.config.js AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00
vite.config.js inital commit 2023-06-03 19:28:07 -07:00
yarn.lock AnythingLLM UI overhaul (#278) 2023-10-23 13:10:34 -07:00