anything-llm/server
Timothy Carambat 42e1d8e8ce
Customize refusal response for query mode (#1243)
* Customize refusal response for `query` mode

* remove border for desktop
2024-04-30 16:14:30 -07:00
..
endpoints [FEAT] Confluence data connector (#1181) 2024-04-25 17:53:38 -07:00
models Customize refusal response for query mode (#1243) 2024-04-30 16:14:30 -07:00
prisma Customize refusal response for query mode (#1243) 2024-04-30 16:14:30 -07:00
storage 1173 dynamic cache openrouter (#1176) 2024-04-23 11:10:54 -07:00
swagger Add ability to add invitee to workspaces automatically (#975) 2024-03-26 16:38:32 -07:00
utils Customize refusal response for query mode (#1243) 2024-04-30 16:14:30 -07:00
.env.example Bump openai package to latest (#1234) 2024-04-30 12:33:42 -07:00
.flowconfig devcontainer v1 (#297) 2024-01-08 15:31:06 -08:00
.gitignore RSA-Signing on server<->collector communication via API (#1005) 2024-04-01 13:56:35 -07:00
.nvmrc Implement Chroma Support (#1) 2023-06-07 21:31:35 -07:00
index.js Agent support for @agent default agent inside workspace chat (#1093) 2024-04-16 10:50:10 -07:00
jsconfig.json devcontainer v1 (#297) 2024-01-08 15:31:06 -08:00
nodemon.json Full developer api (#221) 2023-08-23 19:15:07 -07:00
package.json Bump openai package to latest (#1234) 2024-04-30 12:33:42 -07:00
yarn.lock Bump openai package to latest (#1234) 2024-04-30 12:33:42 -07:00