anything-llm/server/utils/AiProviders
Sean Hatfield 7273c892a1
Ollama performance mode option (#2014)
* ollama performance mode option

* Change ENV prop
Move perf setting to advanced

---------

Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-08-02 13:29:17 -07:00
..
anthropic Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
azureOpenAi Deprecate .isSafe moderation (#1790) 2024-06-28 15:32:30 -07:00
bedrock Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
cohere Deprecate .isSafe moderation (#1790) 2024-06-28 15:32:30 -07:00
gemini Gemini Pro 1.5, API support for 2M context and new experimental model (#2031) 2024-08-02 10:24:31 -07:00
genericOpenAi Deprecate .isSafe moderation (#1790) 2024-06-28 15:32:30 -07:00
groq Patch Groq preview models maxed to 8K tokens due to warning 2024-08-01 09:24:57 -07:00
huggingface Deprecate .isSafe moderation (#1790) 2024-06-28 15:32:30 -07:00
koboldCPP Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
liteLLM Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
lmStudio Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
localAi Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
mistral Deprecate .isSafe moderation (#1790) 2024-06-28 15:32:30 -07:00
native Deprecate .isSafe moderation (#1790) 2024-06-28 15:32:30 -07:00
ollama Ollama performance mode option (#2014) 2024-08-02 13:29:17 -07:00
openAi Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
openRouter handle OpenRouter exceptions on streaming (#2033) 2024-08-02 12:23:39 -07:00
perplexity Bump Perplexity and Together AI static model list 2024-07-31 10:58:34 -07:00
textGenWebUI Add multimodality support (#2001) 2024-07-31 10:47:49 -07:00
togetherAi Bump Perplexity and Together AI static model list 2024-07-31 10:58:34 -07:00