anything-llm/server/utils/AiProviders
Sean Hatfield 3f78ef413b
[FEAT] Support for gemini-1.0-pro model and fixes to prompt window limit (#1557)
support for gemini-1.0-pro model and fixes to prompt window limit
2024-05-29 08:17:35 +08:00
..
anthropic Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
azureOpenAi Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
cohere Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
gemini [FEAT] Support for gemini-1.0-pro model and fixes to prompt window limit (#1557) 2024-05-29 08:17:35 +08:00
genericOpenAi update error handling for OpenAI providers 2024-05-22 09:58:10 -05:00
groq update error handling for OpenAI providers 2024-05-22 09:58:10 -05:00
huggingface Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
koboldCPP update error handling for OpenAI providers 2024-05-22 09:58:10 -05:00
liteLLM Patch handling of end chunk stream events for OpenAI endpoints (#1487) 2024-05-23 10:20:40 -07:00
lmStudio Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
localAi Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
mistral Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
native Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
ollama Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00
openAi update error handling for OpenAI providers 2024-05-22 09:58:10 -05:00
openRouter update error handling for OpenAI providers 2024-05-22 09:58:10 -05:00
perplexity update error handling for OpenAI providers 2024-05-22 09:58:10 -05:00
textGenWebUI update error handling for OpenAI providers 2024-05-22 09:58:10 -05:00
togetherAi Make native embedder the fallback for all LLMs (#1427) 2024-05-16 17:25:05 -07:00