anything-llm/server/utils/AiProviders
Sean Hatfield 3fe7a25759
add token context limit for native llm settings (#614)
Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-17 16:25:30 -08:00
..
anthropic add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
azureOpenAi add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
gemini add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
lmStudio add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
localAi add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
mistral add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
native add token context limit for native llm settings (#614) 2024-01-17 16:25:30 -08:00
ollama add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
openAi add support for mistral api (#610) 2024-01-17 14:42:05 -08:00
togetherAi add support for mistral api (#610) 2024-01-17 14:42:05 -08:00