anything-llm/server/utils/AiProviders
2024-03-22 14:39:30 -07:00
..
anthropic [FEAT] Anthropic Haiku model support (#901) 2024-03-13 17:32:02 -07:00
azureOpenAi Stop generation button during stream-response (#892) 2024-03-12 15:21:27 -07:00
gemini Stop generation button during stream-response (#892) 2024-03-12 15:21:27 -07:00
groq [FEAT] Groq LLM support (#865) 2024-03-06 14:48:38 -08:00
huggingface Stop generation button during stream-response (#892) 2024-03-12 15:21:27 -07:00
lmStudio Patch LMStudio Inference server bug integration (#957) 2024-03-22 14:39:30 -07:00
localAi Refactor LLM chat backend (#717) 2024-02-14 12:32:07 -08:00
mistral Refactor LLM chat backend (#717) 2024-02-14 12:32:07 -08:00
native Stop generation button during stream-response (#892) 2024-03-12 15:21:27 -07:00
ollama Stop generation button during stream-response (#892) 2024-03-12 15:21:27 -07:00
openAi Enable ability to do full-text query on documents (#758) 2024-02-21 13:15:45 -08:00
openRouter Stop generation button during stream-response (#892) 2024-03-12 15:21:27 -07:00
perplexity CHORE: bump pplx model support (#791) 2024-02-23 17:33:16 -08:00
togetherAi Stop generation button during stream-response (#892) 2024-03-12 15:21:27 -07:00