anything-llm/server/utils/chats
Timothy Carambat e0a0a8976d
Add Ollama as LLM provider option (#494)
* Add support for Ollama as LLM provider
resolves #493
2023-12-27 17:21:47 -08:00
..
commands [FEATURE] Enable the ability to have multi user instances (#158) 2023-07-25 10:37:04 -07:00
index.js Enable chat streaming for LLMs (#354) 2023-11-13 15:07:30 -08:00
stream.js Add Ollama as LLM provider option (#494) 2023-12-27 17:21:47 -08:00