Timothy Carambat
|
9ace0e67e6
|
Validate max_tokens is number (#1445)
|
2024-05-17 21:44:55 -07:00 |
|
Timothy Carambat
|
01cf2fed17
|
Make native embedder the fallback for all LLMs (#1427)
|
2024-05-16 17:25:05 -07:00 |
|
Sean Hatfield
|
0a6a9e40c1
|
[FIX] Add max tokens field to generic OpenAI LLM connector (#1345)
* add max tokens field to generic openai llm connector
* add max_tokens property to generic openai agent provider
|
2024-05-10 14:49:02 -07:00 |
|
Sean Hatfield
|
9feaad79cc
|
[CHORE] Remove sendChat and streamChat in all LLM providers (#1260)
* remove sendChat and streamChat functions/references in all LLM providers
* remove unused imports
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
|
2024-05-01 16:52:28 -07:00 |
|
Timothy Carambat
|
547d4859ef
|
Bump openai package to latest (#1234)
* Bump `openai` package to latest
Tested all except localai
* bump LocalAI support with latest image
* add deprecation notice
* linting
|
2024-04-30 12:33:42 -07:00 |
|
Timothy Carambat
|
df17fbda36
|
Add generic OpenAI endpoint support (#1178)
* Add generic OpenAI endpoint support
* allow any input for model in case provider does not support models endpoint
|
2024-04-23 13:06:07 -07:00 |
|