Sean Hatfield
1d39b8a2ce
add Together AI LLM support ( #560 )
...
* add Together AI LLM support
* update readme to support together ai
* Patch togetherAI implementation
* add model sorting/option labels by organization for model selection
* linting + add data handling for TogetherAI
* change truthy statement
patch validLLMSelection method
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-10 12:35:30 -08:00
Timothy Carambat
8cd3a92c66
[BUG] Fixed mass_assignment vuln ( #566 )
...
Fixed mass_assignment vuln
Co-authored-by: dastaj <78434825+dastaj@users.noreply.github.com>
2024-01-10 08:42:03 -08:00
Timothy Carambat
259079ac58
561 relative api docs url ( #564 )
...
* patch swagger host to be relative
* change tag on feature request template
2024-01-09 21:52:50 -08:00
Timothy Carambat
755c10b8ca
[API] patch swagger host to be relative ( #563 )
...
patch swagger host to be relative
2024-01-09 19:49:51 -08:00
timothycarambat
5b2c0ca782
add OnboardAI link
2024-01-09 18:07:41 -08:00
timothycarambat
b8192883c2
fix auto-tag on bug issue yaml
2024-01-09 18:06:01 -08:00
timothycarambat
4801df08c5
remove broken config link
2024-01-09 14:31:28 -08:00
timothycarambat
6a4c99affe
unquote email in config
2024-01-09 14:31:02 -08:00
timothycarambat
964d9a7137
update build ignore
2024-01-09 14:28:56 -08:00
Timothy Carambat
fd4a230669
Setup issue and PR templates ( #559 )
...
* Setup issue templates
Allow ability to include blank issue
resolves #557
todo: PR template
* update templates + add PR template
* newlines
2024-01-09 14:25:53 -08:00
Sean Hatfield
5c3bb4b8cc
532 uiux add slash command modal ( #555 )
...
* WIP slash commands
* add slash command image
* WIP slash commands
* slash command menu feature complete
* move icons to slash command local
* update how slash command component works
* relint with new linter
* Finalize slash command input
Change empty workspace text layout
Patch dev unmount issues on Chatworkspace/index.jsx
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-09 13:07:09 -08:00
Timothy Carambat
58971e8b30
Build & Publish AnythingLLM for ARM64 and x86 ( #549 )
...
* Update build process to support multi-platform builds
Bump @lancedb/vectordb to 0.1.19 for ARM&AMD compatibility
Patch puppeteer on ARM builds because of broken chromium
resolves #539
resolves #548
---------
Co-authored-by: shatfield4 <seanhatfield5@gmail.com>
2024-01-08 16:15:01 -08:00
Timothy Carambat
cce48c2163
[Automated Builds] Patch lockfiles with new devdeps ( #553 )
...
patch lockfiles with new devdeps
2024-01-08 15:48:03 -08:00
Francisco Bischoff
990a2e85bf
devcontainer v1 ( #297 )
...
Implement support for GitHub codespaces and VSCode devcontainers
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
Co-authored-by: Sean Hatfield <seanhatfield5@gmail.com>
2024-01-08 15:31:06 -08:00
Sagar Jaglan
5172bceec3
#520 fixed unauthorized token error ( #535 )
2024-01-08 13:16:00 -08:00
timothycarambat
7d584713a3
workflow ignore env.example updates
2024-01-06 15:40:03 -08:00
timothycarambat
19e3889f29
update envs to not display https localhost
2024-01-06 15:38:44 -08:00
Bhargav Kowshik
76f733f902
Remove explicit Prisma step instruction in the development setup ( #538 )
2024-01-05 09:42:16 -08:00
timothycarambat
3e088f22b1
fix: Patch tiktoken method missing
...
resolves #541
2024-01-05 09:39:19 -08:00
Sean Hatfield
d95d1a9dfd
529 UI update llm embedder and vectordb selection pages ( #533 )
...
* move llm, embedder, vectordb items to components folder
* add backdrop blur to search in llm, embedder, vectordb preferences
* implement searchable llm preference in settings
* implement searchable embedder in settings
* remove unused useState from embedder preferences
* implement searchable vector database in settings
* fix save changes button not appearing on change for llm, embedder, and vectordb settings pages
* sort selected items in all settings and put selected item at top of list
* no auto-top for selection
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-04 18:20:58 -08:00
Timothy Carambat
e9f7b9b79e
Handle undefined stream chunk for native LLM ( #534 )
2024-01-04 18:05:06 -08:00
pritchey
74d2711d80
523-Added support for HTTPS to Server. ( #524 )
...
* Added support for HTTPS to server.
* Move boot scripts to helper file
catch bad ssl boot config
fallback SSL boot to HTTP
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-04 17:22:15 -08:00
Sayan Gupta
b7d2756754
Issue #204 Added a check to ensure that 'chunk.payload' exists and contains the 'id' property ( #526 )
...
* Issue #204 Added a check to ensure that 'chunk.payload' exists and contains the 'id' property before attempting to destructure it
* run linter
* simplify condition and comment
---------
Co-authored-by: timothycarambat <rambat1010@gmail.com>
2024-01-04 16:39:43 -08:00
timothycarambat
a1b4ed43ba
allow native when on embedder
2024-01-04 16:12:20 -08:00
Timothy Carambat
d8ca92df88
Onboarding V2 ( #502 )
...
* WIP onboarding v2
* Welcome screen for onboarding complete
* fix home page and WIP create skeleton for llm preference search/options
* render llms as options
* add search functionality to llm preference & add survey step
* fix openai settings undefined & create custom logo onboarding page
* add user setup UI
* add data handling & privacy onboarding screen
* add create workspace onboarding screen
* fix survey width in onboarding
* create vector database connection onboarding page
* add workspace image & all skeleton ui complete
* fix navigation buttons and ui tweaks to fit on screen
* WIP LLMPreference
* LLM Preference screen fully functional
* create components for vector db options and fix styling of azure options
* remove unneeded comment
* vector db connection onboarding screen complete
* minor ui tweak to searchbar
* user setup page fully working
* create workspace onboarding page fully working
* useNavigate for navigation between pages
* mobile layout, cleanup old files, survey functionality implemented
* fix default logo appearing when should be blank & password setup bug fix
* Modify flow of onboarding
todo: embedding set up
* Add embedder setup screen & insert into flow
* update embedding back button
auto-dismiss toasts on each step
* move page defs under imports
fix bg color on mobile styling
---------
Co-authored-by: shatfield4 <seanhatfield5@gmail.com>
2024-01-04 15:54:31 -08:00
Timothy Carambat
92da23e963
Handle special token in TikToken ( #528 )
...
* Handle special token in TikToken
resolves #525
* remove duplicate method
add clarification comment on implementation
2024-01-04 15:47:00 -08:00
timothycarambat
a2a903741d
replace stored GIF with Github CDN hosted image
2024-01-04 10:59:24 -08:00
Timothy Carambat
75dd86967c
Implement AzureOpenAI model chat streaming ( #518 )
...
resolves #492
2024-01-03 16:25:39 -08:00
timothycarambat
8001f69454
Merge branch 'master' of github.com:Mintplex-Labs/anything-llm
2024-01-03 15:44:37 -08:00
timothycarambat
dc23961231
drop feeback path in paths
2024-01-03 15:44:26 -08:00
Timothy Carambat
ceadc8d467
patch gpt-4-turbo token allowance for Azure model ( #514 )
2024-01-02 12:49:48 -08:00
timothycarambat
237c544ebc
Merge branch 'master' of github.com:Mintplex-Labs/anything-llm
2024-01-02 12:44:26 -08:00
timothycarambat
d99b87e7d8
patch workspace-chats API endpoint to be generally available instead of forced multi-user
2024-01-02 12:44:17 -08:00
Timothy Carambat
6d5968bf7e
Llm chore cleanup ( #501 )
...
* move internal functions to private in class
simplify lc message convertor
* Fix hanging Context text when none is present
2023-12-28 14:42:34 -08:00
Timothy Carambat
2a1202de54
Patch Ollama Streaming chunk issues ( #500 )
...
Replace stream/sync chats with Langchain interface for now
connect #499
ref: https://github.com/Mintplex-Labs/anything-llm/issues/495#issuecomment-1871476091
2023-12-28 13:59:47 -08:00
Timothy Carambat
d7481671ba
Prevent external service localhost question ( #497 )
...
* Prevent external service localhost question
* add 0.0.0.0 to docker-invalid URL
* clarify hint
2023-12-28 10:47:02 -08:00
Timothy Carambat
e0a0a8976d
Add Ollama as LLM provider option ( #494 )
...
* Add support for Ollama as LLM provider
resolves #493
2023-12-27 17:21:47 -08:00
Timothy Carambat
24227e48a7
Add LLM support for Google Gemini-Pro ( #492 )
...
resolves #489
2023-12-27 17:08:03 -08:00
timothycarambat
26549df6a9
touchup linting
2023-12-27 13:28:37 -08:00
timothycarambat
1a9a519afb
Merge branch 'master' of github.com:Mintplex-Labs/anything-llm
2023-12-27 10:25:20 -08:00
timothycarambat
5931b60202
add FUNDING.yml
2023-12-27 10:25:12 -08:00
Sean Hatfield
1d9ba76b92
fix success is not defined error ( #484 )
2023-12-21 10:31:14 -08:00
timothycarambat
daadad3859
hoist var in extensions
2023-12-20 19:41:16 -08:00
timothycarambat
31ff4f0832
docs: chain windows docker commands for single enter run
2023-12-20 11:40:04 -08:00
Timothy Carambat
c613eff31c
[Docker] Windows Docker command in Powershell ( #480 )
...
* wip
* side by side test
* patch syntax highlighting
* remove spacing formatting
* swap powershell command
2023-12-20 11:33:00 -08:00
timothycarambat
049bfa14cb
fix: fully separate chunkconcurrency from chunk length
2023-12-20 11:20:40 -08:00
timothycarambat
7bee849c65
chore: Force VectorCache to always be on;
...
update file picker spacing for attributes
2023-12-20 10:45:03 -08:00
timothycarambat
67725e807a
fix: broken build file
2023-12-19 16:22:25 -08:00
timothycarambat
a7f6003277
fix: set lower maxChunk limit on native embedder to stay within resource constraints
...
chore: update comment for what embedding chunk means
2023-12-19 16:20:34 -08:00
timothycarambat
b40cfead88
Merge branch 'master' of github.com:Mintplex-Labs/anything-llm
2023-12-19 13:30:25 -08:00