* WIP embedded app * WIP got response from backend in embedded app * WIP streaming prints to embedded app * implemented streaming and tailwind min for styling into embedded app * WIP embedded app history functional * load params from script tag into embedded app * rough in modularization of embed chat cleanup dev process for easier dev support move all chat to components todo: build process todo: backend support * remove eslint config * Implement models and cleanup embed chat endpoints Improve build process for embed prod minification and bundle size awareness WIP * forgot files * rename to embed folder * introduce chat modal styles * add middleware validations on embed chat * auto open param and default greeting * reset chat history * Admin embed config page * Admin Embed Chats mgmt page * update embed * nonpriv * more style support reopen if chat was last opened * update comments * remove unused imports * allow change of workspace for embedconfig * update failure to lookup message * update reset script * update instructions * Add more styling options Add sponsor text at bottom Support dynamic container height Loading animations * publish new embed script * Add back syntax highlighting and keep bundle small via dynamic script build * add hint * update readme * update copy model for snippet with link to styles --------- Co-authored-by: timothycarambat <rambat1010@gmail.com>
4.5 KiB
AnythingLLM Embedded Chat Widget
Warning
The use of the AnythingLLM embed is currently in beta. Please request a feature or report a bug via a Github Issue if you have any issues.
Warning
The core AnythingLLM team publishes a pre-built version of the script that is bundled with the main application. You can find it at the frontend URL
/embed/anythingllm-chat-widget.min.js
. You should only be working in this repo if you are wanting to build your own custom embed.
This folder of AnythingLLM contains the source code for how the embedded version of AnythingLLM works to provide a public facing interface of your workspace.
The AnythingLLM Embedded chat widget allows you to expose a workspace and its embedded knowledge base as a chat bubble via a <script>
or <iframe>
element that you can embed in a website or HTML.
Security
- Users will not be able to view or read context snippets like they can in the core AnythingLLM application
- Users are assigned a random session ID that they use to persist a chat session.
- Recommended You can limit both the number of chats an embedding can process and per-session.
by using the AnythingLLM embedded chat widget you are responsible for securing and configuration of the embed as to not allow excessive chat model abuse of your instance
Developer Setup
cd embed
from the root of the repoyarn
to install all dev and script dependenciesyarn dev
to boot up an example HTML page to use the chat embed widget.
While in development mode (yarn dev
) the script will rebuild on any changes to files in the src
directory. Ensure that the required keys for the development embed are accurate and set.
yarn build
will compile and minify your build of the script. You can then host and link your built script wherever you like.
Integrations & Embed Types
<script>
tag HTML embed
The primary way of embedding a workspace as a chat widget is via a simple <script>
<!--
An example of a script tag embed
REQUIRED data attributes:
data-embed-id // The unique id of your embed with its default settings
data-base-api-url // The URL of your anythingLLM instance backend
-->
<script
data-embed-id="5fc05aaf-2f2c-4c84-87a3-367a4692c1ee"
data-base-api-url="http://localhost:3001/api/embed"
src="http://localhost:3000/embed/anythingllm-chat-widget.min.js">
</script>
<script>
Customization Options
LLM Overrides
-
data-prompt
— Override the chat window with a custom system prompt. This is not visible to the user. If undefined it will use the embeds attached workspace system prompt. -
data-model
— Override the chat model used for responses. This must be a valid model string for your AnythingLLM LLM provider. If unset it will use the embeds attached workspace model selection or the system setting. -
data-temperature
— Override the chat model temperature. This must be a valid value for your AnythingLLM LLM provider. If unset it will use the embeds attached workspace model temperature or the system setting.
Style Overrides
-
data-chat-icon
— The chat bubble icon show when chat is closed. Options areplus
,chatCircle
,support
,search2
,search
,magic
. -
data-button-color
— The chat bubble background color shown when chat is closed. Value must be hex color code. -
data-user-bg-color
— The background color of the user chat bubbles when chatting. Value must be hex color code. -
data-assistant-bg-color
— The background color of the assistant response chat bubbles when chatting. Value must be hex color code. -
data-brand-image-url
— URL to image that will be show at the top of the chat when chat is open. -
data-greeting
— Default text message to be shown when chat is opened and no previous message history is found. -
data-no-sponsor
— Setting this attribute to anything will hide the custom or default sponsor at the bottom of an open chat window. -
data-sponsor-link
— A clickable link in the sponsor section in the footer of an open chat window. -
data-sponsor-text
— The text displays in sponsor text in the footer of an open chat window.
Behavior Overrides
-
data-open-on-load
— Once loaded, open the chat as default. It can still be closed by the user. -
data-support-email
— Shows a support email that the user can used to draft an email via the "three dot" menu in the top right. Option will not appear if it is not set.
<iframe>
tag HTML embed
work in progress
<iframe>
Customization Options
work in progress