Add an AI Chatbot to the Remix Supabase SaaS Starter kit
Add an AI chatbot to your Remix Supabase SaaS Starter kit to provide instant support to your users.
NB: this is currently a work in progress and not yet available.
This plugin allows you to create a chatbot for your app. Users can interact with the chatbot to get information about your app.
The chatbot will index your app's content and provide answers to user queries.
Installation
Install the chatbot plugin from your main app:
npx @makerkit/cli plugins install chatbot
Now, install the plugin package in your application:
pnpm add --filter web @kit/chatbot
## Import the Chatbot componentYou can now import the Chatbot component in your app:```tsximport { Chatbot } from '@kit/chatbot';<Chatbot sitename={'Makerkit'} />
Create the API route handler
You can add this anywhere since it's configurable, but my suggestion is to add it to the api.chat._index
directory. Create a new file called route.ts
at api.chat._index/route.ts
:
import { ActionFunctionArgs } from '@remix-run/server-runtime';import { handleChatBotRequest } from '@kit/chatbot/server';export const action = async ({ request }: ActionFunctionArgs) => { return handleChatBotRequest(request);};
If you use a different path, please set the VITE_CHATBOT_API_URL
environment variable:
VITE_CHATBOT_API_URL=/api/some-other-path
Add the Migration file
Copy the migration file at packages/plugins/chatbot/migrations
into your apps migration directory at apps/web/supabase/migrations
.
Make sure to reset the database after copying the migration file and make sure to push this to your remote instance.
Add your OpenAI API Key
In your hosting provider, add the OPENAI_API_KEY
environment variable with your OpenAI API key.
OPENAI_API_KEY=sk-1234567890
Locally, you can set it in your .env.local
file, which is never committed to the repository.
Configuration
The Chatbot component accepts the following props:
interface ChatBotProps { siteName: string; conversationId?: string; defaultPrompts?: string[]; isOpen?: boolean; isDisabled?: boolean; settings?: ChatbotSettings; storageKey?: string; onClear?: () => void; onMessage?: (message: string) => void;}
Customizing the Chatbot
You can customize the chatbot by providing a settings
object:
type Position = `bottom-left` | `bottom-right`;interface Branding { primaryColor: string; accentColor: string; textColor: string;}export interface ChatbotSettings { title: string; order: Position; branding: Branding;}
For example:
<Chatbot settings={{ title: 'Chat with us', order: 'bottom-right', branding: { primaryColor: '#000', accentColor: '#000', textColor: '#fff', },}} />
Indexing Content
At this time - the Chatbot will crawl your app's content and index it - by reading the sitemap.xml file.
The crawler will infer the sitemap.xml from the website's robots.txt file. Otherwise, you can provide the sitemap.xml URL using the CHATBOT_WEBSITE_SITEMAP_URL
environment variable.
CHATBOT_WEBSITE_SITEMAP_URL=https://example.com/sitemap.xml
Then, you will need to provide your Supabase remote instance URL and key using the .env.local
at packages/plugins/chatbot/.env.local
.
SUPABASE_SERVICE_ROLE_KEY=*****VITE_SUPABASE_URL=https://*****.supabase.coOPENAI_API_KEY=sk-*****
Please make sure to never commit these keys to your repository. By default, the
To index your website's content, the chatbot will read the sitemap.xml file.
To launch the chatbot crawler, run the command:
pnpm --filter chatbot indexer -- --url <url> --include <include-paths> --exclude <excluded-paths>
For example:
pnpm --filter chatbot indexer -- --url https://example.com --include /docs,/blog --exclude /blog/secret
The --url
flag is required and should point to the website's root URL. The --include
and --exclude
flags are optional and should be comma-separated paths.
It's highly recommended to include the --exclude
flag to prevent the chatbot from indexing way too many pages or stuff you don't want to be indexed.
Environment Variables
You can customize the chatbot model by providing the following environment variables:
LLM_MODEL_NAME=gpt-3.5-turboLLM_BASE_URL=LLM_API_KEY=
The model needs to be compatible with the OpenAI API.