visit
The advent of OpenAI's API has empowered countless developers to create sophisticated chatbots without breaking a sweat 🧑💻.
In this article, we would be building a Slack chatbot named Fl0Bot
which could answer questions regarding FL0
. 💬
We would be using NodeJs
for our backend and Postgres
as a database. Then we would be deploying our application effortlessly with the help of FL0
🚀.
Here's our docker-compose.yaml
file for the same 🐳
version: "3"
services:
app:
build:
context: .
target: development
env_file: .env
volumes:
- ./src:/usr/src/app/src
ports:
- 8081:80
depends_on:
- db
db:
image: postgres:14
restart: always
environment:
POSTGRES_USER: admin
POSTGRES_PASSWORD: admin
POSTGRES_DB: my-startup-db
volumes:
- postgres-data:/var/lib/postgresql/data
ports:
- 5432:5432
volumes:
postgres-data:
And here's a high-level overview of what we are gonna build 👀
Now, let's delve into the code 🧑💻
npm install axios @slack/bolt openai uuid
We would create a .env.example
file to list the environment variables just for reference 👇
NODE_ENV=development
DATABASE_URL=postgres_url
BOT_SYSTEM=system_prompt
OPENAI_API_KEY=open_api_key
SLACK_WEBHOOK=slack_webhook
src/config/index.js
module.exports = {
"local": {
"use_env_variable": "DATABASE_URL",
"openai_api_key": "OPENAI_API_KEY",
"bot_system" : "BOT_SYSTEM",
"slack_webhook" : "SLACK_WEBHOOK",
synchronize: true
},
"development": {
"use_env_variable": "DATABASE_URL",
"openai_api_key": "OPENAI_API_KEY",
"bot_system" : "BOT_SYSTEM",
"slack_webhook" : "SLACK_WEBHOOK",
synchronize: true
},
"production": {
"use_env_variable": "DATABASE_URL",
"openai_api_key": "OPENAI_API_KEY",
"bot_system" : "BOT_SYSTEM",
"slack_webhook" : "SLACK_WEBHOOK",
synchronize: true
}
}
Now, let's get started with setting up our database. As we are using sequelize ORM
, we would need to create models for our Postgres database 🐘.
Here, we would need to create a Chat
in which we would be storing all the communication between the FL0Bot
and User
.
Every time a new request is made, we SELECT
the recent chats from this database and send it for reference to the FL0Bot. 💬
src/models/chat.js
'use strict';
const { Sequelize, DataTypes } = require('sequelize');
module.exports = (sequelize) => {
const Chat = sequelize.define(
'Chat',
{
chat_id: {
type: DataTypes.UUID,
primaryKey: true,
defaultValue: Sequelize.UUIDV4,
},
person_id: {
type: DataTypes.STRING,
allowNull: false,
},
role: {
type: DataTypes.STRING,
},
content: {
type: DataTypes.STRING(10000)
},
time_created: {
type: DataTypes.DATE,
defaultValue: DataTypes.NOW,
},
time_updated: {
type: DataTypes.DATE,
defaultValue: DataTypes.NOW,
},
},
{
tableName: 'chats', // Specify the table name explicitly if different from the model name
timestamps: false, // Disable timestamps (createdAt, updatedAt)
hooks: {
beforeValidate: (chat, options) => {
// Update the time_updated field to the current timestamp before saving the record
chat.time_updated = new Date();
},
},
}
);
return Chat;
};
First, we would create our handleAppMention
function.
We are also adding a system
in the conversation which is in the config.bot_system
. This provides GPT the context about FL0
.
Example GPT System Prompt
You are a bot that answers queries only around a specific product: fl0 and you will tell nothing about any other product or tools. FL0 is a platform for easily deploying your code as containers. Just push code to your repo and FL0 will build and deploy your app to a fully managed infrastructure complete with databases, logging, multiple environments and lots more!
src/index.js
async function handleAppMention({event}) {
const mentionRegex = /<@[\w\d]+>/g; // Regex pattern to match the mention
const msg = event.text.replace(mentionRegex, '');
const person_id = event.user;
const query = msg;
try {
const userExists = await Chat.findOne({ where: { person_id: person_id }, raw: true });
if (!userExists) {
const dbChat = await Chat.create({ person_id: person_id, role: 'system', content: process.env[config.bot_system] });
}
const chats = await Chat.findAll({ where: { person_id }, order: [['time_created', 'DESC']], limit: 5, raw: true });
const chatsGpt = chats.map((item) => ({ role: item.role, content: item.content }));
chatsGpt.push({ role: 'user', content: query });
const response = await openai.createChatCompletion({
model: 'gpt-3.5-turbo',
messages: chatsGpt,
});
await Chat.bulkCreate([
{ person_id, role: 'user', content: query },
{ person_id, role: 'assistant', content: response.data.choices[0].message.content }
]);
await axios.post(process.env[config.slack_webhook], {text: response.data.choices[0].message.content});
return response.data.choices[0].message.content
} catch (error) {
console.log("ERROR",error)
return 'Failed to process chat';
}
}
🚗 Coming to our routes, we've set up an endpoint (/slack/action-endpoint
) for Slack's action-events
, in response to app_mention
events.
And we are returning the response from handleAppMention
function.
src/index.js
const express = require('express')
const { sequelize, Chat } = require('./models');
const process = require('process');
const env = process.env.NODE_ENV || 'development';
const config = require(__dirname + '/config/index.js')[env];
const axios = require('axios');
const app = express()
app.use(express.json());
const { Configuration, OpenAIApi } = require("openai");
const configuration = new Configuration({
apiKey: process.env[config.openai_api_key],
});
const openai = new OpenAIApi(configuration);
const port = process.env.PORT ?? 3000;
app.post('/slack/action-endpoint', async (req, res) => {
const { challenge } = req.body;
if (challenge) {
res.status(200).send(challenge);
} else {
try {
switch(req.body.event.type) {
case 'app_mention':
const response = handleAppMention(req.body)
res.status(200).json({ message: 'Success' });
break
default:
res.status(400).json({ message: 'Bad Request' });
break
}
} catch (error) {
console.error(`Error processing Slack event: ${error}`);
res.status(500).json({ message: error });
}
}
});
app.listen(port, async () => {
console.log(`Example app listening on port ${port}`)
try {
await sequelize.sync({ force: false });
await sequelize.authenticate();
sequelize.options.dialectOptions.ssl = false;
await sequelize.sync({ force: true});
console.log('Connection has been established successfully.');
} catch (error) {
console.error('Unable to connect to the database:', error);
}
});
In this tutorial, we're utilizing FL0
, a platform expertly designed for straightforward deployment of dockerized NodeJS applications, fully integrated with a database.
We would just need to push our repo to GitHub
.🫸
Now we would be deploying our project just by "Connecting our GitHub
account" and selecting our project.
Then we would be adding our environment variables listed in .env.example
file.
In the Event Subscriptions section, we would enable events, set the request URL, and subscribe to bot events: app_mention
So, there we have it - a completely operational chatbot tailored to answer questions about FL0
and its features, built using NodeJs, Postgres, and OpenAI's GPT, and seamlessly deployed with FL0
!
Here's the link to our repository for reference ➡️
Also published