visit
“_Within three to eight years, we will have a machine with the general intelligence of an average human being…if we’re lucky, they might decide to keep us as pets.” — **_Marvin Minsky, 1970**. Founder of the MIT AI Labs, Turing Award winner, advisor to Stanley Kubrick for 2001 : A Space Odyssey.
However, we have reached a point where we can teach machines to learn and generate text and images based on what they learn.
Memorizing statistical patterns isn’t intelligence, of course, but with the advent of Large Language Models (LLMs) — like , , and the new — that approach near-human levels of understanding of the concept of language, we can even use AI to aid human accessibility!
Think Chatbots/helpers that don’t need to tell you “Press 3 if you’re having trouble connecting to our servers”, but understand that you want to solve a connectivity issue when you tell it, “My game freezes at the login screen! Pls help!!”
But enough with these thought experiments. Let’s try our hand at building just such a frontend integration — a chat helper that can use OpenAI to answer a potential student’s questions, without them having to tab out of the course!
chatgpt
package for our purposes.
Using WunderGraph as a backend-for-frontend decouples the frontend — for any set of clients — from the backend, simplifying maintenance, and the two-way communication between the two, by using GraphQL at build time only to turn this whole operation into simple queries and mutations, with complete end-to-end type safety, and data from all your data sources consolidated into a single, unified virtual graph — served as JSON-over-RPC.
Step 0: Dependencies
npm install express dotenv
npm install chatgpt
npm install puppeteer
Step 1: The Server
import { ChatGPTAPI, getOpenAIAuth } from 'chatgpt'
import * as dotenv from 'dotenv'
dotenv.config()
import express from 'express'
const app = express()
app.use(express.json())
const port = 3001
async function getAnswer(question) {
// use puppeteer to bypass cloudflare (headful because of captchas)
const openAIAuth = await getOpenAIAuth({
email: process.env.OPENAI_EMAIL,
password: process.env.OPENAI_PASSWORD,
// isGoogleLogin: true // uncomment this if using google auth
})
const api = new ChatGPTAPI({ ...openAIAuth })
await api.initSession()
// send a message and wait for the response
const response = await api.sendMessage(question)
// response is a markdown-formatted string
return response
}
// GET
app.get('/api', async (req, res) => {
// res.send({ data: await example() });
res.send({
question: 'What is the answer to life, the universe, and everything?',
answer: '42!',
})
})
// POST
app.post('/api', async (req, res) => {
// Get the body from the request
const { body } = req
console.log(body.question) // debug
res.send({
question: body.question,
answer: await getAnswer(body.question),
})
})
app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
})
{ question : “somequestion”, answer : “someanswer”}
Step 2: The OpenAPI Spec
WunderGraph works by introspecting your data sources and consolidating them all into a single, unified virtual graph, that you can then define operations on, and serve the results via JSON-over-RPC.
{
"openapi": "3.0.0",
"info": {
"title": "express-chatgpt",
"version": "1.0.0",
"license": {
"name": "ISC"
},
"description": "OpenAPI v3 spec for our API."
},
"servers": [
{
"url": "//localhost:3001"
}
],
"paths": {
"/api": {
"get": {
"summary": "/api",
"responses": {
"200": {
"description": "OK",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"question": {
"type": "string",
"description": "The question to be asked",
"example": "What is the answer to life, the universe, and everything?"
},
"answer": {
"type": "string",
"description": "The answer",
"example": "42!"
}
}
}
}
}
}
},
"tags": []
},
"post": {
"summary": "/api",
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"question": {
"type": "string",
"description": "The question to be asked",
"example": "What is the answer to life, the universe, and everything?"
}
}
}
}
}
},
"responses": {
"200": {
"description": "OK",
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"question": {
"type": "string",
"description": "The question to be asked",
"example": "What is the answer to life, the universe, and everything?"
},
"answer": {
"type": "string",
"description": "The answer",
"example": "42!"
}
}
}
}
}
}
}
}
}
}
}
Step 0: The Quickstart
We can set up our Next.js client and the WunderGraph BFF using the create-wundergraph-app
CLI. CD into the project root (and out of your Express backend directory), and type in:
npx create-wundergraph-app frontend -E nextjs
cd frontend
npm i && npm start
That’ll boot up the WunderGraph AND Next.js servers (leveraging the npm-run-all
package), giving you a Next.js splash page at localhost:3000
with an example query. If you see that, everything’s working.
Step 1: Setting up WunderGraph
WunderGraph can introspect pretty much any data source you can think of — microservices, databases, APIs — into a secure, typesafe JSON-over-RPC API, OpenAPI REST, GraphQL, PlanetScale, Fauna, MongoDB, and more, plus any Postgres/SQLite/MySQL database.
So, let’s get right to it. Open wundergraph.config.ts
in the .wundergraph
directory, and add our REST endpoint as one such data source our app depends on, and one that WunderGraph should introspect.
const chatgpt = introspect.openApi({
apiNamespace: 'chatgpt',
source: {
kind: 'file',
filePath: './chatgpt-spec.json', // path to your openAPI spec file
},
requestTimeoutSeconds: 30, // optional
})
// add this data source to your config like a dependency
configureWunderGraphApplication({
apis: [chatgpt],
})
//...
Once you’ve run npm start, WunderGraph monitors necessary files in your project directory automatically; so just hitting save here will get the code generator running, and it’ll generate a schema that you can inspect (if you want) — the wundergraph.app.schema.graphql
file within /.wundergraph/generated
.
Step 2: Defining your Operations using GraphQL
This is the part where we write queries/mutations in GraphQL to operate on WunderGraph’s generated virtual graph layer and get us the data we want.
So go to ./wundergraph/operations
and create a new GraphQL file. We’ll call it GetAnswer.graphql
.
mutation ($question: String!) {
result: chatgpt_postApi(postApiInput: { question: $question }) {
question
answer
}
}
Mind the namespacing! Also, notice how we’ve aliased the chatgpt_postApi
field as result.
Each time you’ve hit save throughout this process, WunderGraph’s code generation has been working in the background (and it will, as long as its server is running), generating typesafe, client-specific data fetching React hooks (useQuery, useMutation
, etc.) on the fly for you (using Vercel’s SWR under the hood).
Step 3: Building the UI
Our UI really needs just two things for a minimum viable product. A content area where you’d show your courses, tutorials, or any kind of content that you offer, and a collapsible Chat Assistant/Chatbot interface — that uses one of the hooks we just talked about, useMutation
.
./pages/_app.tsx
import Head from 'next/head'
function MyApp({ Component, pageProps }) {
return (
<>
<Head>
<meta charSet="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<script src="//cdn.tailwindcss.com"></script>
</Head>
<main>
<Component {...pageProps} />
</main>
</>
)
}
export default MyApp
./pages/index.tsx
import Head from 'next/head'
function MyApp({ Component, pageProps }) {
return (
<>
<Head>
<meta charSet="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<script src="//cdn.tailwindcss.com"></script>
</Head>
<main>
<Component {...pageProps} />
</main>
</>
)
}
export default MyApp
./components/NavBar.tsx
const NavBar = () => {
return (
<header className="bg-gray-400 p-4 shadow-md">
<div className="container mx-auto flex items-center justify-between">
<a href="#" className="text-xl font-bold">
My Website
</a>
<nav>
<a href="#" className="px-4 hover:underline">
Home
</a>
<a href="#" className="px-4 hover:underline">
FAQ
</a>
</nav>
</div>
</header>
)
}
export default NavBar
./components/ChatHelper.tsx
import { useState } from 'react'
import { useMutation } from '../components/generated/nextjs'
const ChatHelper = () => {
const placeholderAnswer = 'Hi! What did you want to learn about today?'
const [isExpanded, setIsExpanded] = useState(false)
const [input, setInput] = useState('')
const { data, isMutating, trigger } = useMutation({
operationName: 'GetAnswer',
})
return (
<div
className={`relative rounded-lg bg-white shadow-md ${
isExpanded ? 'expanded' : 'collapsed'
}`}
>
<button
className="absolute top-0 right-0 mr-2 p-4"
onClick={() => setIsExpanded(!isExpanded)}
>
<span>{isExpanded ? '❌' : '💬'}</span>
</button>
<div className="max-h-lg max-w-lg p-4">
{isExpanded && (
<>
<h2 className="text-lg font-bold">Helper</h2>
<p id="answer" className="font-bold text-blue-500">
{data ? data.chatgpt_postApi?.answer : placeholderAnswer}
</p>
<p id="ifLoading" className="font-italics font-bold text-green-500">
{isMutating ? 'ChatGPT is thinking...' : ''}
</p>
<form
onSubmit={(event) => {
event.preventDefault()
if (input) {
trigger({
question: input,
})
}
}}
>
<input
className="w-full rounded-md border p-2"
type="text"
placeholder="Your question here."
onChange={(event) => {
const val = event.target.value
if (val) {
// set question
setInput(val)
}
}}
/>
<button className="mt-2 rounded-md bg-blue-500 p-2 text-white">
Help me, Obi-Wan Kenobi.
</button>
</form>
</>
)}
</div>
</div>
)
}
export default ChatHelper
The useMutation
hook is called only when you call trigger on form submit with an input (i.e., the question; which will end up being the request body in the Express backend). This is pretty intuitive, but for further questions regarding trigger, check out SWR’s documentation .
Going forward, you’ll probably want to add a <ul>
list of canned/pre-selected questions (based on the current course) that when clicked, are passed to the <ChatHelper>
component as questions, so your students have a list of suggestions for where to start asking questions.
Other than that, you could also use the conversationId
and messageId
in the result object, and pass them to sendMessage
as conversationId
and parentMessageId
respectively to track the conversation with the bot, and add to it an awareness of questions asked immediately before — so your students can ask follow-up questions to get more relevant information and make the conversation flow more naturally.
// send a follow-up
res = await api.sendMessage('Can you expand on that?', {
conversationId: res.conversationId,
parentMessageId: res.messageId,
})
console.log(res.response)
// send another follow-up
res = await api.sendMessage('What were we talking about?', {
conversationId: res.conversationId,
parentMessageId: res.messageId,
})
Additionally, keep an eye on the chatgpt
library itself, as OpenAI frequently changes how ChatGPT’s research preview works, so you’ll want to make sure your code keeps up with the unofficial API as it is updated accordingly.
Finally, if you want to know more about WunderGraph’s many use cases, check out their Discord community !
Also published