How To Make an AI Online Chatbot Using LangChain
In this post, we’ll be making an online AI chatbot, that will be capable of retrieving latest data we can find using Google search. Furthermore, for this example, we’ll be using OpenAI API to get responses from their chat model gpt-3.5-turbo.
Essentially, we’ll build a ChatGPT with access to the internet. But that is not all, since we’ll get responses in code, we’ll be able to further expand on the bots functionality. To achieve this, we’ll use LangChain library, which is constantly evolving, giving us the ability to create wonders with it.
Prerequisites
Before we can start coding, we’ll need to setup our API keys from OpenAI and SerpApi services. Additionally, because OpenAI’s API usage isn’t available for free, you’ll also need to upgrade to a paid plan for this to work. On the other hand, SerpApi does offer a free plan with 100 calls per month.
Okay, once you get your keys, we can start with the project. Firstly, lets import all the necessary libraries and tools we’ll need.
import json
import os
from langchain.agents import initialize_agent, AgentType, Tool
from langchain.memory import ConversationBufferWindowMemory
from langchain.chat_models import ChatOpenAI
from langchain.utilities import SerpAPIWrapper
I stored my API keys inside a auth.json file, from which I’ll retrieve them with a function and set them inside environment variables.
ROOT = os.path.dirname(__file__)
def get_token(token_name):
auth_file = open(os.path.join(ROOT, 'auth.json'))
auth_data = json.load(auth_file)
token = auth_data[token_name]
return token
os.environ['OPENAI_API_KEY'] = get_token('openai-token')
os.environ['SERPAPI_API_KEY'] = get_token('serper-token')
Defining & putting to use the AI online Chatbot
Now that we setup our environment, we can start defining all the components of our chatbot. Furthermore, these components include large language model (LLM) instance, memory, tools, and an agent instance to put it all together.
You may already be familiar with LLM and memory instances, which we worked with in another chatbot tutorial. However, tools and agents are something we’ve yet to encounter. Therefore, let me explain, tools are basically plugins, with which our LLM can interact and an agent is what uses them.
There are various different types of agents, and since we’re building a chatbot, we want it to retrieve data in a conversation type manner. Therefore, we’ll use the CHAT_CONVERSATIONAL_REACT_DESCRIPTION
type.
search = SerpAPIWrapper()
tools = [
Tool(
name='Current Search',
func=search.run,
description='useful for when you need to answer questions about current events or the current state of the world. the input to this should be a single search term.'
)
]
memory = ConversationBufferWindowMemory(memory_key='chat_history', return_messages=True)
llm = ChatOpenAI(temperature=0.9)
agent_chain = initialize_agent(
tools=tools,
llm=llm,
agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
memory=memory
)
Now, we’re ready to put it to use.
while True:
prompt = input('You: ')
if prompt:
response = agent_chain.run(input=prompt)
print('AI:', response)
Keep in mind that you have limited API calls when using SerpApi service. Therefore, you may encounter errors or unexpected behaviour from your bot once you exceed that limit.
Conclusion
To conclude, we made a simple AI chatbot that is capable of retrieving data from searching online. I learned a lot while working on this project and I hope it proves helpful to you aswell.