Building an AI Chatbot with Essential Python Libraries
In the competitive field of data science and analysis, showcasing relevant projects is a key factor in landing the perfect job. Building a project that blends several facets of the field – AI, Natural Language Processing (NLP), Web Scraping, and Data Visualization – demonstrates a comprehensive understanding of the skills that potential employers look for. One such project is the development of an AI chatbot. This not only emphasizes your command over the above-mentioned areas but also portrays your ability to integrate various technologies to create an impactful end product. Today, we’ll delve into a sample code that can serve as a fantastic foundation for such a project, utilizing several essential Python libraries.
Streamlit is a fast, easy, and powerful way to create web applications in Python. It’s perfect for building data applications because of its simplicity and focus on Python’s strengths. Streamlit is being used here to create the user interface for our chatbot.
The Langchain library is a frame work for incorporating tools with large language models. The language model being used is the OpenAI’s gpt-3.5-turbo.
# bring the essential libraries. import streamlit as st from langchain.chat_models import ChatOpenAI from langchain.agents import initialize_agent from langchain.agents import Tool from langchain.tools import DuckDuckGoSearchRun from langchain.tools import YouTubeSearchTool
Large Language Model
OpenAI’s large language models, such as GPT-3 or GPT-4, are advanced “AI” models trained on a wide variety of internet text. These models use statistical patterns to predict the next words based on prior input, generating human-like text. They don’t understand text as humans do, they excel in tasks like drafting emails, writing code, or answering questions due to their ability to produce contextually relevant responses. So the “AI” isn’t anything to be worried about.
#build your large language model by instantiating the model from OpenAI llm = ChatOpenAI(model_name='gpt-3.5-turbo', openai_api_key ="your api key ", temperature=0.3)
The ‘temperature’ parameter controls the randomness of the model’s output. A low value like 0.3 will make the responses more focused and deterministic, while higher values produce more random outputs.
DuckDuckGo and YouTube Search
The Langchain library also provides a DuckDuckGo search function and a YouTube search function. DuckDuckGo is a search engine that respects user privacy, and it’s being used to find information on the internet. The YouTube search function, on the other hand, helps us search for relevant videos on YouTube.
#Lets build a few tools to use with our LLM search = DuckDuckGoSearchRun() search_tool = Tool(name="search_tool", description = "search the net",func = search.run) yt = YouTubeSearchTool() yt_tool= Tool(name='youtube', description="youtube search for video", func=yt.run) tools=[search_tool,yt_tool]
Tool class is used to encapsulate these functions into tools that can be used by the AI agent. These tools are then passed to the agent during its initialization.
The agent uses the tools and the language model to interpret the user’s prompts and provide intelligent responses
agent = initialize_agent(tools=tools, llm=llm, agent='zero-shot-react-description', verbose=True)
Using Streamlit for the User Interface
After initializing the AI agent and setting up the tools, the next step is to create the user interface for our chatbot using Streamlit.
st.title('🔴🦜🔗😀 AI Search Chat') prompt = st.text_input('Input your prompt here')
Streamlit is used to create a title for our application and a text input box for the user to enter their prompt. When the user enters a prompt, the following code gets executed:
if prompt: response =agent.run(prompt) st.write(response)