Large Language Models or LLMs are Machine Learning models that use deep learning algorithms to process and understand natural language. These models are trained on extensive text data to find language patterns and relationships, enabling them to execute tasks like translation, sentiment analysis, and chatbot interactions. They are able to understand complex text, recognize entity connections, and produce coherent, grammatically correct text.

LLMs are great technologies but as of now, they come with a few limitations. One of its their main limitation is that these models have been trained with data up to a particular date, so they don’t have any information after that. The second one, and related to the first, is that they don’t have access to the internet, which again limits the information they have available to that they were trained with. That’s where Langchain comes in. Langchain can extend LLMs capabilities by allowing the LLM to access the internet, for instance, by performing Google searches to retrieve up-to-date information.

What is LangChain?

LangChain is a framework for developing applications powered by language models. It enables to connect a language model like GPT-4 to other sources of data like Google search results. Also, it allows the language model to interact with its environment by creating sequential subtasks using agents to solve the initially proposed task.

Stay up-to-date with our latest articles!
Subscribe to our free newsletter by entering your email address below.

Let’s first install the libraries that we will use in this article.

Python
!pip install langchain openai google-search-results

In addition to that, we need to add the keys of two APIs to the environment to be able to use both OpenAI API to process the data from text, and Serper.dev to scrape Google search results for additional and up-to-date information.

Python
import os

# OpenAI API key
os.environ["OPENAI_API_KEY"] = "input your API key here"

# Serper API key
os.environ["SERPER_API_KEY"] = "input your API key here"

Extending the LLM capabilities through LangChain

Before moving forward, what is an agent? An agent is defined as a reasoning engine that determines which actions or tasks to take and in which order. Basically, it determines the subtasks to undertake to achieve the specified outcome.

To use them, we need to initialize them and assign the type of agents we need for our task. For that, we have to import initialize_agent and AgentType.

Python
from langchain.agents import initialize_agent
from langchain.agents import AgentType

This will allow us to handle text and access the information that is contained within the data the LLM was trained with, but without access to the internet.

Let’s define the template for the prompt and the input variables, in this case, we will pass two variables: “question” and “how”.

  • question: it refers to the main question we want to answer.
  • how: it refers to how we want the input to look. For example, if we want just a brief answer, a number, or a whole explanation.
Python
# Import functions
from langchain import PromptTemplate, LLMChain

# Define the template of the question
template = """Question: {question}
Answer: Let's think step by step {how}."""

# Create the prompt
prompt = PromptTemplate(template=template, 
                        input_variables=["question", "how"])

If we want to use an OpenAI model like GPT-4 we need to import the following:

Python
from langchain.llms import OpenAI

Now we initialize the LLM, in our case using the OpenAI API. We set temperature to 0, so it provides us with more focused and deterministic results. Otherwise, if we set it to higher values (up to 2) we would get a more random response.

Python
llm = OpenAI(temperature=0)

And we add it to the language chain together with the prompt:

Python
llm_chain = LLMChain(prompt=prompt, llm=llm)

To get an answer to our question we need to define the question and, additionally, how we want the answer. For example, in this first example, we just want a number. The number of Grand Slams that Carlos Alcaraz (a Spanish tennis player) has at the moment.

Python
question = "How many Grand Slams does Carlos Alcaraz have?"
how = "as a single number, without any text"
print(llm_chain.run({'question':question, 'how':how}))
0

If we change the way we want our answer to look like (how), for example asking for an explanation, not only a number, we get the following:

Python
question = "How many Grand Slams does Carlos Alcaraz have?"
how = "as an explanation, not just a number"
print(llm_chain.run({'question':question, 'how':how}))
Carlos Alcaraz is a professional tennis player from Spain. He is currently ranked as the world's No. 128 in the ATP rankings. As of April 2021, he has not yet won any Grand Slam titles. He has, however, reached the third round of the 2021 Australian Open and the second round of the 2021 French Open. He has also won two ATP Challenger Tour titles and one ITF Futures title.

We can see that without access to internet, it fails to give the right answer (he has 2 Grand Slams at the moment).

Provide the LLM with internet access

To gain access to the internet we need tools, which are interfaces that an agent can use to interact with the world. These are the tools the agent will use to effectively complete their task. For example, in our case, it would be the access to Google search results.

For that, we will use Serper API, which is a low-cost Google Search API that can be used to add answer box, knowledge graph, and organic results data from Google Search. We can import it like this:

Python
from langchain.utilities import GoogleSerperAPIWrapper

Remember that you need to add the API key to your environment as we did at the beginning of the article. Let’s see how it works on its own, before adding it to LangChain:

Python
search = GoogleSerperAPIWrapper()
results = search.results("Carlos Alcaraz")
display(results)
{
    "searchParameters": {
        "q": "Carlos Alcaraz",
        "gl": "us",
        "hl": "en",
        "num": 10,
        "type": "search",
    },
    "knowledgeGraph": {
        "title": "Carlos Alcaraz",
        "type": "Spanish tennis player",
        "imageUrl": "https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRYk7d777Z22vWKPBiYXVetAI3GI11-C0vf1Xv-Xg&s=0",
        "description": "Carlos Alcaraz Garfia is a Spanish professional tennis player. He is ranked as the world No. 1 in men's singles by the Association of Tennis Professionals. Alcaraz has won twelve ATP Tour-level singles titles, including two Grand Slams and four...",
        "descriptionSource": "Wikipedia",
        "descriptionLink": "https://en.wikipedia.org/wiki/Carlos_Alcaraz",
        "attributes": {
            "Born": "May 5, 2003 (age 20\xa0years), El Palmar, Spain",
            "Grand slams won (singles)": "2",
            "Height": "6′ 1″",
            "Nationality": "Spanish",
            "Siblings": "Alvaro Alcaraz Garfia, Jaime Alcaraz Garfia, and Sergio Alcaraz Garfia",
            "Weight": "179\xa0lbs",
            "Coach": "Juan Carlos Ferrero",
        },
    },
    "organic": [
        {
            "title": "Carlos Alcaraz - Wikipedia",
            "link": "https://en.wikipedia.org/wiki/Carlos_Alcaraz",
            "snippet": "Carlos Alcaraz Garfia is a Spanish professional tennis player. He is ranked as the world No. 1 in men's singles by the Association of Tennis Professionals ...",
            "attributes": {
                "Current ranking": "No. 1 (26 June 2023)",
                "Country (sports)": "Spain",
            },
            "imageUrl": "https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQr4j3dQH81f76-fJevN1W7UyCui7i90GeBtEzNk38sLGodG76M55n1phA&s",
            "position": 1,
        },
...

This is the kind of information that Serper API will be able to provide us with. This will then be analyzed by the LLM, in our case via OpenAI API, and define further tasks or deliver the final result.

Let’s see how both combine together.

First, let’s use it as a utility. It consists of wrapping the Serper API within a Tool. We need to do the following:

Python
# Import the function Tool
from langchain.agents import Tool

# Initialize Serper API wrapper
search = GoogleSerperAPIWrapper()

# Load the tool
tools = [
    Tool(
        name="Intermediate Answer",
        func=search.run,
        description="useful for when you need to ask with search"
    )
]

# Now we initialize the agent by passing the Serper API wrapper tool, 
# the LLM and the type of agent
as_utility = initialize_agent(tools, 
                              llm,
                              agent=AgentType.SELF_ASK_WITH_SEARCH,
                              verbose=True,
                              handle_parsing_errors=True)

# Finally we ask the question
as_utility.run(question + ' ' + how)
> Entering new AgentExecutor chain...
 Yes.
Follow up: Who is Carlos Alcaraz?
Intermediate answer: Carlos Alcaraz Garfia (born 5 May 2003) is a Spanish professional tennis player. He is ranked as the world No. 1 in men's singles by the Association of Tennis Professionals (ATP).

Follow up: How many Grand Slams has Carlos Alcaraz won?
Intermediate answer: 2
So the final answer is: Carlos Alcaraz has won two Grand Slams.

> Finished chain.
'Carlos Alcaraz has won two Grand Slams.'

Let’s see what it did. Initially, it needs information about who Carlos Alcaraz is. It accesses the internet to find that out. Once it has knowledge of that, it can find out about the number of Grand Slams. Finally, it formats the answer as indicated in the question, in this case, we didn’t just want a number.

We can also use it as a tool to use it with an agent:

Python
# Import the function load_tools
from langchain.agents import load_tools

# Define the tools
tools = load_tools(["google-serper"])

# Initialize the agent with the LLM and the tools
as_tool = initialize_agent(tools, llm, verbose=True)

# Answer the question
as_tool.run(question)
> Entering new AgentExecutor chain...
 I need to find out how many Grand Slams Carlos Alcaraz has won
Action: google_serper
Action Input: "Carlos Alcaraz Grand Slams"
Observation: 2
Thought: I now know the final answer
Final Answer: Carlos Alcaraz has won 2 Grand Slams.

> Finished chain.
'Carlos Alcaraz has won 2 Grand Slams.'

We observe that the output is very similar. We will be using this second way as it allows for more flexibility.

Provide the LLM with multiple tools

This way also allows us to add additional tools, for example llm-math, which will enable us to make mathematical calculations accurately.

Python
tools = load_tools(["google-serper", "llm-math"], llm=llm)

We can add more instructions to force it to use the maths tool. For example, following the example of Carlos Alcaraz, let’s try to forecast based on his achievements, the number of Grand Slams he will have by the time he is 35 years old. In this case, we are not really interested in the result, as we know that this is way more complex than a sum or multiplication. We are just interested in seeing how LangChain deals with the additional mathematical tasks that are defined to achieve the desired outcome.

Python
# Define additional question and additional indications
additional_question = '''Based on that and in his number of active years left, 
if he follows this progression, 
how many Grand Slams will he have when he will be 35?'''
additional_indications = 'Round it to the nearest integer'

# Initialize the agent with the loaded tools
as_tool = initialize_agent(tools, llm, verbose=True)

# Answer the query
as_tool.run(question + ' ' + additional_question + ' ' + how + ' ' + additional_indications)
> Entering new AgentExecutor chain...
 I need to find out how many Grand Slams Carlos Alcaraz has already won and then use that information to make a prediction about how many he will have when he is 35.
Action: google_serper
Action Input: "Carlos Alcaraz Grand Slams"
Observation: 2
Thought: I need to find out how many years Carlos Alcaraz has left in his active career.
Action: google_serper
Action Input: "Carlos Alcaraz active years"
Observation: Carlos Alcaraz Garfia (born 5 May 2003) is a Spanish professional tennis player. ... At 18 years of age, Alcaraz became the youngest man in a major fourth ... Current ranking: No. 1 (26 June 2023). Country (sports): Spain. Height: 1.83 m (6 ft 0 in). Turned pro: 2018. Carlos Alcaraz has spent in total 30 weeks as ATP world No. 1. GeneralEdit. Year, 2018, 2019, 2020, 2021, 2022 ... Carlos turned professional in 2018 when he was 15, playing Futures and Challenger tournaments. However, he landed his first shot on the ATP Tour after receiving ... Age: 20. Turned Pro: 2018. Country: Spain. Height: 6′ 0″ (1.83m). Carlos Alcaraz is a 19-year-old tennis player from Spain, who became the new world No 1 on September 12, 2022. He won his first Grand Slam title at the US ... Carlos Alcaraz Garfia is a Spanish professional tennis player known for his performance on the ... Years Active: 2018-Present; Nicknames: Charly, Carlitos. Carlos Alcaraz ; Birthday May 5 ; Birth Sign Taurus ; Birthplace Murcia ; Age 20 years old. Family Life: He was born to Carlos and Virginia Alcaraz, and he was one of four chil.... He has already won two Masters 1000 tournaments in 2022 (Miami and Madrid) and is sixth in the world ranking - all at 19 years old. He ... The 20-year-old Spaniard beat Novak Djokovic in five sets to steal the 2023 men's title, becoming the second winner outside of the Big Three ... Checking out his career earnings to date on the ATP tour, purse winnings stand at $16,660,732. So far in 2023, Alcaraz has won $4,831,089 with ...
Thought: I need to calculate how many Grand Slams Carlos Alcaraz will have when he is 35.
Action: Calculator
Action Input: (2 + (35 - 20) * 0.5)
Observation: Answer: 9.5
Thought: I now know the final answer
Final Answer: Carlos Alcaraz will have approximately 10 Grand Slams when he is 35.

> Finished chain.
'Carlos Alcaraz will have approximately 10 Grand Slams when he is 35.'

We can see that first, it finds information about the number of Grand Slams that Alcaraz has won so far. For this, it needs access to the internet, which it achieves by using Serper API. After that, it needs to find the number of active years, which is achieved again with the Google search results. Finally, it needs to do a calculation calling the Maths tool (we don’t care if it makes sense or not at this point) to find out the expected Grand Slams when he will be 35 years old. The last step is outputting the result in an explicative text format as instructed in the prompt.

Conclusion

This was a very simple example of what LangChain together with an LLM can achieve. As you can see, it can give superpowers to ChatGPT (or the LLM you prefer to use)! In future articles, we will deal with more complex architectures!

Categories: NLP

0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *