0% found this document useful (0 votes)
21 views28 pages

Chapter 1

The document provides an overview of developing applications using LangChain, a framework for building LLM apps. It includes examples of integrating various models, such as OpenAI and Hugging Face, and explains the use of prompt templates and few-shot prompting techniques. The instructor, Jonathan Bennion, is an AI engineer with experience at major tech companies and has contributed to LangChain and DeepEval.

Uploaded by

omarshehata3232
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views28 pages

Chapter 1

The document provides an overview of developing applications using LangChain, a framework for building LLM apps. It includes examples of integrating various models, such as OpenAI and Hugging Face, and explains the use of prompt templates and few-shot prompting techniques. The instructor, Jonathan Bennion, is an AI engineer with experience at major tech companies and has contributed to LangChain and DeepEval.

Uploaded by

omarshehata3232
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

The LangChain

ecosystem
D E V E L O P I N G L L M A P P L I C AT I O N S W I T H L A N G C H A I N

Jonathan Bennion
AI Engineer & LangChain Contributor
Meet your instructor...

Jonathan Bennion, AI Engineer

ML & AI at Facebook, Google, Amazon,


Disney, EA

Created Logical Fallacy chain in LangChain

Contributor to DeepEval

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Build LLM Apps with LangChain

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


DEVELOPING LLM APPLICATIONS WITH LANGCHAIN
LangChain integrations

1 [Link]

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Building LLM apps the LangChain way...

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Building LLM apps the LangChain way...

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Building LLM apps the LangChain way...

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Building LLM apps the LangChain way...

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Building LLM apps the LangChain way...

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Prompting OpenAI models
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
model="gpt-4o-mini",
api_key='...'
)

[Link]("What is LangChain?")

LangChain is a framework designed for developing applications...

Additional parameters: max_completion_tokens , temperature , etc.

1 [Link]

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Prompting Hugging Face models
from langchain_huggingface import HuggingFacePipeline

llm = HuggingFacePipeline.from_model_id(
model_id="meta-llama/Llama-3.2-3B-Instruct",
task="text-generation",
pipeline_kwargs={"max_new_tokens": 100}
)

[Link]("What is Hugging Face?")

Hugging Face is a popular open-source artificial intelligence (AI) library...

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Let's practice!
D E V E L O P I N G L L M A P P L I C AT I O N S W I T H L A N G C H A I N
Prompt templates
D E V E L O P I N G L L M A P P L I C AT I O N S W I T H L A N G C H A I N

Jonathan Bennion
AI Engineer & LangChain Contributor
Prompt templates
Recipes for defining prompts for LLMs
Can contain: instructions, examples, and additional context

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Prompt templates
from langchain_core.prompts import PromptTemplate

template = "Expain this concept simply and concisely: {concept}"


prompt_template = PromptTemplate.from_template(
template=template
)

prompt = prompt_template.invoke({"concept": "Prompting LLMs"})


print(prompt)

text='Expain this concept simply and concisely: Prompting LLMs'

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


llm = HuggingFacePipeline.from_model_id(
model_id="meta-llama/Llama-3.3-70B-Instruct",
task="text-generation"
)
llm_chain = prompt_template | llm

concept = "Prompting LLMs"


print(llm_chain.invoke({"concept": concept}))

Prompting LLMs (Large Language Models) refers to the process of giving a model a
specific input or question to generate a response.

LangChain Expression Language (LCEL): | (pipe) operator

Chain: connect calls to different components

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Chat models
Chat roles: system , human , ai

from langchain_core.prompts import ChatPromptTemplate

template = ChatPromptTemplate.from_messages(
[
("system", "You are a calculator that responds with math."),
("human", "Answer this math question: What is two plus two?"),
("ai", "2+2=4"),
("human", "Answer this math question: {math}")
]
)

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Integrating ChatPromptTemplate
llm = ChatOpenAI(model="gpt-4o-mini", api_key='<OPENAI_API_TOKEN>')

llm_chain = template | llm


math='What is five times five?'

response = llm_chain.invoke({"math": math})


print([Link])

5x5=25

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Let's practice!
D E V E L O P I N G L L M A P P L I C AT I O N S W I T H L A N G C H A I N
Few-shot prompting
D E V E L O P I N G L L M A P P L I C AT I O N S W I T H L A N G C H A I N

Jonathan Bennion
AI Engineer & LangChain Contributor
Limitations of standard prompt templates

PromptTemplate + ChatPromptTemplate
examples = [
Handling small numbers of examples {
"question": "..."
Don't scale for many examples
"answer": "..."
FewShotPromptTemplate
},
...
]

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Building an example set
examples = [
{
"question": "Does Henry Campbell have any pets?",
"answer": "Henry Campbell has a dog called Pluto."
},
...
]

# Convert pandas DataFrame to list of dicts


examples = df.to_dict(orient="records")

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Formatting the examples
from langchain_core.prompts import FewShotPromptTemplate, PromptTemplate

example_prompt = PromptTemplate.from_template("Question: {question}\n{answer}")

prompt = example_prompt.invoke({"question": "What is the capital of Italy?"


"answer": "Rome"})
print([Link])

Question: What is the capital of Italy?


Rome

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


FewShotPromptTemplate

prompt_template = FewShotPromptTemplate(
examples=examples,
examples : the list of dicts
example_prompt=example_prompt,
suffix="Question: {input}", example_prompt : formatted template
input_variables=["input"] suffix : suffix to add to the input
)
input_variables

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Invoking the few-shot prompt template
prompt = prompt_template.invoke({"input": "What is the name of Henry Campbell's dog?"})
print([Link])

Question: Does Henry Campbell have any pets?


Henry Campbell has a dog called Pluto.
...

Question: What is the name of Henry Campbell's dog?

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Integration with a chain
llm = ChatOpenAI(model="gpt-4o-mini", api_key="...")

llm_chain = prompt_template | llm


response = llm_chain.invoke({"input": "What is the name of Henry Campbell's dog?"})
print([Link])

The name of Henry Campbell's dog is Pluto.

DEVELOPING LLM APPLICATIONS WITH LANGCHAIN


Let's practice!
D E V E L O P I N G L L M A P P L I C AT I O N S W I T H L A N G C H A I N

You might also like