LangChain

Introduction

Build context-aware reasoning applications

LangChain is a framework for developing applications powered by large language models (LLMs).

User Guide

Install

pip install langchain

#--optional--
pip install langchain_community langchain_ollama langchain_chroma

Use Ollama

有两种方式使用ollama本地大模型

  • Method: use langchain_community
# Method 1: Using LangChain's base classes and components directly
from langchain_community.llms import Ollama
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

output_parser = StrOutputParser()

llm = Ollama(model="llama3.1")
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are world class technical documentation writer."),
    ("user", "{input}")
])
chain = prompt | llm | output_parser

print(chain.invoke({"input": "how can langsmith help with testing?"}))
  • Method: use langchain_ollama
# Method 2: Using LangChain's Ollama wrapper
from langchain_ollama import OllamaLLM

model = OllamaLLM(model="llama3.1")
res = model.invoke("Come up with 10 names for a song about parrots")
print(res)

References

comments powered by Disqus