LLMs

Contents

LLMs#

This page considers LangChain interfaces for LLMs.

from langchain_ollama import ChatOllama

Structured ouput#

Some providers support the structured output. The model will return the data in the specified format.

To specify the model to follow the specified format, use the with_strucutred_ouput method. It returns the modified chat object that will follow specified rules.

Check if the provider supports structured ouput in the JSON mode column of the provided features section.


The following cell illustrates how the the user characteristics are extracted from the given text.

from pydantic import BaseModel

class MyModel(BaseModel):
    id: str
    name: str

model = ChatOllama(model="llama3.1", temperature=0)
structured_model = model.with_structured_output(MyModel)
response = structured_model.invoke(
    "Extract data: 'User llm_lover with id 777 tries to acess the database.'"
)
response
MyModel(id='777', name='llm_lover')