core

API to get structured output from a string using different LLM providers

source

structured_output


def structured_output(
    model:str, # Model name, see examples here or LiteLLM docs for complete list
    system_prompt:str, # Instructions for LLM to process the input string
    response_format:BaseModel, # User-defined Pydantic model to define output
    user_prompt:str, # Input string that will be processed
)->BaseModel:

Get structured output from model by combining system and user prompts and making the right API call. See, here for full list of APIs available.

You can control the model that is used for the call by simply adjusting the string in the model variable.

model="azure/gpt-4o-2024-08-06" # e.g. openai/gpt-4o-2024-08-06 would use the standard OpenAI
system_prompt = "Extract the event information."

class CalendarEvent(BaseModel):
    name: str
    date: str
    participants: list[str]
user_prompt = "Alice and Bob are going to Carmen's birthday party on 22nd March 2025"
r = structured_output(model=model,
                      system_prompt=system_prompt,
                      response_format=CalendarEvent, #Note this is the class name (without the `()`)
                      user_prompt=user_prompt,
                 )

r.model_dump()
{'name': "Carmen's Birthday Party",
 'date': '2025-03-22',
 'participants': ['Alice', 'Bob']}