A Coding Guide to Build an Agentic AI‑Powered Asynchronous Ticketing Assistant Using PydanticAI Agents, Pydantic v2, and SQLite Database


In this tutorial, we’ll build an end‑to‑end ticketing assistant powered by Agentic AI using the PydanticAI library. We’ll define our data rules with Pydantic v2 models, store tickets in an in‑memory SQLite database, and generate unique identifiers with Python’s uuid module. Behind the scenes, two agents, one for creating tickets and one for checking status, leverage Google Gemini (via PydanticAI’s google-gla provider) to interpret your natural‑language prompts and call our custom database functions. The result is a clean, type‑safe workflow you can run immediately in Colab.

!pip install --upgrade pip
!pip install pydantic-ai

First, these two commands update your pip installer to the latest version, bringing in new features and security patches, and then install PydanticAI. This library enables the definition of type-safe AI agents and the integration of Pydantic models with LLMs.

import os
from getpass import getpass


if "GEMINI_API_KEY" not in os.environ:
    os.environ["GEMINI_API_KEY"] = getpass("Enter your Google Gemini API key: ")

We check whether the GEMINI_API_KEY environment variable is already set. If not, we securely prompt you (without echoing) to enter your Google Gemini API key at runtime, then store it in os.environ so that your Agentic AI calls can authenticate automatically.

!pip install nest_asyncio

We install the nest_asyncio package, which lets you patch the existing asyncio event loop so that you can call async functions (or use .run_sync()) inside environments like Colab without running into “event loop already running” errors.

import sqlite3
import uuid
from dataclasses import dataclass
from typing import Literal


from pydantic import BaseModel, Field
from pydantic_ai import Agent, RunContext

We bring in Python’s sqlite3 for our in‑memory database and uuid to generate unique ticket IDs, use dataclass and Literal for clear dependency and type definitions, and load Pydantic’s BaseModel/​Field for enforcing data schemas alongside Agent and RunContext from PydanticAI to wire up and run our conversational agents.

conn = sqlite3.connect(":memory:")
conn.execute("""
CREATE TABLE tickets (
    ticket_id TEXT PRIMARY KEY,
    summary   TEXT NOT NULL,
    severity  TEXT NOT NULL,
    department TEXT NOT NULL,
    status    TEXT NOT NULL
)
""")
conn.commit()

We set up an in‑memory SQLite database and define a tickets table with columns for ticket_id, summary, severity, department, and status, then commit the schema so you have a lightweight, transient store for managing your ticket records.

@dataclass
class TicketingDependencies:
    """Carries our DB connection into system prompts and tools."""
    db: sqlite3.Connection


class CreateTicketOutput(BaseModel):
    ticket_id: str = Field(..., description="Unique ticket identifier")
    summary: str   = Field(..., description="Text summary of the issue")
    severity: Literal["low","medium","high"] = Field(..., description="Urgency level")
    department: str = Field(..., description="Responsible department")
    status: Literal["open"] = Field("open", description="Initial ticket status")


class TicketStatusOutput(BaseModel):
    ticket_id: str = Field(..., description="Unique ticket identifier")
    status: Literal["open","in_progress","resolved"] = Field(..., description="Current ticket status")

Here, we define a simple TicketingDependencies dataclass to pass our SQLite connection into each agent call, and then declare two Pydantic models: CreateTicketOutput (with fields for ticket ID, summary, severity, department, and default status “open”) and TicketStatusOutput (with ticket ID and its current status). These models enforce a clear, validated structure on everything our agents return, ensuring you always receive well-formed data.

create_agent = Agent(
    "google-gla:gemini-2.0-flash",
    deps_type=TicketingDependencies,
    output_type=CreateTicketOutput,
    system_prompt="You are a ticketing assistant. Use the `create_ticket` tool to log new issues."
)


@create_agent.tool
async def create_ticket(
    ctx: RunContext[TicketingDependencies],
    summary: str,
    severity: Literal["low","medium","high"],
    department: str
) -> CreateTicketOutput:
    """
    Logs a new ticket in the database.
    """
    tid = str(uuid.uuid4())
    ctx.deps.db.execute(
        "INSERT INTO tickets VALUES (?,?,?,?,?)",
        (tid, summary, severity, department, "open")
    )
    ctx.deps.db.commit()
    return CreateTicketOutput(
        ticket_id=tid,
        summary=summary,
        severity=severity,
        department=department,
        status="open"
    )

We create a PydanticAI Agent named’ create_agent’ that’s wired to Google Gemini and is aware of our SQLite connection (deps_type=TicketingDependencies) and output schema (CreateTicketOutput). The @create_agent.tool decorator then registers an async create_ticket function, which generates a UUID, inserts a new row into the tickets table, and returns a validated CreateTicketOutput object.

status_agent = Agent(
    "google-gla:gemini-2.0-flash",
    deps_type=TicketingDependencies,
    output_type=TicketStatusOutput,
    system_prompt="You are a ticketing assistant. Use the `get_ticket_status` tool to retrieve current status."
)


@status_agent.tool
async def get_ticket_status(
    ctx: RunContext[TicketingDependencies],
    ticket_id: str
) -> TicketStatusOutput:
    """
    Fetches the ticket status from the database.
    """
    cur = ctx.deps.db.execute(
        "SELECT status FROM tickets WHERE ticket_id = ?", (ticket_id,)
    )
    row = cur.fetchone()
    if not row:
        raise ValueError(f"No ticket found for ID {ticket_id!r}")
    return TicketStatusOutput(ticket_id=ticket_id, status=row[0])

We set up a second PydanticAI Agent, status_agent, also using the Google Gemini provider and our shared TicketingDependencies. It registers an async get_ticket_status tool that looks up a given ticket_id in the SQLite database and returns a validated TicketStatusOutput, or raises an error if the ticket isn’t found.

deps = TicketingDependencies(db=conn)


create_result = await create_agent.run(
    "My printer on 3rd floor shows a paper jam error.", deps=deps
)


print("Created Ticket →")
print(create_result.output.model_dump_json(indent=2))


tid = create_result.output.ticket_id
status_result = await status_agent.run(
    f"What's the status of ticket {tid}?", deps=deps
)


print("Ticket Status →")
print(status_result.output.model_dump_json(indent=2))

Finally, we first package your SQLite connection into deps, then ask the create_agent to log a new ticket via a natural‑language prompt, printing the validated ticket data as JSON. It then takes the returned ticket_id, queries the status_agent for that ticket’s current state, and prints the status in JSON form.

In conclusion, you have seen how Agentic AI and PydanticAI work together to automate a complete service process, from logging a new issue to retrieving its live status, all managed through conversational prompts. Our use of Pydantic v2 ensures every ticket matches the schema you define, while SQLite provides a lightweight backend that’s easy to replace with any database. With these tools in place, you can expand the assistant, adding new agent functions, integrating other AI models like openai:gpt-4o, or connecting real‑world APIs, confident that your data remains structured and reliable throughout.


Here is the Colab Notebook. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 90k+ ML SubReddit.

🔥 [Register Now] miniCON Virtual Conference on AGENTIC AI: FREE REGISTRATION + Certificate of Attendance + 4 Hour Short Event (May 21, 9 am- 1 pm PST) + Hands on Workshop


Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is committed to harnessing the potential of Artificial Intelligence for social good. His most recent endeavor is the launch of an Artificial Intelligence Media Platform, Marktechpost, which stands out for its in-depth coverage of machine learning and deep learning news that is both technically sound and easily understandable by a wide audience. The platform boasts of over 2 million monthly views, illustrating its popularity among audiences.

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here