├── README.md ├── LICENSE └── pydantic_get_started.ipynb /README.md: -------------------------------------------------------------------------------- 1 | # pydantic-tutorials -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2024 sugarforever 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /pydantic_get_started.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "nbformat": 4, 3 | "nbformat_minor": 0, 4 | "metadata": { 5 | "colab": { 6 | "provenance": [], 7 | "authorship_tag": "ABX9TyMdLVGi+A9kDSK2fvr4ZDFK", 8 | "include_colab_link": true 9 | }, 10 | "kernelspec": { 11 | "name": "python3", 12 | "display_name": "Python 3" 13 | }, 14 | "language_info": { 15 | "name": "python" 16 | } 17 | }, 18 | "cells": [ 19 | { 20 | "cell_type": "markdown", 21 | "metadata": { 22 | "id": "view-in-github", 23 | "colab_type": "text" 24 | }, 25 | "source": [ 26 | "\"Open" 27 | ] 28 | }, 29 | { 30 | "cell_type": "markdown", 31 | "source": [ 32 | "# Get Started with PyDantic AI\n", 33 | "\n", 34 | "Ever used FastAPI, LangChain, or the OpenAI Python SDK? Then you've already used [Pydantic](https://pydantic.dev/) under the hood. Pydantic is Python's go-to library for data validation, used by thousands of packages to ensure data matches expected types and formats.\n", 35 | "What makes Pydantic special is its use of standard Python type hints. Instead of learning a new syntax, you just write regular Python code with type annotations, and Pydantic handles the validation:\n", 36 | "\n", 37 | "```python\n", 38 | "from pydantic import BaseModel\n", 39 | "\n", 40 | "class User(BaseModel):\n", 41 | " name: str\n", 42 | " age: int\n", 43 | " email: str\n", 44 | "\n", 45 | "json_string = '{\"name\": \"John Doe\", \"age\": 30, \"email\": \"john@example.com\"}'\n", 46 | "user = User.parse_raw(json_string)\n", 47 | "\n", 48 | "```\n", 49 | "\n", 50 | "[Pydantic AI](https://ai.pydantic.dev/) builds on this foundation to make building AI applications just as straightforward. Created by the Pydantic team, it provides a framework for working with large language models (LLMs) that feels natural to Python developers.\n", 51 | "\n", 52 | "Key features of Pydantic AI:\n", 53 | "- Works with major LLM providers (OpenAI, Anthropic, Gemini, Groq)\n", 54 | "- Uses standard Python for control flow and composition\n", 55 | "- Validates AI responses using Pydantic models\n", 56 | "- Supports streaming responses with validation\n", 57 | "- Includes built-in debugging and monitoring\n", 58 | "\n", 59 | "In this tutorial, we'll explore how to use Pydantic AI to build reliable AI applications using familiar Python patterns. Let's get started!" 60 | ], 61 | "metadata": { 62 | "id": "DbzujyM2Pc9O" 63 | } 64 | }, 65 | { 66 | "cell_type": "markdown", 67 | "source": [ 68 | "## Installation\n", 69 | "\n", 70 | "The only Python package you need for now is `pydantic_ai`." 71 | ], 72 | "metadata": { 73 | "id": "fjI3ajM8fc5q" 74 | } 75 | }, 76 | { 77 | "cell_type": "code", 78 | "execution_count": 1, 79 | "metadata": { 80 | "colab": { 81 | "base_uri": "https://localhost:8080/" 82 | }, 83 | "id": "xxQbhxKTypJg", 84 | "outputId": "93a11690-898e-40c9-b7c4-628c44d0f8ca" 85 | }, 86 | "outputs": [ 87 | { 88 | "output_type": "stream", 89 | "name": "stdout", 90 | "text": [ 91 | "\u001b[?25l \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m0.0/60.7 kB\u001b[0m \u001b[31m?\u001b[0m eta \u001b[36m-:--:--\u001b[0m\r\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m60.7/60.7 kB\u001b[0m \u001b[31m1.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 92 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m209.8/209.8 kB\u001b[0m \u001b[31m5.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 93 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m127.1/127.1 kB\u001b[0m \u001b[31m5.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 94 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m108.8/108.8 kB\u001b[0m \u001b[31m4.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 95 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m71.1/71.1 kB\u001b[0m \u001b[31m2.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 96 | "\u001b[?25h\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", 97 | "google-colab 1.0.0 requires google-auth==2.27.0, but you have google-auth 2.37.0 which is incompatible.\u001b[0m\u001b[31m\n", 98 | "\u001b[0m" 99 | ] 100 | } 101 | ], 102 | "source": [ 103 | "!pip install pydantic_ai -qU" 104 | ] 105 | }, 106 | { 107 | "cell_type": "markdown", 108 | "source": [ 109 | "## Get Colab Environment Ready\n", 110 | "\n", 111 | "To make the demo application run, we will also need `nest-asyncio`.\n", 112 | "\n", 113 | "Next step is to set up environmental variable `OPENAI_API_KEY` so that the Pydantic AI Agents can pick it up in using OpenAI models." 114 | ], 115 | "metadata": { 116 | "id": "spdjK3wrfmFR" 117 | } 118 | }, 119 | { 120 | "cell_type": "code", 121 | "source": [ 122 | "!pip install nest-asyncio -qU" 123 | ], 124 | "metadata": { 125 | "id": "4eNx6IKlzZgB" 126 | }, 127 | "execution_count": 2, 128 | "outputs": [] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "source": [ 133 | "import nest_asyncio\n", 134 | "nest_asyncio.apply()" 135 | ], 136 | "metadata": { 137 | "id": "BZtoNwxczdux" 138 | }, 139 | "execution_count": 3, 140 | "outputs": [] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "source": [ 145 | "from google.colab import userdata\n", 146 | "OPENAI_API_KEY = userdata.get('OPENAI_API_KEY')\n", 147 | "\n", 148 | "import os\n", 149 | "os.environ['OPENAI_API_KEY'] = OPENAI_API_KEY" 150 | ], 151 | "metadata": { 152 | "id": "lQUPHTYgzGiM" 153 | }, 154 | "execution_count": 4, 155 | "outputs": [] 156 | }, 157 | { 158 | "cell_type": "markdown", 159 | "source": [ 160 | "## Pydantic AI Agents\n", 161 | "\n", 162 | "Let's start looking into some cool examples of Pydantic AI agents." 163 | ], 164 | "metadata": { 165 | "id": "XRmerIQOgBE2" 166 | } 167 | }, 168 | { 169 | "cell_type": "markdown", 170 | "source": [ 171 | "### The Simplest One\n", 172 | "\n", 173 | "Chat with OpenAI `gpt-4o` straight away." 174 | ], 175 | "metadata": { 176 | "id": "z-gg8vLMgKwa" 177 | } 178 | }, 179 | { 180 | "cell_type": "code", 181 | "source": [ 182 | "from pydantic_ai import Agent" 183 | ], 184 | "metadata": { 185 | "id": "Ywyz_i_9y4SC" 186 | }, 187 | "execution_count": 5, 188 | "outputs": [] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "source": [ 193 | "agent = Agent(\"openai:gpt-4o\")\n", 194 | "response = agent.run_sync(\"Hey, dude!\")\n", 195 | "print(response.data)" 196 | ], 197 | "metadata": { 198 | "colab": { 199 | "base_uri": "https://localhost:8080/" 200 | }, 201 | "id": "muzkRuKj867U", 202 | "outputId": "48780d98-44b8-452d-f882-e216a1a8b9f4" 203 | }, 204 | "execution_count": 6, 205 | "outputs": [ 206 | { 207 | "output_type": "stream", 208 | "name": "stdout", 209 | "text": [ 210 | "Hey there! How can I assist you today?\n" 211 | ] 212 | } 213 | ] 214 | }, 215 | { 216 | "cell_type": "markdown", 217 | "source": [ 218 | "### Agent with Static Prompt" 219 | ], 220 | "metadata": { 221 | "id": "5xfsoAoAgdOU" 222 | } 223 | }, 224 | { 225 | "cell_type": "code", 226 | "source": [ 227 | "agent = Agent(\"openai:gpt-4o\", system_prompt=\"You can only speak Chinese\")\n", 228 | "response = agent.run_sync(\"Hey, dude!\")\n", 229 | "print(response.data)" 230 | ], 231 | "metadata": { 232 | "colab": { 233 | "base_uri": "https://localhost:8080/" 234 | }, 235 | "id": "cNuOO7a69Z_U", 236 | "outputId": "d9fb84ad-96df-44d0-ee35-a878352142f8" 237 | }, 238 | "execution_count": 7, 239 | "outputs": [ 240 | { 241 | "output_type": "stream", 242 | "name": "stdout", 243 | "text": [ 244 | "你好!有什么我可以帮助你的吗?\n" 245 | ] 246 | } 247 | ] 248 | }, 249 | { 250 | "cell_type": "markdown", 251 | "source": [ 252 | "### Agent with Dynamic Prompt" 253 | ], 254 | "metadata": { 255 | "id": "IfMRzGDngpPV" 256 | } 257 | }, 258 | { 259 | "cell_type": "code", 260 | "source": [ 261 | "from pydantic_ai import Agent, RunContext" 262 | ], 263 | "metadata": { 264 | "id": "DXr9G0SC96Em" 265 | }, 266 | "execution_count": 9, 267 | "outputs": [] 268 | }, 269 | { 270 | "cell_type": "code", 271 | "source": [ 272 | "dynamic_prompt_agent = Agent(\"openai:gpt-4o\")\n", 273 | "\n", 274 | "@dynamic_prompt_agent.system_prompt\n", 275 | "def set_agent_name(ctx: RunContext[str]) -> str:\n", 276 | " return f\"Your name is {ctx.deps}.\"\n", 277 | "\n", 278 | "response = dynamic_prompt_agent.run_sync(\"Hey, dude! Who are you?\", deps=\"Jarvis\")\n", 279 | "print(response.data)" 280 | ], 281 | "metadata": { 282 | "colab": { 283 | "base_uri": "https://localhost:8080/" 284 | }, 285 | "id": "P2XHmZke9spd", 286 | "outputId": "5bc52c96-1ef5-4ea4-c298-72f221338b09" 287 | }, 288 | "execution_count": 10, 289 | "outputs": [ 290 | { 291 | "output_type": "stream", 292 | "name": "stdout", 293 | "text": [ 294 | "Hey there! I'm Jarvis, your AI assistant. How can I help you today?\n" 295 | ] 296 | } 297 | ] 298 | }, 299 | { 300 | "cell_type": "markdown", 301 | "source": [ 302 | "### Agent with Dependency Type" 303 | ], 304 | "metadata": { 305 | "id": "Q2XEz8rlhAs0" 306 | } 307 | }, 308 | { 309 | "cell_type": "code", 310 | "source": [ 311 | "from dataclasses import dataclass\n", 312 | "\n", 313 | "@dataclass\n", 314 | "class Player:\n", 315 | " name: str\n", 316 | " goals: int\n", 317 | "\n", 318 | "\n", 319 | "agent = Agent(\n", 320 | " 'openai:gpt-4o',\n", 321 | " deps_type=Player,\n", 322 | " result_type=bool,\n", 323 | ")\n", 324 | "\n", 325 | "@agent.system_prompt\n", 326 | "def add_player_name(ctx: RunContext[Player]) -> str:\n", 327 | " player_name = ctx.deps.name\n", 328 | " return f\"The player's name is {player_name}.\"\n", 329 | "\n", 330 | "@agent.system_prompt\n", 331 | "def add_player_goals(ctx: RunContext[Player]) -> str:\n", 332 | " goals = ctx.deps.goals\n", 333 | " return f\"The player's goals so far is {goals}.\"\n", 334 | "\n", 335 | "response = agent.run_sync(\"Hey, dude! Does the player ever score a goal?\", deps=Player(name=\"Messi\", goals=2))\n", 336 | "print(response.data)\n", 337 | "\n", 338 | "response = agent.run_sync(\"Hey, dude! Does the player ever score a goal?\", deps=Player(name=\"Ronaldo\", goals=0))\n", 339 | "print(response.data)" 340 | ], 341 | "metadata": { 342 | "colab": { 343 | "base_uri": "https://localhost:8080/" 344 | }, 345 | "id": "8M9B-U5IKGV6", 346 | "outputId": "0dd8014a-44b9-42c0-8614-116c8c1450c8" 347 | }, 348 | "execution_count": 11, 349 | "outputs": [ 350 | { 351 | "output_type": "stream", 352 | "name": "stdout", 353 | "text": [ 354 | "True\n", 355 | "False\n" 356 | ] 357 | } 358 | ] 359 | }, 360 | { 361 | "cell_type": "markdown", 362 | "source": [ 363 | "### Agent with Function Tools\n", 364 | "\n", 365 | "Function tools provide a mechanism for models to retrieve extra information to help them generate a response.\n", 366 | "\n", 367 | "Developers use decorators `@agent.tool_plain` or `@agent.tool` to define tools." 368 | ], 369 | "metadata": { 370 | "id": "ITkIu7KKhLdr" 371 | } 372 | }, 373 | { 374 | "cell_type": "code", 375 | "source": [ 376 | "agent = Agent('openai:gpt-4o')\n", 377 | "\n", 378 | "@agent.tool\n", 379 | "def get_player_goals(ctx: RunContext[str], player_name: str) -> str:\n", 380 | " print(f\"Getting the goals of player {player_name} so far\")\n", 381 | " if player_name == 'Messi':\n", 382 | " return '2'\n", 383 | " elif player_name == 'Ronaldo':\n", 384 | " return '100'\n", 385 | " else:\n", 386 | " return '0'\n", 387 | "\n", 388 | "response = agent.run_sync(\"Let me know if Ronaldo scored so far\")\n", 389 | "print(response.data)" 390 | ], 391 | "metadata": { 392 | "colab": { 393 | "base_uri": "https://localhost:8080/" 394 | }, 395 | "id": "3MvdJo3TNZWm", 396 | "outputId": "7d0e8ac0-c2c3-42db-d5af-48ffc23f9a75" 397 | }, 398 | "execution_count": 13, 399 | "outputs": [ 400 | { 401 | "output_type": "stream", 402 | "name": "stdout", 403 | "text": [ 404 | "Getting the goals of player Ronaldo so far\n", 405 | "Ronaldo has scored 100 goals so far.\n" 406 | ] 407 | } 408 | ] 409 | }, 410 | { 411 | "cell_type": "code", 412 | "source": [ 413 | "response.all_messages()" 414 | ], 415 | "metadata": { 416 | "colab": { 417 | "base_uri": "https://localhost:8080/" 418 | }, 419 | "id": "DvRxFW8r7LsG", 420 | "outputId": "4e5ecef7-06b9-4c90-9c79-46eaa4c03477" 421 | }, 422 | "execution_count": 14, 423 | "outputs": [ 424 | { 425 | "output_type": "execute_result", 426 | "data": { 427 | "text/plain": [ 428 | "[UserPrompt(content='Let me know if Ronaldo scored so far', timestamp=datetime.datetime(2024, 12, 13, 21, 28, 29, 504173, tzinfo=datetime.timezone.utc), role='user'),\n", 429 | " ModelStructuredResponse(calls=[ToolCall(tool_name='get_player_goals', args=ArgsJson(args_json='{\"player_name\":\"Ronaldo\"}'), tool_id='call_31XYZmv6dW8mOsEGatfpP7Zi')], timestamp=datetime.datetime(2024, 12, 13, 21, 28, 29, tzinfo=datetime.timezone.utc), role='model-structured-response'),\n", 430 | " ToolReturn(tool_name='get_player_goals', content='100', tool_id='call_31XYZmv6dW8mOsEGatfpP7Zi', timestamp=datetime.datetime(2024, 12, 13, 21, 28, 30, 253595, tzinfo=datetime.timezone.utc), role='tool-return'),\n", 431 | " ModelTextResponse(content='Ronaldo has scored 100 goals so far.', timestamp=datetime.datetime(2024, 12, 13, 21, 28, 30, tzinfo=datetime.timezone.utc), role='model-text-response')]" 432 | ] 433 | }, 434 | "metadata": {}, 435 | "execution_count": 14 436 | } 437 | ] 438 | }, 439 | { 440 | "cell_type": "code", 441 | "source": [ 442 | "response = agent.run_sync(\"Let me know if Saka scored so far\")\n", 443 | "print(response.data)" 444 | ], 445 | "metadata": { 446 | "colab": { 447 | "base_uri": "https://localhost:8080/" 448 | }, 449 | "id": "9fGvNFz6PPJv", 450 | "outputId": "461418a0-38ef-4a95-c3a0-de685b696b78" 451 | }, 452 | "execution_count": 15, 453 | "outputs": [ 454 | { 455 | "output_type": "stream", 456 | "name": "stdout", 457 | "text": [ 458 | "Getting the goals of player Saka so far\n", 459 | "Saka has not scored any goals so far.\n" 460 | ] 461 | } 462 | ] 463 | }, 464 | { 465 | "cell_type": "code", 466 | "source": [ 467 | "response.all_messages()" 468 | ], 469 | "metadata": { 470 | "colab": { 471 | "base_uri": "https://localhost:8080/" 472 | }, 473 | "id": "PjURG5FIOqsY", 474 | "outputId": "0b9c26ef-9e97-4293-eb82-43972e65f1e7" 475 | }, 476 | "execution_count": 16, 477 | "outputs": [ 478 | { 479 | "output_type": "execute_result", 480 | "data": { 481 | "text/plain": [ 482 | "[UserPrompt(content='Let me know if Saka scored so far', timestamp=datetime.datetime(2024, 12, 13, 21, 30, 3, 734955, tzinfo=datetime.timezone.utc), role='user'),\n", 483 | " ModelStructuredResponse(calls=[ToolCall(tool_name='get_player_goals', args=ArgsJson(args_json='{\"player_name\":\"Saka\"}'), tool_id='call_gsWKZgcj67OIi2D7F1UfgMMx')], timestamp=datetime.datetime(2024, 12, 13, 21, 30, 3, tzinfo=datetime.timezone.utc), role='model-structured-response'),\n", 484 | " ToolReturn(tool_name='get_player_goals', content='0', tool_id='call_gsWKZgcj67OIi2D7F1UfgMMx', timestamp=datetime.datetime(2024, 12, 13, 21, 30, 4, 370336, tzinfo=datetime.timezone.utc), role='tool-return'),\n", 485 | " ModelTextResponse(content='Saka has not scored any goals so far.', timestamp=datetime.datetime(2024, 12, 13, 21, 30, 4, tzinfo=datetime.timezone.utc), role='model-text-response')]" 486 | ] 487 | }, 488 | "metadata": {}, 489 | "execution_count": 16 490 | } 491 | ] 492 | } 493 | ] 494 | } --------------------------------------------------------------------------------