├── README.md └── BuildingLLMSystems.ipynb /README.md: -------------------------------------------------------------------------------- 1 | # GENAI for STEM Education Workshop 2 | -------------------------------------------------------------------------------- /BuildingLLMSystems.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "metadata": { 3 | "kernelspec": { 4 | "name": "python3", 5 | "display_name": "Python 3 (ipykernel)", 6 | "language": "python" 7 | }, 8 | "language_info": { 9 | "name": "python", 10 | "version": "3.10.16", 11 | "mimetype": "text/x-python", 12 | "codemirror_mode": { 13 | "name": "ipython", 14 | "version": 3 15 | }, 16 | "pygments_lexer": "ipython3", 17 | "nbconvert_exporter": "python", 18 | "file_extension": ".py" 19 | }, 20 | "colab": { 21 | "provenance": [] 22 | } 23 | }, 24 | "nbformat_minor": 0, 25 | "nbformat": 4, 26 | "cells": [ 27 | { 28 | "cell_type": "markdown", 29 | "source": [ 30 | "#Lab 3: Building AI Systems" 31 | ], 32 | "metadata": { 33 | "id": "qIOnMSlQcjSc" 34 | } 35 | }, 36 | { 37 | "cell_type": "markdown", 38 | "source": [ 39 | "## Teaching Students to Build AI Systems (Making LLM calls)" 40 | ], 41 | "metadata": { 42 | "id": "UMsdfLnXhiIU" 43 | } 44 | }, 45 | { 46 | "cell_type": "markdown", 47 | "source": [ 48 | "### Installation" 49 | ], 50 | "metadata": { 51 | "id": "quRua816celk" 52 | } 53 | }, 54 | { 55 | "cell_type": "code", 56 | "source": [ 57 | "!pip install -q smartfunc\n" 58 | ], 59 | "metadata": { 60 | "id": "_tZoii-ocpNc", 61 | "trusted": true 62 | }, 63 | "outputs": [], 64 | "execution_count": 1 65 | }, 66 | { 67 | "cell_type": "markdown", 68 | "source": [ 69 | "### Environment Variables" 70 | ], 71 | "metadata": { 72 | "id": "GQZDoUbBegRq" 73 | } 74 | }, 75 | { 76 | "cell_type": "code", 77 | "source": [ 78 | "import os\n", 79 | "\n", 80 | "# Set environment variable\n", 81 | "os.environ['OPENAI_API_KEY'] = '' #Enter API Key Here\n" 82 | ], 83 | "metadata": { 84 | "id": "CbGweciDej28", 85 | "trusted": true 86 | }, 87 | "outputs": [], 88 | "execution_count": 3 89 | }, 90 | { 91 | "cell_type": "markdown", 92 | "source": [ 93 | "### Simple Calls\n", 94 | "\n", 95 | "- **Import statement**: \n", 96 | " `from smartfunc import backend` \n", 97 | " - Imports the `backend` decorator from the `smartfunc` module.\n", 98 | "\n", 99 | "- **Function declaration**: \n", 100 | " `@backend(\"gpt-4o\")` \n", 101 | " - Decorates the `generate_summary` function to indicate that it should be executed using the GPT-4o backend, likely delegating execution to an AI model.\n", 102 | "\n", 103 | "- **Function definition**: \n", 104 | " `def generate_summary(text: str):` \n", 105 | " - Defines a function named `generate_summary` that takes a single string argument `text`.\n", 106 | "\n", 107 | "- **Docstring prompt template**: \n", 108 | " `\"\"\"Generate a summary of the following text: {{ text }}\"\"\"` \n", 109 | " - Provides a prompt template using double curly braces (`{{ text }}`) for inserting the input text dynamically when calling the backend model.\n", 110 | "\n", 111 | "- **Function body**: \n", 112 | " `pass` \n", 113 | " - The function body is empty; the logic is presumably handled by the `@backend` decorator and the docstring prompt.\n", 114 | "\n", 115 | "- **Function call**: \n", 116 | " `generate_summary(\"Django ORM\")` \n", 117 | " - Calls the function with the input `\"Django ORM\"`, triggering the GPT-4o backend to generate a summary of that topic." 118 | ], 119 | "metadata": { 120 | "id": "vk5IS0c5ct-o" 121 | } 122 | }, 123 | { 124 | "cell_type": "code", 125 | "source": [ 126 | "from smartfunc import backend\n", 127 | "from openai import OpenAI\n", 128 | "from IPython.display import Markdown, display\n", 129 | "\n", 130 | "client = OpenAI()\n", 131 | "\n", 132 | "@backend(client, model=\"gpt-4o\")\n", 133 | "def generate_answer(text: str):\n", 134 | " return f\"You are friendly buddy bot. Explain in detail for school student. Use lots of emojis and reply in Markdown. Answer this question: { text }\"\n", 135 | "\n", 136 | "Markdown(generate_answer(\"Why is the sky blue?\"))" 137 | ], 138 | "metadata": { 139 | "colab": { 140 | "base_uri": "https://localhost:8080/", 141 | "height": 646 142 | }, 143 | "id": "F68rnibwODqm", 144 | "outputId": "3c010c18-572a-4b5c-da74-e3c151ce0728", 145 | "trusted": true 146 | }, 147 | "outputs": [ 148 | { 149 | "output_type": "execute_result", 150 | "data": { 151 | "text/plain": [ 152 | "" 153 | ], 154 | "text/markdown": "Hello there, young explorer! 🌟 I'm so excited to dive into the colorful mystery of why the sky is blue! Let's explore this cosmic question together! 🛸✨\n\n### The Magic of Sunlight 🌞\n\nFirst, let's start with sunlight. Although it seems white to our eyes, sunlight is actually composed of many different colors. It's like a rainbow all mixed together! 🌈 When all these colors combine, they create white light. Imagine white light as being made up of seven colors: red, orange, yellow, green, blue, indigo, and violet.\n\n### Earth's Atmosphere: A Colorful Playground 🎨\n\nOur planet is wrapped in a layer of air called the atmosphere. It's like a big blanket made of gases, mostly oxygen and nitrogen. This atmosphere is where the magic happens! 🪄\n\n### Scattering: The Key to the Blue Sky 🔹\n\nAs sunlight travels through the atmosphere, it bumps into all the tiny molecules of gas and tiny particles. This causes the light to scatter in all directions. But here's the interesting part: not all colors of light scatter equally!\n\n- **Blue light** has a shorter wavelength, which means it gets scattered much more than other colors. It bounces around in the sky more, which is why we see blue light coming from every direction. That's why the sky looks blue to us! 🌌✨\n\n- **Red and yellow light** have longer wavelengths, so they pass through the atmosphere more directly and scatter less. That's why we don't see these colors as much in the daytime sky.\n\n### Sunrise and Sunset: A Colorful Twist 🌅🌇\n\nNow, have you ever noticed how the sky looks orange or red during sunrise and sunset? This happens because the sun is lower in the sky at these times, meaning its light has to travel through more of the Earth's atmosphere to reach us. The blue light scatters away, so we're left with the reds and oranges. It's like nature's very own artwork! 🎨🌄\n\n### In Summary 📚\n\nSo, next time you look up at the sky during the day, remember that you're seeing blue because of:\n\n1. **Sunlight** being made of different colors.\n2. **The atmosphere** acting like a giant scattering screen.\n3. **Blue light** being scattered more than other colors due to its short wavelength.\n\nI hope you enjoyed this colorful journey through the sky! If you've got more questions, you know where to find me! 😊☁️🌈\n\n---\n\nKeep wondering and exploring! 🔭✈️" 155 | }, 156 | "metadata": {}, 157 | "execution_count": 4 158 | } 159 | ], 160 | "execution_count": 4 161 | }, 162 | { 163 | "cell_type": "code", 164 | "source": [ 165 | "from smartfunc import backend\n", 166 | "from openai import OpenAI\n", 167 | "from IPython.core.display import display, HTML\n", 168 | "\n", 169 | "client = OpenAI()\n", 170 | "\n", 171 | "@backend(client, model=\"gpt-4o\")\n", 172 | "def draw_ascii(text: str):\n", 173 | " return f\"You draw amazing ASCII diagrams. Draw this in ASCII: { text }. Only return the ASCII.\"\n", 174 | "\n", 175 | "ascii_pic = draw_ascii(\"Side view of a Car\")\n", 176 | "print(ascii_pic)" 177 | ], 178 | "metadata": { 179 | "id": "kFx00Ar1Pq4v", 180 | "colab": { 181 | "base_uri": "https://localhost:8080/" 182 | }, 183 | "outputId": "819a3ee5-2fa4-48ee-9d60-5d1542bee9b1", 184 | "trusted": true 185 | }, 186 | "outputs": [ 187 | { 188 | "output_type": "stream", 189 | "name": "stdout", 190 | "text": [ 191 | "```\n", 192 | " ______\n", 193 | " // ||\\ \\\n", 194 | " ___//___||_\\ \\___\n", 195 | " ) _ _ \\\n", 196 | " |_/ \\________/ \\___|\n", 197 | "___\\_/________\\_/____\n", 198 | "```\n" 199 | ] 200 | } 201 | ], 202 | "execution_count": 5 203 | }, 204 | { 205 | "cell_type": "code", 206 | "source": [ 207 | "from smartfunc import backend\n", 208 | "from openai import OpenAI\n", 209 | "\n", 210 | "@backend(client, model=\"gpt-4o\")\n", 211 | "def generate_summary(text: str):\n", 212 | " return f\"Generate a summary of the following text: { text }. Reply using Markdown.\"\n", 213 | "\n", 214 | "\n", 215 | "Markdown(generate_summary(\"Photosynthesis\"))" 216 | ], 217 | "metadata": { 218 | "colab": { 219 | "base_uri": "https://localhost:8080/", 220 | "height": 296 221 | }, 222 | "id": "ZYtl38RmdEbj", 223 | "outputId": "296892f9-3500-4491-be43-e3d9bdf4998b", 224 | "trusted": true 225 | }, 226 | "outputs": [ 227 | { 228 | "output_type": "execute_result", 229 | "data": { 230 | "text/plain": [ 231 | "" 232 | ], 233 | "text/markdown": "# Summary of Photosynthesis\n\nPhotosynthesis is a biochemical process by which green plants, algae, and some bacteria convert light energy into chemical energy. During this process, these organisms harness sunlight and use it to transform carbon dioxide and water into glucose and oxygen. The primary pigment involved in capturing light energy is chlorophyll, which gives plants their green color.\n\nThe process of photosynthesis can be divided into two main stages: the light-dependent reactions and the light-independent reactions, also known as the Calvin cycle. \n\n1. **Light-dependent Reactions**: These occur in the thylakoid membranes of the chloroplasts where sunlight is absorbed by chlorophyll. The energy captured is used to split water molecules, releasing oxygen and producing energy-rich compounds ATP and NADPH.\n\n2. **Calvin Cycle (Light-independent Reactions)**: These take place in the stroma of chloroplasts. ATP and NADPH generated from the light-dependent reactions are utilized to convert carbon dioxide into glucose through a series of chemical reactions.\n\nPhotosynthesis is fundamental for life on Earth as it provides the organic compounds and oxygen necessary for most living organisms to survive." 234 | }, 235 | "metadata": {}, 236 | "execution_count": 6 237 | } 238 | ], 239 | "execution_count": 6 240 | }, 241 | { 242 | "cell_type": "code", 243 | "source": [ 244 | "from smartfunc import backend\n", 245 | "from openai import OpenAI\n", 246 | "\n", 247 | "@backend(client, model=\"gpt-4o\")\n", 248 | "def generate_joke(text: str):\n", 249 | " return f\"Generate a funny joke about: { text }\"\n", 250 | "\n", 251 | "generate_joke(\"Calculus\")" 252 | ], 253 | "metadata": { 254 | "colab": { 255 | "base_uri": "https://localhost:8080/", 256 | "height": 35 257 | }, 258 | "id": "oeEfksfAf3JY", 259 | "outputId": "0a3ba5a4-443b-4d78-9884-8785b0764afb", 260 | "trusted": true 261 | }, 262 | "outputs": [ 263 | { 264 | "output_type": "execute_result", 265 | "data": { 266 | "text/plain": [ 267 | "\"Why did the calculus teacher break up with the geometry teacher?\\n\\nBecause she couldn't handle his derivation from the norm!\"" 268 | ], 269 | "application/vnd.google.colaboratory.intrinsic+json": { 270 | "type": "string" 271 | } 272 | }, 273 | "metadata": {}, 274 | "execution_count": 7 275 | } 276 | ], 277 | "execution_count": 7 278 | }, 279 | { 280 | "cell_type": "markdown", 281 | "source": [ 282 | "## Making a UI with Gradio" 283 | ], 284 | "metadata": { 285 | "id": "W2Z4TtRUyOQz" 286 | } 287 | }, 288 | { 289 | "cell_type": "code", 290 | "source": [ 291 | "!pip install -q gradio smartfunc" 292 | ], 293 | "metadata": { 294 | "id": "LgerLxsjyUIu" 295 | }, 296 | "execution_count": 14, 297 | "outputs": [] 298 | }, 299 | { 300 | "cell_type": "code", 301 | "source": [ 302 | "# Make a UI to display Jokes\n", 303 | "\n", 304 | "import os\n", 305 | "\n", 306 | "# Set environment variable\n", 307 | "os.environ['OPENAI_API_KEY'] = '' #Enter API Key Here\n", 308 | "\n", 309 | "from smartfunc import backend\n", 310 | "from openai import OpenAI\n", 311 | "\n", 312 | "@backend(client, model=\"gpt-4o\")\n", 313 | "def generate_joke(text: str):\n", 314 | " return f\"Generate a funny joke about: { text }\"\n", 315 | "\n", 316 | "import gradio as gr\n", 317 | "\n", 318 | "demo = gr.Interface(\n", 319 | " fn=generate_joke,\n", 320 | " examples=[[\"Dogs\"], [\"Sport\"], [\"Planes\"], [\"Cars\"]],\n", 321 | " inputs=gr.Textbox(\n", 322 | " label=\"Your Topic:\",\n", 323 | " placeholder=\"Type your topic…\"\n", 324 | " ),\n", 325 | " outputs=gr.Textbox(\n", 326 | " label=\"Joke:\",\n", 327 | " lines=3,\n", 328 | " max_lines=5\n", 329 | " ),\n", 330 | " title=\"Jolly Joking App\",\n", 331 | " description=\"Enter your topic for a joke and press the Submit button 😊\"\n", 332 | ")\n", 333 | "\n", 334 | "\n", 335 | "if __name__ == \"__main__\":\n", 336 | " demo.launch()" 337 | ], 338 | "metadata": { 339 | "id": "OBEtFRAJyrcl", 340 | "outputId": "327fa8e8-4676-4f9d-cb49-4faed1fdfad3", 341 | "colab": { 342 | "base_uri": "https://localhost:8080/", 343 | "height": 648 344 | } 345 | }, 346 | "execution_count": 16, 347 | "outputs": [ 348 | { 349 | "output_type": "stream", 350 | "name": "stdout", 351 | "text": [ 352 | "It looks like you are running Gradio on a hosted Jupyter notebook, which requires `share=True`. Automatically setting `share=True` (you can turn this off by setting `share=False` in `launch()` explicitly).\n", 353 | "\n", 354 | "Colab notebook detected. To show errors in colab notebook, set debug=True in launch()\n", 355 | "* Running on public URL: https://72cb514e77f5aa91c5.gradio.live\n", 356 | "\n", 357 | "This share link expires in 1 week. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces (https://huggingface.co/spaces)\n" 358 | ] 359 | }, 360 | { 361 | "output_type": "display_data", 362 | "data": { 363 | "text/plain": [ 364 | "" 365 | ], 366 | "text/html": [ 367 | "
" 368 | ] 369 | }, 370 | "metadata": {} 371 | } 372 | ] 373 | }, 374 | { 375 | "cell_type": "markdown", 376 | "source": [ 377 | "## Teaching Students to Build A Chatbot" 378 | ], 379 | "metadata": { 380 | "id": "13sMP5Xrik5x" 381 | } 382 | }, 383 | { 384 | "cell_type": "markdown", 385 | "source": [ 386 | "### Simple Chatbot with a Gradio UI" 387 | ], 388 | "metadata": { 389 | "id": "pkHpHyz0i1AW" 390 | } 391 | }, 392 | { 393 | "cell_type": "code", 394 | "source": [ 395 | "from IPython.display import Markdown, display" 396 | ], 397 | "metadata": { 398 | "id": "E3O9L8cWipqR", 399 | "trusted": true 400 | }, 401 | "outputs": [], 402 | "execution_count": 9 403 | }, 404 | { 405 | "cell_type": "code", 406 | "source": [ 407 | "import os\n", 408 | "\n", 409 | "# Set environment variable\n", 410 | "os.environ['OPENAI_API_KEY'] = '' #Enter API Key Here" 411 | ], 412 | "metadata": { 413 | "id": "TXaIyQqtiqct", 414 | "trusted": true 415 | }, 416 | "outputs": [], 417 | "execution_count": 10 418 | }, 419 | { 420 | "cell_type": "code", 421 | "source": [ 422 | "!pip install -q --upgrade langchain langchain-openai langchain-core gradio pydantic\n", 423 | "\n", 424 | "# After you run this please Restart the Kernel." 425 | ], 426 | "metadata": { 427 | "colab": { 428 | "base_uri": "https://localhost:8080/" 429 | }, 430 | "id": "bgHRYrTfi3VZ", 431 | "outputId": "3cf16dad-f0af-4ff9-8b10-a8a77c6063e8", 432 | "trusted": true 433 | }, 434 | "outputs": [ 435 | { 436 | "output_type": "stream", 437 | "name": "stdout", 438 | "text": [ 439 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m90.6/90.6 kB\u001b[0m \u001b[31m6.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 440 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m89.9/89.9 kB\u001b[0m \u001b[31m5.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 441 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m102.8/102.8 kB\u001b[0m \u001b[31m6.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 442 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m476.0/476.0 kB\u001b[0m \u001b[31m31.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 443 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m23.0/23.0 MB\u001b[0m \u001b[31m25.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 444 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m55.4/55.4 kB\u001b[0m \u001b[31m3.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 445 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m463.4/463.4 kB\u001b[0m \u001b[31m29.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 446 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m2.1/2.1 MB\u001b[0m \u001b[31m75.0 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", 447 | "\u001b[?25h" 448 | ] 449 | } 450 | ], 451 | "execution_count": 11 452 | }, 453 | { 454 | "cell_type": "code", 455 | "source": [ 456 | "import gradio as gr\n", 457 | "from langchain_openai import ChatOpenAI\n", 458 | "from langchain_core.messages import AIMessage, HumanMessage, SystemMessage\n", 459 | "\n", 460 | "llm = ChatOpenAI(temperature=0.8, model='gpt-4o')\n", 461 | "\n", 462 | "# Define the System Prompt\n", 463 | "system_prompt_text = \"You are an HTML tutor. You only answer questions about HTML Web programming and nothing else.\"\n", 464 | "\n", 465 | "def chat(message, history):\n", 466 | " # Initialize chat history with the System Message\n", 467 | " chat_history = [SystemMessage(content=system_prompt_text)]\n", 468 | "\n", 469 | " # Append the conversation history\n", 470 | " for human, ai in history:\n", 471 | " chat_history.append(HumanMessage(content=human))\n", 472 | " chat_history.append(AIMessage(content=ai))\n", 473 | "\n", 474 | " # Append the latest user message\n", 475 | " chat_history.append(HumanMessage(content=message))\n", 476 | "\n", 477 | " response = llm.invoke(chat_history)\n", 478 | "\n", 479 | " return response.content\n", 480 | "\n", 481 | "# Launch Gradio Chat UI\n", 482 | "gr.ChatInterface(chat).launch(server_name=\"0.0.0.0\", share=True, debug=True)" 483 | ], 484 | "metadata": { 485 | "colab": { 486 | "base_uri": "https://localhost:8080/", 487 | "height": 648 488 | }, 489 | "id": "faJwmzn7i8oJ", 490 | "outputId": "8a928b8d-dee1-49ea-d722-06d34e5e1586", 491 | "trusted": true 492 | }, 493 | "outputs": [ 494 | { 495 | "output_type": "stream", 496 | "name": "stdout", 497 | "text": [ 498 | "Colab notebook detected. This cell will run indefinitely so that you can see errors and logs. To turn off, set debug=False in launch().\n", 499 | "* Running on public URL: https://3f10e588a68bddf308.gradio.live\n", 500 | "\n", 501 | "This share link expires in 1 week. For free permanent hosting and GPU upgrades, run `gradio deploy` from the terminal in the working directory to deploy to Hugging Face Spaces (https://huggingface.co/spaces)\n" 502 | ] 503 | }, 504 | { 505 | "output_type": "display_data", 506 | "data": { 507 | "text/plain": [ 508 | "" 509 | ], 510 | "text/html": [ 511 | "
" 512 | ] 513 | }, 514 | "metadata": {} 515 | }, 516 | { 517 | "output_type": "stream", 518 | "name": "stdout", 519 | "text": [ 520 | "Keyboard interruption in main thread... closing server.\n", 521 | "Killing tunnel 0.0.0.0:7860 <> https://3f10e588a68bddf308.gradio.live\n" 522 | ] 523 | }, 524 | { 525 | "output_type": "execute_result", 526 | "data": { 527 | "text/plain": [] 528 | }, 529 | "metadata": {}, 530 | "execution_count": 12 531 | } 532 | ], 533 | "execution_count": 12 534 | }, 535 | { 536 | "cell_type": "code", 537 | "source": [], 538 | "metadata": { 539 | "trusted": true, 540 | "id": "xTYzPUnHxI9b" 541 | }, 542 | "outputs": [], 543 | "execution_count": null 544 | } 545 | ] 546 | } --------------------------------------------------------------------------------