├── .devcontainer ├── Dockerfile ├── devcontainer.json ├── jbang.py ├── noop.txt └── setup.sh ├── .gitignore ├── CODE_OF_CONDUCT.md ├── LICENSE ├── README.md ├── README.zh-cn.md ├── SECURITY.md ├── SUPPORT.md ├── docs ├── cn │ ├── 00.IntroduceLLM.md │ ├── 01.UsingAzureOpenAIServiceWithSDK.md │ ├── 02.IntroduceSemanticKernel.md │ ├── 03.Plugins.md │ ├── 04.Planner.md │ └── 05.Embeddings.md └── en │ ├── 00.IntroduceLLM.md │ ├── 01.UsingAzureOpenAIServiceWithSDK.md │ ├── 02.IntroduceSemanticKernel.md │ ├── 03.Plugins.md │ ├── 04.Planner.md │ └── 05.Embeddings.md ├── imgs ├── 00 │ ├── aoaistudio.png │ ├── aoaistudio_huggingface.png │ └── msandopenai.png ├── 001 │ ├── logo.png │ └── skpatternlarge.png ├── 002 │ └── cross-platform-plugins.png ├── 003 │ └── the-planner.png ├── 01 │ ├── aoaicreate.png │ ├── aoaidone.png │ ├── aoaigo.png │ ├── aoaikey.png │ ├── aoaimodel.png │ ├── aoaimodeldeploy.png │ ├── aoaimodels.png │ └── aoairesource.png ├── 02 │ └── kernel.png ├── 03 │ └── introplugins.png ├── 04 │ └── the-planner.png ├── 05 │ └── GPT.png └── cover.png ├── notebooks ├── dotNET │ ├── 01 │ │ └── dotNETSDKAOAIDemo.ipynb │ ├── 02 │ │ └── LearnSK.ipynb │ ├── 03 │ │ ├── FunctionCallWithSK.ipynb │ │ └── PluginWithSK.ipynb │ ├── 04 │ │ └── PlannerWithSK.ipynb │ └── 05 │ │ └── EmbeddingsWithSK.ipynb ├── java │ ├── 01 │ │ └── JavaSDKAOAIDemo.ipynb │ ├── 02 │ │ └── LearnSK.ipynb │ ├── 03 │ │ └── PluginWithSK.ipynb │ ├── 04 │ │ └── PlannerWithSK.ipynb │ └── 05 │ │ └── EmbeddingsWithSK.ipynb └── python │ ├── 01 │ ├── .env │ ├── PythonSDKAOAIDemo.ipynb │ └── images │ │ └── generated_image.png │ ├── 02 │ ├── .env │ └── LearnSK.ipynb │ ├── 03 │ ├── .env │ ├── APIPlugin │ │ └── CustomPlugin.py │ ├── FunctionCallWithSK.ipynb │ └── PluginWithSK.ipynb │ ├── 04 │ ├── .env │ ├── APIPlugin │ │ └── CustomPlugin.py │ └── PlannerWithSK.ipynb │ └── 05 │ ├── .env │ └── EmbeddingsWithSK.ipynb ├── pdf ├── cn │ └── SemanticKernelCookBook.cn.pdf └── en │ └── SemanticKernelCookBook.en.pdf ├── plugins ├── APIPlugin │ └── CustomPlugin.py ├── CustomPlugin │ └── CompanySearchPlugin.cs ├── EmailPlugin │ └── WeatherMail │ │ ├── config.json │ │ └── skprompt.txt ├── TranslatePlugin │ ├── Basic │ │ ├── config.json │ │ └── skprompt.txt │ └── MultiLanguage │ │ ├── config.json │ │ └── skprompt.txt └── WriterPlugin │ ├── Classification │ ├── config.json │ └── skprompt.txt │ └── Tips │ ├── config.json │ └── skprompt.txt └── workshop ├── dotNET ├── workshop1 │ ├── data │ │ ├── .env │ │ ├── notebook.ipynb │ │ ├── notes │ │ │ ├── 01-introduction.md │ │ │ ├── 02-history-of-ml.md │ │ │ └── 03-techniques-ml.md │ │ └── transcripts │ │ │ ├── 01-introduction.txt │ │ │ ├── 02-history-of-ml.txt │ │ │ └── 03-techniques-ml.txt │ ├── notebook.ipynb │ └── plugins │ │ ├── AnswerPlugin │ │ └── Summary │ │ │ ├── config.json │ │ │ └── skprompt.txt │ │ └── FilePlugin │ │ ├── Notes │ │ ├── config.json │ │ └── skprompt.txt │ │ └── Transcrips │ │ ├── config.json │ │ └── skprompt.txt ├── workshop2 │ ├── docs │ │ └── session1.pdf │ ├── genppt_notebook.ipynb │ └── plugins │ │ └── ToolsPlugin │ │ └── PPT │ │ ├── config.json │ │ └── skprompt.txt └── workshop3 │ └── dotNETAgent │ ├── SKAIAgent.ipynb │ ├── Utils │ └── Settings.cs │ └── plugins │ ├── CodePlugin │ └── dotNET │ │ ├── config.json │ │ └── skprompt.txt │ ├── CustomPlugin │ ├── GenCLIPlugin.cs │ └── RunPlugin.cs │ └── QAPlugin │ └── qa │ ├── config.json │ └── skprompt.txt ├── java └── workshop1 │ ├── data │ ├── notes │ │ ├── 01-introduction.md │ │ ├── 02-history-of-ml.md │ │ └── 03-techniques-ml.md │ └── transcripts │ │ ├── 01-introduction.txt │ │ ├── 02-history-of-ml.txt │ │ └── 03-techniques-ml.txt │ ├── notebook.ipynb │ └── plugins │ ├── AnswerPlugin │ └── Summary │ │ ├── config.json │ │ └── skprompt.txt │ └── FilePlugin │ ├── Notes │ ├── config.json │ └── skprompt.txt │ └── Transcrips │ ├── config.json │ └── skprompt.txt └── python ├── workshop1 ├── .env ├── data │ ├── notes │ │ ├── 01-introduction.md │ │ ├── 02-history-of-ml.md │ │ └── 03-techniques-ml.md │ └── transcripts │ │ ├── 01-introduction.txt │ │ ├── 02-history-of-ml.txt │ │ └── 03-techniques-ml.txt ├── notebook.ipynb └── plugins │ ├── AnswerPlugin │ └── Summary │ │ ├── config.json │ │ └── skprompt.txt │ └── FilePlugin │ ├── Notes │ ├── config.json │ └── skprompt.txt │ └── Transcrips │ ├── config.json │ └── skprompt.txt └── workshop2 ├── .env ├── docs └── session1.pdf ├── genppt_notebook.ipynb ├── myfile.md ├── myfile.pptx └── plugins └── ToolsPlugin └── PPT ├── config.json └── skprompt.txt /.devcontainer/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM mcr.microsoft.com/devcontainers/miniconda:0-3 2 | 3 | 4 | ENV JAVA_HOME=/usr/local/sdkman/candidates/java/21.0.2-tem/ 5 | ENV PATH=$PATH:JAVA_HOME/bin 6 | ENV PATH=$PATH:/usr/share/dotnet 7 | 8 | # Copy environment.yml (if found) to a temp location so we update the environment. Also 9 | # copy "noop.txt" so the COPY instruction does not fail if no environment.yml exists. 10 | COPY environment.yml* .devcontainer/noop.txt /tmp/conda-tmp/ 11 | RUN if [ -f "/tmp/conda-tmp/environment.yml" ]; then umask 0002 && /opt/conda/bin/conda env update -n base -f /tmp/conda-tmp/environment.yml; fi \ 12 | && rm -rf /tmp/conda-tmp 13 | 14 | # [Optional] Uncomment to install a different version of Python than the default 15 | RUN pip install jbang 16 | 17 | CMD [ "python", "./jbang.py"] 18 | 19 | # RUN wget https://github.com/padreati/rapaio-jupyter-kernel/releases/download/1.3.0/rapaio-jupyter-kernel-1.3.0.jar \ 20 | # && java -jar ./rapaio-jupyter-kernel-1.3.0.jar -i -auto 21 | 22 | # [Optional] Uncomment this section to install additional OS packages. 23 | # RUN apt-get update && export DEBIAN_FRONTEND=noninteractive \ 24 | # && apt-get -y install --no-install-recommends 25 | -------------------------------------------------------------------------------- /.devcontainer/devcontainer.json: -------------------------------------------------------------------------------- 1 | // For format details, see https://aka.ms/devcontainer.json. For config options, see the 2 | // README at: https://github.com/devcontainers/templates/tree/main/src/miniconda 3 | { 4 | "name": "Miniconda (Python 3)", 5 | "build": { 6 | "context": "..", 7 | "dockerfile": "Dockerfile" 8 | }, 9 | "postCreateCommand": "bash ./.devcontainer/setup.sh", 10 | "features": { 11 | "ghcr.io/devcontainers/features/dotnet:2": {} , 12 | "ghcr.io/rocker-org/devcontainer-features/miniforge:1": {}, 13 | "ghcr.io/devcontainers/features/docker-in-docker:2": {}, 14 | "ghcr.io/devcontainers/features/java:1": { 15 | "version": "21", 16 | "installMaven": "true", 17 | "installGradle": "true" 18 | }, 19 | "ghcr.io/devcontainers-contrib/features/ant-sdkman:2": {} 20 | }, 21 | "customizations": { 22 | "vscode": { 23 | "extensions": [ 24 | "ms-python.python", 25 | "ms-dotnettools.csdevkit", 26 | "ms-dotnettools.vscode-dotnet-pack", 27 | "ms-azuretools.vscode-docker", 28 | "vscjava.vscode-java-pack" 29 | ] 30 | } 31 | }, 32 | "remoteEnv": { 33 | "JAVA_HOME": "/usr/local/sdkman/candidates/java/21.0.2-tem/", 34 | "PATH": "${containerEnv:PATH}:/usr/share/dotnet" 35 | } 36 | 37 | // Features to add to the dev container. More info: https://containers.dev/features. 38 | // "features": {}, 39 | 40 | // Use 'forwardPorts' to make a list of ports inside the container available locally. 41 | // "forwardPorts": [], 42 | 43 | // Use 'postCreateCommand' to run commands after the container is created. 44 | // "postCreateCommand": "python --version", 45 | 46 | // Configure tool-specific properties. 47 | // "customizations": {}, 48 | 49 | // Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root. 50 | // "remoteUser": "root" 51 | } 52 | -------------------------------------------------------------------------------- /.devcontainer/jbang.py: -------------------------------------------------------------------------------- 1 | import jbang 2 | jbang.exec("trust add https://github.com/jupyter-java") 3 | jbang.exec("install-kernel@jupyter-java") -------------------------------------------------------------------------------- /.devcontainer/noop.txt: -------------------------------------------------------------------------------- 1 | This file is copied into the container along with environment.yml* from the 2 | parent folder. This is done to prevent the Dockerfile COPY instruction from 3 | failing if no environment.yml is found. -------------------------------------------------------------------------------- /.devcontainer/setup.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | ENV_NAME="pydev" 4 | 5 | PYTHON_VERSION="3.10.12" 6 | 7 | # Check if the environment already exists 8 | if (conda env list | grep $ENV_NAME); then 9 | echo "$ENV_NAME environment already exists" 10 | else 11 | echo "Creating $ENV_NAME environment" 12 | conda create -n $ENV_NAME python=$PYTHON_VERSION -y 13 | fi 14 | 15 | source /opt/conda/etc/profile.d/conda.sh 16 | conda init 17 | conda activate $ENV_NAME 18 | 19 | wget https://github.com/padreati/rapaio-jupyter-kernel/releases/download/1.3.0/rapaio-jupyter-kernel-1.3.0.jar 20 | java -jar ./rapaio-jupyter-kernel-1.3.0.jar -i -auto 21 | rm -r ./rapaio-jupyter-kernel-1.3.0.jar 22 | 23 | 24 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | ## Ignore Visual Studio temporary files, build results, and 2 | ## files generated by popular Visual Studio add-ons. 3 | ## 4 | ## Get latest from https://github.com/github/gitignore/blob/main/VisualStudio.gitignore 5 | 6 | # User-specific files 7 | *.rsuser 8 | *.suo 9 | *.user 10 | *.userosscache 11 | *.sln.docstates 12 | 13 | # User-specific files (MonoDevelop/Xamarin Studio) 14 | *.userprefs 15 | 16 | # Mono auto generated files 17 | mono_crash.* 18 | 19 | # Build results 20 | [Dd]ebug/ 21 | [Dd]ebugPublic/ 22 | [Rr]elease/ 23 | [Rr]eleases/ 24 | x64/ 25 | x86/ 26 | [Ww][Ii][Nn]32/ 27 | [Aa][Rr][Mm]/ 28 | [Aa][Rr][Mm]64/ 29 | bld/ 30 | [Bb]in/ 31 | [Oo]bj/ 32 | [Ll]og/ 33 | [Ll]ogs/ 34 | 35 | # Visual Studio 2015/2017 cache/options directory 36 | .vs/ 37 | # Uncomment if you have tasks that create the project's static files in wwwroot 38 | #wwwroot/ 39 | 40 | # Visual Studio 2017 auto generated files 41 | Generated\ Files/ 42 | 43 | # MSTest test Results 44 | [Tt]est[Rr]esult*/ 45 | [Bb]uild[Ll]og.* 46 | 47 | # NUnit 48 | *.VisualState.xml 49 | TestResult.xml 50 | nunit-*.xml 51 | 52 | # Build Results of an ATL Project 53 | [Dd]ebugPS/ 54 | [Rr]eleasePS/ 55 | dlldata.c 56 | 57 | # Benchmark Results 58 | BenchmarkDotNet.Artifacts/ 59 | 60 | # .NET Core 61 | project.lock.json 62 | project.fragment.lock.json 63 | artifacts/ 64 | 65 | # ASP.NET Scaffolding 66 | ScaffoldingReadMe.txt 67 | 68 | # StyleCop 69 | StyleCopReport.xml 70 | 71 | # Files built by Visual Studio 72 | *_i.c 73 | *_p.c 74 | *_h.h 75 | *.ilk 76 | *.meta 77 | *.obj 78 | *.iobj 79 | *.pch 80 | *.pdb 81 | *.ipdb 82 | *.pgc 83 | *.pgd 84 | *.rsp 85 | *.sbr 86 | *.tlb 87 | *.tli 88 | *.tlh 89 | *.tmp 90 | *.tmp_proj 91 | *_wpftmp.csproj 92 | *.log 93 | *.tlog 94 | *.vspscc 95 | *.vssscc 96 | .builds 97 | *.pidb 98 | *.svclog 99 | *.scc 100 | 101 | # Chutzpah Test files 102 | _Chutzpah* 103 | 104 | # Visual C++ cache files 105 | ipch/ 106 | *.aps 107 | *.ncb 108 | *.opendb 109 | *.opensdf 110 | *.sdf 111 | *.cachefile 112 | *.VC.db 113 | *.VC.VC.opendb 114 | 115 | # Visual Studio profiler 116 | *.psess 117 | *.vsp 118 | *.vspx 119 | *.sap 120 | 121 | # Visual Studio Trace Files 122 | *.e2e 123 | 124 | # TFS 2012 Local Workspace 125 | $tf/ 126 | 127 | # Guidance Automation Toolkit 128 | *.gpState 129 | 130 | # ReSharper is a .NET coding add-in 131 | _ReSharper*/ 132 | *.[Rr]e[Ss]harper 133 | *.DotSettings.user 134 | 135 | # TeamCity is a build add-in 136 | _TeamCity* 137 | 138 | # DotCover is a Code Coverage Tool 139 | *.dotCover 140 | 141 | # AxoCover is a Code Coverage Tool 142 | .axoCover/* 143 | !.axoCover/settings.json 144 | 145 | # Coverlet is a free, cross platform Code Coverage Tool 146 | coverage*.json 147 | coverage*.xml 148 | coverage*.info 149 | 150 | # Visual Studio code coverage results 151 | *.coverage 152 | *.coveragexml 153 | 154 | # NCrunch 155 | _NCrunch_* 156 | .*crunch*.local.xml 157 | nCrunchTemp_* 158 | 159 | # MightyMoose 160 | *.mm.* 161 | AutoTest.Net/ 162 | 163 | # Web workbench (sass) 164 | .sass-cache/ 165 | 166 | # Installshield output folder 167 | [Ee]xpress/ 168 | 169 | # DocProject is a documentation generator add-in 170 | DocProject/buildhelp/ 171 | DocProject/Help/*.HxT 172 | DocProject/Help/*.HxC 173 | DocProject/Help/*.hhc 174 | DocProject/Help/*.hhk 175 | DocProject/Help/*.hhp 176 | DocProject/Help/Html2 177 | DocProject/Help/html 178 | 179 | # Click-Once directory 180 | publish/ 181 | 182 | # Publish Web Output 183 | *.[Pp]ublish.xml 184 | *.azurePubxml 185 | # Note: Comment the next line if you want to checkin your web deploy settings, 186 | # but database connection strings (with potential passwords) will be unencrypted 187 | *.pubxml 188 | *.publishproj 189 | 190 | # Microsoft Azure Web App publish settings. Comment the next line if you want to 191 | # checkin your Azure Web App publish settings, but sensitive information contained 192 | # in these scripts will be unencrypted 193 | PublishScripts/ 194 | 195 | # NuGet Packages 196 | *.nupkg 197 | # NuGet Symbol Packages 198 | *.snupkg 199 | # The packages folder can be ignored because of Package Restore 200 | **/[Pp]ackages/* 201 | # except build/, which is used as an MSBuild target. 202 | !**/[Pp]ackages/build/ 203 | # Uncomment if necessary however generally it will be regenerated when needed 204 | #!**/[Pp]ackages/repositories.config 205 | # NuGet v3's project.json files produces more ignorable files 206 | *.nuget.props 207 | *.nuget.targets 208 | 209 | # Microsoft Azure Build Output 210 | csx/ 211 | *.build.csdef 212 | 213 | # Microsoft Azure Emulator 214 | ecf/ 215 | rcf/ 216 | 217 | # Windows Store app package directories and files 218 | AppPackages/ 219 | BundleArtifacts/ 220 | Package.StoreAssociation.xml 221 | _pkginfo.txt 222 | *.appx 223 | *.appxbundle 224 | *.appxupload 225 | 226 | # Visual Studio cache files 227 | # files ending in .cache can be ignored 228 | *.[Cc]ache 229 | # but keep track of directories ending in .cache 230 | !?*.[Cc]ache/ 231 | 232 | # Others 233 | ClientBin/ 234 | ~$* 235 | *~ 236 | *.dbmdl 237 | *.dbproj.schemaview 238 | *.jfm 239 | *.pfx 240 | *.publishsettings 241 | orleans.codegen.cs 242 | 243 | # Including strong name files can present a security risk 244 | # (https://github.com/github/gitignore/pull/2483#issue-259490424) 245 | #*.snk 246 | 247 | # Since there are multiple workflows, uncomment next line to ignore bower_components 248 | # (https://github.com/github/gitignore/pull/1529#issuecomment-104372622) 249 | #bower_components/ 250 | 251 | # RIA/Silverlight projects 252 | Generated_Code/ 253 | 254 | # Backup & report files from converting an old project file 255 | # to a newer Visual Studio version. Backup files are not needed, 256 | # because we have git ;-) 257 | _UpgradeReport_Files/ 258 | Backup*/ 259 | UpgradeLog*.XML 260 | UpgradeLog*.htm 261 | ServiceFabricBackup/ 262 | *.rptproj.bak 263 | 264 | # SQL Server files 265 | *.mdf 266 | *.ldf 267 | *.ndf 268 | 269 | # Business Intelligence projects 270 | *.rdl.data 271 | *.bim.layout 272 | *.bim_*.settings 273 | *.rptproj.rsuser 274 | *- [Bb]ackup.rdl 275 | *- [Bb]ackup ([0-9]).rdl 276 | *- [Bb]ackup ([0-9][0-9]).rdl 277 | 278 | # Microsoft Fakes 279 | FakesAssemblies/ 280 | 281 | # GhostDoc plugin setting file 282 | *.GhostDoc.xml 283 | 284 | # Node.js Tools for Visual Studio 285 | .ntvs_analysis.dat 286 | node_modules/ 287 | 288 | # Visual Studio 6 build log 289 | *.plg 290 | 291 | # Visual Studio 6 workspace options file 292 | *.opt 293 | 294 | # Visual Studio 6 auto-generated workspace file (contains which files were open etc.) 295 | *.vbw 296 | 297 | # Visual Studio 6 auto-generated project file (contains which files were open etc.) 298 | *.vbp 299 | 300 | # Visual Studio 6 workspace and project file (working project files containing files to include in project) 301 | *.dsw 302 | *.dsp 303 | 304 | # Visual Studio 6 technical files 305 | *.ncb 306 | *.aps 307 | 308 | # Visual Studio LightSwitch build output 309 | **/*.HTMLClient/GeneratedArtifacts 310 | **/*.DesktopClient/GeneratedArtifacts 311 | **/*.DesktopClient/ModelManifest.xml 312 | **/*.Server/GeneratedArtifacts 313 | **/*.Server/ModelManifest.xml 314 | _Pvt_Extensions 315 | 316 | # Paket dependency manager 317 | .paket/paket.exe 318 | paket-files/ 319 | 320 | # FAKE - F# Make 321 | .fake/ 322 | 323 | # CodeRush personal settings 324 | .cr/personal 325 | 326 | # Python Tools for Visual Studio (PTVS) 327 | __pycache__/ 328 | *.pyc 329 | 330 | # Cake - Uncomment if you are using it 331 | # tools/** 332 | # !tools/packages.config 333 | 334 | # Tabs Studio 335 | *.tss 336 | 337 | # Telerik's JustMock configuration file 338 | *.jmconfig 339 | 340 | # BizTalk build output 341 | *.btp.cs 342 | *.btm.cs 343 | *.odx.cs 344 | *.xsd.cs 345 | 346 | # OpenCover UI analysis results 347 | OpenCover/ 348 | 349 | # Azure Stream Analytics local run output 350 | ASALocalRun/ 351 | 352 | # MSBuild Binary and Structured Log 353 | *.binlog 354 | 355 | # NVidia Nsight GPU debugger configuration file 356 | *.nvuser 357 | 358 | # MFractors (Xamarin productivity tool) working folder 359 | .mfractor/ 360 | 361 | # Local History for Visual Studio 362 | .localhistory/ 363 | 364 | # Visual Studio History (VSHistory) files 365 | .vshistory/ 366 | 367 | # BeatPulse healthcheck temp database 368 | healthchecksdb 369 | 370 | # Backup folder for Package Reference Convert tool in Visual Studio 2017 371 | MigrationBackup/ 372 | 373 | # Ionide (cross platform F# VS Code tools) working folder 374 | .ionide/ 375 | 376 | # Fody - auto-generated XML schema 377 | FodyWeavers.xsd 378 | 379 | # VS Code files for those working on multiple tools 380 | .vscode/* 381 | !.vscode/settings.json 382 | !.vscode/tasks.json 383 | !.vscode/launch.json 384 | !.vscode/extensions.json 385 | *.code-workspace 386 | 387 | # Local History for Visual Studio Code 388 | .history/ 389 | 390 | # Windows Installer files from build outputs 391 | *.cab 392 | *.msi 393 | *.msix 394 | *.msm 395 | *.msp 396 | 397 | # JetBrains Rider 398 | *.sln.iml 399 | 400 | .DS_Store 401 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Microsoft Open Source Code of Conduct 2 | 3 | This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). 4 | 5 | Resources: 6 | 7 | - [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/) 8 | - [Microsoft Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) 9 | - Contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with questions or concerns 10 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) Microsoft Corporation. 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # **Semantic Kernel Cookbook** 2 | 3 | ***Note:*** The content of this book is based on Semantic Kernel dotnet-1.16.2 and python-1.3.0 and java-1.2.0 4 | 5 | ![cover](imgs/cover.png) 6 | 7 | With the rise of LLM, AI has entered the 2.0 era. Compared with previous AI technologies, the threshold has been lowered and the applicability has been enhanced. It is no longer limited to the field of data science, and more different types of jobs and roles of people are participating in large-scale research in the application scenarios of the model. For how traditional engineering projects or enterprise applications to enter the field of LLM, a framework is an important key. Especially for traditional projects, companies must think about how to access LLM faster and at low cost. In 2023, the first year of LLM, the open source community has a lot of frameworks and solutions based on LLM applications. I personally like LangChain, BentoML, prompt flow , autogen and Semantic Kernel. But overall, Semantic Kernel is more suitable for traditional engineering and multi-language engineering teams, LangChain is suitable for data science personnel, and BenToML is suitable for multi-model deployment scenarios. In December 2023, when Semantic Kernel officially releases 1.0.1 based on .NET version, I also hope to use this manual to give you some ways to learn and get started. Although Semantic Kernel still has many imperfections, it does not prevent everyone from learning and using it. 8 | 9 | This manual mainly focuses on the implementation of the Semantic Kernel version in .NET, Java and Python to get everyone started, and combines it with Azure OpenAI Service to provide guidance for everyone who needs to master the development of large model applications. This manual will be updated as closely as possible with Semantic Kernel releases so that everyone can master the latest Semantic Kernel techniques. The following are the corresponding chapters and corresponding codes in this manual. Please study as needed: 10 | 11 | | Session | Intro |
.NET
Samples
|
Python
Samples
|
Java
Samples
| 12 | |----------|:----------|:-------------:|:-------------:|:-------------:| 13 | | [Getting started with LLM](https://github.com/kinfey/SemanticKernelCookBook/blob/main/docs/en/00.IntroduceLLM.md) | Begin to know LLM, including OpenAI, Azure OpenAI Service and LLM on Hugging face | | | | 14 | | [Using Azure OpenAI Service With SDK](https://github.com/kinfey/SemanticKernelCookBook/blob/main/docs/en/01.UsingAzureOpenAIServiceWithSDK.md) | Learn how to use Azure OpenAI Service with SDK |
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/01/dotNETSDKAOAIDemo.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/01/PythonSDKAOAIDemo.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/01/JavaSDKAOAIDemo.ipynb)
| 15 | | [Foundations of Semantic Kernel](https://github.com/kinfey/SemanticKernelCookBook/blob/main/docs/en/02.IntroduceSemanticKernel.md) | What is Semantic Kernel? What are its advantages and disadvantages? Semantic Kernel related concepts, etc. |
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/02/LearnSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/02/LearnSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/02/LearnSK.ipynb)
| 16 | | [The skills of LLM - Plugins](https://github.com/kinfey/SemanticKernelCookBook/blob/main/docs/en/03.Plugins.md) | We know that communicating with LLM requires the use of prompt engineering? For enterprise applications, there are many business-oriented prompt engineering. In Semantic Kernel we call it Plugins. In this session we will introduce how to use Semantic Kernel Plugins and how to define your own Plugins |
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/03/PluginWithSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/03/PluginWithSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/03/PluginWithSK.ipynb)
| 17 | | [Planner - Let LLM have planning work](https://github.com/kinfey/SemanticKernelCookBook/blob/main/docs/en/04.Planner.md) | Human beings need to complete a job step by step, and the same goes for LLMs. Semantic Kernel has a very powerful task planning capability - Planner, in this session we will explain in detail how to define and use Planner to make your application more intelligent |
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/04/PlannerWithSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/04/PlannerWithSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/04/PlannerWithSK.ipynb)
| 18 | | [Embedding Skills](https://github.com/kinfey/SemanticKernelCookBook/blob/main/docs/en/05.Embeddings.md) | Building RAG applications is the most commonly used LLM solution at this stage. It is very convenient to build RAG applications through Semantic Kernel This session will tell you how to use Semantic Kernel Embeddings |
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/05/EmbeddingsWithSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/05/EmbeddingsWithSK.ipynb)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/05/EmbeddingsWithSK.ipynb)
| 19 | | HandsOnLab | Through three hands on labs projects, let everyone truly understand the application of Semantic Kernel |
[Click](https://github.com/kinfey/SemanticKernelCookBook/tree/main/workshop/dotNET)
|
[Click](https://github.com/kinfey/SemanticKernelCookBook/tree/main/workshop/python)
| [Click](https://github.com/kinfey/SemanticKernelCookBook/tree/main/workshop/java) | 20 | 21 | 22 | ***如果你需要中文,请 [点击这里](https://github.com/kinfey/SemanticKernelCookBook/blob/main/README.zh-cn.md)*** 23 | 24 | 25 | -------------------------------------------------------------------------------- /README.zh-cn.md: -------------------------------------------------------------------------------- 1 | # **Semantic Kernel 入门手册** 2 | 3 | ***注意:*** 本书的内容基于 Semantic Kernel dotnet-1.16.2 and python-1.3.0 and java-1.2.0 4 | 5 | ![cover](imgs/cover.png) 6 | 7 | 随着大模型兴起,人工智能进入到 2.0 时代,与过往人工智能技术相比门槛降低了,可应用性增强,而且不在局限在数据科学的领域上,有更多不同的工种和人群参与到大模型的应用场景中。对于传统工程项目或者企业应用如何进入到大模型的领域当中呢,框架是必须的。特别对于传统项目,如何更快,低成本地接入大模型是企业所必须思考的。在大模型的元年 2023 年,开源社区有非常多的基于大模型应用的框架和解决方案,我本人比较喜欢 LangChain, BentoML, prompt flow , autogen 以及 Semantic Kernel。但纵观来说 , Semantic Kernel 更适合传统工程以及多语言体系的工程团队使用, LangChain 适合数据科学人员进行使用,至于 BenToML 适合多模型部署的场景。在 2023 年 12 月,Semantic Kernel 正式发布 1.0.1 基于 .NET 版本之际,也希望通过本手册给大家一些学习入门的方法。虽然 Semantic Kernel 还有很多的不完善的地方,但不阻碍大家学习和使用。 8 | 9 | 本手册主要围绕 .NET ,Java 以及 Python 两个版本的 Semantic Kernel 版本实现去带大家进行入门,结合 Azure OpenAI Service 给需要掌握大模型应用开发的各位进行指导。本手册会尽量跟随 Semantic Kernel 的发行版本同步更新,让大家可以掌握最新的 Semantic Kernel 技巧。以下是本手册对应的章节以及对应代码,请根据需要进行学习: 10 | 11 | 12 | 13 | | 课程名 | 介绍 |
.NET
示例
|
Python
示例
|
Java
示例
| 14 | |----------|:----------|:-------------:|------:|------:| 15 | | [了解大型语言模型](https://github.com/kinfey/SemanticKernelCookBook/tree/main/docs/cn/00.IntroduceLLM.md) | 认识大模型,包括 OpenAI, Azure OpenAI Service 以及 Hugging face 上的大模型 | | | | 16 | | [用 SDK 访问 Azure OpenAI Service](https://github.com/kinfey/SemanticKernelCookBook/tree/main/docs/cn/01.UsingAzureOpenAIServiceWithSDK.md) | 使用 SDK 用最熟悉的编程语言访问 Azure OpenAI Service |
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/01/dotNETSDKAOAIDemo.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/01/PythonSDKAOAIDemo.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/01/JavaSDKAOAIDemo.ipynb)
| 17 | | [Semantic Kernel 基础](https://github.com/kinfey/SemanticKernelCookBook/tree/main/docs/cn/02.IntroduceSemanticKernel.md) | 什么是 Semantic Kernel ? 它的优点和缺点是什么? Semantic Kernel 相关概念等 |
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/02/LearnSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/02/LearnSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/02/LearnSK.ipynb)
| 18 | | [开启大模型的技能之门 - Plugins](https://github.com/kinfey/SemanticKernelCookBook/tree/main/docs/cn/03.Plugins.md) | 我们知道和大模型交流需要使用提示工程? 对于企业的应用都有不少针对业务的提示工程和大模型交流,在 Semantic Kernel 我们把它称为 Plugins。本节我们会介绍如何使用 Semantic Kernel 的 Plugins 以及如何定义属于自己的 Plugins |
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/03/PluginWithSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/03/PluginWithSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/03/PluginWithSK.ipynb)
| 19 | | [Planner - 让大模型有规划地工作](https://github.com/kinfey/SemanticKernelCookBook/tree/main/docs/cn/04.Planner.md) | 人类完成一个工作需要按步就班,大模型也一样。 Semantic Kernel 有非常强大的计划任务规划能力 - Planner,本章我们会和大家一一细说如何定义和使用 Planner 让您的应用更具智能化 |
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/04/PlannerWithSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/04/PlannerWithSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/04/PlannerWithSK.ipynb)
| 20 | | [嵌入式的技巧 - Embeddings](https://github.com/kinfey/SemanticKernelCookBook/tree/main/docs/cn/05.Embeddings.md) | 构建 RAG 应用是现阶段最多人使用的大模型解决方案,通过 Semantic Kernel 可以非常方便地构建 RAG 应用,本章会从最基础部分开始让大家通过 Semantic Kernel 完成 Embeddings 的工作 |
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/dotNET/05/EmbeddingsWithSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/python/05/EmbeddingsWithSK.ipynb)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/blob/main/notebooks/java/05/EmbeddingsWithSK.ipynb)
| 21 | | 项目实战 | 通过三个项目实战,让大家动手真正了解 Semantic Kernel 的应用 |
[进入](https://github.com/kinfey/SemanticKernelCookBook/tree/main/workshop/dotNET)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/tree/main/workshop/python)
|
[进入](https://github.com/kinfey/SemanticKernelCookBook/tree/main/workshop/java)
| 22 | 23 | 24 | 25 | 26 | 27 | -------------------------------------------------------------------------------- /SECURITY.md: -------------------------------------------------------------------------------- 1 | 2 | 3 | ## Security 4 | 5 | Microsoft takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organizations, which include [Microsoft](https://github.com/Microsoft), [Azure](https://github.com/Azure), [DotNet](https://github.com/dotnet), [AspNet](https://github.com/aspnet) and [Xamarin](https://github.com/xamarin). 6 | 7 | If you believe you have found a security vulnerability in any Microsoft-owned repository that meets [Microsoft's definition of a security vulnerability](https://aka.ms/security.md/definition), please report it to us as described below. 8 | 9 | ## Reporting Security Issues 10 | 11 | **Please do not report security vulnerabilities through public GitHub issues.** 12 | 13 | Instead, please report them to the Microsoft Security Response Center (MSRC) at [https://msrc.microsoft.com/create-report](https://aka.ms/security.md/msrc/create-report). 14 | 15 | If you prefer to submit without logging in, send email to [secure@microsoft.com](mailto:secure@microsoft.com). If possible, encrypt your message with our PGP key; please download it from the [Microsoft Security Response Center PGP Key page](https://aka.ms/security.md/msrc/pgp). 16 | 17 | You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Additional information can be found at [microsoft.com/msrc](https://www.microsoft.com/msrc). 18 | 19 | Please include the requested information listed below (as much as you can provide) to help us better understand the nature and scope of the possible issue: 20 | 21 | * Type of issue (e.g. buffer overflow, SQL injection, cross-site scripting, etc.) 22 | * Full paths of source file(s) related to the manifestation of the issue 23 | * The location of the affected source code (tag/branch/commit or direct URL) 24 | * Any special configuration required to reproduce the issue 25 | * Step-by-step instructions to reproduce the issue 26 | * Proof-of-concept or exploit code (if possible) 27 | * Impact of the issue, including how an attacker might exploit the issue 28 | 29 | This information will help us triage your report more quickly. 30 | 31 | If you are reporting for a bug bounty, more complete reports can contribute to a higher bounty award. Please visit our [Microsoft Bug Bounty Program](https://aka.ms/security.md/msrc/bounty) page for more details about our active programs. 32 | 33 | ## Preferred Languages 34 | 35 | We prefer all communications to be in English. 36 | 37 | ## Policy 38 | 39 | Microsoft follows the principle of [Coordinated Vulnerability Disclosure](https://aka.ms/security.md/cvd). 40 | 41 | 42 | -------------------------------------------------------------------------------- /SUPPORT.md: -------------------------------------------------------------------------------- 1 | # TODO: The maintainer of this repo has not yet edited this file 2 | 3 | **REPO OWNER**: Do you want Customer Service & Support (CSS) support for this product/project? 4 | 5 | - **No CSS support:** Fill out this template with information about how to file issues and get help. 6 | - **Yes CSS support:** Fill out an intake form at [aka.ms/onboardsupport](https://aka.ms/onboardsupport). CSS will work with/help you to determine next steps. 7 | - **Not sure?** Fill out an intake as though the answer were "Yes". CSS will help you decide. 8 | 9 | *Then remove this first heading from this SUPPORT.MD file before publishing your repo.* 10 | 11 | # Support 12 | 13 | ## How to file issues and get help 14 | 15 | This project uses GitHub Issues to track bugs and feature requests. Please search the existing 16 | issues before filing new issues to avoid duplicates. For new issues, file your bug or 17 | feature request as a new Issue. 18 | 19 | For help and questions about using this project, please **REPO MAINTAINER: INSERT INSTRUCTIONS HERE 20 | FOR HOW TO ENGAGE REPO OWNERS OR COMMUNITY FOR HELP. COULD BE A STACK OVERFLOW TAG OR OTHER 21 | CHANNEL. WHERE WILL YOU HELP PEOPLE?**. 22 | 23 | ## Microsoft Support Policy 24 | 25 | Support for this **PROJECT or PRODUCT** is limited to the resources listed above. 26 | -------------------------------------------------------------------------------- /docs/cn/00.IntroduceLLM.md: -------------------------------------------------------------------------------- 1 | # **前言:了解大型语言模型** 2 | 3 | 从上世纪 50 年代开始,人类对人工智能就开始探索。人工智能给人的印象一直是一个划时代的技术,需要有专业技能的数据科学领域从业人员才能使用。但从 2022 年年底开始,人工智能的领域发生了重大改变。 OpenAI 发布 GPT-3 模型开始,人工智能不再是单一领域或者说单一场景的解决方案。全新的多模态大型语言模型模型改变了游戏规则,无论你从事什么样的工作,都可以用自然语言和大型语言模型模型进行交流,大型语言模型模型反馈给你的是生成式的内容,这就包括文本,图片,视频等,大大增强了可用性。在进入 Semantic Kernel 的内容前,希望在前言部分和大家说说与大型语言模型模型相关的故事,让各位可以更好地理解大型语言模型模型。 4 | 5 | ## **大型语言模型基础** 6 | 7 | 大型语言模型模型(英文:Large Language Model,缩写LLM),也称大型语言模型,是一种人工智能模型,旨在理解和生成人类语言。它们在大量的文本数据上进行训练,可以执行广泛的任务,包括文本总结、翻译、情感分析等等。大型语言模型模型的特点是规模庞大,包含数十亿的参数,帮助它们学习语言数据中的复杂模式。这些模型通常基于深度学习架构以及 Transformer 算法。大型语言模型模型的训练方式是通过自我监督学习,即通过预测序列中的下一个词或标记,为输入的数据生成自己的标签,并给出之前的词。训练过程包括两个主要步骤:预训练和微调。在预训练阶段,模型从一个巨大的、多样化的数据集中学习,通常包含来自不同来源的数十亿词汇,如网站、书籍和文章。在微调阶段,模型在与目标任务或领域相关的更具体、更小的数据集上进一步训练,这有助于模型微调其理解,并适应任务的特殊要求。现在很多公司都在开发大型语言模型,为世人所熟知的就包括 OpenAI 的 GPT-X 和 DALLE-X 系列,以及 Meta 的 LLama, Google 的 Gemini ,以及百度的问心一言等。OpenAI 中的 GPT-4 是现阶段最好的大型语言模型,有非常大的优势。但随着时间推移,也有不少好的行业垂直领域模型诞生。 8 | 9 | ### **Transformer 算法** 10 | 11 | Transformer 算法是一种基于自注意力机制的深度学习模型,它可以用于处理自然语言和其他序列数据。它由编码器和解码器两部分组成,每部分包含多个层,每层又包含多头自注意力和前馈神经网络。Transformer 算法的优点是可以并行处理整个序列,捕捉长距离的依赖关系,而不需要使用循环神经网络或卷积神经网络。Transformer 算法最初是在论文《Attention Is All You Need》1中提出的,用于机器翻译任务。后来,它被广泛应用于其他自然语言处理任务,如文本摘要、问答、语音识别等。一些著名的基于 Transformer 的模型有 BERT、GPT-3 / 4、T5 等。 12 | 13 | 本次课程主要围绕 Azure OpenAI Services 的 OpenAI 3/3.5/4 以及 DALLE-3 展开。至于其他的模型会在日后更多的进阶内容上提及,大家可以关注我的 GitHub Repo . 14 | 15 | ## **OpenAI 以及相关模型介绍** 16 | 17 | 虽然 Transformer 的算法来自 Google ,但真正让大型语言模型进入公众视野的是 OpenAI. OpenAI 是一个人工智能研究和部署的公司,它的使命是确保人工通用智能(AGI)能够造福全人类。它的愿景是创造一个能够与人类合作和竞争的 AGI,同时遵循人类的价值观和道德。OpenAI 最初是一个非营利性的组织,由一些科技界的知名人士,如埃隆·马斯克、彼得·蒂尔、杨致远等创立于 2015 年1。它的目标是推进数字智能的发展,使其能够最大程度地惠及人类,而不受金钱利益的束缚。它的研究是开放和透明的,任何人都可以访问和使用 OpenAI 在人工智能领域有着开创性的研究,尤其是在生成模型和安全性方面。它开发了一些强大的大语言模型,如 GPT-3/3.5/4、ChatGPT、DALL·E、Whisper 等,它们可以理解和生成文本、图像、声音等多种形式的数据。它也致力于探索 AGI 的潜在风险和影响,以及如何使其与人类的目标和利益保持一致。 18 | 19 | ### **GPT 模型** 20 | 21 | GPT(Generative Pre-trained Transformer,生成预训练变换器)是一种基于深度学习的自然语言处理(NLP)模型,由OpenAI开发。GPT系列模型以其能力强大和灵活性而闻名,在多种语言任务中表现出色。下面是GPT模型的一些关键特点和发展历程: 22 | 23 | GPT的核心特点 24 | 25 | 1. 基于Transformer架构:GPT模型基于Transformer架构,这是一种特别适合处理序列数据(如文本)的深度学习模型。 26 | 27 | 2. 大规模预训练:GPT通过在大量文本数据上进行预训练来学习语言的通用模式和结构。这包括书籍、网页、新闻文章等各种来源的文本。 28 | 29 | 3. 微调应用:预训练后,GPT模型可以针对特定任务进行微调(fine-tuning),如问答、文本生成、翻译等。 30 | 31 | 4. 上下文敏感:GPT模型能够理解和生成与上下文相关的文本,这使得它在生成连贯和相关内容方面表现突出。 32 | 33 | GPT 发展历程 34 | 35 | GPT-1:第一个版本,展示了大规模未标记数据预训练的潜力,以及在多种任务上微调的有效性。 36 | 37 | GPT-2:增大了模型规模和训练数据量,显著提高了文本生成的质量和准确性。GPT-2因其能够生成连贯且有时难以区分于人类编写的文本而引起广泛关注。 38 | 39 | GPT-3:进一步扩大了模型规模,达到了前所未有的1750亿个参数。GPT-3在多项NLP任务上取得了革命性的表现,尤其是在可以少量或无需微调的情况下。 40 | 41 | GPT-4及以后:随着技术的不断发展,后续的GPT模型可能会在模型规模、理解能力、多模态能力等方面继续进步。 42 | 43 | GPT 应用领域 44 | GPT 模型在众多领域都有广泛应用,包括但不限于: 45 | 46 | 文本生成:如文章撰写、创意写作、代码生成等。 47 | 48 | 聊天机器人:提供流畅的对话体验。 49 | 50 | 自然语言理解:如情感分析、文本分类等。 51 | 52 | 翻译和多语言任务:自动翻译不同语言。 53 | 54 | 知识提取和问答:从大量文本中提取信息,回答特定问题。 55 | 56 | 总的来说,GPT模型代表了当前人工智能和自然语言处理领域的一个重要里程碑,它的强大能力和多样的应用前景持续引领着技术发展的潮流。 57 | 58 | #### **变革者 - GPT-3** 59 | 60 | GPT-3 是一种大型语言模型,由 OpenAI 开发,可以理解和生成自然语言。它是目前最大的大型语言模型之一,拥有 1750 亿个参数,可以完成文本摘要、机器翻译、对话系统、代码生成等工作。GPT-3 的特点是可以通过简单的文本提示,即“少示例学习”(few-shot learning),来适应不同的任务和领域,而不需要额外的微调或标注数据。 GPT-3 打开了潘多拉魔盒,改变了行业规则。GPT-3 已经被应用于许多产品和服务中,如 OpenAI API、OpenAI Codex、早期的 GitHub Copilot 等,它们可以让开发者、创作者和学者更容易地使用和学习人工智能。GPT-3 也引发了一些关于人工智能的伦理、社会和安全的讨论和思考,如人工智能的偏见、可解释性、责任、影响等。 61 | 62 | #### **进化 - GPT-3.5 和 ChatGPT** 63 | 64 | GPT-3.5 和 ChatGPT 都是基于 GPT-3 架构的大语言模型,它们可以理解和生成自然语言。它们都有 1750 亿个参数,可以在多种语言处理任务上表现出惊人的能力,如文本摘要、机器翻译、对话系统、代码生成等。 65 | 66 | GPT-3.5 和 ChatGPT 的主要区别在于它们的范围和目的。GPT-3.5 是一种通用语言模型,可以处理各种语言处理任务。另一方面,ChatGPT 是一个专用模型,专为聊天应用程序设计。它强调了与用户的互动和沟通,可以扮演不同的角色,如猫娘、明星、政治家等。它也可以根据用户的输入,生成图像、音乐、视频等多媒体内容。 67 | 68 | GPT-3.5 和 ChatGPT 的另一个区别在于它们的训练数据和训练方式。GPT-3.5 是在 570 GB 的文本数据上进行了预训练,这些数据来自不同的来源,如网站、书籍、文章等。它的训练方式是通过自我监督学习,即通过预测序列中的下一个词或标记,为输入的数据生成自己的标签,并给出之前的词。ChatGPT 则是在 GPT-3.5 的基础上,使用了更多的对话数据,如社交媒体、聊天记录、电影剧本等,进行了进一步的微调。它的训练方式是通过多任务学习,即同时优化多个目标,如语言模型、对话生成、情感分类、图像生成等。 69 | 70 | 71 | #### **绝对领导者 - GPT-4** 72 | 73 | GPT-4(第四代生成预训练转换器)是由OpenAI开发的最新一代人工智能语言模型。它是GPT-3的继任者,拥有更先进和精细的功能。下面是 GPT-4 的一些主要特点: 74 | 75 | 更大的知识库和数据处理能力:GPT-4 可以处理更大量的数据,它的知识库比GPT-3更加广泛和深入。 76 | 77 | 更高的语言理解和生成能力:GPT-4 在理解和生成自然语言方面有显著提升,能更准确地理解复杂的语言结构和含义。 78 | 79 | 多模态能力:GPT-4 不仅可以处理文本,还可以理解和生成图像,提供多模态的交互体验。 80 | 81 | 更好的上下文理解:GPT-4 能更好地理解和维持长篇对话中的上下文,提供更连贯和一致的回答。 82 | 83 | 提高的安全性和可靠性:OpenAI在 GPT-4 中加强了对不当内容的过滤和控制,以提供更安全可靠的用户体验。 84 | 85 | 广泛的应用领域:GPT- 可应用于各种领域,包括但不限于聊天机器人、内容创作、教育辅助、语言翻译、数据分析等。 86 | 87 | 总的来说,GPT-4 在其前代模型的基础上做出了显著的改进和提升,能够提供更加强大和多样化的功能。 GPT-4 在现阶段有绝对的领导地位,也是很多公司大模型的追赶目标。 88 | 89 | 90 | #### **进击!睁开眼睛看世界 - GPT-4V** 91 | 92 | GPT-4V 全称是GPT-4 with Vision,它可以理解图片,为用户解析图片并回答图片相关的问题。 GPT-4V可以准确理解图像的内容,识别图像中物体、计算物体的数量、提供图片相关的洞察和信息、提取文本等。可以说 , GPT -4V 是大型语言模型的皇者,也让大型模型更好地理解世界。GPT-4V 的视觉主要能力和应用方向 93 | 94 | **物体检测**:GPT-4V 能够识别并检测图像中的各种常见物体,例如汽车、动物和家居用品等。其识别能力已在标准图像数据集上进行了评估。 95 | 96 | **文本识别**:此模型具备光学字符识别(OCR)技术,可在图像中发现打印或手写文字,并将其转换为机器可读的文本。这项功能在文档、标志和标题等图像中得到了验证。 97 | 98 | **人脸识别**:GPT-4V 能够找出并识别图像中的人脸。它还具有一定程度的能力,可以通过面部特征来判断性别、年龄和种族属性。该模型的面部分析能力已在 FairFace 和 LFW 等数据集上进行了测试。 99 | 100 | **验证码解决方案**:GPT-4V 在解决基于文本和图像的验证码时展示了视觉推理能力。这表明模型具有高级的解谜技巧。 101 | 102 | **地理定位**:GPT-4V 能够识别风景图片中所呈现的城市或地理位置。这说明模型掌握了关于现实世界的知识,但也意味着存在泄露隐私的风险。 103 | 104 | **复杂图像**:处理复杂的科学图表、医学扫描或具有多个重叠文本组件的图像时,该模型表现较差。它无法把握上下文细节。 105 | 106 | 107 | ### **DALL·E 模型** 108 | 109 | DALL·E 是由 OpenAI 开发的一个先进的人工智能程序,专门用于生成图像。它是一个基于 GPT-3 架构的神经网络模型,但与 GPT-3 主要处理文本不同,DALL·E 的专长在于根据文本描述生成相应的图像。这个模型的名称是对著名艺术家萨尔瓦多·达利(Salvador Dalí)和流行动画角色沃利(WALL·E)的致敬。 110 | 111 | DALL·E的关键特点 112 | 文本到图像的转换:DALL·E能根据用户提供的文本描述生成图像。这些描述可以非常具体或富有创造性,模型会尽力生成符合描述的图像。 113 | 114 | 创造性和灵活性:DALL·E在生成图像时展现出惊人的创造性,能够组合不同的概念和元素,创建出独特和创新的视觉作品。 115 | 116 | 多样性和细节:该模型能生成多种风格和类型的图像,并能处理复杂的、具有细节的描述。 117 | 118 | 应用潜力:DALL·E在艺术创作、广告、设计等多个领域都有广泛的应用潜力。 119 | 120 | DALL·E 应用场景包括 121 | 122 | 艺术创作:艺术家和设计师可以使用DALL·E来探索新的创意和视觉表现。 123 | 124 | 广告和媒体:生成符合特定主题或概念的图像。 125 | 126 | 教育和娱乐:用于教学材料的制作或创造娱乐内容。 127 | 128 | 研究和探索:探索人工智能在视觉艺术领域的可能性。 129 | 130 | DALL·E 的出现标志着人工智能在创造性任务中的一个重要进步,显示了 A I在视觉艺术领域的巨大潜力。 现在最新的 DALL·E 模型是 DALL·E 3 。 131 | 132 | 133 | ### **Whisper 模型** 134 | 135 | Whisper 是由OpenAI开发的一种先进的自动语音识别(ASR)模型。这个模型专注于转录语音为文本,且在多种语言和不同的环境中都表现出了卓越的性能。以下是关于Whisper模型的一些关键特点: 136 | 137 | 特点 138 | 多语言支持:Whisper模型能够处理多种不同的语言和方言,使其在全球范围内具有广泛的适用性。 139 | 140 | 高精度识别:它能准确地识别和转录语音,即使在背景噪音较多的环境中也能保持较高的准确率。 141 | 142 | 自适应不同语境:Whisper不仅能识别标准的语音输入,还能适应各种口语化和非正式的对话风格。 143 | 144 | 易于集成和使用:作为一个机器学习模型,Whisper可以集成到各种应用和服务中,提供语音识别功能。 145 | 146 | 应用 147 | 148 | 自动字幕和转录:为视频和音频内容自动生成字幕或文本。 149 | 150 | 语音助手和聊天机器人:提高语音助手和聊天机器人对语音指令的识别能力。 151 | 152 | 无障碍服务:帮助有听力障碍的人士更好地理解音频内容。 153 | 154 | 会议和讲座记录:自动记录和转录会议或讲座的内容。 155 | 156 | 总体来说,Whisper模型代表了自动语音识别领域的一个重要进步,其多语言和高精度识别能力使其在各种应用场景中都极具价值。 157 | 158 | 159 | 160 | ## **微软 与 OpenAI** 161 | 162 | 163 | ![msandopenai](./../../imgs/00/msandopenai.png) 164 | 165 | 166 | 微软和OpenAI之间的合作关系是当代人工智能领域的一个重要发展。自从OpenAI成立以来,微软就一直是其重要的合作伙伴和支持者。以下是他们合作的一些关键方面和影响: 167 | 168 | **投资和合作** 169 | 170 | 资金支持:微软在OpenAI的早期就进行了重大投资,包括数亿美元的资金。这些投资帮助OpenAI发展其研究项目和技术。 171 | 172 | 云计算资源:微软向OpenAI提供了其Azure云计算平台的资源,这对于训练和运行大型AI模型,如GPT和DALL·E系列模型,至关重要。 173 | 174 | **技术合作** 175 | 176 | 共同研发:两家公司在多个AI项目和技术上展开了合作,共同推动人工智能的发展。 177 | 178 | 产品集成:OpenAI的某些技术,如GPT-3,已被集成到微软的产品和服务中,例如Microsoft Azure和其它企业级解决方案。 179 | 180 | **战略合作** 181 | 182 | 可持续和安全的AI:双方致力于发展既可持续又安全的AI技术,重视AI伦理和安全性问题。 183 | 184 | 拓展AI应用:通过合作,两家公司致力于将AI技术应用到更广泛的领域,例如健康保健、教育和环境保护等。 185 | 186 | **影响** 187 | 188 | 加速AI技术的发展:这种合作促进了人工智能技术的快速发展和创新。 189 | 190 | 商业应用和服务:微软通过将OpenAI的技术应用于其产品和服务,推动了人工智能在商业领域的广泛应用。 191 | 192 | 推动AI民主化:这种合作有助于使更多的企业和开发者能够访问和使用先进的AI技术。 193 | 194 | 总体来说,微软与OpenAI之间的合作是技术创新和商业应用相结合的典范,这种合作对人工智能技术的发展和普及产生了深远的影响。随着双方合作的不断深化,可以预期他们将继续在人工智能领域发挥重要作用。 195 | 196 | ### **Azure OpenAI Service** 197 | 198 | Azure OpenAI Service 是 Microsoft Azure 与领先的人工智能研究组织 OpenAI 之间的合作。 Azure OpenAI Service 是一个基于云的平台,使开发人员和数据科学家能够快速轻松地构建和部署人工智能模型。 借助 Azure OpenAI,用户可以访问各种 AI 工具和技术来创建智能应用程序,包括自然语言处理、计算机视觉和深度学习。Azure OpenAI Service 旨在加速 AI 应用程序的开发,使用户能够专注于创建为其组织和客户创造价值的创新解决方案。 199 | 200 | Azure OpenAI Service 提供对 OpenAI 强大语言模型的 REST API 访问,这些模型包括 GPT-4、GPT-4 Turbo with Vision、GPT-3.5-Turbo 和嵌入模型系列。 此外,新的 GPT-4 和 GPT-3.5-Turbo 模型系列现已正式发布。 这些模型可以轻松适应特定的任务,包括但不限于内容生成、汇总、图像理解、语义搜索和自然语言到代码的转换。 用户可以通过 REST API、Python SDK 或 Azure OpenAI Studio 中基于 Web 的界面访问该服务。 201 | 202 | 使用 Azure OpenAI Service 你需要有 Azure 账号,然后通过该链接进行申请,等待 1-3 工作日即可使用 Azure OpenAI Service。 203 | 204 | **Azure OpenAI Studio** 205 | 206 | 我们可以通过 Azure OpenAI Studio 管理我们的模型,以及在 Playground 中测试我们的模型 207 | 208 | ![cover](./../../imgs/00/aoaistudio.png) 209 | 210 | ***注意:*** 所有例子都基于 Azure OpenAI Services 211 | 212 | 213 | ## **Hugging Face** 214 | 215 | Hugging Face 是一家专注于自然语言处理(NLP)的人工智能研究公司,以其开源项目和在NLP领域的创新而闻名。公司成立于2016年,其总部位于纽约,但它在全球范围内有影响力和活动。 216 | 217 | ### **主要贡献和产品** 218 | 219 | **Transformers库**:Hugging Face 最著名的贡献是其开发的 “Transformers” 库,这是一个广泛使用的Python库,包含了多种预训练的NLP模型,如BERT、GPT、T5等。这个库使得访问和使用这些复杂模型变得更加容易,对促进NLP领域的研究和应用有重大影响。 220 | 221 | **模型共享和社区**:Hugging Face建立了一个强大的社区,促进了研究者和开发者之间的模型和知识共享。通过其平台,任何人都可以上传、分享和使用预训练模型。 222 | 223 | **研究和合作**:Hugging Face在人工智能领域进行积极的研究,与学术界和工业界的众多团队合作。 224 | 225 | **教育和资源**:Hugging Face还提供各种教育资源,包括教程、文档和研究论文,帮助人们更好地了解和使用NLP技术。 226 | 227 | ### **影响** 228 | 229 | 技术创新:Hugging Face在推动NLP领域的技术创新方面发挥了重要作用,特别是在预训练模型的发展和应用方面。 230 | 231 | 降低技术门槛:通过提供易于使用的工具和资源,Hugging Face 降低了在NLP领域工作的技术门槛,使得更多的研究者和开发者能够参与到这一领域中。 232 | 233 | 社区建设:其强大的社区和开源文化促进了知识共享和协作,加速了 NLP 技术的发展和创新。 234 | 235 | 虽然 Hugging Face 最初是作为一个面向消费者的聊天机器人应用开始的,但它迅速转变为专注于提供NLP技术和资源的公司。现在,它不仅支持研究和教育,还为企业提供商业解决方案,如定制模型训练、数据处理和机器学习咨询服务。 236 | 237 | 综上所述,Hugging Face 是NLP领域的一个关键参与者,其开源精神和对社区的贡献在促进人工智能技术的民主化和创新方面发挥了重要作用。 238 | 239 | Azure AI Studio 也支持 Hugging Face 模型引入,可以让企业更好地结合业务场景使用不同模型解决不同应用场景的问题。 240 | 241 | 242 | ![hugging face](./../../imgs/00/aoaistudio_huggingface.png) 243 | 244 | 245 | 246 | ## **小结** 247 | 248 | 本章为大家介绍了现阶段和大型语言模型相关的知识,特别在 OpenAI, Microsoft, Hugging face 主流大型语言模型平台的相关知识,以及不同模型的应用场景和性能。对于应用场景来说,我们不可能只使用一个模型。在 AI 2.0 时代,我们需要有不同模型的支持,完成更多的智能化应用场景。无论在云端和本地,大型语言模型的应用场景都是未来几年所关注的热点。作为初学者你需要做的是了解不同模型,结合实际场景来完成应用搭建。 249 | 250 | -------------------------------------------------------------------------------- /docs/cn/01.UsingAzureOpenAIServiceWithSDK.md: -------------------------------------------------------------------------------- 1 | # **第一章:用 SDK 访问 Azure OpenAI Service** 2 | 3 | 在前言部分,我们了解了大型语言模型的相关知识,下面我想谈谈如何使用大型语义模型,在未进入 Semantic Kernel 之前,我更希望大家看看如何正确通过 SDK 去访问 Azure OpenAI Service 上的 Azure OpenAI 模型。 4 | 5 | ## **准备工作:在 Azure OpenAI Studio 部署模型** 6 | 7 | 部署 Azure OpenAI 模型很简单,在申请成功 Azure OpenAI Service 后,通过在 Azure Portal 创建资源进行部署。以下是相关步骤: 8 | 9 | 1. 在 [Azure Portal](https://portal.azure.com/) 选择 Azure OpenAI 创建资源 10 | 11 | ![aoairesource](../../imgs/01/aoairesource.png) 12 | 13 | 选择 'Create' 后,配置好 Azure OpenAI 所在区域,需要注意:因为资源分布不同,不同区域所拥有的 OpenAI 模型不尽相同,在使用前一定要了解清楚。 14 | 15 | 16 | ![aoaicreate](../../imgs/01/aoaicreate.png) 17 | 18 | 等待片刻,即创建成功 19 | 20 | 21 | ![aoaidone](../../imgs/01/aoaidone.png) 22 | 23 | 2. 进入创建好的资源,你可以部署模型,以及获取 SDK 调用时需要的 Key,以及 Endpoint 24 | 25 | ![aoaikey](../../imgs/01/aoaikey.png) 26 | 27 | 3. 进入 'Model Deployment' 选择 'Management Deployment' 进入 Azure OpenAI Studio 28 | 29 | ![aoaigo](../../imgs/01/aoaigo.png) 30 | 31 | 4. 在 Azure OpenAI Studio 部署您的模型 32 | 33 | ![aoaimodeldeploy](../../imgs/01/aoaimodeldeploy.png) 34 | 35 | 选择你需要的模型 36 | 37 | ![aoaimodel](../../imgs/01/aoaimodel.png) 38 | 39 | 可以看到您的模型列表 40 | 41 | ![aoaimodels](../../imgs/01/aoaimodels.png) 42 | 43 | 44 | 恭喜你,成功部署了模型,接下来就可以用 SDK 对接了。 45 | 46 | 47 | ## **使用 SDK 链接 Azure OpenAI Service** 48 | 49 | 和 Azure OpenAI Service 对接的 SDK,针对 Python 版本有 OpenAI 发布的 SDK, 针对 .NET 也有 Microsoft 发布的 SDK。作为初学者建议都在 Notebook 环境下使用,以便更容易理解执行的关键步骤。 50 | 51 | 52 | ### **关于 Python SDK** 53 | 54 | OpenAI 发布的官方 Python SDK, 支持链接 OpenAI 和 Azure OpenAI Service。现在 OpenAI SDK 发布了 1.x 版本,但在市面上很多都在用 0.2x 版本。***本次课程内容都会基于 OpenAI SDK 1.x 版本,并使用 Python 3.10.x。*** 55 | 56 | 安装方式 57 | 58 | 59 | ```python 60 | 61 | ! pip install openai -U 62 | 63 | ``` 64 | 65 | 66 | ### **关于 .NET SDK** 67 | 68 | Microsoft 发布基于 Azure OpenAI Service 的 SDK,你可以通过 Nuget 获取最新的包来完成 .NET 生成式 AI 应用。***本次课程内容都会基于 .NET 8 以及最新的 Azure.AI.OpenAI SDK 来展示例子,当然也会使用 Polyglot Notebook 作为环境*** 69 | 70 | 安装方式 71 | 72 | ```csharp 73 | 74 | #r "nuget: Azure.AI.OpenAI, *-*" 75 | 76 | ``` 77 | 78 | 在上面我们已经配置好基于 .NET / Python 的 SDK 环境,接下来我们需要创建好链接的类以完成相关的初始化工作 79 | 80 | 针对 .NET 环境的初识化 81 | 82 | 83 | ```csharp 84 | 85 | string endpoint = "Your Azure OpenAI Service Endpoint"; 86 | string key = "Your Azure OpenAI Service Key"; 87 | 88 | OpenAIClient client = new(new Uri(endpoint), new AzureKeyCredential(key)); 89 | 90 | ``` 91 | 92 | 针对 Python 环境的初始化 93 | 94 | 95 | 96 | ```python 97 | 98 | client = AzureOpenAI( 99 | azure_endpoint = 'Your Azure OpenAI Service Endpoint', 100 | api_key='Your Azure OpenAI Service Key', 101 | api_version="Your Azure OpenAI API version" 102 | ) 103 | 104 | ``` 105 | 106 | 107 | ## **Azure OpenAI Service 的常用 API 以及 SDK 调用方法** 108 | 109 | ### ***1. 文本补全 API*** 110 | 111 | 这是基于 gpt-35-turbo-instruct 模型,对于文本补全是非常重要的接口 112 | 113 | **.NET 场景下的文本补全** 114 | 115 | ```csharp 116 | 117 | CompletionsOptions completionsOptions = new() 118 | { 119 | DeploymentName = "gpt-35-turbo-instruct", 120 | Prompts = { "Can you introduce what is generative AI ?" }, 121 | }; 122 | 123 | Response completionsResponse = client.GetCompletions(completionsOptions); 124 | 125 | string completion = completionsResponse.Value.Choices[0].Text; 126 | 127 | ``` 128 | 129 | **Python 场景下的文本补全** 130 | 131 | 132 | ```python 133 | 134 | start_phrase = 'Can you introduce what is generative AI ?' 135 | 136 | response = openai.Completion.create(engine=deployment_name, prompt=start_phrase, max_tokens=1000) 137 | 138 | text = response['choices'][0]['text'].replace('\n', '').replace(' .', '.').strip() 139 | 140 | ``` 141 | 142 | ### ***2. Chat API*** 143 | 144 | 这是基于 gpt-35-turbo 和 gpt-4 模型,基于聊天场景的接口 145 | 146 | **.NET 场景下的 Chat** 147 | 148 | ```csharp 149 | 150 | var chatCompletionsOptions = new ChatCompletionsOptions() 151 | { 152 | DeploymentName = "gpt-4", 153 | Messages = 154 | { 155 | new ChatRequestSystemMessage("You are my coding assistant."), 156 | new ChatRequestUserMessage("Can you tell me how to write python flask application?"), 157 | }, 158 | MaxTokens = 10000 159 | }; 160 | 161 | Response response = client.GetChatCompletions(chatCompletionsOptions); 162 | 163 | ``` 164 | 165 | **Python 场景下的 Chat** 166 | 167 | ```python 168 | 169 | response = client.chat.completions.create( 170 | model="gpt-35-turbo", # model = "deployment_name". 171 | messages=[ 172 | {"role": "system", "content": "You are my coding assistant."}, 173 | {"role": "user", "content": "Can you tell me how to write python flask application?"} 174 | ] 175 | ) 176 | 177 | print(response.choices[0].message.content) 178 | 179 | ``` 180 | 181 | ### ***3. 文生图 API*** 182 | 183 | 基于 DallE 3 模型,文生图的场景 184 | 185 | **.NET 场景下的文生图** 186 | 187 | 188 | ```csharp 189 | 190 | Response imageGenerations = await client.GetImageGenerationsAsync( 191 | new ImageGenerationOptions() 192 | { 193 | DeploymentName = "Your Azure OpenAI Service Dall-E 3 model Deployment Name", 194 | Prompt = "Chinese New Year picture for the Year of the Dragon", 195 | Size = ImageSize.Size1024x1024, 196 | }); 197 | 198 | 199 | ``` 200 | 201 | 202 | **Python 场景下的文生图** 203 | 204 | 205 | ```python 206 | 207 | result = client.images.generate( 208 | model="dalle3", 209 | prompt="Chinese New Year picture for the Year of the Dragon", 210 | n=1 211 | ) 212 | 213 | json_response = json.loads(result.model_dump_json()) 214 | 215 | ``` 216 | 217 | 218 | ### ***4. Embeddings API*** 219 | 220 | 基于 text-embedding-ada-002 模型,基于向量转换的实现 221 | 222 | 223 | **.NET 场景下的 Embeddings** 224 | 225 | 226 | ```csharp 227 | 228 | EmbeddingsOptions embeddingOptions = new() 229 | { 230 | DeploymentName = "text-embedding-ada-002", 231 | Input = { "Kinfey is Microsoft Cloud Advocate" }, 232 | }; 233 | 234 | var returnValue = openAIClient.GetEmbeddings(embeddingOptions); 235 | 236 | foreach (float item in returnValue.Value.Data[0].Embedding.ToArray()) 237 | { 238 | Console.WriteLine(item); 239 | } 240 | 241 | ``` 242 | 243 | 244 | **Python 场景下的 Embeddings** 245 | 246 | ```python 247 | 248 | client.embeddings.create(input = ['Kinfey is Microsoft Cloud Advocate'], model='text-embedding-ada-002 model').data[0].embedding 249 | 250 | ``` 251 | 252 | 253 | ## **例子** 254 | 255 | 以下列出了和上述接口相关的例子,请根据您的语言环境进行学习 256 | 257 | ### ***Python 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/python/01/PythonSDKAOAIDemo.ipynb) 258 | 259 | ### ***.NET 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/dotNET/01/dotNETSDKAOAIDemo.ipynb) 260 | 261 | ## **小结** 262 | 263 | 我们用最原始,也是最基础的 SDK 与 Azure OpenAI Service 打交道,这也是我们面向生成式人工智能编程的第一步,在没有使用框架下可以更快速地理解不同接口,也为我们进入 Semantic Kernel 打下基础。 264 | 265 | 266 | 267 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 276 | 277 | -------------------------------------------------------------------------------- /docs/cn/02.IntroduceSemanticKernel.md: -------------------------------------------------------------------------------- 1 | # **第二章:Semantic Kernel 基础** 2 | 3 | 我们已经和大家讲述了 LLMs 以及利用 .NET 和 Python SDK 链接 Azure OpenAI Service 的方法。接下来我们就将进入 Semantic Kernel 的世界。在 2023 年我们有非常多的基于 LLMs 的框架诞生,我们为什么要选择 Semantic Kernel ? Semantic Kernel 的优缺点是什么?还有作为传统开发者如何使用 Semantic Kernel ? 我都会在本章内容中和大家一一细说。 4 | 5 | ## **什么是 Semantic Kernel** 6 | 7 | Semantic Kernel 是一个轻量级的开源框架,通过 Semantic Kernel 您可以快速使用不同编程语言(C#/Python/Java)结合 LLMs(OpenAI、Azure OpenAI、Hugging Face 等模型) 构建智能应用。在我们进入生成式人工智能后,人机对话的方式有了很大的改变,我们用自然语言就可以完成与机器的对话,门槛降低了非常多。结合提示工程和大型语言模型,我们可以用更低的成本完成不同的业务。但如何把提示工程以及大模型引入到工程上?我们就需要一个像 Semantic Kernel 的框架作为开启智能大门的基础。在 2023 年 5 月,微软 CTO Kevin Scott 就提出了 Copilot Stack 的概念,人工智能编排就是核心。 Semantic Kernel 具备和 LLMs 以及各种提示工程/软件组成的插件组合的能力,因此也被看作 Copilot Stack 的最佳实践。通过 Semantic Kernel,你可以非常方便地构建基于 Copilot Stack 的解决方案,而且对于传统工程,也可以无缝对接。 8 | 9 | ### **Semantic Kernel vs LangChain** 10 | 11 | 我们没法不去作出一些客观的比较,毕竟 LangChain 拥有更多的使用群体。无可否认在落地场景上,LangChain 现在比 Semantic Kernel 更多,特别在入门参考例子上。我们来个更为全面的比较: 12 | 13 | ***LangChain*** 基于 Python 和 Javascript的开源框架,包含了众多预制组件,开发者可以无需多写提示工程就可以完成智能应用的开发。特别在复杂的应用场景,开发人员可以快速整合多个预定义的组件来综合完成。在开发角度,更适合具备数据科学或则人工智能基础的开发人员。 14 | 15 | ***Semantic Kernel*** 您可以基于 C#, Python, Java 的开源框架。更大的优势在工程化。毕竟它更像一个编程范式,传统开发人员可以很快掌握该框架进行应用开发,而且可以更好结合自定义的插件和提示工程完成企业的定制化业务智能化工作。 16 | 17 | 两者有很多共通点,都还在版本迭代,我们需要基于团队结构,技术栈,应用场景作出选择。 18 | 19 | ### **Semantic Kernel 的特点** 20 | 21 | 1. **强大的插件** - 你可以通过结合自定义/预定义的插件解决智能业务的问题。让传统的代码和智能插件一起工作灵活地接入到应用场景,简化传统应用向智能化转型的过程。 22 | 23 | 2. **多模型支持** - 为您的智能应用配置“大脑”,可以是来自 Azure OpenAI Service , 也可以是 OpenAI ,以及来自 Hugging Face 上的各种离线模型。通过链接器你可以快速接入不同的“大脑”,让您的应用更智能更聪明。 24 | 25 | 3. **各式各样的链接器** - 链接器除了链接“大脑”外,还可以链接如向量数据库,各种商业软件,不同的业务中间件,让更多的业务场景进入智能成为可能 26 | 27 | 4. **开发便捷** - 简单易用,开发人员零成本入门 28 | 29 | ### **Semantic Kernel 的缺点** 30 | 31 | 毕竟 LLMs 还在不停发展,有很多新模型的加入,也有很多新的功能,以及新的概念引入。Semantic Kernel, LangChain 等开源框架都在努力适应这个新的摩尔定律,但版本的迭代会有不确定的更改。所以在使用的时候,开发者需要多留意对应 GitHub Repo 上的变更日志。 32 | 33 | 还有 Semantic Kernel 需要兼顾多个编程语言,所以进度也是不一致,也会导致 Semantic Kernel 在不同技术栈人群的选择。 34 | 35 | 36 | ### **Semantic Kernel 中的 Kernel** 37 | 38 | 如果把 Semantic Kernel 看作是 Copilot Stack 最佳实践,那 Kernel 就是 AI 编排的中心,在官方文档中也有所提及。通过 Kernel 可以和不同插件,服务,日志以及不同的模型链接在一起。所有 Semantic Kernel 的应用都从 Kernel 开始。 39 | 40 | ![kernel](../../imgs/02/kernel.png) 41 | 42 | 43 | ## **用 Semantic Kernel 构建一个简单的翻译项目** 44 | 45 | 说了一些基础知识,我们开始学习,如何把 Semantic Kernel 引入到 .NET 和 Python 工程项目。我们用四步完成一个翻译的实现 46 | 47 | ### **第一步:引入 Semantic Kernel 库** 48 | 49 | 50 | **.NET 开发者** 51 | 52 | ***注意:我们在这里使用的是最新的 Semantic Kernel 1.3.0 的版本, 并使用 Polyglot Notebook 来完成相关学习*** 53 | 54 | 55 | ```csharp 56 | 57 | #r "nuget: Microsoft.SemanticKernel, *-*" 58 | 59 | ``` 60 | 61 | 62 | **Python 开发者** 63 | 64 | 65 | ***注意:我们在这里使用的是最新的 Semantic Kernel 0.5.1.dev 的版本, 并使用 Notebook 来完成相关学习*** 66 | 67 | 68 | ```python 69 | 70 | ! pip install semantic-kernel -U 71 | 72 | ``` 73 | 74 | 75 | ### **第二步:创建 Kernel 对象** 76 | 77 | 78 | 这里注意,我们需要把 Azure OpenAI Service 相关的 Endpoin,Key 以及 模型的 Deployment Name 都放在一个固定文件内,方便调用以及设置。在 .NET 环境我设置在 Settings.cs 环境,在 Python 环境我设置在 .env 环境中 79 | 80 | **.NET 开发者** 81 | 82 | ```csharp 83 | 84 | using Microsoft.SemanticKernel; 85 | using Microsoft.SemanticKernel.Connectors.OpenAI; 86 | 87 | 88 | Kernel kernel = Kernel.CreateBuilder() 89 | .AddAzureOpenAIChatCompletion(Settings.AOAIModel , Settings.AOAIEndpoint, Settings.AOAIKey) 90 | .Build(); 91 | 92 | ``` 93 | 94 | **Python 开发者** 95 | 96 | ```python 97 | 98 | import semantic_kernel as sk 99 | import semantic_kernel.connectors.ai.open_ai as skaoai 100 | 101 | kernel = sk.Kernel() 102 | deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env() 103 | kernel.add_chat_service("azure_chat_competion_service", skaoai.AzureChatCompletion(deployment,endpoint,api_key=api_key,api_version = "2023-12-01-preview")) 104 | 105 | 106 | ``` 107 | ### **第三步:引入插件** 108 | 109 | 在 Semantic Kernel 中,我们有不同的插件,用户可以使用预定义的插件,也可以使用自定义的插件。想了解更多可以关注下一章的内容,我们会详细讲述插件的使用。该例子,我们使用的是自定义插件,已经在 plugins 目录下了。 110 | 111 | 112 | **.NET 开发者** 113 | 114 | ```csharp 115 | 116 | var plugin = kernel.CreatePluginFromPromptDirectory(Path.Combine(pluginDirectory, "TranslatePlugin")); 117 | 118 | 119 | ``` 120 | 121 | 122 | **Python 开发者** 123 | 124 | ```python 125 | 126 | pluginFunc = kernel.import_semantic_plugin_from_directory(base_plugin,"TranslatePlugin") 127 | 128 | ``` 129 | 130 | 131 | ### **第四步:执行** 132 | 133 | 134 | **.NET 开发者** 135 | 136 | ```csharp 137 | 138 | translateFunc = pluginFunc["Basic"] 139 | 140 | result = await translateFunc("你好,我是你的 AI 编排助手 - Semantic Kernel") 141 | 142 | 143 | ``` 144 | 145 | 146 | **Python 开发者** 147 | 148 | ```python 149 | 150 | translateFunc = pluginFunc["Basic"] 151 | 152 | result = translateFunc("你好,我是你的 AI 编排助手 - Semantic Kernel") 153 | 154 | 155 | ``` 156 | 157 | 具体实现,您可以访问 158 | 159 | 160 | ***.NET 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/dotNET/02/LearnSK.ipynb) 161 | 162 | ***Python 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/python/02/LearnSK.ipynb) 163 | 164 | 165 | ## **小结** 166 | 167 | 您第一次接触到 Semantic Kernel 的感觉如何呢?您在本章中学习到 Semantic Kernel 的基础知识,以及相关的知识。并了解到 Semantic Kernel 和 LangChain 的对比,以及相关优缺点,对您在工程落地的过程中会有所帮助。我们也通过 Semantic Kernel 四步完成了一个翻译,当是完成了一个大模型时代下的 hello world,给你的入门带来信心。接下来您可以打开下一章的进阶学习,了解更多的 Semantic Kernel 知识。 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | -------------------------------------------------------------------------------- /docs/cn/03.Plugins.md: -------------------------------------------------------------------------------- 1 | # **第三章:开启大模型的技能之门 - Plugins** 2 | 3 | 在上一章,我们了解了 Semantic Kernel 的基本知识。其中 Semantic Kernel 的一大特点是拥有强大的插件,通过结合自定义/预定义的插件解决智能业务的问题。让传统的代码和智能插件一起工作灵活地接入到应用场景简化传统应用向智能化转型的过程。 本章,我们主要介绍如何使用插件。 4 | 5 | ## **什么是插件** 6 | 7 | ![introplugins](../../imgs/03/introplugins.png) 8 | 9 | 我们知道 LLMs 本来的数据是有时间限制的,如果要增加实时内容或者企业化的知识是有相当大的缺陷。OpenAI 通过插件将 ChatGPT 连接到第三方应用程序。 这些插件使 ChatGPT 能够与开发人员定义的 API 进行交互,从而增强 ChatGPT 的功能并允许有更广泛的操作,如: 10 | 11 | 1. 检索实时信息,例如,体育赛事比分、股票价格、最新新闻等。 12 | 13 | 2. 检索知识库信息, 例如,公司文档、个人笔记等。 14 | 15 | 3. 协助用户进行相关操作,例如,预订航班、订餐等。 16 | 17 | Semantic Kernel 遵循 OpenAI 的插件的插件规范,可以很方便地接入和导出插件(如基于 Bing, Microsoft 365, OpenAI 的插件),这样可以让开发人员很简单地调用不同的插件服务。除了兼容 OpenAI 的插件外,Semantic Kernel 内也有属于自己插件定义的方式。不仅可以在规定模版格式上定义 Plugins, 更可以在函数内定义 Plugins. 18 | 19 | ### **认知 Plugins** 20 | 21 | 在早期的 Semantic Kernel 预览版本中,Plugins 并定义为 Skills。正如上面提及的,和 OpenAI 看齐。但具体的目标没有变化。你可以理解 Semantic Kernel 的插件就是给开发人员通过完成智能业务需求的各种功能。你可以把 Semantic Kernel 看成是乐高的底座,要完成一个长城的堆砌就需要有各种不同的功能模块。而这些功能模块就是我们所说的插件。 22 | 23 | 我们可能有基于业务的插件集,如 HRPlugins 下就包含了不同的功能,如: 24 | 25 | 26 | | 插件 | Intro | 27 | |----------|:----------| 28 | | Holidays | 关于假期内容的插件 | 29 | | Contracts | 关于员工合同 | 30 | | Departments | 关于组织结构相关的 | 31 | 32 | 这些子功能可以根据不同的要求进行有效组合来完成不同的计划任务。 例如以下结构: 33 | 34 | 35 | ```txt 36 | 37 | |-plugins 38 | |-HRPlugins 39 | |-Holidays 40 | |-Contract 41 | |-Departments 42 | |-EmailsPlugins 43 | |-HRMail 44 | |-CustomMail 45 | |-PeoplesPlugins 46 | |-Managers 47 | |-Workers 48 | 49 | ``` 50 | 51 | 我们向大模型发出一条指令“请向合同到期的管理人员,发送一个 Email”,实际就是从不同的组件找组合,最后就是依据 Contract + Managers + HRMail 来完成相关的工作 52 | 53 | 54 | ## **Semantic Kernel 插件** 55 | 56 | 57 | 58 | ### **通过模版定义插件** 59 | 60 | 我们知道通过提示工程可以和 LLMs 进行对话。对于一个企业或者创业公司,我们在处理业务时,可能不是一个提示工程,可能需要有针对提示工程的合集。我们可以把这些针对业务能力的提示工程集放到 Semantic Kernel 的插件集合内。对于结合提示工程的插件,Semantic Kernel 有固定的模版,提示工程都放在 skprompt.txt 文件内,而相关参数设置都放在 config.json 文件内。最后的文件结构式这样的 61 | 62 | 63 | ```txt 64 | 65 | |-plugins 66 | |-HRPlugins 67 | |-Holidays 68 | |-skprompt.txt 69 | |-config.json 70 | |-Contract 71 | |-skprompt.txt 72 | |-config.json 73 | |-Departments 74 | |-skprompt.txt 75 | |-config.json 76 | |-EmailsPlugins 77 | |-HRMail 78 | |-skprompt.txt 79 | |-config.json 80 | |-CustomMail 81 | |-skprompt.txt 82 | |-config.json 83 | |-PeoplesPlugins 84 | |-Managers 85 | |-skprompt.txt 86 | |-config.json 87 | |-Workers 88 | |-skprompt.txt 89 | |-config.json 90 | 91 | ``` 92 | 93 | 我们先来看看 **skprompt.txt** 的定义,这里一般是放置和业务相关的 prompt,可以支持多个参数,每个参数都放置在 {{$参数名}} 内,如以下格式: 94 | 95 | 96 | ```txt 97 | 98 | Translate {{$input}} into {{$language}} 99 | 100 | 101 | ``` 102 | 103 | 我们这里的工作是把输入内容翻译成特定的语言,input 和 language 是两个参数,也就是说你可以给这两个参数给出随意的值。 104 | 105 | 在 **config.json** 中就是配置相关的内容,随了设置和 LLMs 相关的参数外,你也可以设定输入的参数以及相关描述 106 | 107 | ```json 108 | 109 | { 110 | "schema": 1, 111 | "description": "Translate sentenses into a language of your choice", 112 | "execution_settings": { 113 | "default": { 114 | "max_tokens": 2000, 115 | "temperature": 0.7, 116 | "top_p": 0.0, 117 | "presence_penalty": 0.0, 118 | "frequency_penalty": 0.0, 119 | "stop_sequences": [ 120 | "[done]" 121 | ] 122 | } 123 | }, 124 | "input_variables": [ 125 | { 126 | "name": "input", 127 | "description": "sentense to translate", 128 | "default": "" 129 | }, 130 | { 131 | "name": "language", 132 | "description": "Language to translate to", 133 | "default": "" 134 | } 135 | ] 136 | } 137 | 138 | ``` 139 | 140 | 141 | ### **通过函数定义插件** 142 | 143 | 我们也可以通过函数定义不同的插件,这个有点像 gpt-3.5-turbo 在 6 月 13 日 发布的 Function Calling ,这是通过增加外部函数,通过调用来增强 OpenAI 模型的能力。正如本章开篇时所提及的。 144 | 145 | #### **Function Calling** 146 | 147 | 通过描述式的函数, LLMs 调用这些函数的参数的 JSON 对象。 这是一种 GPT 功能与外部工具和 API 连接起来的方法。支持 Azure OpenAI Service 或 OpenAI 上 的: 148 | 149 | - gpt-4 150 | - gpt-4-1106-preview 151 | - gpt-4-0613 152 | - gpt-3.5-turbo 153 | - gpt-3.5-turbo-1106 154 | - gpt-3.5-turbo-0613 155 | 156 | Semantic Kernel 也支持 Function Calling 的方式调用,但一般很少采用,除非你本来有为业务定制好的 Function Calling 方法 157 | 158 | 159 | #### **Semantic Kernel 定义函数插件** 160 | 161 | Semantic Kernel 定义函数插件,比起用 Function Calling 更为简单,而且在 Function Calling 诞生之前就有了相关的定义方法,不受模型限制,你可以在早期模型中就用上这种方式对 LLMs 进行知识和数据扩充。所有的自定义函数建议放置在 plugins 的同一文件夹下方便管理。 162 | 163 | **.NET 应用场景** 164 | 165 | 定义函数扩展,需要加上宏定义 166 | 167 | ```csharp 168 | 169 | [KernelFunction,Description("search weather")] 170 | public string WeatherSearch(string text) 171 | { 172 | return "Guangzhou, 2 degree,rainy"; 173 | } 174 | 175 | ``` 176 | 177 | ***注意:*** 建议用业务类封装的方式定义好不同的扩展函数,方便调用时更加简单,如 178 | 179 | 180 | ```csharp 181 | 182 | using Microsoft.SemanticKernel; 183 | using System.ComponentModel; 184 | using System.Globalization; 185 | 186 | 187 | public class CompanySearchPlugin 188 | { 189 | [KernelFunction,Description("search employee infomation")] 190 | public string EmployeeSearch(string input) 191 | { 192 | return "欢迎了解社保相关内容"; 193 | } 194 | 195 | [KernelFunction,Description("search weather")] 196 | public string WeatherSearch(string text) 197 | { 198 | return "Guangzhou, 2 degree,rainy"; 199 | } 200 | } 201 | 202 | ``` 203 | 204 | 调用方法如下: 205 | 206 | 207 | ```csharp 208 | 209 | var companySearchPluginObj = new CompanySearchPlugin(); 210 | 211 | var companySearchPlugin = kernel.ImportPluginFromObject(companySearchPluginObj, "CompanySearchPlugin"); 212 | 213 | var weatherContent = await kernel.InvokeAsync( companySearchPlugin["WeatherSearch"],new(){["text"] = "广州天气"}); 214 | 215 | weatherContent.GetValue() 216 | 217 | 218 | ``` 219 | 220 | **Python 应用场景** 221 | 222 | 223 | 定义函数扩展,需要加上宏定义 224 | 225 | ```python 226 | 227 | @kernel_function_context_parameter(name="city", description="city string") 228 | def ask_weather_function(self, context: KernelContext) -> str: 229 | return context["city"] + "’s weather is 30 celsius degree , and very hot." 230 | 231 | ``` 232 | 233 | 把自定义函数都放置在 **native_function.py** 中,并通过类来定义,如 234 | 235 | 236 | ```python 237 | 238 | from semantic_kernel import SKContext 239 | from semantic_kernel.plugin_definition import kernel_function,kernel_function_context_parameter 240 | 241 | 242 | class CustomPlugin: 243 | @kernel_function( 244 | description = "Get news from the web", 245 | name = "NewsPlugin" 246 | ) 247 | @kernel_function_context_parameter(name="location", description="location name") 248 | def get_news_api(self, context: SKContext) -> str: 249 | return """Get news from the """ + context["location"] + """.""" 250 | 251 | @kernel_function( 252 | description="Search Weather in a city", 253 | name="WeatherFunction" 254 | ) 255 | @kernel_function_context_parameter(name="city", description="city string") 256 | def ask_weather_function(self, context: SKContext) -> str: 257 | return "Guangzhou’s weather is 30 celsius degree , and very hot." 258 | 259 | @kernel_function( 260 | description="Search Docs", 261 | name="DocsFunction" 262 | ) 263 | @kernel_function_context_parameter(name="docs", description="docs string") 264 | def ask_docs_function(self, context: SKContext) -> str: 265 | return "ask docs :" + context["docs"] 266 | 267 | ``` 268 | 269 | 调用方法如下: 270 | 271 | 272 | ```python 273 | 274 | import APIPlugin.CustomPlugin as custom_plugin 275 | 276 | api_plugin = kernel.import_plugin(custom_plugin.CustomPlugin(), plugin_name="CustomPlugin") 277 | 278 | context_variables = sk.ContextVariables(variables={"city": "Guangzhou"}) 279 | 280 | news_result = await api_plugin["WeatherFunction"].invoke(variables=context_variables) 281 | 282 | print(news_result) 283 | 284 | ``` 285 | 286 | 287 | ### **内置插件** 288 | 289 | Semantic Kernel 有非常多的预定义插件,作为解决一般业务的相关能力,当然随着 Semantic Kernel 的成熟,会有更多的内置插件融入进来。 290 | 291 | ## **例子** 292 | 293 | ***.NET 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/dotNET/03/PluginWithSK.ipynb) 294 | 295 | ***Python 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/python/03/FunctionCallWithSK.ipynb) 296 | 297 | 298 | ## **小结** 299 | 300 | 通过插件你可以基于业务场景完成更多的工作,通过本章的学习,你已经初步掌握插件的定义以及如何使用插件进行工作。希望你在真正的业务工作中,基于业务场景,打造属于企业的插件库 301 | 302 | 303 | 304 | 305 | 306 | -------------------------------------------------------------------------------- /docs/cn/04.Planner.md: -------------------------------------------------------------------------------- 1 | # **第四章:Planner 组件 - 让大模型有规划地工作** 2 | 3 | 我们在第三章学习了 Semantic Kernel 非常重要的功能 - 插件,通过插件可以完成针对不同领域的工作。LLMs 改变了人机交互的方式,通过自然语言去与大模型对话,让大模型完成工作。但往往我们给出的指令不只是完成单一的工作,如“请根据西雅图的天气给出差的人群发一个穿衣提醒的邮件”。我们一直希望人工智能和人类作对比,那我们把上述指令用人的思维去思考,就会有如下拆分: 4 | 5 | 1. 查询西雅图天气 6 | 7 | 2. 从公司系统查询出差的人以及其联系方式 8 | 9 | 3. 穿衣提醒邮件模版 10 | 11 | 4. 发送邮件 12 | 13 | LLMs 其实都有同样的思维,在 Semantic Kernel 中就有强大的 Planner 功能来做任务拆分的事情。本章就会和大家讲述相关的内容。 14 | 15 | ## **什么是 Planner** 16 | 17 | Planner 是 Semantic Kernel 一个重要组件,通过它可以接收任务指令,然后与已经在 Kernel 定义好的内置插件或者自定义插件做对应,让任务指令按部就班来工作。 就如开篇所提及的内容,实际上针对 “请根据西雅图的天气给出差的人群发一个穿衣提醒的邮件” 的指令,会现在 plugins 定义相关的插件,并通过 Kernel 注册后,Semantic Kernel 就协助您去配对步骤。 18 | 19 | ![planner](../../imgs/04/the-planner.png) 20 | 21 | ## **如何使用 Planner** 22 | 23 | 现在 Planner 还在进化阶段,在最新的 .NET 版本中,我们发现 Semantic Kernel 关于 Planner 组件 Microsoft.SemanticKernel.Planners.Handlebars 的版本与核心的 Microsoft.SemanticKernel 有所不同。有用户在质疑是否 Semantic Kernel 的版本号混乱。你可以理解在 2023 年 Semantic Kernel 团队完成了核心部分,至于针对组件化的功能如 Planner ,如 Memory ,如部分 Connector 等都还在进化。毕竟这些功能和 LLMs 的发展有关。 24 | 25 | 如果您需要使用 Planner,需要考虑您的业务场景。例如在一些业务流程中加入 Planner ,以及工具链中加入 Planner 都是非常有用的,毕竟人类对工作自动化有很多的思考 26 | 27 | **.NET 场景下** 28 | 29 | 需要添加关于 Planner 的相关组件库 30 | 31 | ```csharp 32 | 33 | #r "nuget: Microsoft.SemanticKernel.Planners.Handlebars, *-*" 34 | 35 | ``` 36 | 37 | 引用库 38 | 39 | 40 | ```csharp 41 | 42 | using Microsoft.SemanticKernel.Planning; 43 | 44 | ``` 45 | 46 | 47 | ***注意:*** 在采用 HanlerBars 使用时,你需要注意注意这些功能还在演变,在使用时请忽略 SKEXP0060 48 | 49 | 50 | ```csharp 51 | 52 | #pragma warning disable SKEXP0060 53 | 54 | ``` 55 | 56 | 57 | 58 | **Python 场景下** 59 | 60 | 61 | 直接引用库 62 | 63 | ```python 64 | 65 | from semantic_kernel.planning.basic_planner import BasicPlanner 66 | 67 | ``` 68 | 69 | 或 70 | 71 | 72 | ```python 73 | 74 | from semantic_kernel.planning.sequential_planner import SequentialPlanner 75 | 76 | 77 | ``` 78 | 79 | ***注意:*** 这里 Python 的 Planner 和 .NET 的 Planner 设定有所不同, Python 应该和 .NET 同步,所以大家在 Python 使用时,需要有以后变更的准备 80 | 81 | 82 | ### **改变中的 Planner** 83 | 84 | 在官方的博客中,有所提及 Planner 的变化 https://devblogs.microsoft.com/semantic-kernel/migrating-from-the-sequential-and-stepwise-planners-to-the-new-handlebars-and-stepwise-planner/ 结合了 Function Calling 的调用重新整理了在预览版中不同的 Planner 整合,大家可以关注一下该内容来了解。 85 | 86 | 如果你希望了解 Planner 实现的原理,请参考 87 | 88 | https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/Planners/Planners.Handlebars/Handlebars/CreatePlanPrompt.handlebars 89 | 90 | 91 | ## **例子** 92 | 93 | ***.NET 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/dotNET/04/PlannerWithSK.ipynb) 94 | 95 | ***Python 例子*** 请[点击访问这里](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/python/04/PlannerWithSK.ipynb) 96 | 97 | 98 | ## **小结** 99 | 100 | Planner 的加入让 Semantic Kernel 的可用性大大提高,特别针对业务和工具类的场景。构建企业级的插件库,对于 Planner 落地也是非常重要,毕竟我们通过插件组合出不同的任务来完成工作。 101 | 102 | 103 | -------------------------------------------------------------------------------- /docs/en/01.UsingAzureOpenAIServiceWithSDK.md: -------------------------------------------------------------------------------- 1 | # **Using Azure OpenAI Service With SDK** 2 | 3 | In the preface, we learned about LLMs. Now I want to talk about how to use LLMs. Before Learning Semantic Kernel, I would like you to see how to correctly accessAzure OpenAI Service through the SDK. 4 | 5 | ## **Deploy the model in Azure OpenAI Studio** 6 | 7 | Deploying an Azure OpenAI model is very easy. After successfully applying for Azure OpenAI Service, you can deploy it by creating resources in Azure Portal. Here are the steps: 8 | 9 | 1. Select Azure OpenAI to create resources in [Azure Portal](https://portal.azure.com/) 10 | 11 | ![aoairesource](../../imgs/01/aoairesource.png) 12 | 13 | After click 'Create', configure the region where Azure OpenAI is located. Please note: Because the resource distribution is different, different regions have different OpenAI models. You must understand it clearly before using it. 14 | 15 | ![aoaicreate](../../imgs/01/aoaicreate.png) 16 | 17 | Wait for a moment 18 | 19 | ![aoaidone](../../imgs/01/aoaidone.png) 20 | 21 | 2. Go to the created resources, you can deploy the model, and obtain the Key and Endpoint required when calling the SDK 22 | 23 | ![aoaikey](../../imgs/01/aoaikey.png) 24 | 25 | 3. Enter 'Model Deployment' and select 'Management Deployment' to enter Azure OpenAI Studio 26 | 27 | ![aoaigo](../../imgs/01/aoaigo.png) 28 | 29 | 4. Deploy your model in Azure OpenAI Studio 30 | 31 | ![aoaimodeldeploy](../../imgs/01/aoaimodeldeploy.png) 32 | 33 | Choose the model you need 34 | 35 | ![aoaimodel](../../imgs/01/aoaimodel.png) 36 | 37 | this is your model list 38 | 39 | ![aoaimodels](../../imgs/01/aoaimodels.png) 40 | 41 | 42 | Congratulations, you have successfully deployed the model. Now you can use the SDK to connect it. 43 | 44 | 45 | ## **Using SDK with Azure OpenAI Service** 46 | 47 | The SDK that interfaces with Azure OpenAI Service includes the SDK released by OpenAI for the Python version, and the SDK released by Microsoft for .NET. As a beginner, it is recommended to use it in a Notebook environment so that it is easier to understand the key steps of execution. 48 | 49 | 50 | ### **Python SDK** 51 | 52 | The official Python SDK released by OpenAI supports linking OpenAI and Azure OpenAI Service. Now OpenAI SDK has released version 1.x, but many people on the market are using version 0.2x. ***The content of this course will be based on OpenAI SDK version 1.x and use Python 3.10.x. *** 53 | 54 | 55 | ```python 56 | 57 | ! pip install openai -U 58 | 59 | ``` 60 | 61 | 62 | ### **.NET SDK** 63 | 64 | Microsoft releases an SDK based on Azure OpenAI Service. You can get the latest package through Nuget to complete .NET generative AI applications. ***The content of this course will be based on .NET 8 and the latest Azure.AI.OpenAI SDK to demonstrate examples. Of course, Polyglot Notebook will also be used as the environment*** 65 | 66 | 67 | ```csharp 68 | 69 | #r "nuget: Azure.AI.OpenAI, *-*" 70 | 71 | ``` 72 | 73 | We have configured the SDK environment based on .NET / Python above. Next, we need to create the linked class to complete the related initialization work. 74 | 75 | Getting started with the .NET environment 76 | 77 | 78 | ```csharp 79 | 80 | string endpoint = "Your Azure OpenAI Service Endpoint"; 81 | string key = "Your Azure OpenAI Service Key"; 82 | 83 | OpenAIClient client = new(new Uri(endpoint), new AzureKeyCredential(key)); 84 | 85 | ``` 86 | 87 | Getting started with the Python environment 88 | 89 | 90 | ```python 91 | 92 | client = AzureOpenAI( 93 | azure_endpoint = 'Your Azure OpenAI Service Endpoint', 94 | api_key='Your Azure OpenAI Service Key', 95 | api_version="Your Azure OpenAI API version" 96 | ) 97 | 98 | ``` 99 | 100 | 101 | ## **Using SDK to call Azure OpenAI Service API** 102 | 103 | ### ***1. Completion API*** 104 | 105 | This is based on the gpt-35-turbo-instruct model, which is a very important API for text completion. 106 | 107 | **Completion API with .NET** 108 | 109 | ```csharp 110 | 111 | CompletionsOptions completionsOptions = new() 112 | { 113 | DeploymentName = "gpt-35-turbo-instruct", 114 | Prompts = { "Can you introduce what is generative AI ?" }, 115 | }; 116 | 117 | Response completionsResponse = client.GetCompletions(completionsOptions); 118 | 119 | string completion = completionsResponse.Value.Choices[0].Text; 120 | 121 | ``` 122 | 123 | **Completion API with Python** 124 | 125 | 126 | ```python 127 | 128 | start_phrase = 'Can you introduce what is generative AI ?' 129 | 130 | response = openai.Completion.create(engine=deployment_name, prompt=start_phrase, max_tokens=1000) 131 | 132 | text = response['choices'][0]['text'].replace('\n', '').replace(' .', '.').strip() 133 | 134 | ``` 135 | 136 | ### ***2. Chat API*** 137 | 138 | This is an API based on the gpt-35-turbo and gpt-4 models for the chat scenario 139 | 140 | **Chat with .NET** 141 | 142 | ```csharp 143 | 144 | var chatCompletionsOptions = new ChatCompletionsOptions() 145 | { 146 | DeploymentName = "gpt-4", 147 | Messages = 148 | { 149 | new ChatRequestSystemMessage("You are my coding assistant."), 150 | new ChatRequestUserMessage("Can you tell me how to write python flask application?"), 151 | }, 152 | MaxTokens = 10000 153 | }; 154 | 155 | Response response = client.GetChatCompletions(chatCompletionsOptions); 156 | 157 | ``` 158 | 159 | **Chat with Python** 160 | 161 | ```python 162 | 163 | response = client.chat.completions.create( 164 | model="gpt-35-turbo", # model = "deployment_name". 165 | messages=[ 166 | {"role": "system", "content": "You are my coding assistant."}, 167 | {"role": "user", "content": "Can you tell me how to write python flask application?"} 168 | ] 169 | ) 170 | 171 | print(response.choices[0].message.content) 172 | 173 | ``` 174 | 175 | ### ***3. Generate images API*** 176 | 177 | Scenario of Generate images based on DallE 3 model 178 | 179 | **Generate images with .NET** 180 | 181 | 182 | ```csharp 183 | 184 | Response imageGenerations = await client.GetImageGenerationsAsync( 185 | new ImageGenerationOptions() 186 | { 187 | DeploymentName = "Your Azure OpenAI Service Dall-E 3 model Deployment Name", 188 | Prompt = "Chinese New Year picture for the Year of the Dragon", 189 | Size = ImageSize.Size1024x1024, 190 | }); 191 | 192 | 193 | ``` 194 | 195 | 196 | **Generate images with Python** 197 | 198 | 199 | ```python 200 | 201 | result = client.images.generate( 202 | model="dalle3", 203 | prompt="Chinese New Year picture for the Year of the Dragon", 204 | n=1 205 | ) 206 | 207 | json_response = json.loads(result.model_dump_json()) 208 | 209 | ``` 210 | 211 | 212 | ### ***4. Embeddings API*** 213 | 214 | Based on text-embedding-ada-002 model, implementation based on vector conversion 215 | 216 | 217 | **Embeddings with .NET** 218 | 219 | 220 | ```csharp 221 | 222 | EmbeddingsOptions embeddingOptions = new() 223 | { 224 | DeploymentName = "text-embedding-ada-002", 225 | Input = { "Kinfey is Microsoft Cloud Advocate" }, 226 | }; 227 | 228 | var returnValue = openAIClient.GetEmbeddings(embeddingOptions); 229 | 230 | foreach (float item in returnValue.Value.Data[0].Embedding.ToArray()) 231 | { 232 | Console.WriteLine(item); 233 | } 234 | 235 | ``` 236 | 237 | 238 | **Embeddings with Python** 239 | 240 | ```python 241 | 242 | client.embeddings.create(input = ['Kinfey is Microsoft Cloud Advocate'], model='text-embedding-ada-002 model').data[0].embedding 243 | 244 | ``` 245 | 246 | 247 | ## **Samples** 248 | 249 | Examples related to the above APIs are listed below. Please click here 250 | 251 | ### ***Python examples*** Please visit [Click here](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/python/01/PythonSDKAOAIDemo.ipynb) 252 | 253 | ### ***.NET examples*** Please visit [Click here](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/dotNET/01/dotNETSDKAOAIDemo.ipynb) 254 | 255 | ## **Summary** 256 | 257 | We use the most original and basic SDK to deal with Azure OpenAI Service. This is also our first step towards generative AU programming. We can understand different interfaces more quickly without using a framework, and it also lays the foundation for us to enter Semantic Kernel 258 | 259 | 260 | 261 | 262 | 263 | 264 | 265 | 266 | 267 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 276 | 277 | 278 | 279 | 280 | 281 | 282 | 283 | -------------------------------------------------------------------------------- /docs/en/02.IntroduceSemanticKernel.md: -------------------------------------------------------------------------------- 1 | # **Foundations of Semantic Kernel** 2 | 3 | We have already told you about LLMs and how to connect to Azure OpenAI Service using .NET and Python SDK. Next we will enter the world of Semantic Kernel. In 2023, we will have a lot of frameworks based on LLMs born. Why should we choose Semantic Kernel? What are the advantages and disadvantages of Semantic Kernel? And how do you use Semantic Kernel as a traditional developer? I will explain it in detail in this chapter. 4 | 5 | ## **What is Semantic Kernel** 6 | 7 | Semantic Kernel is a lightweight open source framework. Through Semantic Kernel, you can quickly use different programming languages ​​(C#/Python/Java) combined with LLMs (OpenAI, Azure OpenAI, Hugging Face and other models) to build intelligent applications. After we entered generative AI, the way humans and machines communicate has changed a lot. We can use natural language to complete conversations with machines, and the threshold has been lowered a lot. Combining hint engineering and LLMs, we can complete different businesses at a lower cost. But how to introduce prompt engineering and LLMs into projects? We need a framework like Semantic Kernel as the basis for opening the door to intelligence. In May 2023, Microsoft CTO Kevin Scott proposed the concept of Copilot Stack, with artificial intelligence orchestration at its core. Semantic Kernel has the ability to be combined with LLMs and plug-ins composed of various prompt engineerings/software, so it is also regarded as the best practice of Copilot Stack. Through Semantic Kernel, you can easily build solutions based on Copilot Stack, and it can also be seamlessly connected to traditional projects. 8 | 9 | ### **Semantic Kernel vs LangChain** 10 | 11 | We have no choice but to make some objective comparisons. After all, LangChain has more user groups. It is undeniable that LangChain now has more features than Semantic Kernel in practical scenarios, especially in the entry-level reference examples. Let’s make a more comprehensive comparison: 12 | 13 | ***LangChain*** is an open source framework based on Python and Javascript and contains many prefabricated components. Developers can complete the development of intelligence applications without writing additional prompt engineerings. Especially in complex application scenarios, developers can quickly integrate multiple predefined components to complete the synthesis. From a development perspective, it is more suitable for developers with a foundation in data science or artificial intelligence. 14 | 15 | ***Semantic Kernel*** You can use open source frameworks based on C#, Python, and Java. The greater advantage lies in engineering. After all, it is more like a programming paradigm. Traditional developers can quickly master the framework for application development, and can better combine customized plug-ins and prompt projects to complete the enterprise's customized business intelligence work. 16 | 17 | The two have a lot in common, and both are still in version iteration. We need to make a choice based on the team structure, technology stack, and application scenarios. 18 | 19 | ### **Features of Semantic Kernel** 20 | 21 | 1. **Powerful Plugins** - You can solve intelligentce business problems by combining custom/predefined plugins. Let traditional codes and intelligentce plugins work together to flexibly connect to application scenarios, simplifying the process of transforming traditional applications into intelligent ones. 22 | 23 | 2. **Multi-model support** - Configure the "brain" for your intelligent application, which can be from Azure OpenAI Service, OpenAI, and various offline models on Hugging Face. Through the linker, you can quickly connect to different "brains" to make your application more smarter. 24 | 25 | 3. **Various connectors** - In addition to linking the "brain", the connectors can also link to vector databases, various business software, and different business middleware, allowing more business scenarios to enter the intelligent world possible 26 | 27 | 4. **Convenient Development** - Simple and easy to use, developers can get started at zero cost 28 | 29 | ### **Disadvantages of Semantic Kernel** 30 | 31 | After all, LLMs are still developing, with many new models being added, many new functions, and new concepts being introduced. Open source frameworks such as Semantic Kernel and LangChain are working hard to adapt to this new Moore's Law, but there will be uncertain changes in version iterations. Therefore, when using it, developers need to pay more attention to the change log on the corresponding GitHub Repo. 32 | 33 | In addition, Semantic Kernel needs to take into account multiple programming languages, so the progress is inconsistent, which will also lead to the choice of Semantic Kernel among people with different technology stacks. 34 | 35 | 36 | ### **Semantic Kernel's Kernel** 37 | 38 | If Semantic Kernel is regarded as Copilot Stack best practice, then Kernel is the center of AI orchestration and is also mentioned in the official documentation. Kernel can be linked with different plug-ins, services, logs and different models. All Semantic Kernel applications start with the Kernel. 39 | 40 | ![kernel](../../imgs/02/kernel.png) 41 | 42 | 43 | ## **Build a simple translation project with Semantic Kernel** 44 | 45 | After talking about some basic knowledge, we started to learn how to introduce Semantic Kernel into .NET and Python project projects. We use four steps to complete the implementation of a translation 46 | 47 | ### **Step 1: Import the Semantic Kernel library** 48 | 49 | 50 | **.NET** 51 | 52 | ***Note: We are using the latest Semantic Kernel 1.3.0 version here, and using Polyglot Notebook to complete related learning*** 53 | 54 | 55 | ```csharp 56 | 57 | #r "nuget: Microsoft.SemanticKernel, *-*" 58 | 59 | ``` 60 | 61 | 62 | **Python** 63 | 64 | 65 | ***Note: We are using the latest version of Semantic Kernel 0.5.1.dev here, and using Notebook to complete related learning*** 66 | 67 | 68 | ```python 69 | 70 | ! pip install semantic-kernel -U 71 | 72 | ``` 73 | 74 | 75 | ### **Step 2: Create Kernel object** 76 | 77 | Note here that we need to put the Endpoint, Key and Deployment Name of the model related to Azure OpenAI Service in a file to facilitate calling and setting. In .NET environment I set it in Settings.cs environment, in Python environment I set it in .env environment 78 | 79 | **.NET** 80 | 81 | ```csharp 82 | 83 | using Microsoft.SemanticKernel; 84 | using Microsoft.SemanticKernel.Connectors.OpenAI; 85 | 86 | 87 | Kernel kernel = Kernel.CreateBuilder() 88 | .AddAzureOpenAIChatCompletion("Your Azure OpenAI Service Deployment Name" , "Your Azure OpenAI Service Endpoint", "Your Azure OpenAI Service API Key") 89 | .Build(); 90 | 91 | ``` 92 | 93 | **Python** 94 | 95 | ```python 96 | 97 | import semantic_kernel as sk 98 | import semantic_kernel.connectors.ai.open_ai as skaoai 99 | 100 | kernel = sk.Kernel() 101 | deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env() 102 | kernel.add_chat_service("azure_chat_competion_service", skaoai.AzureChatCompletion(deployment,endpoint,api_key=api_key,api_version = "2023-12-01-preview")) 103 | 104 | 105 | ``` 106 | ### **Step 3: Import plugins** 107 | 108 | In Semantic Kernel, we have different plugins. Users can use predefined plugins or custom plugins. If you want to know more, you can pay attention to the next chapter, where we will explain the use of plugins in detail. In this example, we are using a custom plug-in, which is already in the plugins directory. 109 | 110 | 111 | **.NET** 112 | 113 | ```csharp 114 | 115 | var plugin = kernel.CreatePluginFromPromptDirectory(Path.Combine(pluginDirectory, "TranslatePlugin")); 116 | 117 | 118 | ``` 119 | 120 | 121 | **Python** 122 | 123 | ```python 124 | 125 | pluginFunc = kernel.import_semantic_plugin_from_directory(base_plugin,"TranslatePlugin") 126 | 127 | ``` 128 | 129 | 130 | ### **Step 4: Running** 131 | 132 | 133 | **.NET** 134 | 135 | ```csharp 136 | 137 | var transalteContent = await kernel.InvokeAsync( plugin["Basic"],new(){["input"] = "你好,我是你的 AI 编排助手 - Semantic Kernel"}); 138 | 139 | transalteContent.GetValue(); 140 | 141 | 142 | ``` 143 | 144 | 145 | **Python** 146 | 147 | ```python 148 | 149 | translateFunc = pluginFunc["Basic"] 150 | 151 | result = await translateFunc("你好,我是你的 AI 编排助手 - Semantic Kernel") 152 | 153 | 154 | ``` 155 | 156 | you can visit 157 | 158 | 159 | ***.NET Samples*** [Click here](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/dotNET/02/LearnSK.ipynb) 160 | 161 | ***Python Samples*** [Click here](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/python/02/LearnSK.ipynb) 162 | 163 | 164 | ## **Summary** 165 | 166 | How did you feel when you first came into contact with Semantic Kernel? In this chapter, you learned the basics of Semantic Kernel and related knowledge. And understanding the comparison between Semantic Kernel and LangChain, as well as the related advantages and disadvantages, will be helpful to you in the process of project implementation. We have also completed a translation through Semantic Kernel in four steps, which is a hello world in the era of large models, giving you confidence when getting started. Next, you can open the next chapter for advanced learning to learn more about Semantic Kernel. 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | -------------------------------------------------------------------------------- /docs/en/04.Planner.md: -------------------------------------------------------------------------------- 1 | # **Planner - Let LLM have planning work** 2 | 3 | In Last Chapter, we learned a very important function of Semantic Kernel - plug-ins, through which work in different fields can be completed. LLMs change the way human-computer interaction uses natural language to talk to LLMs and let the LLMs complete the work. But often the instructions we give are not just about completing a single task, such as "Please send a dressing reminder email to people on business trips based on the weather in Seattle." We have always wanted artificial intelligence to compare with humans. If we think about the above instructions using human thinking, we will have the following split: 4 | 5 | 1. Check the weather in Seattle 6 | 7 | 2. Query the business travelers and their contact information from the company system 8 | 9 | 3. Reminder email template 10 | 11 | 4. Send email 12 | 13 | LLMs actually have the same thinking. There is a powerful Planner function in the Semantic Kernel to split tasks. This chapter will tell you the relevant content. 14 | 15 | ## **What's Planner** 16 | 17 | Planner is an important component of the Semantic Kernel. It can receive task instructions and then correspond to the built-in plug-ins or custom plug-ins that have been defined in the Kernel, so that the task instructions can work step by step. As mentioned at the beginning, in fact, for the instruction "please send a dressing reminder email to business people based on the weather in Seattle", the relevant plugins will be defined in plugins and registered through Kernel, Semantic Kernel will Assist you with the pairing steps. 18 | 19 | ![planner](../../imgs/04/the-planner.png) 20 | 21 | ## **How to use Planner** 22 | 23 | Now Planner is still in the evolutionary stage. In the latest .NET version, we found that the Semantic Kernel version of the Planner component Microsoft.SemanticKernel.Planners.Handlebars is different from the core Microsoft.SemanticKernel. Some users are questioning whether the Semantic Kernel version number is confusing. You can understand that the Semantic Kernel team has completed the core part in 2023, and component-based functions such as Planner, Memory, and some Connectors are still evolving. After all, these features are relevant to the development of LLMs. 24 | 25 | If you need to use Planner, you need to consider your business scenario. For example, adding Planner to some business processes and adding Planner to tool chains are very useful. After all, humans think a lot about work automation. 26 | 27 | **.NET** 28 | 29 | add related component libraries about Planner 30 | 31 | ```csharp 32 | 33 | #r "nuget: Microsoft.SemanticKernel.Planners.Handlebars, *-*" 34 | 35 | ``` 36 | 37 | import library 38 | 39 | 40 | ```csharp 41 | 42 | using Microsoft.SemanticKernel.Planning; 43 | 44 | ``` 45 | 46 | ***Note:*** When using HanlerBars, you need to note that these features are still evolving, and please ignore SKEXP0060 when using them. 47 | 48 | 49 | ```csharp 50 | 51 | #pragma warning disable SKEXP0060 52 | 53 | ``` 54 | 55 | 56 | 57 | **Python** 58 | 59 | 60 | import library 61 | 62 | ```python 63 | 64 | from semantic_kernel.planning.basic_planner import BasicPlanner 65 | 66 | ``` 67 | 68 | or 69 | 70 | 71 | ```python 72 | 73 | from semantic_kernel.planning.sequential_planner import SequentialPlanner 74 | 75 | 76 | ``` 77 | 78 | ***Note:*** The settings of Python's Planner and .NET's Planner are different here. Python should be synchronized with .NET, so when using Python, you need to be prepared for future changes. 79 | 80 | 81 | ### **The Changing Planner** 82 | 83 | In the official blog, changes in Planner are mentioned https://devblogs.microsoft.com/semantic-kernel/migrating-from-the-sequential-and-stepwise-planners-to-the-new-handlebars-and -stepwise-planner/ combines Function Calling to rearrange the different Planner integrations in the preview version. You can pay attention to this content to learn more. 84 | 85 | If you want to understand the principles of Planner implementation, please refer to 86 | 87 | https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/Planners/Planners.Handlebars/Handlebars/CreatePlanPrompt.handlebars 88 | 89 | ## **Sample** 90 | 91 | ***.NET Sample*** Please [click here](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/dotNET/04/PlannerWithSK.ipynb) 92 | 93 | ***Python Sample*** Please [click here](https://github.com/microsoft/SemanticKernelCookBook/blob/main/notebooks/python/04/PlannerWithSK.ipynb) 94 | 95 | 96 | ## **Summary** 97 | 98 | The addition of Planner greatly improves the usability of Semantic Kernel, especially for business and tool scenarios. Building an enterprise-level plugin library is also very important for the implementation of Planner. After all, we use plug-ins to combine different tasks to complete the work. 99 | 100 | -------------------------------------------------------------------------------- /imgs/00/aoaistudio.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/00/aoaistudio.png -------------------------------------------------------------------------------- /imgs/00/aoaistudio_huggingface.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/00/aoaistudio_huggingface.png -------------------------------------------------------------------------------- /imgs/00/msandopenai.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/00/msandopenai.png -------------------------------------------------------------------------------- /imgs/001/logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/001/logo.png -------------------------------------------------------------------------------- /imgs/001/skpatternlarge.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/001/skpatternlarge.png -------------------------------------------------------------------------------- /imgs/002/cross-platform-plugins.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/002/cross-platform-plugins.png -------------------------------------------------------------------------------- /imgs/003/the-planner.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/003/the-planner.png -------------------------------------------------------------------------------- /imgs/01/aoaicreate.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoaicreate.png -------------------------------------------------------------------------------- /imgs/01/aoaidone.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoaidone.png -------------------------------------------------------------------------------- /imgs/01/aoaigo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoaigo.png -------------------------------------------------------------------------------- /imgs/01/aoaikey.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoaikey.png -------------------------------------------------------------------------------- /imgs/01/aoaimodel.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoaimodel.png -------------------------------------------------------------------------------- /imgs/01/aoaimodeldeploy.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoaimodeldeploy.png -------------------------------------------------------------------------------- /imgs/01/aoaimodels.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoaimodels.png -------------------------------------------------------------------------------- /imgs/01/aoairesource.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/01/aoairesource.png -------------------------------------------------------------------------------- /imgs/02/kernel.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/02/kernel.png -------------------------------------------------------------------------------- /imgs/03/introplugins.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/03/introplugins.png -------------------------------------------------------------------------------- /imgs/04/the-planner.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/04/the-planner.png -------------------------------------------------------------------------------- /imgs/05/GPT.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/05/GPT.png -------------------------------------------------------------------------------- /imgs/cover.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/imgs/cover.png -------------------------------------------------------------------------------- /notebooks/dotNET/02/LearnSK.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "dotnet_interactive": { 8 | "language": "csharp" 9 | }, 10 | "polyglot_notebook": { 11 | "kernelName": "csharp" 12 | }, 13 | "vscode": { 14 | "languageId": "polyglot-notebook" 15 | } 16 | }, 17 | "outputs": [ 18 | { 19 | "data": { 20 | "text/html": [ 21 | "
Installed Packages
  • Microsoft.SemanticKernel, 1.16.2
" 22 | ] 23 | }, 24 | "metadata": {}, 25 | "output_type": "display_data" 26 | } 27 | ], 28 | "source": [ 29 | "#r \"nuget: Microsoft.SemanticKernel, *-*\"" 30 | ] 31 | }, 32 | { 33 | "cell_type": "code", 34 | "execution_count": 3, 35 | "metadata": { 36 | "dotnet_interactive": { 37 | "language": "csharp" 38 | }, 39 | "polyglot_notebook": { 40 | "kernelName": "csharp" 41 | }, 42 | "vscode": { 43 | "languageId": "polyglot-notebook" 44 | } 45 | }, 46 | "outputs": [], 47 | "source": [ 48 | "using Microsoft.SemanticKernel;\n", 49 | "using Microsoft.SemanticKernel.Connectors.OpenAI;" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": 4, 55 | "metadata": { 56 | "dotnet_interactive": { 57 | "language": "csharp" 58 | }, 59 | "polyglot_notebook": { 60 | "kernelName": "csharp" 61 | }, 62 | "vscode": { 63 | "languageId": "polyglot-notebook" 64 | } 65 | }, 66 | "outputs": [], 67 | "source": [ 68 | "using System;\n", 69 | "using System.Collections.Generic;\n", 70 | "using System.Diagnostics;\n", 71 | "using System.IO;\n", 72 | "using System.Net.Http;\n", 73 | "using System.Text.Json;\n", 74 | "using System.Threading.Tasks;" 75 | ] 76 | }, 77 | { 78 | "cell_type": "code", 79 | "execution_count": 5, 80 | "metadata": { 81 | "dotnet_interactive": { 82 | "language": "csharp" 83 | }, 84 | "polyglot_notebook": { 85 | "kernelName": "csharp" 86 | }, 87 | "vscode": { 88 | "languageId": "polyglot-notebook" 89 | } 90 | }, 91 | "outputs": [], 92 | "source": [ 93 | "Kernel kernel = Kernel.CreateBuilder()\n", 94 | " .AddAzureOpenAIChatCompletion(\"Your Azure OpenAI Service gpt-35-turbo Deployment Name\" , \"Your Azure OpenAI Service Endpoint\", \"Your Azure OpenAI Service API Key\")\n", 95 | " .Build();" 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": 6, 101 | "metadata": { 102 | "dotnet_interactive": { 103 | "language": "csharp" 104 | }, 105 | "polyglot_notebook": { 106 | "kernelName": "csharp" 107 | }, 108 | "vscode": { 109 | "languageId": "polyglot-notebook" 110 | } 111 | }, 112 | "outputs": [], 113 | "source": [ 114 | "var pluginDirectory = Path.Combine(\"../../..\", \"plugins\");" 115 | ] 116 | }, 117 | { 118 | "cell_type": "code", 119 | "execution_count": 7, 120 | "metadata": { 121 | "dotnet_interactive": { 122 | "language": "csharp" 123 | }, 124 | "polyglot_notebook": { 125 | "kernelName": "csharp" 126 | }, 127 | "vscode": { 128 | "languageId": "polyglot-notebook" 129 | } 130 | }, 131 | "outputs": [], 132 | "source": [ 133 | "var plugin = kernel.CreatePluginFromPromptDirectory(Path.Combine(pluginDirectory, \"TranslatePlugin\"));" 134 | ] 135 | }, 136 | { 137 | "cell_type": "code", 138 | "execution_count": 8, 139 | "metadata": { 140 | "dotnet_interactive": { 141 | "language": "csharp" 142 | }, 143 | "polyglot_notebook": { 144 | "kernelName": "csharp" 145 | }, 146 | "vscode": { 147 | "languageId": "polyglot-notebook" 148 | } 149 | }, 150 | "outputs": [], 151 | "source": [ 152 | "var transalteContent = await kernel.InvokeAsync( plugin[\"Basic\"],new(){[\"input\"] = \"你好,我是你的 AI 编排助手 - Semantic Kernel\"});" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": 9, 158 | "metadata": { 159 | "dotnet_interactive": { 160 | "language": "csharp" 161 | }, 162 | "polyglot_notebook": { 163 | "kernelName": "csharp" 164 | }, 165 | "vscode": { 166 | "languageId": "polyglot-notebook" 167 | } 168 | }, 169 | "outputs": [ 170 | { 171 | "data": { 172 | "text/plain": [ 173 | "Hello, I am your AI orchestration assistant - Semantic Kernel." 174 | ] 175 | }, 176 | "metadata": {}, 177 | "output_type": "display_data" 178 | } 179 | ], 180 | "source": [ 181 | "transalteContent.GetValue()" 182 | ] 183 | } 184 | ], 185 | "metadata": { 186 | "kernelspec": { 187 | "display_name": ".NET (C#)", 188 | "language": "C#", 189 | "name": ".net-csharp" 190 | }, 191 | "language_info": { 192 | "name": "python" 193 | }, 194 | "polyglot_notebook": { 195 | "kernelInfo": { 196 | "defaultKernelName": "csharp", 197 | "items": [ 198 | { 199 | "aliases": [], 200 | "name": "csharp" 201 | } 202 | ] 203 | } 204 | } 205 | }, 206 | "nbformat": 4, 207 | "nbformat_minor": 2 208 | } 209 | -------------------------------------------------------------------------------- /notebooks/dotNET/04/PlannerWithSK.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "dotnet_interactive": { 8 | "language": "csharp" 9 | }, 10 | "polyglot_notebook": { 11 | "kernelName": "csharp" 12 | }, 13 | "vscode": { 14 | "languageId": "polyglot-notebook" 15 | } 16 | }, 17 | "outputs": [], 18 | "source": [ 19 | "#r \"nuget: Microsoft.SemanticKernel, *-*\"\n", 20 | "#r \"nuget: Microsoft.SemanticKernel.Planners.Handlebars, *-*\"" 21 | ] 22 | }, 23 | { 24 | "cell_type": "code", 25 | "execution_count": null, 26 | "metadata": { 27 | "dotnet_interactive": { 28 | "language": "csharp" 29 | }, 30 | "polyglot_notebook": { 31 | "kernelName": "csharp" 32 | }, 33 | "vscode": { 34 | "languageId": "polyglot-notebook" 35 | } 36 | }, 37 | "outputs": [], 38 | "source": [ 39 | "using Microsoft.SemanticKernel;\n", 40 | "using Microsoft.SemanticKernel.Memory;\n", 41 | "using Microsoft.SemanticKernel.Connectors.OpenAI;\n", 42 | "using Microsoft.SemanticKernel.Planning;\n" 43 | ] 44 | }, 45 | { 46 | "cell_type": "code", 47 | "execution_count": null, 48 | "metadata": { 49 | "dotnet_interactive": { 50 | "language": "csharp" 51 | }, 52 | "polyglot_notebook": { 53 | "kernelName": "csharp" 54 | }, 55 | "vscode": { 56 | "languageId": "polyglot-notebook" 57 | } 58 | }, 59 | "outputs": [], 60 | "source": [ 61 | "#!import ../../../Plugins/CustomPlugin/CompanySearchPlugin.cs" 62 | ] 63 | }, 64 | { 65 | "cell_type": "code", 66 | "execution_count": null, 67 | "metadata": { 68 | "dotnet_interactive": { 69 | "language": "csharp" 70 | }, 71 | "polyglot_notebook": { 72 | "kernelName": "csharp" 73 | }, 74 | "vscode": { 75 | "languageId": "polyglot-notebook" 76 | } 77 | }, 78 | "outputs": [], 79 | "source": [ 80 | "using System;\n", 81 | "using System.Collections.Generic;\n", 82 | "using System.Diagnostics;\n", 83 | "using System.IO;\n", 84 | "using System.Net.Http;\n", 85 | "using System.Text.Json;\n", 86 | "using System.Threading.Tasks;" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": null, 92 | "metadata": { 93 | "dotnet_interactive": { 94 | "language": "csharp" 95 | }, 96 | "polyglot_notebook": { 97 | "kernelName": "csharp" 98 | }, 99 | "vscode": { 100 | "languageId": "polyglot-notebook" 101 | } 102 | }, 103 | "outputs": [], 104 | "source": [ 105 | "Kernel kernel = Kernel.CreateBuilder()\n", 106 | " .AddAzureOpenAIChatCompletion(\"Your Azure OpenAI Service gpt-35-turbo-16k Deployment Name\", \"Your Azure OpenAI Service Endpoint\", \"Your Azure OpenAI Service API Key\")\n", 107 | " .Build();" 108 | ] 109 | }, 110 | { 111 | "cell_type": "code", 112 | "execution_count": null, 113 | "metadata": { 114 | "dotnet_interactive": { 115 | "language": "csharp" 116 | }, 117 | "polyglot_notebook": { 118 | "kernelName": "csharp" 119 | }, 120 | "vscode": { 121 | "languageId": "polyglot-notebook" 122 | } 123 | }, 124 | "outputs": [], 125 | "source": [ 126 | "var companySearchPluginObj = new CompanySearchPlugin();\n", 127 | "var companySearchPlugin = kernel.ImportPluginFromObject(companySearchPluginObj, \"CompanySearchPlugin\");" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": null, 133 | "metadata": { 134 | "dotnet_interactive": { 135 | "language": "csharp" 136 | }, 137 | "polyglot_notebook": { 138 | "kernelName": "csharp" 139 | }, 140 | "vscode": { 141 | "languageId": "polyglot-notebook" 142 | } 143 | }, 144 | "outputs": [], 145 | "source": [ 146 | "var pluginDirectory = Path.Combine(\"../../..\", \"plugins\");" 147 | ] 148 | }, 149 | { 150 | "cell_type": "code", 151 | "execution_count": null, 152 | "metadata": { 153 | "dotnet_interactive": { 154 | "language": "csharp" 155 | }, 156 | "polyglot_notebook": { 157 | "kernelName": "csharp" 158 | }, 159 | "vscode": { 160 | "languageId": "polyglot-notebook" 161 | } 162 | }, 163 | "outputs": [], 164 | "source": [ 165 | "var writetPlugin = kernel.ImportPluginFromPromptDirectory(Path.Combine(pluginDirectory, \"WriterPlugin\"));\n", 166 | "var emailPlugin = kernel.ImportPluginFromPromptDirectory(Path.Combine(pluginDirectory, \"EmailPlugin\"));\n", 167 | "var translatePlugin = kernel.ImportPluginFromPromptDirectory(Path.Combine(pluginDirectory, \"TranslatePlugin\"));" 168 | ] 169 | }, 170 | { 171 | "cell_type": "code", 172 | "execution_count": null, 173 | "metadata": { 174 | "dotnet_interactive": { 175 | "language": "csharp" 176 | }, 177 | "polyglot_notebook": { 178 | "kernelName": "csharp" 179 | }, 180 | "vscode": { 181 | "languageId": "polyglot-notebook" 182 | } 183 | }, 184 | "outputs": [], 185 | "source": [ 186 | "string goal = \"\"\"\n", 187 | "Check the weather in Guangzhou, use spanish to write emails abount dressing tips based on the results\n", 188 | "\"\"\";" 189 | ] 190 | }, 191 | { 192 | "cell_type": "code", 193 | "execution_count": null, 194 | "metadata": { 195 | "dotnet_interactive": { 196 | "language": "csharp" 197 | }, 198 | "polyglot_notebook": { 199 | "kernelName": "csharp" 200 | }, 201 | "vscode": { 202 | "languageId": "polyglot-notebook" 203 | } 204 | }, 205 | "outputs": [], 206 | "source": [ 207 | "using Microsoft.SemanticKernel.Planning.Handlebars;\n", 208 | "\n", 209 | "#pragma warning disable SKEXP0060\n", 210 | "\n", 211 | "var planner = new HandlebarsPlanner();" 212 | ] 213 | }, 214 | { 215 | "cell_type": "code", 216 | "execution_count": null, 217 | "metadata": { 218 | "dotnet_interactive": { 219 | "language": "csharp" 220 | }, 221 | "polyglot_notebook": { 222 | "kernelName": "csharp" 223 | }, 224 | "vscode": { 225 | "languageId": "polyglot-notebook" 226 | } 227 | }, 228 | "outputs": [], 229 | "source": [ 230 | "#pragma warning disable SKEXP0060\n", 231 | "\n", 232 | "var originalPlan = await planner.CreatePlanAsync(kernel, goal);" 233 | ] 234 | }, 235 | { 236 | "cell_type": "code", 237 | "execution_count": null, 238 | "metadata": { 239 | "dotnet_interactive": { 240 | "language": "csharp" 241 | }, 242 | "polyglot_notebook": { 243 | "kernelName": "csharp" 244 | }, 245 | "vscode": { 246 | "languageId": "polyglot-notebook" 247 | } 248 | }, 249 | "outputs": [], 250 | "source": [ 251 | "Console.WriteLine(originalPlan);" 252 | ] 253 | }, 254 | { 255 | "cell_type": "code", 256 | "execution_count": null, 257 | "metadata": { 258 | "dotnet_interactive": { 259 | "language": "csharp" 260 | }, 261 | "polyglot_notebook": { 262 | "kernelName": "csharp" 263 | }, 264 | "vscode": { 265 | "languageId": "polyglot-notebook" 266 | } 267 | }, 268 | "outputs": [], 269 | "source": [ 270 | "originalPlan.ToString()" 271 | ] 272 | }, 273 | { 274 | "cell_type": "code", 275 | "execution_count": null, 276 | "metadata": { 277 | "dotnet_interactive": { 278 | "language": "csharp" 279 | }, 280 | "polyglot_notebook": { 281 | "kernelName": "csharp" 282 | }, 283 | "vscode": { 284 | "languageId": "polyglot-notebook" 285 | } 286 | }, 287 | "outputs": [], 288 | "source": [ 289 | "#pragma warning disable SKEXP0060\n", 290 | "var originalPlanResult = await originalPlan.InvokeAsync(kernel, new KernelArguments());" 291 | ] 292 | }, 293 | { 294 | "cell_type": "code", 295 | "execution_count": null, 296 | "metadata": { 297 | "dotnet_interactive": { 298 | "language": "csharp" 299 | }, 300 | "polyglot_notebook": { 301 | "kernelName": "csharp" 302 | }, 303 | "vscode": { 304 | "languageId": "polyglot-notebook" 305 | } 306 | }, 307 | "outputs": [], 308 | "source": [ 309 | "Console.WriteLine(originalPlanResult);" 310 | ] 311 | } 312 | ], 313 | "metadata": { 314 | "kernelspec": { 315 | "display_name": ".NET (C#)", 316 | "language": "C#", 317 | "name": ".net-csharp" 318 | }, 319 | "language_info": { 320 | "codemirror_mode": { 321 | "name": "ipython", 322 | "version": 3 323 | }, 324 | "file_extension": ".py", 325 | "mimetype": "text/x-python", 326 | "name": "python", 327 | "nbconvert_exporter": "python", 328 | "pygments_lexer": "ipython3", 329 | "version": "3.10.12" 330 | }, 331 | "polyglot_notebook": { 332 | "kernelInfo": { 333 | "defaultKernelName": "csharp", 334 | "items": [ 335 | { 336 | "aliases": [], 337 | "name": "csharp" 338 | } 339 | ] 340 | } 341 | } 342 | }, 343 | "nbformat": 4, 344 | "nbformat_minor": 2 345 | } 346 | -------------------------------------------------------------------------------- /notebooks/dotNET/05/EmbeddingsWithSK.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": { 7 | "dotnet_interactive": { 8 | "language": "csharp" 9 | }, 10 | "polyglot_notebook": { 11 | "kernelName": "csharp" 12 | }, 13 | "vscode": { 14 | "languageId": "polyglot-notebook" 15 | } 16 | }, 17 | "outputs": [ 18 | { 19 | "data": { 20 | "text/html": [ 21 | "
Installed Packages
  • Microsoft.SemanticKernel, 1.16.2
  • Microsoft.SemanticKernel.Connectors.Qdrant, 1.16.2-alpha
" 22 | ] 23 | }, 24 | "metadata": {}, 25 | "output_type": "display_data" 26 | } 27 | ], 28 | "source": [ 29 | "#r \"nuget: Microsoft.SemanticKernel, *-*\"\n", 30 | "#r \"nuget: Microsoft.SemanticKernel.Connectors.Qdrant, *-*\"" 31 | ] 32 | }, 33 | { 34 | "cell_type": "code", 35 | "execution_count": 2, 36 | "metadata": { 37 | "dotnet_interactive": { 38 | "language": "csharp" 39 | }, 40 | "polyglot_notebook": { 41 | "kernelName": "csharp" 42 | }, 43 | "vscode": { 44 | "languageId": "polyglot-notebook" 45 | } 46 | }, 47 | "outputs": [], 48 | "source": [ 49 | "using Microsoft.SemanticKernel;\n", 50 | "using Microsoft.SemanticKernel.Embeddings;\n", 51 | "using Microsoft.SemanticKernel.Memory;\n", 52 | "using Microsoft.SemanticKernel.Connectors.Qdrant;\n", 53 | "using Microsoft.SemanticKernel.Connectors.OpenAI;" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": 3, 59 | "metadata": { 60 | "dotnet_interactive": { 61 | "language": "csharp" 62 | }, 63 | "polyglot_notebook": { 64 | "kernelName": "csharp" 65 | }, 66 | "vscode": { 67 | "languageId": "polyglot-notebook" 68 | } 69 | }, 70 | "outputs": [], 71 | "source": [ 72 | "using System;\n", 73 | "using System.Collections.Generic;\n", 74 | "using System.Diagnostics;\n", 75 | "using System.IO;\n", 76 | "using System.Net.Http;\n", 77 | "using System.Text.Json;\n", 78 | "using System.Threading.Tasks;" 79 | ] 80 | }, 81 | { 82 | "cell_type": "code", 83 | "execution_count": 4, 84 | "metadata": { 85 | "dotnet_interactive": { 86 | "language": "csharp" 87 | }, 88 | "polyglot_notebook": { 89 | "kernelName": "csharp" 90 | }, 91 | "vscode": { 92 | "languageId": "polyglot-notebook" 93 | } 94 | }, 95 | "outputs": [], 96 | "source": [ 97 | "string conceptCollectionName = \"cookbookvdb\";" 98 | ] 99 | }, 100 | { 101 | "cell_type": "code", 102 | "execution_count": 18, 103 | "metadata": { 104 | "dotnet_interactive": { 105 | "language": "csharp" 106 | }, 107 | "polyglot_notebook": { 108 | "kernelName": "csharp" 109 | }, 110 | "vscode": { 111 | "languageId": "polyglot-notebook" 112 | } 113 | }, 114 | "outputs": [ 115 | { 116 | "ename": "Error", 117 | "evalue": "(11,1): error SKEXP0020: 'Microsoft.SemanticKernel.Connectors.Qdrant.QdrantMemoryBuilderExtensions.WithQdrantMemoryStore(Microsoft.SemanticKernel.Memory.MemoryBuilder, string, int)' is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.", 118 | "output_type": "error", 119 | "traceback": [ 120 | "(11,1): error SKEXP0020: 'Microsoft.SemanticKernel.Connectors.Qdrant.QdrantMemoryBuilderExtensions.WithQdrantMemoryStore(Microsoft.SemanticKernel.Memory.MemoryBuilder, string, int)' is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed." 121 | ] 122 | } 123 | ], 124 | "source": [ 125 | "#pragma warning disable SKEXP0001\n", 126 | "#pragma warning disable SKEXP0010\n", 127 | "#pragma warning disable SKEXP0020\n", 128 | "\n", 129 | "\n", 130 | "var textEmbedding = new AzureOpenAITextEmbeddingGenerationService(\"Your Azure OpenAI Serivce text-embedding-ada-002 deploynment model\" , \"Your Azure OpenAI Serivce Endpoint\", \"Your Azure OpenAI Serivce Endpoint API Key\");\n", 131 | "\n", 132 | "\n", 133 | "var qdrantMemoryBuilder = new MemoryBuilder();\n", 134 | "qdrantMemoryBuilder.WithTextEmbeddingGeneration(textEmbedding);\n", 135 | "qdrantMemoryBuilder.WithQdrantMemoryStore(\"http://localhost:6333\", 1536);\n", 136 | "\n", 137 | "var qdrantBuilder = qdrantMemoryBuilder.Build();" 138 | ] 139 | }, 140 | { 141 | "cell_type": "code", 142 | "execution_count": 13, 143 | "metadata": { 144 | "dotnet_interactive": { 145 | "language": "csharp" 146 | }, 147 | "polyglot_notebook": { 148 | "kernelName": "csharp" 149 | }, 150 | "vscode": { 151 | "languageId": "polyglot-notebook" 152 | } 153 | }, 154 | "outputs": [], 155 | "source": [ 156 | "await qdrantBuilder.SaveInformationAsync(conceptCollectionName, id: \"info1\", text: \"Kinfey is Microsoft Cloud Advocate\");\n", 157 | "await qdrantBuilder.SaveInformationAsync(conceptCollectionName, id: \"info2\", text: \"Kinfey is ex-Microsoft MVP\");\n", 158 | "await qdrantBuilder.SaveInformationAsync(conceptCollectionName, id: \"info3\", text: \"Kinfey is AI Expert\");\n", 159 | "await qdrantBuilder.SaveInformationAsync(conceptCollectionName, id: \"info4\", text: \"OpenAI is a company that is developing artificial general intelligence (AGI) with widely distributed economic benefits.\");" 160 | ] 161 | }, 162 | { 163 | "cell_type": "code", 164 | "execution_count": 14, 165 | "metadata": { 166 | "dotnet_interactive": { 167 | "language": "csharp" 168 | }, 169 | "polyglot_notebook": { 170 | "kernelName": "csharp" 171 | }, 172 | "vscode": { 173 | "languageId": "polyglot-notebook" 174 | } 175 | }, 176 | "outputs": [], 177 | "source": [ 178 | "string questionText = \"Do you know kinfey ?\"; " 179 | ] 180 | }, 181 | { 182 | "cell_type": "code", 183 | "execution_count": 15, 184 | "metadata": { 185 | "dotnet_interactive": { 186 | "language": "csharp" 187 | }, 188 | "polyglot_notebook": { 189 | "kernelName": "csharp" 190 | }, 191 | "vscode": { 192 | "languageId": "polyglot-notebook" 193 | } 194 | }, 195 | "outputs": [], 196 | "source": [ 197 | "var searchResults = qdrantBuilder.SearchAsync(conceptCollectionName, questionText, limit: 3, minRelevanceScore: 0.7);\n" 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": 16, 203 | "metadata": { 204 | "dotnet_interactive": { 205 | "language": "csharp" 206 | }, 207 | "polyglot_notebook": { 208 | "kernelName": "csharp" 209 | }, 210 | "vscode": { 211 | "languageId": "polyglot-notebook" 212 | } 213 | }, 214 | "outputs": [ 215 | { 216 | "name": "stdout", 217 | "output_type": "stream", 218 | "text": [ 219 | "Kinfey is AI Expert : 0.88983846\n", 220 | "Kinfey is ex-Microsoft MVP : 0.87771213\n", 221 | "Kinfey is Microsoft Cloud Advocate : 0.8547776\n" 222 | ] 223 | } 224 | ], 225 | "source": [ 226 | "\n", 227 | "await foreach (var item in searchResults)\n", 228 | "{\n", 229 | " Console.WriteLine(item.Metadata.Text + \" : \" + item.Relevance);\n", 230 | "}" 231 | ] 232 | } 233 | ], 234 | "metadata": { 235 | "kernelspec": { 236 | "display_name": ".NET (C#)", 237 | "language": "C#", 238 | "name": ".net-csharp" 239 | }, 240 | "language_info": { 241 | "name": "python" 242 | }, 243 | "polyglot_notebook": { 244 | "kernelInfo": { 245 | "defaultKernelName": "csharp", 246 | "items": [ 247 | { 248 | "aliases": [], 249 | "name": "csharp" 250 | } 251 | ] 252 | } 253 | } 254 | }, 255 | "nbformat": 4, 256 | "nbformat_minor": 2 257 | } 258 | -------------------------------------------------------------------------------- /notebooks/python/01/.env: -------------------------------------------------------------------------------- 1 | AZURE_OPENAI_ENDPOINT = 'Your Azure OpenAI Service Endpoint' 2 | AZURE_OPENAI_API_KEY = 'Your Azure OpenAI Service API Key' 3 | AZURE_OPENAI_VERSION = 'Your Azure OpenAI Service Version, e.g. 2023-12-01-preview' -------------------------------------------------------------------------------- /notebooks/python/01/images/generated_image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/notebooks/python/01/images/generated_image.png -------------------------------------------------------------------------------- /notebooks/python/02/.env: -------------------------------------------------------------------------------- 1 | GLOBAL_LLM_SERVICE="AzureOpenAI" 2 | AZURE_OPENAI_ENDPOINT = 'Your Azure OpenAI Service Endpoint' 3 | AZURE_OPENAI_API_KEY = 'Your Azure OpenAI Service API Key' 4 | AZURE_OPENAI_VERSION = '2024-05-01-preview' 5 | AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="Your Azure OpenAI GPT-3.5 turbo delployment Name" -------------------------------------------------------------------------------- /notebooks/python/03/.env: -------------------------------------------------------------------------------- 1 | GLOBAL_LLM_SERVICE="AzureOpenAI" 2 | AZURE_OPENAI_ENDPOINT = 'Your Azure OpenAI Service Endpoint' 3 | AZURE_OPENAI_API_KEY = 'Your Azure OpenAI Service API Key' 4 | AZURE_OPENAI_VERSION = '2024-05-01-preview' 5 | AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="Your Azure OpenAI GPT-3.5 turbo delployment Name" -------------------------------------------------------------------------------- /notebooks/python/03/APIPlugin/CustomPlugin.py: -------------------------------------------------------------------------------- 1 | # from semantic_kernel.skill_definition import sk_function, sk_function_context_parameter 2 | from semantic_kernel.functions import kernel_function 3 | 4 | 5 | class CustomPlugin: 6 | @kernel_function( 7 | description = "Get news from the web", 8 | name = "NewsPlugin" 9 | ) 10 | def get_news_api(self, location: str) -> str: 11 | return """Get news from the """ + location + """.""" 12 | 13 | @kernel_function( 14 | description="Search Weather in a city", 15 | name="WeatherFunction" 16 | ) 17 | def ask_weather_function(self, city: str) -> str: 18 | return city + "’s weather is 30 celsius degree , and very hot." 19 | 20 | @kernel_function( 21 | description="Search Docs", 22 | name="DocsFunction" 23 | ) 24 | def ask_docs_function(self, doc: str) -> str: 25 | return "ask docs :" + doc -------------------------------------------------------------------------------- /notebooks/python/03/PluginWithSK.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "dotnet_interactive": { 8 | "language": "csharp" 9 | }, 10 | "polyglot_notebook": { 11 | "kernelName": "csharp" 12 | }, 13 | "vscode": { 14 | "languageId": "polyglot-notebook" 15 | } 16 | }, 17 | "outputs": [], 18 | "source": [ 19 | "! pip install semantic-kernel -U" 20 | ] 21 | }, 22 | { 23 | "cell_type": "code", 24 | "execution_count": null, 25 | "metadata": { 26 | "vscode": { 27 | "languageId": "polyglot-notebook" 28 | } 29 | }, 30 | "outputs": [], 31 | "source": [ 32 | "import semantic_kernel as sk\n", 33 | "import semantic_kernel.connectors.ai.open_ai as skaoai\n", 34 | "from semantic_kernel.connectors.ai.open_ai import AzureChatPromptExecutionSettings, OpenAIChatPromptExecutionSettings\n", 35 | "from semantic_kernel.prompt_template import InputVariable, PromptTemplateConfig" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": 3, 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "service_id = \"default\"\n", 45 | "\n", 46 | "kernel = sk.Kernel()\n", 47 | "\n", 48 | "kernel.add_service(\n", 49 | " skaoai.AzureChatCompletion(\n", 50 | " service_id=service_id,\n", 51 | " env_file_path=\".env\"\n", 52 | " ),\n", 53 | ")" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": 4, 59 | "metadata": {}, 60 | "outputs": [], 61 | "source": [ 62 | "base_plugin = \"../../../plugins\"" 63 | ] 64 | }, 65 | { 66 | "cell_type": "code", 67 | "execution_count": 5, 68 | "metadata": {}, 69 | "outputs": [], 70 | "source": [ 71 | "prompt = \"System: You are a python developer 。 User:{{$input}}\"" 72 | ] 73 | }, 74 | { 75 | "cell_type": "code", 76 | "execution_count": 6, 77 | "metadata": {}, 78 | "outputs": [], 79 | "source": [ 80 | "execution_settings = AzureChatPromptExecutionSettings(\n", 81 | " service_id=service_id,\n", 82 | " ai_model_id=\"GPT35TModel\",\n", 83 | " max_tokens=2000,\n", 84 | " temperature=0.2,\n", 85 | ")" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": 7, 91 | "metadata": {}, 92 | "outputs": [], 93 | "source": [ 94 | "prompt_template_config = PromptTemplateConfig(\n", 95 | " template=prompt,\n", 96 | " name=\"summarize\",\n", 97 | " template_format=\"semantic-kernel\",\n", 98 | " input_variables=[\n", 99 | " InputVariable(name=\"input\", description=\"The user input\", is_required=True),\n", 100 | " ],\n", 101 | " execution_settings=execution_settings,\n", 102 | ")" 103 | ] 104 | }, 105 | { 106 | "cell_type": "code", 107 | "execution_count": 8, 108 | "metadata": {}, 109 | "outputs": [], 110 | "source": [ 111 | "summarize = kernel.add_function(\n", 112 | " function_name=\"codeFunc\",\n", 113 | " plugin_name=\"genCodePlugin\",\n", 114 | " prompt_template_config=prompt_template_config,\n", 115 | ")" 116 | ] 117 | }, 118 | { 119 | "cell_type": "code", 120 | "execution_count": 9, 121 | "metadata": {}, 122 | "outputs": [], 123 | "source": [ 124 | "input_text = \"\"\"\n", 125 | "Generate a bubble algorithm method with python\n", 126 | "\"\"\"" 127 | ] 128 | }, 129 | { 130 | "cell_type": "code", 131 | "execution_count": 10, 132 | "metadata": {}, 133 | "outputs": [], 134 | "source": [ 135 | "gen = await kernel.invoke(summarize, input=input_text)" 136 | ] 137 | }, 138 | { 139 | "cell_type": "code", 140 | "execution_count": 11, 141 | "metadata": {}, 142 | "outputs": [ 143 | { 144 | "name": "stdout", 145 | "output_type": "stream", 146 | "text": [ 147 | "Sure! Here's an example of a bubble sort algorithm implemented in Python:\n", 148 | "\n", 149 | "```python\n", 150 | "def bubble_sort(arr):\n", 151 | " n = len(arr)\n", 152 | " \n", 153 | " for i in range(n-1):\n", 154 | " for j in range(0, n-i-1):\n", 155 | " if arr[j] > arr[j+1]:\n", 156 | " arr[j], arr[j+1] = arr[j+1], arr[j]\n", 157 | " \n", 158 | " return arr\n", 159 | "\n", 160 | "# Example usage:\n", 161 | "arr = [64, 34, 25, 12, 22, 11, 90]\n", 162 | "sorted_arr = bubble_sort(arr)\n", 163 | "print(\"Sorted array:\", sorted_arr)\n", 164 | "```\n", 165 | "\n", 166 | "In this implementation, the `bubble_sort` function takes an array as input and sorts it using the bubble sort algorithm. The outer loop runs `n-1` times, where `n` is the length of the array. The inner loop compares adjacent elements and swaps them if they are in the wrong order. This process is repeated until the array is fully sorted.\n", 167 | "\n", 168 | "The example usage demonstrates how to use the `bubble_sort` function to sort an array. The sorted array is then printed to the console.\n" 169 | ] 170 | } 171 | ], 172 | "source": [ 173 | "print(gen)" 174 | ] 175 | }, 176 | { 177 | "cell_type": "code", 178 | "execution_count": 12, 179 | "metadata": {}, 180 | "outputs": [], 181 | "source": [ 182 | "\n", 183 | "translate_plugin = kernel.add_plugin(parent_directory=base_plugin,plugin_name =\"TranslatePlugin\")" 184 | ] 185 | }, 186 | { 187 | "cell_type": "code", 188 | "execution_count": 13, 189 | "metadata": {}, 190 | "outputs": [], 191 | "source": [ 192 | "mulitlingual = translate_plugin[\"MultiLanguage\"]" 193 | ] 194 | }, 195 | { 196 | "cell_type": "code", 197 | "execution_count": 14, 198 | "metadata": {}, 199 | "outputs": [], 200 | "source": [ 201 | "result = await kernel.invoke(mulitlingual, input=\"hello world\", language=\"fr\")" 202 | ] 203 | }, 204 | { 205 | "cell_type": "code", 206 | "execution_count": 15, 207 | "metadata": {}, 208 | "outputs": [ 209 | { 210 | "name": "stdout", 211 | "output_type": "stream", 212 | "text": [ 213 | "Bonjour tout le monde\n" 214 | ] 215 | } 216 | ], 217 | "source": [ 218 | "print(result)" 219 | ] 220 | }, 221 | { 222 | "cell_type": "code", 223 | "execution_count": 16, 224 | "metadata": {}, 225 | "outputs": [], 226 | "source": [ 227 | "import APIPlugin.CustomPlugin as custom_plugin" 228 | ] 229 | }, 230 | { 231 | "cell_type": "code", 232 | "execution_count": 17, 233 | "metadata": {}, 234 | "outputs": [], 235 | "source": [ 236 | "# api_plugin = kernel.import_plugin(custom_plugin.CustomPlugin(), plugin_name=\"CustomPlugin\") \n", 237 | "\n", 238 | "api_plugin = kernel.add_plugin(custom_plugin.CustomPlugin(), \"CustomPlugin\")" 239 | ] 240 | }, 241 | { 242 | "cell_type": "code", 243 | "execution_count": 18, 244 | "metadata": {}, 245 | "outputs": [ 246 | { 247 | "name": "stdout", 248 | "output_type": "stream", 249 | "text": [ 250 | "Guangzhou’s weather is 30 celsius degree , and very hot.\n" 251 | ] 252 | } 253 | ], 254 | "source": [ 255 | "\n", 256 | "ask_weather_function = api_plugin[\"WeatherFunction\"]\n", 257 | "weather_result = await ask_weather_function(kernel, city=\"Guangzhou\")\n", 258 | "print(weather_result)" 259 | ] 260 | }, 261 | { 262 | "cell_type": "code", 263 | "execution_count": 19, 264 | "metadata": {}, 265 | "outputs": [ 266 | { 267 | "name": "stdout", 268 | "output_type": "stream", 269 | "text": [ 270 | "Get news from the Seattle.\n" 271 | ] 272 | } 273 | ], 274 | "source": [ 275 | "ask_news_function = api_plugin[\"NewsPlugin\"]\n", 276 | "news_result = await ask_news_function(kernel, location=\"Seattle\")\n", 277 | "print(news_result)" 278 | ] 279 | } 280 | ], 281 | "metadata": { 282 | "kernelspec": { 283 | "display_name": "pydev", 284 | "language": "python", 285 | "name": "python3" 286 | }, 287 | "language_info": { 288 | "codemirror_mode": { 289 | "name": "ipython", 290 | "version": 3 291 | }, 292 | "file_extension": ".py", 293 | "mimetype": "text/x-python", 294 | "name": "python", 295 | "nbconvert_exporter": "python", 296 | "pygments_lexer": "ipython3", 297 | "version": "3.10.12" 298 | } 299 | }, 300 | "nbformat": 4, 301 | "nbformat_minor": 2 302 | } 303 | -------------------------------------------------------------------------------- /notebooks/python/04/.env: -------------------------------------------------------------------------------- 1 | GLOBAL_LLM_SERVICE="AzureOpenAI" 2 | AZURE_OPENAI_ENDPOINT = 'Your Azure OpenAI Service Endpoint' 3 | AZURE_OPENAI_API_KEY = 'Your Azure OpenAI Service API Key' 4 | AZURE_OPENAI_VERSION = '2024-05-01-preview' 5 | AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="Your Azure OpenAI GPT-3.5 turbo delployment Name" -------------------------------------------------------------------------------- /notebooks/python/04/APIPlugin/CustomPlugin.py: -------------------------------------------------------------------------------- 1 | # from semantic_kernel.skill_definition import sk_function, sk_function_context_parameter 2 | from semantic_kernel.functions import kernel_function 3 | 4 | 5 | class CustomPlugin: 6 | @kernel_function( 7 | description = "Get news from the web", 8 | name = "NewsPlugin" 9 | ) 10 | def get_news_api(self, location: str) -> str: 11 | return """Get news from the """ + location + """.""" 12 | 13 | @kernel_function( 14 | description="Search Weather in a city", 15 | name="WeatherFunction" 16 | ) 17 | def ask_weather_function(self, city: str) -> str: 18 | return city + "’s weather is 30 celsius degree , and very hot." 19 | 20 | @kernel_function( 21 | description="Search Docs", 22 | name="DocsFunction" 23 | ) 24 | def ask_docs_function(self, doc: str) -> str: 25 | return "ask docs :" + doc -------------------------------------------------------------------------------- /notebooks/python/04/PlannerWithSK.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import os\n", 10 | "import sys\n", 11 | "\n", 12 | "import semantic_kernel as sk\n", 13 | "import semantic_kernel.connectors.ai.open_ai as skaoai\n", 14 | "from semantic_kernel.connectors.ai.open_ai import AzureChatPromptExecutionSettings, OpenAIChatPromptExecutionSettings\n", 15 | "from semantic_kernel.prompt_template import InputVariable, PromptTemplateConfig" 16 | ] 17 | }, 18 | { 19 | "cell_type": "code", 20 | "execution_count": 2, 21 | "metadata": {}, 22 | "outputs": [], 23 | "source": [ 24 | "\n", 25 | "kernel = sk.Kernel()" 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": 3, 31 | "metadata": {}, 32 | "outputs": [], 33 | "source": [ 34 | "service_id = \"default\"\n", 35 | "\n", 36 | "kernel = sk.Kernel()\n", 37 | "\n", 38 | "kernel.add_service(\n", 39 | " skaoai.AzureChatCompletion(\n", 40 | " service_id=service_id,\n", 41 | " env_file_path=\".env\"\n", 42 | " ),\n", 43 | ")" 44 | ] 45 | }, 46 | { 47 | "cell_type": "code", 48 | "execution_count": 4, 49 | "metadata": {}, 50 | "outputs": [], 51 | "source": [ 52 | "from semantic_kernel.planners import SequentialPlanner\n", 53 | "\n", 54 | "planner = SequentialPlanner(kernel, service_id)" 55 | ] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": 5, 60 | "metadata": {}, 61 | "outputs": [], 62 | "source": [ 63 | "base_skills_directory = '../../../plugins'" 64 | ] 65 | }, 66 | { 67 | "cell_type": "code", 68 | "execution_count": 6, 69 | "metadata": {}, 70 | "outputs": [], 71 | "source": [ 72 | "import APIPlugin.CustomPlugin as custom_plugin" 73 | ] 74 | }, 75 | { 76 | "cell_type": "code", 77 | "execution_count": 7, 78 | "metadata": {}, 79 | "outputs": [], 80 | "source": [ 81 | "#custom_plugin = kernel.import_native_skill_from_directory(base_skills_directory , \"APIPlugin\")\n", 82 | "# custom_plugin = kernel.import_plugin(custom_plugin.CustomPlugin(), plugin_name=\"CustomPlugin\") #.import_native_skill_from_directory(base_plugin , \"APIPlugin\")\n", 83 | "# writer_plugin = kernel.import_semantic_plugin_from_directory(base_skills_directory, \"WriterPlugin\")\n", 84 | "# email_plugin = kernel.import_semantic_plugin_from_directory(base_skills_directory, \"EmailPlugin\")\n", 85 | "# translate_plugin = kernel.import_semantic_plugin_from_directory(base_skills_directory, \"TranslatePlugin\")\n", 86 | "\n", 87 | "\n", 88 | "custom_plugin = kernel.add_plugin(custom_plugin.CustomPlugin(), \"CustomPlugin\")\n", 89 | "writer_plugin = kernel.add_plugin(parent_directory=base_skills_directory,plugin_name =\"WriterPlugin\")\n", 90 | "email_plugin = kernel.add_plugin(parent_directory=base_skills_directory,plugin_name =\"EmailPlugin\")\n", 91 | "translate_plugin = kernel.add_plugin(parent_directory=base_skills_directory,plugin_name =\"TranslatePlugin\")" 92 | ] 93 | }, 94 | { 95 | "cell_type": "code", 96 | "execution_count": 8, 97 | "metadata": {}, 98 | "outputs": [], 99 | "source": [ 100 | "ask = \"\"\"\n", 101 | "Write email about travellings tips based on getting current weather in Guangzhou \n", 102 | "\"\"\"\n", 103 | "# original_plan = await planner.create_plan(ask, kernel)\n", 104 | "\n", 105 | "original_plan = await planner.create_plan(goal=ask)" 106 | ] 107 | }, 108 | { 109 | "cell_type": "code", 110 | "execution_count": 9, 111 | "metadata": {}, 112 | "outputs": [ 113 | { 114 | "name": "stdout", 115 | "output_type": "stream", 116 | "text": [ 117 | "- Search Weather in a city using CustomPlugin-WeatherFunction with parameters: {'city': 'Guangzhou'}\n", 118 | "- tips using WriterPlugin-Tips with parameters: {'input': 'The current weather in Guangzhou is $WEATHER_OUTPUT', 'language': 'English'}\n", 119 | "- Write an email about weather using EmailPlugin-WeatherMail with parameters: {'input': 'Here are some travel tips based on the current weather in Guangzhou: $RESULT__TRAVEL_TIPS'}\n" 120 | ] 121 | } 122 | ], 123 | "source": [ 124 | "\n", 125 | "for step in original_plan._steps:\n", 126 | " print(\n", 127 | " f\"- {step.description.replace('.', '') if step.description else 'No description'} using {step.metadata.fully_qualified_name} with parameters: {step.parameters}\"\n", 128 | " )" 129 | ] 130 | }, 131 | { 132 | "cell_type": "code", 133 | "execution_count": 10, 134 | "metadata": {}, 135 | "outputs": [], 136 | "source": [ 137 | "results = await original_plan.invoke(kernel)" 138 | ] 139 | }, 140 | { 141 | "cell_type": "code", 142 | "execution_count": 11, 143 | "metadata": {}, 144 | "outputs": [ 145 | { 146 | "name": "stdout", 147 | "output_type": "stream", 148 | "text": [ 149 | "Subject: Dressing Tips for Guangzhou's Hot Weather - Travel Advisory\n", 150 | "\n", 151 | "Dear [Colleagues],\n", 152 | "\n", 153 | "As we prepare for our upcoming trip to Guangzhou, I wanted to share some dressing tips to help us stay cool and comfortable in the hot weather of 30 degrees Celsius. Please pay attention to these recommendations and adjust your travel attire accordingly:\n", 154 | "\n", 155 | "1. Choose lightweight and breathable fabrics like cotton, linen, or chiffon. These materials allow air circulation and keep you cool. (选择轻薄透气的面料,如棉,麻或雪纺。这些材料有助于空气流通,保持凉爽。)\n", 156 | "\n", 157 | "2. Opt for loose-fitting and flowy clothing to allow better ventilation and ease of movement. (选择宽松飘逸的服装,有助于通风,并提供更自由的动作。)\n", 158 | "\n", 159 | "3. Wear light-colored outfits as they reflect sunlight and help to keep you cooler. (穿浅色的衣服,因为它们反射阳光,有助于保持凉爽。)\n", 160 | "\n", 161 | "4. Avoid tight or fitted clothing that may trap heat and cause discomfort. (避免穿着紧身或合身的衣物,这可能会困住热量,并引起不适。)\n", 162 | "\n", 163 | "5. It's advisable to wear wide-brimmed hats or caps to protect your face and neck from the direct sun. (建议戴宽檐帽或帽子,以保护脸部和颈部不受直接阳光照射。)\n", 164 | "\n", 165 | "6. Use sunscreen with a high SPF to shield your skin from harmful UV rays. (使用高SPF值的防晒霜,以防止皮肤受到有害的紫外线辐射。)\n", 166 | "\n", 167 | "7. Carry a lightweight and breathable umbrella for sun protection, especially during peak hours. (随身携带轻便透气的遮阳伞,尤其是在高峰时段以防紫外线。)\n", 168 | "\n", 169 | "8. Opt for open-toed shoes or sandals to keep your feet cool and comfortable. (选择露趾鞋或凉鞋,使脚部保持凉爽和舒适。)\n", 170 | "\n", 171 | "9. Stay hydrated by drinking plenty of water throughout the day to combat the heat. (一天中要喝足够的水,保持身体水分,对抗炎热。)\n", 172 | "\n", 173 | "Please remember to prioritize sun protection and dress in lightweight and comfortable clothing to ensure a pleasant experience during our trip to Guangzhou. Additionally, it is important for all of us to pay attention to the travel conditions at all times, especially regarding any changes in weather or local advisories. Let's stay updated and make necessary adjustments to our plans if needed.\n", 174 | "\n", 175 | "If you have any questions or concerns regarding the trip or the dressing tips, please feel free to reach out to me.\n", 176 | "\n", 177 | "Looking forward to our trip!\n", 178 | "\n", 179 | "Best regards,\n", 180 | "\n", 181 | "[Your Name]\n" 182 | ] 183 | } 184 | ], 185 | "source": [ 186 | "print(results)" 187 | ] 188 | } 189 | ], 190 | "metadata": { 191 | "kernelspec": { 192 | "display_name": "pydev", 193 | "language": "python", 194 | "name": "python3" 195 | }, 196 | "language_info": { 197 | "codemirror_mode": { 198 | "name": "ipython", 199 | "version": 3 200 | }, 201 | "file_extension": ".py", 202 | "mimetype": "text/x-python", 203 | "name": "python", 204 | "nbconvert_exporter": "python", 205 | "pygments_lexer": "ipython3", 206 | "version": "3.10.12" 207 | }, 208 | "orig_nbformat": 4 209 | }, 210 | "nbformat": 4, 211 | "nbformat_minor": 2 212 | } 213 | -------------------------------------------------------------------------------- /notebooks/python/05/.env: -------------------------------------------------------------------------------- 1 | GLOBAL_LLM_SERVICE="AzureOpenAI" 2 | AZURE_OPENAI_ENDPOINT = 'Your Azure OpenAI Service Endpoint' 3 | AZURE_OPENAI_API_KEY = 'Your Azure OpenAI Service API Key' 4 | AZURE_OPENAI_VERSION = '2024-05-01-preview' 5 | AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="Your Azure OpenAI GPT-3.5 turbo delployment Name" 6 | AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME="Your Azure OpenAI Embedding Model delployment Name" -------------------------------------------------------------------------------- /pdf/cn/SemanticKernelCookBook.cn.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/pdf/cn/SemanticKernelCookBook.cn.pdf -------------------------------------------------------------------------------- /pdf/en/SemanticKernelCookBook.en.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/pdf/en/SemanticKernelCookBook.en.pdf -------------------------------------------------------------------------------- /plugins/APIPlugin/CustomPlugin.py: -------------------------------------------------------------------------------- 1 | from semantic_kernel.functions import kernel_function 2 | 3 | 4 | class CustomPlugin: 5 | @kernel_function( 6 | description = "Get news from the web", 7 | name = "NewsPlugin" 8 | ) 9 | def get_news_api(self,location:str) -> str: 10 | return """Get news from the """ + location + """.""" 11 | 12 | @kernel_function( 13 | description="Search Weather in a city", 14 | name="WeatherFunction" 15 | ) 16 | def ask_weather_function(self,city: str) -> str: 17 | return city + "’s weather is 30 celsius degree , and very hot." 18 | 19 | @kernel_function( 20 | description="Search Docs", 21 | name="DocsFunction" 22 | ) 23 | def ask_docs_function(self,docs: str) -> str: 24 | return "ask docs :" + docs -------------------------------------------------------------------------------- /plugins/CustomPlugin/CompanySearchPlugin.cs: -------------------------------------------------------------------------------- 1 | using Microsoft.SemanticKernel; 2 | using System.ComponentModel; 3 | using System.Globalization; 4 | 5 | 6 | public class CompanySearchPlugin 7 | { 8 | [KernelFunction,Description("search employee infomation")] 9 | public string EmployeeSearch(string input) 10 | { 11 | return "欢迎了解社保相关内容"; 12 | } 13 | 14 | [KernelFunction,Description("search weather")] 15 | public string WeatherSearch(string city) 16 | { 17 | return city + ", 2 degree,rainy"; 18 | } 19 | } -------------------------------------------------------------------------------- /plugins/EmailPlugin/WeatherMail/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Write an email about weather", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 1024, 7 | "temperature": 0.0, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /plugins/EmailPlugin/WeatherMail/skprompt.txt: -------------------------------------------------------------------------------- 1 | Based on the weather and dressing tips given by {{$input}}, write an email to your traveling colleagues and ask your traveling colleagues to pay attention to the travel conditions at all times. 2 | -------------------------------------------------------------------------------- /plugins/TranslatePlugin/Basic/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Translation", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 1024, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /plugins/TranslatePlugin/Basic/skprompt.txt: -------------------------------------------------------------------------------- 1 | Translate {{$input}} to English -------------------------------------------------------------------------------- /plugins/TranslatePlugin/MultiLanguage/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Translate sentenses into a language of your choice", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 2000, 7 | "temperature": 0.7, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0, 11 | "stop_sequences": [ 12 | "[done]" 13 | ] 14 | } 15 | }, 16 | "input_variables": [ 17 | { 18 | "name": "input", 19 | "description": "sentense to translate", 20 | "default": "" 21 | }, 22 | { 23 | "name": "language", 24 | "description": "Language to translate to", 25 | "default": "" 26 | } 27 | ] 28 | } -------------------------------------------------------------------------------- /plugins/TranslatePlugin/MultiLanguage/skprompt.txt: -------------------------------------------------------------------------------- 1 | Translate the input below into {{$language}} 2 | 3 | {{$input}} -------------------------------------------------------------------------------- /plugins/WriterPlugin/Classification/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "text classification", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 60, 7 | "top_p": 0.0, 8 | "presence_penalty": 0.0, 9 | "frequency_penalty": 0.0 10 | } 11 | } 12 | } -------------------------------------------------------------------------------- /plugins/WriterPlugin/Classification/skprompt.txt: -------------------------------------------------------------------------------- 1 | 请帮我把 {{$input}} 进行类别确认,类别包括天气,课程,生成式,如果不太清楚,请回答没法确认,分类参考如下: 2 | 3 | 问: 会下雨吗? 类别:天气 4 | 问: 今天温度? 类别:天气 5 | 问: 适度多少? 类别:天气 6 | 问: 什么是新能源车? 类别: 课程 7 | 问: 电动车的特点 类别: 课程 8 | 问: 概念是什么? 类别: 课程 9 | 问: 课程相关的内容有哪些? 类别: 课程 10 | 问: 写一首诗歌? 类别: 生成式 11 | 问: 翻译一下 类别: 生成式 12 | 问: 计算结果 类别: 生成式 13 | 14 | 如果能确认类别,天气相关请只输出 1 , 课程相关请只输出 2 , 生成式相关请只输出 3 ,没法确认相关请只输出 0,并把{{$input}}和它的类别参考以下 json 格式输出 15 | 16 | {""question"":""{{$input}}"",""label"":""{{$label}}""} -------------------------------------------------------------------------------- /plugins/WriterPlugin/Tips/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "tips", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 1024, 7 | "temperature": 0.7, 8 | "presence_penalty": 0.0, 9 | "frequency_penalty": 0.0 10 | } 11 | }, 12 | "input_variables": [ 13 | { 14 | "name": "input", 15 | "description": "tips", 16 | "default": "" 17 | }, 18 | { 19 | "name": "language", 20 | "description": "Language to translate to", 21 | "default": "Spanish" 22 | } 23 | ] 24 | } -------------------------------------------------------------------------------- /plugins/WriterPlugin/Tips/skprompt.txt: -------------------------------------------------------------------------------- 1 | Get dressing tips based on {{$input}} and translate them into the corresponding {{$language}} -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/data/.env: -------------------------------------------------------------------------------- 1 | AZURE_OPENAI_ENDPOINT = "" 2 | AZURE_OPENAI_API_KEY = "" 3 | AZURE_OPENAI_API_VERSION = "" 4 | AZURE_OPENAI_DEPLOYMENT_NAME = "" -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/data/transcripts/01-introduction.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | hello and welcome to this course on 5 | 0:11 6 | classical machine learning for beginners 7 | 0:13 8 | whether you're completely new to the 9 | 0:15 10 | topic or an experienced ml practitioner 11 | 0:17 12 | looking to brush up on an area we're 13 | 0:19 14 | happy to have you join us this course is 15 | 16 | Introducing ML for Beginners 17 | 18 | 0:22 19 | based on the free open source 26 lesson 20 | 0:25 21 | ml for beginners curriculum from 22 | 0:27 23 | Microsoft which can be found at AKA dot 24 | 0:30 25 | Ms slash ml-beginners machine learning 26 | 0:34 27 | is one of the most popular Technologies 28 | 0:36 29 | these days I'm sure you've heard this 30 | 0:38 31 | term if you have any sort of familiarity 32 | 0:41 33 | with technology no matter what domain 34 | 0:42 35 | you work in however the mechanics of 36 | 0:45 37 | machine learning are a mystery to most 38 | 0:47 39 | people and the subject can sometimes 40 | 0:49 41 | feel overwhelming in this course you'll 42 | 0:51 43 | start right from the beginning and 44 | 0:53 45 | you'll learn about it step by step to 46 | 0:56 47 | practical Hands-On coding examples let's 48 | 49 | The difference between AI and ML 50 | 51 | 0:58 52 | start by talking about the difference 53 | 1:00 54 | between artificial intelligence and 55 | 1:03 56 | machine learning AI is a science of 57 | 1:05 58 | getting machines to accomplish tasks 59 | 1:07 60 | that typically require human level 61 | 1:09 62 | intelligence many different techniques 63 | 1:12 64 | have been proposed for AI but the most 65 | 1:14 66 | successful and popular approach these 67 | 1:16 68 | days is machine learning 69 | 1:19 70 | unlike other AI techniques ml uses 71 | 1:22 72 | specialized algorithms to make decisions 73 | 1:24 74 | by learning from data so machine 75 | 1:28 76 | learning is really a subset of 77 | 1:30 78 | artificial intelligence 79 | 1:32 80 | you've also probably heard of deep 81 | 1:33 82 | learning which is a subset of machine 83 | 1:35 84 | learning that relies on neural networks 85 | 1:38 86 | to learn from data in this course we're 87 | 88 | What you'll learn in this course 89 | 90 | 1:40 91 | going to cover what we call classical 92 | 1:42 93 | machine learning you'll learn some Core 94 | 1:45 95 | Concepts of ml a bit of History 96 | 1:48 97 | statistical techniques like regression 98 | 1:50 99 | classification clustering and more 100 | 1:54 101 | the concepts you'll learn here will 102 | 1:55 103 | serve you well as you progress to more 104 | 1:57 105 | 106 | Advanced Techniques 107 | 108 | What you won't learn in this course 109 | 1:59 110 | keep in mind that this course won't 111 | 2:00 112 | cover data science deep learning neural 113 | 2:03 114 | networks and AI techniques other than ml 115 | 2:06 116 | Microsoft offers two additional courses 117 | 2:09 118 | for you to learn more about these areas 119 | 2:10 120 | data science for beginners available at 121 | 2:13 122 | AKA dot Ms slash data science beginners 123 | 2:16 124 | and AI for beginners available at aka.ms 125 | 2:20 126 | Ai and beginners machine learning is a 127 | 128 | Why study Machine Learning 129 | 130 | 2:23 131 | Hot Topic because it's solving complex 132 | 2:25 133 | real-world problems in so many areas 134 | 2:28 135 | Finance earth science space exploration 136 | 2:31 137 | cognitive science and many more Fields 138 | 2:33 139 | have adopted machine learning to solve 140 | 2:35 141 | problems specific to their domains for 142 | 2:39 143 | example you can use machine learning to 144 | 2:41 145 | predict the likelihood of disease from a 146 | 2:43 147 | patient's medical history 148 | 2:45 149 | to anticipate weather events to 150 | 2:48 151 | understand the sentiment of a text and 152 | 2:50 153 | to detect fake news and stop the spread 154 | 2:52 155 | of propaganda applications of machine 156 | 2:54 157 | learning are almost everywhere and are 158 | 2:57 159 | as ubiquitous as the data that is 160 | 2:59 161 | Flowing from our devices and systems 162 | 3:01 163 | because of how useful it is 164 | 3:03 165 | understanding the basics of machine 166 | 3:04 167 | learning is going to help you no matter 168 | 3:06 169 | what domain you're coming from in the 170 | 3:09 171 | next video in the series I'll give an 172 | 3:10 173 | overview of the history of ml I'll see 174 | 3:13 175 | you there -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/data/transcripts/02-history-of-ml.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | in this video I want to give you a bit 5 | 0:11 6 | of Background by walking through the 7 | 0:13 8 | major milestones in the history of 9 | 0:14 10 | machine learning and artificial 11 | 0:16 12 | intelligence 13 | 0:17 14 | the modern world of artificial 15 | 16 | Alan Turing and the Turing test 17 | 18 | 0:19 19 | intelligence really begun in 1950s 20 | 0:21 21 | though it's based on mathematical and 22 | 0:24 23 | statistical developments over many 24 | 0:26 25 | centuries Alan Turing is credited with 26 | 0:28 27 | helping to lay the foundation for the 28 | 0:30 29 | concept of a machine they can think 30 | 0:33 31 | in his quest to Define machine 32 | 0:35 33 | intelligence he achieved a crucial 34 | 0:37 35 | Milestone by creating the Turing test in 36 | 0:40 37 | 1950. in this test an interrogator 38 | 0:43 39 | questions both a human and a computer 40 | 0:45 41 | and tries to determine which one is 42 | 0:47 43 | which 44 | 0:48 45 | if the interrogator cannot tell the 46 | 0:50 47 | difference then the computer can be 48 | 0:52 49 | considered intelligent in 1956 the term 50 | 51 | The Dartmouth Summer Research Project on AI 52 | 53 | 0:55 54 | artificial intelligence was coined with 55 | 0:57 56 | a small group of scientists gathered at 57 | 1:00 58 | Dartmouth College in the U.S for an 59 | 1:02 60 | event called the summer research project 61 | 1:04 62 | on artificial intelligence 63 | 1:06 64 | this conference was the birth of the 65 | 1:09 66 | field of research we know as AI the 67 | 1:11 68 | years from 1956 to 1974 are known as The 69 | 70 | The golden years of AI 71 | 72 | 1:15 73 | Golden Ears of AI optimism ran high in 74 | 1:19 75 | the hope that AI could solve many 76 | 1:21 77 | problems in 1967 Marvin Minsky the 78 | 1:24 79 | co-founder of the mitai lab stated 80 | 1:27 81 | confidently and incorrectly that within 82 | 1:30 83 | a generation the problem of creating 84 | 1:32 85 | artificial intelligence will 86 | 1:34 87 | substantially be solved 88 | 1:36 89 | natural language processing research 90 | 1:39 91 | flourished search was refined and made 92 | 1:41 93 | more powerful and the concept of micro 94 | 1:44 95 | worlds was created where simple tasks 96 | 1:46 97 | were completed using plain language 98 | 1:48 99 | instructions 100 | 1:50 101 | research was well funded by government 102 | 1:51 103 | agencies advances were made in 104 | 1:54 105 | computation and algorithms and 106 | 1:56 107 | prototypes of intelligent machines were 108 | 1:58 109 | built 110 | 1:59 111 | some of these machines include shaky the 112 | 2:01 113 | robot who could maneuver and decide how 114 | 2:04 115 | to perform tasks intelligently Eliza and 116 | 2:07 117 | gnarly charabot that could converse with 118 | 2:10 119 | people and act as a primitive therapist 120 | 2:12 121 | Blocksworld an example of a micro world 122 | 2:15 123 | where blogs get could be stacked and 124 | 2:17 125 | sorted and decision-making experiments 126 | 2:19 127 | could be tested by the mid-1970s it had 128 | 129 | The AI winter 130 | 131 | 2:22 132 | become apparent that the complexity of 133 | 2:24 134 | making intelligent machines had been 135 | 2:26 136 | understated and that its promise had 137 | 2:28 138 | been overblown compute power was too 139 | 2:31 140 | limited there was a lack of data to 141 | 2:33 142 | train and test AIS and there were 143 | 2:35 144 | questions around the ethics of 145 | 2:37 146 | introducing AI systems like the 147 | 2:39 148 | therapist Eliza into society 149 | 2:42 150 | funding dried up and confidence in the 151 | 2:44 152 | field slowed marking the beginning of 153 | 2:46 154 | what is called an AI winter in the 1980s 155 | 2:49 156 | as computers became more powerful expert 157 | 158 | Resurgence and fall of AI for expert systems 159 | 160 | 2:52 161 | systems became more successful there was 162 | 2:55 163 | a Resurgence in optimism about AI as 164 | 2:58 165 | businesses found practical applications 166 | 3:00 167 | of these rule-based inference systems by 168 | 3:03 169 | the late 80s it was becoming apparent 170 | 3:05 171 | that expert systems had become too 172 | 3:07 173 | specialized and were unlikely to achieve 174 | 3:10 175 | machine intelligence the rise of 176 | 3:12 177 | personal computers also competed with 178 | 3:15 179 | these large specialized centralized 180 | 3:16 181 | systems this led to another chill in the 182 | 3:19 183 | AI field things began to change in the 184 | 3:22 185 | mid-1990s as compute and storage 186 | 3:24 187 | capabilities grew exponentially making 188 | 3:27 189 | it possible to process much larger data 190 | 191 | Growth in AI driven by more data and more powerful hardware 192 | 193 | 3:30 194 | sets than ever before 195 | 3:32 196 | the rise of the internet and the 197 | 3:34 198 | popularity of smartphones both 199 | 3:36 200 | contributed to increasing amounts of 201 | 3:38 202 | data a new experiments in machine 203 | 3:39 204 | learning became possible 205 | 3:42 206 | throughout the 2000s significant 207 | 3:44 208 | advancements were made in computer 209 | 3:45 210 | vision and natural language processing 211 | 3:47 212 | by training machine learning models on 213 | 3:50 214 | 215 | Big Data 216 | 217 | 3:52 218 | in the past decade compute power and the 219 | 3:55 220 | size of data sets have continued to grow 221 | 3:57 222 | and machine learning has become capable 223 | 4:00 224 | of solving even more problems as a 225 | 4:02 226 | result today machine learning touches 227 | 4:04 228 | almost every part of our Lives sometimes 229 | 4:07 230 | we're well aware of it like when we 231 | 4:09 232 | interact with chat TPT in the browser or 233 | 4:12 234 | see a self-driving car go by 235 | 4:14 236 | but most of the time it's seamlessly 237 | 4:16 238 | woven into familiar experiences of our 239 | 4:19 240 | everyday life such as when we're 241 | 4:21 242 | approved for a new loan or get a catalog 243 | 4:23 244 | at home 245 | 246 | Increased awareness of ethical and responsible AI 247 | 248 | 4:25 249 | this era has also been marked by an 250 | 4:27 251 | increased awareness of potential ethical 252 | 4:29 253 | issues in machine learning and by 254 | 4:31 255 | significant research in the field of 256 | 4:33 257 | responsible AI 258 | 4:34 259 | we want the benefits of AI but we also 260 | 4:37 261 | want AI that is responsible and doesn't 262 | 4:39 263 | amplify human bias 264 | 4:42 265 | in the next video we will introduce 266 | 4:44 267 | techniques for building using and 268 | 4:46 269 | maintaining machine learning models I'll 270 | 4:48 271 | see you there -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/data/transcripts/03-techniques-ml.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | the process of building machine learning 5 | 0:10 6 | models is very different from any other 7 | 0:13 8 | development workflow in this video 9 | 0:15 10 | you'll learn about that process more 11 | 0:18 12 | specifically you learn about deciding 13 | 0:20 14 | whether AI is the right approach for 15 | 0:22 16 | your problem collecting and preparing 17 | 0:24 18 | your data 19 | 0:25 20 | training your model evaluating your 21 | 0:28 22 | model tuning the hyper parameters and 23 | 0:30 24 | testing the trained model in the real 25 | 0:32 26 | world 27 | 28 | Decide if AI is the right approach 29 | 30 | 0:34 31 | traditional software is well suited to 32 | 0:36 33 | solve problems where the solution can be 34 | 0:39 35 | described as a formal set of rules in 36 | 0:42 37 | contrast AI shines in solving problems 38 | 0:44 39 | where the solution can be extracted from 40 | 0:46 41 | data many of the problems we encountered 42 | 0:49 43 | in our daily life can be efficiently 44 | 0:51 45 | solved with traditional programming if 46 | 0:53 47 | an engineer can break up the solution of 48 | 0:55 49 | a problem and Define it using precise 50 | 0:57 51 | rules then traditional programming is a 52 | 0:59 53 | great tool to use but many of the 54 | 1:01 55 | problems we encounter in our day-to-day 56 | 1:03 57 | aren't quite as easy to Define as a set 58 | 1:05 59 | of rules thankfully for many of those 60 | 1:08 61 | problems we have access to plenty of 62 | 1:10 63 | real life data containing useful 64 | 1:12 65 | information which means that AI can help 66 | 1:14 67 | us find a solution one good example is 68 | 1:17 69 | translating from one language to another 70 | 1:19 71 | writing a set of rules that full encodes 72 | 1:22 73 | all the parallels between two languages 74 | 1:24 75 | is not easy but there are many examples 76 | 1:26 77 | of translation online so AI has been 78 | 1:29 79 | able to do a much better job of 80 | 1:30 81 | translation than previous attempts so 82 | 1:33 83 | our first step when we're starting a new 84 | 1:35 85 | project should be to analyze the problem 86 | 1:37 87 | and determine which technique is best to 88 | 1:39 89 | solve it if you're able to obtain plenty 90 | 1:41 91 | of data that contains useful information 92 | 1:43 93 | about your solution then AI is a 94 | 1:46 95 | promising approach once you decided that 96 | 97 | Collect and prepare data 98 | 99 | 1:48 100 | AI is the right method for you you need 101 | 1:50 102 | to collect and prepare your data for 103 | 1:52 104 | example you may need to normalize it or 105 | 1:54 106 | convert it to a different form or remove 107 | 1:56 108 | rows that are missing certain Fields 109 | 1:58 110 | once your data is clean you need to 111 | 2:00 112 | decide about which aspects of your data 113 | 2:03 114 | or features you're going to use as input 115 | 2:05 116 | to your prediction and which feature you 117 | 2:07 118 | want to predict for example if you have 119 | 2:10 120 | medical data you may decide to use 121 | 2:12 122 | features that describe the patient's 123 | 2:14 124 | medical history as input and a chance of 125 | 2:16 126 | a particular disease as the output 127 | 2:18 128 | feature you want to predict 129 | 2:21 130 | and finally you need to split your data 131 | 2:23 132 | into training and test sets a usual 133 | 2:25 134 | split is 80 for your training data and 135 | 2:29 136 | 20 for test 137 | 138 | Train your model 139 | 140 | 2:31 141 | next you need to choose a machine 142 | 2:33 143 | learning algorithm which you'll learn a 144 | 2:35 145 | lot about in the coming videos if you're 146 | 2:37 147 | undecided between a few good algorithms 148 | 2:39 149 | you may want to try them all and see 150 | 2:41 151 | which one performs best 152 | 2:43 153 | then you need to train your model using 154 | 2:45 155 | the training set you collected earlier 156 | 2:47 157 | and the algorithm you chose training a 158 | 2:50 159 | model may take a while especially if the 160 | 2:52 161 | model is large 162 | 163 | Evaluate your model 164 | 165 | 2:53 166 | once the model is trained you can test 167 | 2:55 168 | it using the test data set that you 169 | 2:57 170 | split earlier it's important that you 171 | 3:00 172 | test the algorithm with data that it 173 | 3:01 174 | hasn't seen during training to ensure 175 | 3:04 176 | that it generalizes well to new 177 | 3:06 178 | scenarios 179 | 180 | Tune the hyperparameters 181 | 182 | 3:07 183 | some algorithms contain hyper parameters 184 | 3:10 185 | which are settings that control key 186 | 3:11 187 | aspects of their inner workings choosing 188 | 3:14 189 | good hyper parameters is important 190 | 3:16 191 | because they can make a big difference 192 | 3:17 193 | in your results if you want to be 194 | 3:19 195 | systematic about your hyper parameter 196 | 3:21 197 | search you can write code that tries 198 | 3:23 199 | lots of different combinations and helps 200 | 3:26 201 | you discover the best values for your 202 | 3:27 203 | data once you get good test results it's 204 | Test the model in the real world 205 | 3:30 206 | time to see how well your model performs 207 | 3:32 208 | within the context of its intended use 209 | 3:34 210 | for example this could involve 211 | 3:36 212 | collecting live data from a sensor and 213 | 3:38 214 | using it to make predictions or 215 | 3:40 216 | deploying a model to a few users of your 217 | 3:42 218 | application if it all looks good then 219 | 3:45 220 | you're ready to release it to production 221 | 3:46 222 | and enjoy its benefits 223 | 3:49 224 | make sure you watch the next video where 225 | 3:51 226 | we'll start getting Hands-On with 227 | 3:53 228 | machine learning by configuring all the 229 | 3:55 230 | tools we'll use in the rest of the 231 | 3:57 232 | series I'll see you there -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/plugins/AnswerPlugin/Summary/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "type": "completion", 4 | "description": "Summary knowledge", 5 | "completion": { 6 | "max_tokens": 3000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/plugins/AnswerPlugin/Summary/skprompt.txt: -------------------------------------------------------------------------------- 1 | system : Please output the summary content according to the following requirements based on the input content. 2 | 3 | 1. Summarize the content within 1000 words 4 | 2. Please use plain and understandable language 5 | 6 | user: {{$input}} -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/plugins/FilePlugin/Notes/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Get knowledge from Transcript ", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 13000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/plugins/FilePlugin/Notes/skprompt.txt: -------------------------------------------------------------------------------- 1 | system : You are an AI teacher who understands markdown format. Please summarize the input content as required. 2 | 3 | 1. Ignore content related to Pre-lecture, Review & Self Study, and Assignment 4 | 5 | 2. Use # as content title , ## or ### is not content title 6 | 7 | 3. The format of knowledge points is ---\n## xxx ,such as 8 | 9 | --- 10 | ## xxx 11 | 12 | it is knowledge points, don't use knowledge points as title 13 | 14 | 3. Summarize knowledge points. The summary should not over 100 words. 15 | 16 | 4. After obtaining and summarizing all knowledge points, synthesize the content of all knowledge points and summarize it into a content of no more than 200 words. The output does not need to contain additional content except json like this 17 | {"kb": "title", "content": "summarize content"} 18 | 19 | user: {{$input}} 20 | 21 | -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/plugins/FilePlugin/Transcrips/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Get knowledge from Transcript ", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 13000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop1/plugins/FilePlugin/Transcrips/skprompt.txt: -------------------------------------------------------------------------------- 1 | System: You are an artificial intelligence teacher, please follow the transcript content 2 | 3 | Please perform content extraction according to the following conditions 4 | 5 | 6 | 1. There is a different time after the title of each knowledge point. Remove the time-related content and merge content. 7 | 8 | 2. Ignore the knowledge points and related content about Intro 9 | 10 | 3. get the title and merged content of each knowledge point ,summarize content of knowledge point in 5 tokens 11 | 12 | {"kb": "title", "content": "summarize content of knowledge point in 5 tokens"} 13 | 14 | 4. The output is a JSON array 15 | 16 | User: {{$input}} -------------------------------------------------------------------------------- /workshop/dotNET/workshop2/docs/session1.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/workshop/dotNET/workshop2/docs/session1.pdf -------------------------------------------------------------------------------- /workshop/dotNET/workshop2/genppt_notebook.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [] 9 | } 10 | ], 11 | "metadata": { 12 | "language_info": { 13 | "name": "python" 14 | } 15 | }, 16 | "nbformat": 4, 17 | "nbformat_minor": 2 18 | } 19 | -------------------------------------------------------------------------------- /workshop/dotNET/workshop2/plugins/ToolsPlugin/PPT/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Content Convert", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 12000, 7 | "temperature": 0.7, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop2/plugins/ToolsPlugin/PPT/skprompt.txt: -------------------------------------------------------------------------------- 1 | 把输入内容按照以下条件,生成 ppt 每一页的内容,以 --- 作为分隔符,每一页的内容都是独立的,不会有跨页的内容。 2 | 3 | 1. 提取节名作为独立一页输出,如 4 | 5 | ## **第一节 xxxxx** 6 | 7 | --- 8 | 9 | 2. 每一节内的内容,知识点都用汉字数字加上顿号分割开,每个知识点作为一页输出,每个知识点的对应的知识内容进行归纳并作为另一页输出,如 10 | 11 | ### **一、xxxxx** 12 | 13 | --- 14 | 15 | ### **二、xxxxx** 16 | 17 | --- 18 | 19 | 20 | 每个知识点的对应的知识内容,如果含有(一),(二),(三),(四) 这样的子知识点,每个子知识点单独作为一页,子知识点对应的内容也必须在同一页中,如 21 | 22 | (一) xxxxx 23 | 24 | ....... 25 | 26 | 27 | 28 | (二) xxxxx 29 | 30 | ....... 31 | 32 | 33 | (三) xxxxx 34 | 35 | ....... 36 | 37 | 38 | (四) xxxxx 39 | 40 | ....... 41 | 42 | 43 | 输出为 44 | 45 | ### **(一) xxxxx** 46 | 47 | ....... 48 | 49 | --- 50 | 51 | ### **(二) xxxxx** 52 | 53 | ....... 54 | 55 | --- 56 | 57 | ### **(三) xxxxx** 58 | 59 | ....... 60 | 61 | --- 62 | 63 | ### **(四) xxxxx** 64 | 65 | ....... 66 | 67 | --- 68 | 69 | 70 | 71 | 3. 如果没有子知识点,就在一页内输出内容,如 72 | 73 | xxxxx 74 | 75 | ....... 76 | 77 | 输出 78 | 79 | 80 | xxxxx 81 | 82 | ....... 83 | 84 | --- 85 | 86 | 87 | user: {{$input}} -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/SKAIAgent.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": null, 6 | "metadata": { 7 | "dotnet_interactive": { 8 | "language": "csharp" 9 | }, 10 | "polyglot_notebook": { 11 | "kernelName": "csharp" 12 | }, 13 | "vscode": { 14 | "languageId": "polyglot-notebook" 15 | } 16 | }, 17 | "outputs": [], 18 | "source": [ 19 | "#r \"nuget: Microsoft.SemanticKernel, *-*\"\n", 20 | "#r \"nuget: Microsoft.SemanticKernel.Experimental.Agents, *-*\"" 21 | ] 22 | }, 23 | { 24 | "cell_type": "code", 25 | "execution_count": null, 26 | "metadata": { 27 | "dotnet_interactive": { 28 | "language": "csharp" 29 | }, 30 | "polyglot_notebook": { 31 | "kernelName": "csharp" 32 | }, 33 | "vscode": { 34 | "languageId": "polyglot-notebook" 35 | } 36 | }, 37 | "outputs": [], 38 | "source": [ 39 | "using Microsoft.SemanticKernel;\n", 40 | "using Microsoft.SemanticKernel.Experimental.Agents;\n", 41 | "using Microsoft.SemanticKernel.Connectors.OpenAI;" 42 | ] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "execution_count": null, 47 | "metadata": { 48 | "dotnet_interactive": { 49 | "language": "csharp" 50 | }, 51 | "polyglot_notebook": { 52 | "kernelName": "csharp" 53 | }, 54 | "vscode": { 55 | "languageId": "polyglot-notebook" 56 | } 57 | }, 58 | "outputs": [], 59 | "source": [ 60 | "#!import ./Utils/Settings.cs" 61 | ] 62 | }, 63 | { 64 | "cell_type": "code", 65 | "execution_count": null, 66 | "metadata": { 67 | "dotnet_interactive": { 68 | "language": "csharp" 69 | }, 70 | "polyglot_notebook": { 71 | "kernelName": "csharp" 72 | }, 73 | "vscode": { 74 | "languageId": "polyglot-notebook" 75 | } 76 | }, 77 | "outputs": [], 78 | "source": [ 79 | "using System;\n", 80 | "using System.Collections.Generic;\n", 81 | "using System.Diagnostics;\n", 82 | "using System.IO;\n", 83 | "using System.Net.Http;\n", 84 | "using System.Text.Json;\n", 85 | "using System.Threading.Tasks;" 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": null, 91 | "metadata": { 92 | "dotnet_interactive": { 93 | "language": "csharp" 94 | }, 95 | "polyglot_notebook": { 96 | "kernelName": "csharp" 97 | }, 98 | "vscode": { 99 | "languageId": "polyglot-notebook" 100 | } 101 | }, 102 | "outputs": [], 103 | "source": [ 104 | "#!import ./plugins/CustomPlugin/GenCLIPlugin.cs\n", 105 | "#!import ./plugins/CustomPlugin/RunPlugin.cs" 106 | ] 107 | }, 108 | { 109 | "cell_type": "code", 110 | "execution_count": null, 111 | "metadata": { 112 | "dotnet_interactive": { 113 | "language": "csharp" 114 | }, 115 | "polyglot_notebook": { 116 | "kernelName": "csharp" 117 | }, 118 | "vscode": { 119 | "languageId": "polyglot-notebook" 120 | } 121 | }, 122 | "outputs": [], 123 | "source": [ 124 | "using System;\n", 125 | "using System.Collections.Generic;\n", 126 | "using System.Linq;\n", 127 | "using System.Threading.Tasks;" 128 | ] 129 | }, 130 | { 131 | "cell_type": "code", 132 | "execution_count": null, 133 | "metadata": { 134 | "dotnet_interactive": { 135 | "language": "csharp" 136 | }, 137 | "polyglot_notebook": { 138 | "kernelName": "csharp" 139 | }, 140 | "vscode": { 141 | "languageId": "polyglot-notebook" 142 | } 143 | }, 144 | "outputs": [], 145 | "source": [ 146 | "#pragma warning disable SKEXP0101\n", 147 | "\n", 148 | "List s_agents = new();" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": { 155 | "dotnet_interactive": { 156 | "language": "csharp" 157 | }, 158 | "polyglot_notebook": { 159 | "kernelName": "csharp" 160 | }, 161 | "vscode": { 162 | "languageId": "polyglot-notebook" 163 | } 164 | }, 165 | "outputs": [], 166 | "source": [ 167 | "var cli_plugin = KernelPluginFactory.CreateFromType();\n", 168 | "var run_plugin = KernelPluginFactory.CreateFromType();" 169 | ] 170 | }, 171 | { 172 | "cell_type": "code", 173 | "execution_count": null, 174 | "metadata": { 175 | "dotnet_interactive": { 176 | "language": "csharp" 177 | }, 178 | "polyglot_notebook": { 179 | "kernelName": "csharp" 180 | }, 181 | "vscode": { 182 | "languageId": "polyglot-notebook" 183 | } 184 | }, 185 | "outputs": [], 186 | "source": [ 187 | "#pragma warning disable SKEXP0101\n", 188 | "\n", 189 | "var cliAgent = await new AgentBuilder()\n", 190 | " .WithOpenAIChatCompletion(\"gpt-4-turbo-preview\", Settings.OpenAIKey)\n", 191 | " .WithInstructions(\"You are a cliAgent . You can generate .NET CLI from request. and pass generate .NET CLI to runPlugin\")\n", 192 | " .WithName(\"cliAgent\")\n", 193 | " .WithDescription(\"cliAgent\")\n", 194 | " .WithPlugin(cli_plugin)\n", 195 | " .BuildAsync();" 196 | ] 197 | }, 198 | { 199 | "cell_type": "code", 200 | "execution_count": null, 201 | "metadata": { 202 | "dotnet_interactive": { 203 | "language": "csharp" 204 | }, 205 | "polyglot_notebook": { 206 | "kernelName": "csharp" 207 | }, 208 | "vscode": { 209 | "languageId": "polyglot-notebook" 210 | } 211 | }, 212 | "outputs": [], 213 | "source": [ 214 | "#pragma warning disable SKEXP0101\n", 215 | "\n", 216 | "var runAgent = await new AgentBuilder()\n", 217 | " .WithOpenAIChatCompletion(\"gpt-4-turbo-preview\", Settings.OpenAIKey)\n", 218 | " .WithInstructions(\"You are a runAgent. Execute the .NET CLI script generated from cliAgent\")\n", 219 | " .WithName(\"runAgent\")\n", 220 | " .WithDescription(\"runAgent\")\n", 221 | " .WithPlugin(run_plugin)\n", 222 | " .BuildAsync();" 223 | ] 224 | }, 225 | { 226 | "cell_type": "code", 227 | "execution_count": null, 228 | "metadata": { 229 | "dotnet_interactive": { 230 | "language": "csharp" 231 | }, 232 | "polyglot_notebook": { 233 | "kernelName": "csharp" 234 | }, 235 | "vscode": { 236 | "languageId": "polyglot-notebook" 237 | } 238 | }, 239 | "outputs": [], 240 | "source": [ 241 | "#pragma warning disable SKEXP0101\n", 242 | "\n", 243 | "var dotNETAgent = await new AgentBuilder()\n", 244 | " .WithOpenAIChatCompletion(\"gpt-4-turbo-preview\", Settings.OpenAIKey)\n", 245 | " .WithInstructions(\"You are a dotNETAgent . let cliAgent to generation .NET CLI script from request , Then use runAgent to execute the .NET CLI script generated from cliAgent\")\n", 246 | " .WithPlugin(cliAgent.AsPlugin())\n", 247 | " .WithPlugin(runAgent.AsPlugin())\n", 248 | " .BuildAsync();" 249 | ] 250 | }, 251 | { 252 | "cell_type": "code", 253 | "execution_count": null, 254 | "metadata": { 255 | "dotnet_interactive": { 256 | "language": "csharp" 257 | }, 258 | "polyglot_notebook": { 259 | "kernelName": "csharp" 260 | }, 261 | "vscode": { 262 | "languageId": "polyglot-notebook" 263 | } 264 | }, 265 | "outputs": [], 266 | "source": [ 267 | "var messages = new string[]\n", 268 | " {\n", 269 | " \"setup a new project - HelloDemoApp\",\n", 270 | " \"compile this project - HelloDemoApp\",\n", 271 | " \"run this project - HelloDemoApp\",\n", 272 | " };" 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": null, 278 | "metadata": { 279 | "dotnet_interactive": { 280 | "language": "csharp" 281 | }, 282 | "polyglot_notebook": { 283 | "kernelName": "csharp" 284 | }, 285 | "vscode": { 286 | "languageId": "polyglot-notebook" 287 | } 288 | }, 289 | "outputs": [], 290 | "source": [ 291 | "#pragma warning disable SKEXP0101\n", 292 | "\n", 293 | "var thread = await dotNETAgent.NewThreadAsync();" 294 | ] 295 | }, 296 | { 297 | "cell_type": "code", 298 | "execution_count": null, 299 | "metadata": { 300 | "dotnet_interactive": { 301 | "language": "csharp" 302 | }, 303 | "polyglot_notebook": { 304 | "kernelName": "csharp" 305 | }, 306 | "vscode": { 307 | "languageId": "polyglot-notebook" 308 | } 309 | }, 310 | "outputs": [], 311 | "source": [ 312 | "#pragma warning disable SKEXP0101\n", 313 | "\n", 314 | "foreach (var response in messages.Select(m => thread.InvokeAsync(dotNETAgent, m)))\n", 315 | "{\n", 316 | " await foreach (var message in response)\n", 317 | " {\n", 318 | " Console.WriteLine($\"[{message.Id}]\");\n", 319 | " Console.WriteLine($\"# {message.Role}: {message.Content}\");\n", 320 | " }\n", 321 | "}" 322 | ] 323 | } 324 | ], 325 | "metadata": { 326 | "language_info": { 327 | "name": "python" 328 | } 329 | }, 330 | "nbformat": 4, 331 | "nbformat_minor": 2 332 | } 333 | -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/Utils/Settings.cs: -------------------------------------------------------------------------------- 1 | public class Settings 2 | { 3 | public static string AOAIEndpoint = "Your Azure OpenAI Service Endpoint"; 4 | public static string AOAIKey = "Your Azure OpenAI Service Key"; 5 | public static string AOAIModel = "gpt-4-32k"; 6 | 7 | public static string OpenAIKey = "Your OpenAI Key"; 8 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/plugins/CodePlugin/dotNET/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "type": "completion", 4 | "description": "Generate dotNET CLI", 5 | "completion": { 6 | "max_tokens": 5000, 7 | "temperature": 0.0, 8 | "top_p": 1.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | }, 12 | "input": { 13 | "parameters": [ 14 | { 15 | "name": "request", 16 | "description": "request", 17 | "defaultValue": "" 18 | }, 19 | { 20 | "name": "project", 21 | "description": "project name", 22 | "defaultValue": "workspace/helloworld" 23 | } 24 | ] 25 | }, 26 | "output": { 27 | "parameters": [ 28 | { 29 | "name": "cli", 30 | "description": "cli", 31 | "defaultValue": "" 32 | } 33 | ] 34 | } 35 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/plugins/CodePlugin/dotNET/skprompt.txt: -------------------------------------------------------------------------------- 1 | System: 2 | 3 | You are a .NET expert and helpe me to generate .NET command line instructions upon {{$request}}, 4 | 5 | 1. {{$request}} format is xxx - bbb, bbb is project name, xxx is requirement 6 | 7 | 2. The output is only commands without any explanation text in markdown format,such as 8 | 9 | ```bash 10 | {{$cli}} 11 | ``` 12 | 13 | 3. Only scripts that are created, compiled, and run are generated about .NET CLI . Code-related scripts and others are not involved in the generation. 14 | 15 | 4. The compile path format is ./project/project.csproj, compile cli output only : 16 | 17 | ```bash 18 | dotnet build ./project/project.csproj 19 | ``` 20 | 21 | 5. setup a new project, set up cli output only : 22 | 23 | ```bash 24 | dotnet new console -o project 25 | ``` 26 | 27 | 6. run project cli output only: 28 | 29 | ```bash 30 | dotnet run --project ./project 31 | ``` 32 | 33 | This is samples: 34 | 35 | 1. if {{$request}} is about setup a new project requirement, such as "set up a new project - WorksDemoApp" , the output only is 36 | 37 | ```bash 38 | dotnet new console -o WorksDemoApp 39 | ``` 40 | 41 | 2. When {{$request}} is about compile project requirement , such as "compile this project - WorksDemoApp",the output only is : 42 | 43 | ```bash 44 | dotnet build ./WorksDemoApp/WorksDemoApp.csproj 45 | ``` 46 | 47 | Relevant project paths must be added 48 | 49 | 3. When {{$request}} is about running project requirement, such as "run this project - WorksDemoApp", the output only is : 50 | 51 | ```bash 52 | dotnet run --project ./WorksDemoApp 53 | ``` 54 | 55 | user: 56 | {{$request}} -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/plugins/CustomPlugin/GenCLIPlugin.cs: -------------------------------------------------------------------------------- 1 | using System.ComponentModel; 2 | using System.Globalization; 3 | using System.Threading.Tasks; 4 | 5 | using Microsoft.SemanticKernel; 6 | using Microsoft.SemanticKernel.Connectors.OpenAI; 7 | 8 | 9 | public class GenCLIPlugin 10 | { 11 | [KernelFunction,Description("cli generation")] 12 | public async Task GenDotNETCLI(string request) 13 | { 14 | Kernel kernel = Kernel.CreateBuilder() 15 | .AddAzureOpenAIChatCompletion("gpt-4-32k", Settings.AOAIEndpoint, Settings.AOAIKey) 16 | .Build(); 17 | 18 | var pluginDirectory = Path.Combine("./", "plugins"); 19 | 20 | var code_plugin = kernel.CreatePluginFromPromptDirectory(Path.Combine(pluginDirectory, "CodePlugin")); 21 | 22 | 23 | // Console.WriteLine("0000"); 24 | // Console.WriteLine(cmd); 25 | // Console.WriteLine("0000"); 26 | 27 | var code = await kernel.InvokeAsync(code_plugin["dotNET"],new(){["request"] = request}); 28 | 29 | // // var gen_code = code.Replace("```bash","").Replace("```","").ToString(); 30 | Console.WriteLine("--------------------------------------------------------"); 31 | Console.WriteLine("---------------------Gen dotNET CLI---------------------"); 32 | Console.WriteLine(code); 33 | Console.WriteLine("--------------------------------------------------------"); 34 | 35 | return code.ToString(); 36 | } 37 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/plugins/CustomPlugin/RunPlugin.cs: -------------------------------------------------------------------------------- 1 | using System.ComponentModel; 2 | using System.Globalization; 3 | using System.Threading.Tasks; 4 | using System.Diagnostics; 5 | 6 | using Microsoft.SemanticKernel; 7 | 8 | 9 | public class RunPlugin 10 | { 11 | [KernelFunction,Description("running code")] 12 | public string RunDotNETCLI(string cli) 13 | { 14 | 15 | Console.WriteLine("--------------------------------------------------------"); 16 | Console.WriteLine("----------------Running dotNET CLI---------------------"); 17 | Console.WriteLine(cli); 18 | Console.WriteLine("--------------------------------------------------------"); 19 | 20 | using(Process process = new Process()) 21 | { 22 | process.StartInfo.FileName = "dotnet"; 23 | process.StartInfo.Arguments = cli.Replace("dotnet","").Replace("```bash","").Replace("```","").ToString();//"run --project ./workspace/helloworld"; 24 | process.StartInfo.UseShellExecute = false; 25 | process.StartInfo.RedirectStandardOutput = true; 26 | process.StartInfo.RedirectStandardError = true; 27 | process.Start(); 28 | 29 | string output = process.StandardOutput.ReadToEnd(); 30 | 31 | return output; 32 | } 33 | } 34 | 35 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/plugins/QAPlugin/qa/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "type": "completion", 4 | "description": "read qa", 5 | "completion": { 6 | "max_tokens": 5000, 7 | "temperature": 0.2, 8 | "top_p": 1.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | }, 12 | "input": { 13 | "parameters": [ 14 | { 15 | "name": "input", 16 | "description": "request", 17 | "defaultValue": "" 18 | } 19 | ] 20 | } 21 | } -------------------------------------------------------------------------------- /workshop/dotNET/workshop3/dotNETAgent/plugins/QAPlugin/qa/skprompt.txt: -------------------------------------------------------------------------------- 1 | System: 2 | 3 | You are a QA worker and can fully read the requirements. You need to divide the goals and project names, and pass the parameters to the command line generated tool. such as 4 | 5 | 1. create new project - XXXXX, the parameter value of request is "create new project", the parameter value of project is "XXXXX" 6 | 2. compile project - XXXXX, the parameter value of request is "compile project ", the parameter value of project is "XXXXX" 7 | 2. run project - XXXXX, the parameter value of request is "run project ", the parameter value of project is "XXXXX" 8 | 9 | user: 10 | {{$input}} -------------------------------------------------------------------------------- /workshop/java/workshop1/data/transcripts/01-introduction.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | hello and welcome to this course on 5 | 0:11 6 | classical machine learning for beginners 7 | 0:13 8 | whether you're completely new to the 9 | 0:15 10 | topic or an experienced ml practitioner 11 | 0:17 12 | looking to brush up on an area we're 13 | 0:19 14 | happy to have you join us this course is 15 | 16 | Introducing ML for Beginners 17 | 18 | 0:22 19 | based on the free open source 26 lesson 20 | 0:25 21 | ml for beginners curriculum from 22 | 0:27 23 | Microsoft which can be found at AKA dot 24 | 0:30 25 | Ms slash ml-beginners machine learning 26 | 0:34 27 | is one of the most popular Technologies 28 | 0:36 29 | these days I'm sure you've heard this 30 | 0:38 31 | term if you have any sort of familiarity 32 | 0:41 33 | with technology no matter what domain 34 | 0:42 35 | you work in however the mechanics of 36 | 0:45 37 | machine learning are a mystery to most 38 | 0:47 39 | people and the subject can sometimes 40 | 0:49 41 | feel overwhelming in this course you'll 42 | 0:51 43 | start right from the beginning and 44 | 0:53 45 | you'll learn about it step by step to 46 | 0:56 47 | practical Hands-On coding examples let's 48 | 49 | The difference between AI and ML 50 | 51 | 0:58 52 | start by talking about the difference 53 | 1:00 54 | between artificial intelligence and 55 | 1:03 56 | machine learning AI is a science of 57 | 1:05 58 | getting machines to accomplish tasks 59 | 1:07 60 | that typically require human level 61 | 1:09 62 | intelligence many different techniques 63 | 1:12 64 | have been proposed for AI but the most 65 | 1:14 66 | successful and popular approach these 67 | 1:16 68 | days is machine learning 69 | 1:19 70 | unlike other AI techniques ml uses 71 | 1:22 72 | specialized algorithms to make decisions 73 | 1:24 74 | by learning from data so machine 75 | 1:28 76 | learning is really a subset of 77 | 1:30 78 | artificial intelligence 79 | 1:32 80 | you've also probably heard of deep 81 | 1:33 82 | learning which is a subset of machine 83 | 1:35 84 | learning that relies on neural networks 85 | 1:38 86 | to learn from data in this course we're 87 | 88 | What you'll learn in this course 89 | 90 | 1:40 91 | going to cover what we call classical 92 | 1:42 93 | machine learning you'll learn some Core 94 | 1:45 95 | Concepts of ml a bit of History 96 | 1:48 97 | statistical techniques like regression 98 | 1:50 99 | classification clustering and more 100 | 1:54 101 | the concepts you'll learn here will 102 | 1:55 103 | serve you well as you progress to more 104 | 1:57 105 | 106 | Advanced Techniques 107 | 108 | What you won't learn in this course 109 | 1:59 110 | keep in mind that this course won't 111 | 2:00 112 | cover data science deep learning neural 113 | 2:03 114 | networks and AI techniques other than ml 115 | 2:06 116 | Microsoft offers two additional courses 117 | 2:09 118 | for you to learn more about these areas 119 | 2:10 120 | data science for beginners available at 121 | 2:13 122 | AKA dot Ms slash data science beginners 123 | 2:16 124 | and AI for beginners available at aka.ms 125 | 2:20 126 | Ai and beginners machine learning is a 127 | 128 | Why study Machine Learning 129 | 130 | 2:23 131 | Hot Topic because it's solving complex 132 | 2:25 133 | real-world problems in so many areas 134 | 2:28 135 | Finance earth science space exploration 136 | 2:31 137 | cognitive science and many more Fields 138 | 2:33 139 | have adopted machine learning to solve 140 | 2:35 141 | problems specific to their domains for 142 | 2:39 143 | example you can use machine learning to 144 | 2:41 145 | predict the likelihood of disease from a 146 | 2:43 147 | patient's medical history 148 | 2:45 149 | to anticipate weather events to 150 | 2:48 151 | understand the sentiment of a text and 152 | 2:50 153 | to detect fake news and stop the spread 154 | 2:52 155 | of propaganda applications of machine 156 | 2:54 157 | learning are almost everywhere and are 158 | 2:57 159 | as ubiquitous as the data that is 160 | 2:59 161 | Flowing from our devices and systems 162 | 3:01 163 | because of how useful it is 164 | 3:03 165 | understanding the basics of machine 166 | 3:04 167 | learning is going to help you no matter 168 | 3:06 169 | what domain you're coming from in the 170 | 3:09 171 | next video in the series I'll give an 172 | 3:10 173 | overview of the history of ml I'll see 174 | 3:13 175 | you there -------------------------------------------------------------------------------- /workshop/java/workshop1/data/transcripts/02-history-of-ml.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | in this video I want to give you a bit 5 | 0:11 6 | of Background by walking through the 7 | 0:13 8 | major milestones in the history of 9 | 0:14 10 | machine learning and artificial 11 | 0:16 12 | intelligence 13 | 0:17 14 | the modern world of artificial 15 | 16 | Alan Turing and the Turing test 17 | 18 | 0:19 19 | intelligence really begun in 1950s 20 | 0:21 21 | though it's based on mathematical and 22 | 0:24 23 | statistical developments over many 24 | 0:26 25 | centuries Alan Turing is credited with 26 | 0:28 27 | helping to lay the foundation for the 28 | 0:30 29 | concept of a machine they can think 30 | 0:33 31 | in his quest to Define machine 32 | 0:35 33 | intelligence he achieved a crucial 34 | 0:37 35 | Milestone by creating the Turing test in 36 | 0:40 37 | 1950. in this test an interrogator 38 | 0:43 39 | questions both a human and a computer 40 | 0:45 41 | and tries to determine which one is 42 | 0:47 43 | which 44 | 0:48 45 | if the interrogator cannot tell the 46 | 0:50 47 | difference then the computer can be 48 | 0:52 49 | considered intelligent in 1956 the term 50 | 51 | The Dartmouth Summer Research Project on AI 52 | 53 | 0:55 54 | artificial intelligence was coined with 55 | 0:57 56 | a small group of scientists gathered at 57 | 1:00 58 | Dartmouth College in the U.S for an 59 | 1:02 60 | event called the summer research project 61 | 1:04 62 | on artificial intelligence 63 | 1:06 64 | this conference was the birth of the 65 | 1:09 66 | field of research we know as AI the 67 | 1:11 68 | years from 1956 to 1974 are known as The 69 | 70 | The golden years of AI 71 | 72 | 1:15 73 | Golden Ears of AI optimism ran high in 74 | 1:19 75 | the hope that AI could solve many 76 | 1:21 77 | problems in 1967 Marvin Minsky the 78 | 1:24 79 | co-founder of the mitai lab stated 80 | 1:27 81 | confidently and incorrectly that within 82 | 1:30 83 | a generation the problem of creating 84 | 1:32 85 | artificial intelligence will 86 | 1:34 87 | substantially be solved 88 | 1:36 89 | natural language processing research 90 | 1:39 91 | flourished search was refined and made 92 | 1:41 93 | more powerful and the concept of micro 94 | 1:44 95 | worlds was created where simple tasks 96 | 1:46 97 | were completed using plain language 98 | 1:48 99 | instructions 100 | 1:50 101 | research was well funded by government 102 | 1:51 103 | agencies advances were made in 104 | 1:54 105 | computation and algorithms and 106 | 1:56 107 | prototypes of intelligent machines were 108 | 1:58 109 | built 110 | 1:59 111 | some of these machines include shaky the 112 | 2:01 113 | robot who could maneuver and decide how 114 | 2:04 115 | to perform tasks intelligently Eliza and 116 | 2:07 117 | gnarly charabot that could converse with 118 | 2:10 119 | people and act as a primitive therapist 120 | 2:12 121 | Blocksworld an example of a micro world 122 | 2:15 123 | where blogs get could be stacked and 124 | 2:17 125 | sorted and decision-making experiments 126 | 2:19 127 | could be tested by the mid-1970s it had 128 | 129 | The AI winter 130 | 131 | 2:22 132 | become apparent that the complexity of 133 | 2:24 134 | making intelligent machines had been 135 | 2:26 136 | understated and that its promise had 137 | 2:28 138 | been overblown compute power was too 139 | 2:31 140 | limited there was a lack of data to 141 | 2:33 142 | train and test AIS and there were 143 | 2:35 144 | questions around the ethics of 145 | 2:37 146 | introducing AI systems like the 147 | 2:39 148 | therapist Eliza into society 149 | 2:42 150 | funding dried up and confidence in the 151 | 2:44 152 | field slowed marking the beginning of 153 | 2:46 154 | what is called an AI winter in the 1980s 155 | 2:49 156 | as computers became more powerful expert 157 | 158 | Resurgence and fall of AI for expert systems 159 | 160 | 2:52 161 | systems became more successful there was 162 | 2:55 163 | a Resurgence in optimism about AI as 164 | 2:58 165 | businesses found practical applications 166 | 3:00 167 | of these rule-based inference systems by 168 | 3:03 169 | the late 80s it was becoming apparent 170 | 3:05 171 | that expert systems had become too 172 | 3:07 173 | specialized and were unlikely to achieve 174 | 3:10 175 | machine intelligence the rise of 176 | 3:12 177 | personal computers also competed with 178 | 3:15 179 | these large specialized centralized 180 | 3:16 181 | systems this led to another chill in the 182 | 3:19 183 | AI field things began to change in the 184 | 3:22 185 | mid-1990s as compute and storage 186 | 3:24 187 | capabilities grew exponentially making 188 | 3:27 189 | it possible to process much larger data 190 | 191 | Growth in AI driven by more data and more powerful hardware 192 | 193 | 3:30 194 | sets than ever before 195 | 3:32 196 | the rise of the internet and the 197 | 3:34 198 | popularity of smartphones both 199 | 3:36 200 | contributed to increasing amounts of 201 | 3:38 202 | data a new experiments in machine 203 | 3:39 204 | learning became possible 205 | 3:42 206 | throughout the 2000s significant 207 | 3:44 208 | advancements were made in computer 209 | 3:45 210 | vision and natural language processing 211 | 3:47 212 | by training machine learning models on 213 | 3:50 214 | 215 | Big Data 216 | 217 | 3:52 218 | in the past decade compute power and the 219 | 3:55 220 | size of data sets have continued to grow 221 | 3:57 222 | and machine learning has become capable 223 | 4:00 224 | of solving even more problems as a 225 | 4:02 226 | result today machine learning touches 227 | 4:04 228 | almost every part of our Lives sometimes 229 | 4:07 230 | we're well aware of it like when we 231 | 4:09 232 | interact with chat TPT in the browser or 233 | 4:12 234 | see a self-driving car go by 235 | 4:14 236 | but most of the time it's seamlessly 237 | 4:16 238 | woven into familiar experiences of our 239 | 4:19 240 | everyday life such as when we're 241 | 4:21 242 | approved for a new loan or get a catalog 243 | 4:23 244 | at home 245 | 246 | Increased awareness of ethical and responsible AI 247 | 248 | 4:25 249 | this era has also been marked by an 250 | 4:27 251 | increased awareness of potential ethical 252 | 4:29 253 | issues in machine learning and by 254 | 4:31 255 | significant research in the field of 256 | 4:33 257 | responsible AI 258 | 4:34 259 | we want the benefits of AI but we also 260 | 4:37 261 | want AI that is responsible and doesn't 262 | 4:39 263 | amplify human bias 264 | 4:42 265 | in the next video we will introduce 266 | 4:44 267 | techniques for building using and 268 | 4:46 269 | maintaining machine learning models I'll 270 | 4:48 271 | see you there -------------------------------------------------------------------------------- /workshop/java/workshop1/data/transcripts/03-techniques-ml.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | the process of building machine learning 5 | 0:10 6 | models is very different from any other 7 | 0:13 8 | development workflow in this video 9 | 0:15 10 | you'll learn about that process more 11 | 0:18 12 | specifically you learn about deciding 13 | 0:20 14 | whether AI is the right approach for 15 | 0:22 16 | your problem collecting and preparing 17 | 0:24 18 | your data 19 | 0:25 20 | training your model evaluating your 21 | 0:28 22 | model tuning the hyper parameters and 23 | 0:30 24 | testing the trained model in the real 25 | 0:32 26 | world 27 | 28 | Decide if AI is the right approach 29 | 30 | 0:34 31 | traditional software is well suited to 32 | 0:36 33 | solve problems where the solution can be 34 | 0:39 35 | described as a formal set of rules in 36 | 0:42 37 | contrast AI shines in solving problems 38 | 0:44 39 | where the solution can be extracted from 40 | 0:46 41 | data many of the problems we encountered 42 | 0:49 43 | in our daily life can be efficiently 44 | 0:51 45 | solved with traditional programming if 46 | 0:53 47 | an engineer can break up the solution of 48 | 0:55 49 | a problem and Define it using precise 50 | 0:57 51 | rules then traditional programming is a 52 | 0:59 53 | great tool to use but many of the 54 | 1:01 55 | problems we encounter in our day-to-day 56 | 1:03 57 | aren't quite as easy to Define as a set 58 | 1:05 59 | of rules thankfully for many of those 60 | 1:08 61 | problems we have access to plenty of 62 | 1:10 63 | real life data containing useful 64 | 1:12 65 | information which means that AI can help 66 | 1:14 67 | us find a solution one good example is 68 | 1:17 69 | translating from one language to another 70 | 1:19 71 | writing a set of rules that full encodes 72 | 1:22 73 | all the parallels between two languages 74 | 1:24 75 | is not easy but there are many examples 76 | 1:26 77 | of translation online so AI has been 78 | 1:29 79 | able to do a much better job of 80 | 1:30 81 | translation than previous attempts so 82 | 1:33 83 | our first step when we're starting a new 84 | 1:35 85 | project should be to analyze the problem 86 | 1:37 87 | and determine which technique is best to 88 | 1:39 89 | solve it if you're able to obtain plenty 90 | 1:41 91 | of data that contains useful information 92 | 1:43 93 | about your solution then AI is a 94 | 1:46 95 | promising approach once you decided that 96 | 97 | Collect and prepare data 98 | 99 | 1:48 100 | AI is the right method for you you need 101 | 1:50 102 | to collect and prepare your data for 103 | 1:52 104 | example you may need to normalize it or 105 | 1:54 106 | convert it to a different form or remove 107 | 1:56 108 | rows that are missing certain Fields 109 | 1:58 110 | once your data is clean you need to 111 | 2:00 112 | decide about which aspects of your data 113 | 2:03 114 | or features you're going to use as input 115 | 2:05 116 | to your prediction and which feature you 117 | 2:07 118 | want to predict for example if you have 119 | 2:10 120 | medical data you may decide to use 121 | 2:12 122 | features that describe the patient's 123 | 2:14 124 | medical history as input and a chance of 125 | 2:16 126 | a particular disease as the output 127 | 2:18 128 | feature you want to predict 129 | 2:21 130 | and finally you need to split your data 131 | 2:23 132 | into training and test sets a usual 133 | 2:25 134 | split is 80 for your training data and 135 | 2:29 136 | 20 for test 137 | 138 | Train your model 139 | 140 | 2:31 141 | next you need to choose a machine 142 | 2:33 143 | learning algorithm which you'll learn a 144 | 2:35 145 | lot about in the coming videos if you're 146 | 2:37 147 | undecided between a few good algorithms 148 | 2:39 149 | you may want to try them all and see 150 | 2:41 151 | which one performs best 152 | 2:43 153 | then you need to train your model using 154 | 2:45 155 | the training set you collected earlier 156 | 2:47 157 | and the algorithm you chose training a 158 | 2:50 159 | model may take a while especially if the 160 | 2:52 161 | model is large 162 | 163 | Evaluate your model 164 | 165 | 2:53 166 | once the model is trained you can test 167 | 2:55 168 | it using the test data set that you 169 | 2:57 170 | split earlier it's important that you 171 | 3:00 172 | test the algorithm with data that it 173 | 3:01 174 | hasn't seen during training to ensure 175 | 3:04 176 | that it generalizes well to new 177 | 3:06 178 | scenarios 179 | 180 | Tune the hyperparameters 181 | 182 | 3:07 183 | some algorithms contain hyper parameters 184 | 3:10 185 | which are settings that control key 186 | 3:11 187 | aspects of their inner workings choosing 188 | 3:14 189 | good hyper parameters is important 190 | 3:16 191 | because they can make a big difference 192 | 3:17 193 | in your results if you want to be 194 | 3:19 195 | systematic about your hyper parameter 196 | 3:21 197 | search you can write code that tries 198 | 3:23 199 | lots of different combinations and helps 200 | 3:26 201 | you discover the best values for your 202 | 3:27 203 | data once you get good test results it's 204 | Test the model in the real world 205 | 3:30 206 | time to see how well your model performs 207 | 3:32 208 | within the context of its intended use 209 | 3:34 210 | for example this could involve 211 | 3:36 212 | collecting live data from a sensor and 213 | 3:38 214 | using it to make predictions or 215 | 3:40 216 | deploying a model to a few users of your 217 | 3:42 218 | application if it all looks good then 219 | 3:45 220 | you're ready to release it to production 221 | 3:46 222 | and enjoy its benefits 223 | 3:49 224 | make sure you watch the next video where 225 | 3:51 226 | we'll start getting Hands-On with 227 | 3:53 228 | machine learning by configuring all the 229 | 3:55 230 | tools we'll use in the rest of the 231 | 3:57 232 | series I'll see you there -------------------------------------------------------------------------------- /workshop/java/workshop1/plugins/AnswerPlugin/Summary/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "type": "completion", 4 | "description": "Summary knowledge", 5 | "completion": { 6 | "max_tokens": 3000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } -------------------------------------------------------------------------------- /workshop/java/workshop1/plugins/AnswerPlugin/Summary/skprompt.txt: -------------------------------------------------------------------------------- 1 | system : Please output the summary content according to the following requirements based on the input content. 2 | 3 | 1. Summarize the content within 100 words 4 | 2. Please use plain and understandable language 5 | 6 | user: {{$input}} -------------------------------------------------------------------------------- /workshop/java/workshop1/plugins/FilePlugin/Notes/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Get knowledge from Notes", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 13000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/java/workshop1/plugins/FilePlugin/Notes/skprompt.txt: -------------------------------------------------------------------------------- 1 | system : You are an AI teacher who understands markdown format. Please summarize the input content as required. 2 | 3 | 1. Ignore content related to Pre-lecture, Review & Self Study, and Assignment 4 | 5 | 2. Use # as content title , ## or ### is not content title 6 | 7 | 3. The format of knowledge points is ---\n## xxx ,such as 8 | 9 | --- 10 | ## xxx 11 | 12 | it is knowledge points, don't use knowledge points as title 13 | 14 | 3. Summarize knowledge points. The summary should not over 100 words. 15 | 16 | 4. After obtaining and summarizing all knowledge points, synthesize the content of all knowledge points and summarize it into a content of no more than 400 words as content output. 17 | 18 | 5. The output must be in json format "{\"kb\":\"title\",\"content\": \"Summarize all the knowledge points\"}" 19 | 20 | user: {{$input}} -------------------------------------------------------------------------------- /workshop/java/workshop1/plugins/FilePlugin/Transcrips/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Get knowledge from Transcript", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 30000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/java/workshop1/plugins/FilePlugin/Transcrips/skprompt.txt: -------------------------------------------------------------------------------- 1 | System: You are an artificial intelligence teacher, please follow the transcript content 2 | 3 | Please perform content extraction according to the following conditions 4 | 5 | 6 | 1. There is a different time after the title of each knowledge point. Remove the time-related content and merge content. 7 | 8 | 2. Ignore the knowledge points and related content about Intro 9 | 10 | 3. get the title and merged content of each knowledge point ,summarize content of knowledge point in 5 tokens 11 | 12 | {"kb": "title", "content": "summarize content of knowledge point in 5 tokens"} 13 | 14 | 4. The output is a JSON array 15 | 16 | User: {{$input}} -------------------------------------------------------------------------------- /workshop/python/workshop1/.env: -------------------------------------------------------------------------------- 1 | AZURE_OPENAI_ENDPOINT = 'Your Azure OpenAI Service Endpoint' 2 | AZURE_OPENAI_API_KEY = 'Your Azure OpenAI Service API Key' 3 | AZURE_OPENAI_VERSION = 'Your Azure OpenAI Service Version, e.g. 2023-12-01-preview' 4 | AZURE_OPENAI_DEPLOYMENT_NAME = "Your Azure OpenAI Service gpt-4-32k Model Delployment Name" -------------------------------------------------------------------------------- /workshop/python/workshop1/data/transcripts/01-introduction.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | hello and welcome to this course on 5 | 0:11 6 | classical machine learning for beginners 7 | 0:13 8 | whether you're completely new to the 9 | 0:15 10 | topic or an experienced ml practitioner 11 | 0:17 12 | looking to brush up on an area we're 13 | 0:19 14 | happy to have you join us this course is 15 | 16 | Introducing ML for Beginners 17 | 18 | 0:22 19 | based on the free open source 26 lesson 20 | 0:25 21 | ml for beginners curriculum from 22 | 0:27 23 | Microsoft which can be found at AKA dot 24 | 0:30 25 | Ms slash ml-beginners machine learning 26 | 0:34 27 | is one of the most popular Technologies 28 | 0:36 29 | these days I'm sure you've heard this 30 | 0:38 31 | term if you have any sort of familiarity 32 | 0:41 33 | with technology no matter what domain 34 | 0:42 35 | you work in however the mechanics of 36 | 0:45 37 | machine learning are a mystery to most 38 | 0:47 39 | people and the subject can sometimes 40 | 0:49 41 | feel overwhelming in this course you'll 42 | 0:51 43 | start right from the beginning and 44 | 0:53 45 | you'll learn about it step by step to 46 | 0:56 47 | practical Hands-On coding examples let's 48 | 49 | The difference between AI and ML 50 | 51 | 0:58 52 | start by talking about the difference 53 | 1:00 54 | between artificial intelligence and 55 | 1:03 56 | machine learning AI is a science of 57 | 1:05 58 | getting machines to accomplish tasks 59 | 1:07 60 | that typically require human level 61 | 1:09 62 | intelligence many different techniques 63 | 1:12 64 | have been proposed for AI but the most 65 | 1:14 66 | successful and popular approach these 67 | 1:16 68 | days is machine learning 69 | 1:19 70 | unlike other AI techniques ml uses 71 | 1:22 72 | specialized algorithms to make decisions 73 | 1:24 74 | by learning from data so machine 75 | 1:28 76 | learning is really a subset of 77 | 1:30 78 | artificial intelligence 79 | 1:32 80 | you've also probably heard of deep 81 | 1:33 82 | learning which is a subset of machine 83 | 1:35 84 | learning that relies on neural networks 85 | 1:38 86 | to learn from data in this course we're 87 | 88 | What you'll learn in this course 89 | 90 | 1:40 91 | going to cover what we call classical 92 | 1:42 93 | machine learning you'll learn some Core 94 | 1:45 95 | Concepts of ml a bit of History 96 | 1:48 97 | statistical techniques like regression 98 | 1:50 99 | classification clustering and more 100 | 1:54 101 | the concepts you'll learn here will 102 | 1:55 103 | serve you well as you progress to more 104 | 1:57 105 | 106 | Advanced Techniques 107 | 108 | What you won't learn in this course 109 | 1:59 110 | keep in mind that this course won't 111 | 2:00 112 | cover data science deep learning neural 113 | 2:03 114 | networks and AI techniques other than ml 115 | 2:06 116 | Microsoft offers two additional courses 117 | 2:09 118 | for you to learn more about these areas 119 | 2:10 120 | data science for beginners available at 121 | 2:13 122 | AKA dot Ms slash data science beginners 123 | 2:16 124 | and AI for beginners available at aka.ms 125 | 2:20 126 | Ai and beginners machine learning is a 127 | 128 | Why study Machine Learning 129 | 130 | 2:23 131 | Hot Topic because it's solving complex 132 | 2:25 133 | real-world problems in so many areas 134 | 2:28 135 | Finance earth science space exploration 136 | 2:31 137 | cognitive science and many more Fields 138 | 2:33 139 | have adopted machine learning to solve 140 | 2:35 141 | problems specific to their domains for 142 | 2:39 143 | example you can use machine learning to 144 | 2:41 145 | predict the likelihood of disease from a 146 | 2:43 147 | patient's medical history 148 | 2:45 149 | to anticipate weather events to 150 | 2:48 151 | understand the sentiment of a text and 152 | 2:50 153 | to detect fake news and stop the spread 154 | 2:52 155 | of propaganda applications of machine 156 | 2:54 157 | learning are almost everywhere and are 158 | 2:57 159 | as ubiquitous as the data that is 160 | 2:59 161 | Flowing from our devices and systems 162 | 3:01 163 | because of how useful it is 164 | 3:03 165 | understanding the basics of machine 166 | 3:04 167 | learning is going to help you no matter 168 | 3:06 169 | what domain you're coming from in the 170 | 3:09 171 | next video in the series I'll give an 172 | 3:10 173 | overview of the history of ml I'll see 174 | 3:13 175 | you there -------------------------------------------------------------------------------- /workshop/python/workshop1/data/transcripts/02-history-of-ml.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | in this video I want to give you a bit 5 | 0:11 6 | of Background by walking through the 7 | 0:13 8 | major milestones in the history of 9 | 0:14 10 | machine learning and artificial 11 | 0:16 12 | intelligence 13 | 0:17 14 | the modern world of artificial 15 | 16 | Alan Turing and the Turing test 17 | 18 | 0:19 19 | intelligence really begun in 1950s 20 | 0:21 21 | though it's based on mathematical and 22 | 0:24 23 | statistical developments over many 24 | 0:26 25 | centuries Alan Turing is credited with 26 | 0:28 27 | helping to lay the foundation for the 28 | 0:30 29 | concept of a machine they can think 30 | 0:33 31 | in his quest to Define machine 32 | 0:35 33 | intelligence he achieved a crucial 34 | 0:37 35 | Milestone by creating the Turing test in 36 | 0:40 37 | 1950. in this test an interrogator 38 | 0:43 39 | questions both a human and a computer 40 | 0:45 41 | and tries to determine which one is 42 | 0:47 43 | which 44 | 0:48 45 | if the interrogator cannot tell the 46 | 0:50 47 | difference then the computer can be 48 | 0:52 49 | considered intelligent in 1956 the term 50 | 51 | The Dartmouth Summer Research Project on AI 52 | 53 | 0:55 54 | artificial intelligence was coined with 55 | 0:57 56 | a small group of scientists gathered at 57 | 1:00 58 | Dartmouth College in the U.S for an 59 | 1:02 60 | event called the summer research project 61 | 1:04 62 | on artificial intelligence 63 | 1:06 64 | this conference was the birth of the 65 | 1:09 66 | field of research we know as AI the 67 | 1:11 68 | years from 1956 to 1974 are known as The 69 | 70 | The golden years of AI 71 | 72 | 1:15 73 | Golden Ears of AI optimism ran high in 74 | 1:19 75 | the hope that AI could solve many 76 | 1:21 77 | problems in 1967 Marvin Minsky the 78 | 1:24 79 | co-founder of the mitai lab stated 80 | 1:27 81 | confidently and incorrectly that within 82 | 1:30 83 | a generation the problem of creating 84 | 1:32 85 | artificial intelligence will 86 | 1:34 87 | substantially be solved 88 | 1:36 89 | natural language processing research 90 | 1:39 91 | flourished search was refined and made 92 | 1:41 93 | more powerful and the concept of micro 94 | 1:44 95 | worlds was created where simple tasks 96 | 1:46 97 | were completed using plain language 98 | 1:48 99 | instructions 100 | 1:50 101 | research was well funded by government 102 | 1:51 103 | agencies advances were made in 104 | 1:54 105 | computation and algorithms and 106 | 1:56 107 | prototypes of intelligent machines were 108 | 1:58 109 | built 110 | 1:59 111 | some of these machines include shaky the 112 | 2:01 113 | robot who could maneuver and decide how 114 | 2:04 115 | to perform tasks intelligently Eliza and 116 | 2:07 117 | gnarly charabot that could converse with 118 | 2:10 119 | people and act as a primitive therapist 120 | 2:12 121 | Blocksworld an example of a micro world 122 | 2:15 123 | where blogs get could be stacked and 124 | 2:17 125 | sorted and decision-making experiments 126 | 2:19 127 | could be tested by the mid-1970s it had 128 | 129 | The AI winter 130 | 131 | 2:22 132 | become apparent that the complexity of 133 | 2:24 134 | making intelligent machines had been 135 | 2:26 136 | understated and that its promise had 137 | 2:28 138 | been overblown compute power was too 139 | 2:31 140 | limited there was a lack of data to 141 | 2:33 142 | train and test AIS and there were 143 | 2:35 144 | questions around the ethics of 145 | 2:37 146 | introducing AI systems like the 147 | 2:39 148 | therapist Eliza into society 149 | 2:42 150 | funding dried up and confidence in the 151 | 2:44 152 | field slowed marking the beginning of 153 | 2:46 154 | what is called an AI winter in the 1980s 155 | 2:49 156 | as computers became more powerful expert 157 | 158 | Resurgence and fall of AI for expert systems 159 | 160 | 2:52 161 | systems became more successful there was 162 | 2:55 163 | a Resurgence in optimism about AI as 164 | 2:58 165 | businesses found practical applications 166 | 3:00 167 | of these rule-based inference systems by 168 | 3:03 169 | the late 80s it was becoming apparent 170 | 3:05 171 | that expert systems had become too 172 | 3:07 173 | specialized and were unlikely to achieve 174 | 3:10 175 | machine intelligence the rise of 176 | 3:12 177 | personal computers also competed with 178 | 3:15 179 | these large specialized centralized 180 | 3:16 181 | systems this led to another chill in the 182 | 3:19 183 | AI field things began to change in the 184 | 3:22 185 | mid-1990s as compute and storage 186 | 3:24 187 | capabilities grew exponentially making 188 | 3:27 189 | it possible to process much larger data 190 | 191 | Growth in AI driven by more data and more powerful hardware 192 | 193 | 3:30 194 | sets than ever before 195 | 3:32 196 | the rise of the internet and the 197 | 3:34 198 | popularity of smartphones both 199 | 3:36 200 | contributed to increasing amounts of 201 | 3:38 202 | data a new experiments in machine 203 | 3:39 204 | learning became possible 205 | 3:42 206 | throughout the 2000s significant 207 | 3:44 208 | advancements were made in computer 209 | 3:45 210 | vision and natural language processing 211 | 3:47 212 | by training machine learning models on 213 | 3:50 214 | 215 | Big Data 216 | 217 | 3:52 218 | in the past decade compute power and the 219 | 3:55 220 | size of data sets have continued to grow 221 | 3:57 222 | and machine learning has become capable 223 | 4:00 224 | of solving even more problems as a 225 | 4:02 226 | result today machine learning touches 227 | 4:04 228 | almost every part of our Lives sometimes 229 | 4:07 230 | we're well aware of it like when we 231 | 4:09 232 | interact with chat TPT in the browser or 233 | 4:12 234 | see a self-driving car go by 235 | 4:14 236 | but most of the time it's seamlessly 237 | 4:16 238 | woven into familiar experiences of our 239 | 4:19 240 | everyday life such as when we're 241 | 4:21 242 | approved for a new loan or get a catalog 243 | 4:23 244 | at home 245 | 246 | Increased awareness of ethical and responsible AI 247 | 248 | 4:25 249 | this era has also been marked by an 250 | 4:27 251 | increased awareness of potential ethical 252 | 4:29 253 | issues in machine learning and by 254 | 4:31 255 | significant research in the field of 256 | 4:33 257 | responsible AI 258 | 4:34 259 | we want the benefits of AI but we also 260 | 4:37 261 | want AI that is responsible and doesn't 262 | 4:39 263 | amplify human bias 264 | 4:42 265 | in the next video we will introduce 266 | 4:44 267 | techniques for building using and 268 | 4:46 269 | maintaining machine learning models I'll 270 | 4:48 271 | see you there -------------------------------------------------------------------------------- /workshop/python/workshop1/data/transcripts/03-techniques-ml.txt: -------------------------------------------------------------------------------- 1 | Intro 2 | 3 | 0:08 4 | the process of building machine learning 5 | 0:10 6 | models is very different from any other 7 | 0:13 8 | development workflow in this video 9 | 0:15 10 | you'll learn about that process more 11 | 0:18 12 | specifically you learn about deciding 13 | 0:20 14 | whether AI is the right approach for 15 | 0:22 16 | your problem collecting and preparing 17 | 0:24 18 | your data 19 | 0:25 20 | training your model evaluating your 21 | 0:28 22 | model tuning the hyper parameters and 23 | 0:30 24 | testing the trained model in the real 25 | 0:32 26 | world 27 | 28 | Decide if AI is the right approach 29 | 30 | 0:34 31 | traditional software is well suited to 32 | 0:36 33 | solve problems where the solution can be 34 | 0:39 35 | described as a formal set of rules in 36 | 0:42 37 | contrast AI shines in solving problems 38 | 0:44 39 | where the solution can be extracted from 40 | 0:46 41 | data many of the problems we encountered 42 | 0:49 43 | in our daily life can be efficiently 44 | 0:51 45 | solved with traditional programming if 46 | 0:53 47 | an engineer can break up the solution of 48 | 0:55 49 | a problem and Define it using precise 50 | 0:57 51 | rules then traditional programming is a 52 | 0:59 53 | great tool to use but many of the 54 | 1:01 55 | problems we encounter in our day-to-day 56 | 1:03 57 | aren't quite as easy to Define as a set 58 | 1:05 59 | of rules thankfully for many of those 60 | 1:08 61 | problems we have access to plenty of 62 | 1:10 63 | real life data containing useful 64 | 1:12 65 | information which means that AI can help 66 | 1:14 67 | us find a solution one good example is 68 | 1:17 69 | translating from one language to another 70 | 1:19 71 | writing a set of rules that full encodes 72 | 1:22 73 | all the parallels between two languages 74 | 1:24 75 | is not easy but there are many examples 76 | 1:26 77 | of translation online so AI has been 78 | 1:29 79 | able to do a much better job of 80 | 1:30 81 | translation than previous attempts so 82 | 1:33 83 | our first step when we're starting a new 84 | 1:35 85 | project should be to analyze the problem 86 | 1:37 87 | and determine which technique is best to 88 | 1:39 89 | solve it if you're able to obtain plenty 90 | 1:41 91 | of data that contains useful information 92 | 1:43 93 | about your solution then AI is a 94 | 1:46 95 | promising approach once you decided that 96 | 97 | Collect and prepare data 98 | 99 | 1:48 100 | AI is the right method for you you need 101 | 1:50 102 | to collect and prepare your data for 103 | 1:52 104 | example you may need to normalize it or 105 | 1:54 106 | convert it to a different form or remove 107 | 1:56 108 | rows that are missing certain Fields 109 | 1:58 110 | once your data is clean you need to 111 | 2:00 112 | decide about which aspects of your data 113 | 2:03 114 | or features you're going to use as input 115 | 2:05 116 | to your prediction and which feature you 117 | 2:07 118 | want to predict for example if you have 119 | 2:10 120 | medical data you may decide to use 121 | 2:12 122 | features that describe the patient's 123 | 2:14 124 | medical history as input and a chance of 125 | 2:16 126 | a particular disease as the output 127 | 2:18 128 | feature you want to predict 129 | 2:21 130 | and finally you need to split your data 131 | 2:23 132 | into training and test sets a usual 133 | 2:25 134 | split is 80 for your training data and 135 | 2:29 136 | 20 for test 137 | 138 | Train your model 139 | 140 | 2:31 141 | next you need to choose a machine 142 | 2:33 143 | learning algorithm which you'll learn a 144 | 2:35 145 | lot about in the coming videos if you're 146 | 2:37 147 | undecided between a few good algorithms 148 | 2:39 149 | you may want to try them all and see 150 | 2:41 151 | which one performs best 152 | 2:43 153 | then you need to train your model using 154 | 2:45 155 | the training set you collected earlier 156 | 2:47 157 | and the algorithm you chose training a 158 | 2:50 159 | model may take a while especially if the 160 | 2:52 161 | model is large 162 | 163 | Evaluate your model 164 | 165 | 2:53 166 | once the model is trained you can test 167 | 2:55 168 | it using the test data set that you 169 | 2:57 170 | split earlier it's important that you 171 | 3:00 172 | test the algorithm with data that it 173 | 3:01 174 | hasn't seen during training to ensure 175 | 3:04 176 | that it generalizes well to new 177 | 3:06 178 | scenarios 179 | 180 | Tune the hyperparameters 181 | 182 | 3:07 183 | some algorithms contain hyper parameters 184 | 3:10 185 | which are settings that control key 186 | 3:11 187 | aspects of their inner workings choosing 188 | 3:14 189 | good hyper parameters is important 190 | 3:16 191 | because they can make a big difference 192 | 3:17 193 | in your results if you want to be 194 | 3:19 195 | systematic about your hyper parameter 196 | 3:21 197 | search you can write code that tries 198 | 3:23 199 | lots of different combinations and helps 200 | 3:26 201 | you discover the best values for your 202 | 3:27 203 | data once you get good test results it's 204 | Test the model in the real world 205 | 3:30 206 | time to see how well your model performs 207 | 3:32 208 | within the context of its intended use 209 | 3:34 210 | for example this could involve 211 | 3:36 212 | collecting live data from a sensor and 213 | 3:38 214 | using it to make predictions or 215 | 3:40 216 | deploying a model to a few users of your 217 | 3:42 218 | application if it all looks good then 219 | 3:45 220 | you're ready to release it to production 221 | 3:46 222 | and enjoy its benefits 223 | 3:49 224 | make sure you watch the next video where 225 | 3:51 226 | we'll start getting Hands-On with 227 | 3:53 228 | machine learning by configuring all the 229 | 3:55 230 | tools we'll use in the rest of the 231 | 3:57 232 | series I'll see you there -------------------------------------------------------------------------------- /workshop/python/workshop1/plugins/AnswerPlugin/Summary/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Summary knowledge", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 3000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/python/workshop1/plugins/AnswerPlugin/Summary/skprompt.txt: -------------------------------------------------------------------------------- 1 | system : Please output the summary content according to the following requirements based on the input content. 2 | 3 | 1. Summarize the content within 100 words 4 | 2. Please use plain and understandable language 5 | 6 | user: {{$input}} -------------------------------------------------------------------------------- /workshop/python/workshop1/plugins/FilePlugin/Notes/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Get knowledge from Notes", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 13000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/python/workshop1/plugins/FilePlugin/Notes/skprompt.txt: -------------------------------------------------------------------------------- 1 | system : You are an AI teacher who understands markdown format. Please summarize the input content as required. 2 | 3 | 1. Ignore content related to Pre-lecture, Review & Self Study, and Assignment 4 | 5 | 2. Use # as content title , ## or ### is not content title 6 | 7 | 3. The format of knowledge points is ---\n## xxx ,such as 8 | 9 | --- 10 | ## xxx 11 | 12 | it is knowledge points, don't use knowledge points as title 13 | 14 | 3. Summarize knowledge points. The summary should not over 100 words. 15 | 16 | 4. After obtaining and summarizing all knowledge points, synthesize the content of all knowledge points and summarize it into a content of no more than 400 words as content output. 17 | 18 | 5. The output must be in json format "{\"kb\":\"title\",\"content\": \"Summarize all the knowledge points\"}" 19 | 20 | user: {{$input}} 21 | 22 | -------------------------------------------------------------------------------- /workshop/python/workshop1/plugins/FilePlugin/Transcrips/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Get knowledge from Transcript", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 30000, 7 | "temperature": 0.6, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/python/workshop1/plugins/FilePlugin/Transcrips/skprompt.txt: -------------------------------------------------------------------------------- 1 | System: You are an artificial intelligence teacher, please follow the transcript content 2 | 3 | Please perform content extraction according to the following conditions 4 | 5 | 6 | 1. There is a different time after the title of each knowledge point. Remove the time-related content and merge content. 7 | 8 | 2. Ignore the knowledge points and related content about Intro 9 | 10 | 3. get the title and merged content of each knowledge point ,summarize content of knowledge point in 5 tokens 11 | 12 | {"kb": "title", "content": "summarize content of knowledge point in 5 tokens"} 13 | 14 | 4. The output is a JSON array 15 | 16 | User: {{$input}} -------------------------------------------------------------------------------- /workshop/python/workshop2/.env: -------------------------------------------------------------------------------- 1 | AZURE_OPENAI_ENDPOINT = 'Your Azure OpenAI Service Endpoint' 2 | AZURE_OPENAI_API_KEY = 'Your Azure OpenAI Service API Key' 3 | AZURE_OPENAI_VERSION = 'Your Azure OpenAI Service Version, e.g. 2023-12-01-preview' 4 | AZURE_OPENAI_DEPLOYMENT_NAME = "Your Azure OpenAI Service gpt-35-turbo-16k Model Delployment Name" -------------------------------------------------------------------------------- /workshop/python/workshop2/docs/session1.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/workshop/python/workshop2/docs/session1.pdf -------------------------------------------------------------------------------- /workshop/python/workshop2/myfile.md: -------------------------------------------------------------------------------- 1 | --- 2 | marp: true 3 | --- 4 | ### **第一节 从非职业化到职业化:教育发展不同历史阶段中的教师** 5 | 6 | --- 7 | 8 | ### **一、非形式化教育中的教师** 9 | 10 | --- 11 | 12 | ### **(一)非形式化教育** 13 | 14 | 简单地说来,所谓非形式化教育,即在一般社会生活中通过社会互动对下一代进行的教育,这种教育形式最为典型的代表,是家庭或家族中的长者对幼者的养护、教导和指导。 15 | 16 | --- 17 | 18 | ### **(二)教育者的特点** 19 | 20 | - “教师”还没有从他(或她)的一般社会身份(如父母)中独立出来,成为一种专门的、独立的社会职能,也就是说,教师还没有职业化。在非形式化教育中,教师并不依靠教育别人来维持自己的生计,他们有赖以维系自己生活的职业(如农民、手工业者),只是兼带地对下一代发挥教育者的功能。 21 | - 在教师职业化之前,教育者和受教育者之间建立关系,通常依靠血缘纽带,社会中的成年人往往只对与自己有血缘关系的未成年人(如自己的子女、孙子、孙女等)进行教育,未成年人也只能向与自己有血缘关系的成年人学习。 这一方面使得教育者与受教育者之间,保持了密切的个人关系,但在另一方面,也大大地限制了教育和被教育的范围。 22 | - 在这样的教育活动中,身为教育者的成年人并不需要专门为了教育活动接受训练和准备,他们主要是依靠自己在社会生活中积累的生活经验进行教育,这使得教育的内容和过程难免狭隘。在非形式化教育活动中,教育者通常也并不专门地为了下一代的教育而专门设计课程、组织教学活动,教育活动缺乏组织性,也限制了教育活动的效率。 23 | 24 | --- 25 | 26 | ### **二、形式化教育中的教师** 27 | 28 | --- 29 | 30 | ### **(一)形式化教育的特点** 31 | 32 | - 有了相对稳定的教育场所(如孔子的私学或柏拉图的学园); 33 | - 相对稳定的教育内容; 34 | - 也有了相对稳定的师生关系。 35 | 36 | --- 37 | 38 | ### **(二)教育者的特点** 39 | 40 | - 首先,教师虽然已经部分地职业化,但并没有专职化。 41 | - 其次,教师所教的学生,已经不再限于与自己有血缘关系的人,教育对象的范围大大拓展。 42 | - 第三,教师所教授学生的内容,已经非常丰富和复杂,因此,教师需要经过专门的知识积累和准备,才能胜任教育工作,单靠个人的生活经验已经不足以满足需要了。 43 | 44 | --- 45 | 46 | ### **三、制度化教育中的教师** 47 | 48 | --- 49 | 50 | ### **(一)教育者的特点** 51 | 52 | - 首先,教师不但职业化,而且专职化。 53 | - 其次,教师和学生之间教与学关系的建立,主要依赖“制度的安排”而不是个人的努力。 54 | - 第三,在制度化的学校教育中,教师所教的内容复杂而系统,而且富于变化。 55 | 56 | --- 57 | 58 | ### **(二)相应的局限性** 59 | 60 | - 首先,专职化也有可能限制教师的生活经验范围,容易造成教师的教育活动与发展迅速的社会生活之间的脱节。 61 | - 其次,由于现代社会发展的速度越来越快,教育内容常常表现出很大的不稳定性,要经常根据社会发展的需要和本学习领域的发展做出调整。 62 | - 第三,由制度安排的师生关系,很容易使得师生之间变成冷冰冰的“工作关系”,缺少必要的以密切的个人关系为基础的人文关怀;这种制度化的工作关系,也容易降低教师对学生的影响力,尤其是在思想品德方面的影响力。 63 | 64 | --- 65 | 66 | ### **(三)总结教师角色的性质和教育工作的性质** 67 | 68 | - 教育者逐渐从非经常性地从事教育工作,到专门地、职业化地从事教育工作。 69 | - 教育者的工作安排,在历史的发展过程中,经历了从随意性的安排到自觉地组织的变化。 70 | - 在教育的历史发展过程中,教育工作经历了从简单到复杂的演变过程,进入20世纪后半期以后,教育工作的复杂化速度也越来越快。 71 | - 教育工作的复杂化,不但要求教师在从业之前必需进行较长时间的专业准备,而且也要求教师在从业之后,仍然不断地自我更新,以适应不断变化的教育工作的要求。 72 | 73 | --- 74 | 75 | ### **谢谢** 76 | 77 | --- -------------------------------------------------------------------------------- /workshop/python/workshop2/myfile.pptx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/microsoft/SemanticKernelCookBook/53a5029acc071db8a45e9b0c3df8e4ad06139300/workshop/python/workshop2/myfile.pptx -------------------------------------------------------------------------------- /workshop/python/workshop2/plugins/ToolsPlugin/PPT/config.json: -------------------------------------------------------------------------------- 1 | { 2 | "schema": 1, 3 | "description": "Content Convert", 4 | "execution_settings": { 5 | "default": { 6 | "max_tokens": 12000, 7 | "temperature": 0.7, 8 | "top_p": 0.0, 9 | "presence_penalty": 0.0, 10 | "frequency_penalty": 0.0 11 | } 12 | } 13 | } -------------------------------------------------------------------------------- /workshop/python/workshop2/plugins/ToolsPlugin/PPT/skprompt.txt: -------------------------------------------------------------------------------- 1 | 把输入内容按照以下条件,生成 ppt 每一页的内容,以 --- 作为分隔符,每一页的内容都是独立的,不会有跨页的内容。 2 | 3 | 1. 提取节名作为独立一页输出,如 4 | 5 | ## **第一节 xxxxx** 6 | 7 | --- 8 | 9 | 2. 每一节内的内容,知识点都用汉字数字加上顿号分割开,每个知识点作为一页输出,每个知识点的对应的知识内容进行归纳并作为另一页输出,如 10 | 11 | ### **一、xxxxx** 12 | 13 | --- 14 | 15 | ### **二、xxxxx** 16 | 17 | --- 18 | 19 | 20 | 每个知识点的对应的知识内容,如果含有(一),(二),(三),(四) 这样的子知识点,每个子知识点单独作为一页,子知识点对应的内容也必须在同一页中,并进行归纳,归纳不少于 20 个 tokens,如 21 | 22 | (一) xxxxx 23 | 24 | ....... 25 | 26 | 27 | 28 | (二) xxxxx 29 | 30 | ....... 31 | 32 | 33 | (三) xxxxx 34 | 35 | ....... 36 | 37 | 38 | (四) xxxxx 39 | 40 | ....... 41 | 42 | 43 | 输出为 44 | 45 | ### **(一) xxxxx** 46 | 47 | ....... 48 | 49 | --- 50 | 51 | ### **(二) xxxxx** 52 | 53 | ....... 54 | 55 | --- 56 | 57 | ### **(三) xxxxx** 58 | 59 | ....... 60 | 61 | --- 62 | 63 | ### **(四) xxxxx** 64 | 65 | ....... 66 | 67 | --- 68 | 69 | 70 | 71 | 3. 如果没有子知识点,就在一页内输出内容,如 72 | 73 | xxxxx 74 | 75 | ....... 76 | 77 | 输出 78 | 79 | 80 | xxxxx 81 | 82 | ....... 83 | 84 | --- 85 | 86 | user: {{$input}} --------------------------------------------------------------------------------