├── .cursor └── rules │ ├── api-usage.mdc │ ├── code-conventions.mdc │ ├── development-workflow.mdc │ ├── endpoints.mdc │ └── project-overview.mdc ├── .envrc.sample ├── .gitignore ├── .stylish-haskell.yaml ├── CHANGELOG.md ├── LICENSE ├── README.md ├── dependencies └── .gitkeep ├── examples ├── README.md ├── openai-example │ └── Main.hs ├── responses-example │ └── Main.hs ├── responses-stream-example │ └── Main.hs └── weather-chatbot-example │ ├── Main.hs │ └── README.md ├── flake.lock ├── flake.nix ├── openai.cabal ├── src └── OpenAI │ ├── Prelude.hs │ ├── V1.hs │ └── V1 │ ├── Assistants.hs │ ├── Audio.hs │ ├── Audio │ ├── Speech.hs │ ├── Transcriptions.hs │ └── Translations.hs │ ├── AutoOr.hs │ ├── Batches.hs │ ├── Chat │ └── Completions.hs │ ├── ChunkingStrategy.hs │ ├── DeletionStatus.hs │ ├── Embeddings.hs │ ├── Error.hs │ ├── Files.hs │ ├── FineTuning │ └── Jobs.hs │ ├── Images.hs │ ├── Images │ ├── Edits.hs │ ├── Generations.hs │ ├── Image.hs │ ├── ResponseFormat.hs │ └── Variations.hs │ ├── ListOf.hs │ ├── Message.hs │ ├── Models.hs │ ├── Moderations.hs │ ├── Order.hs │ ├── ResponseFormat.hs │ ├── Responses.hs │ ├── Threads.hs │ ├── Threads │ ├── Messages.hs │ ├── Runs.hs │ └── Runs │ │ └── Steps.hs │ ├── Tool.hs │ ├── ToolCall.hs │ ├── ToolResources.hs │ ├── Uploads.hs │ ├── Usage.hs │ ├── VectorStores.hs │ └── VectorStores │ ├── FileBatches.hs │ ├── FileCounts.hs │ ├── Files.hs │ └── Status.hs └── tasty ├── Main.hs └── data └── v1 ├── audio └── preamble.wav ├── batches └── requests.jsonl ├── fine_tuning └── jobs │ ├── training_data.jsonl │ └── validation_data.jsonl ├── images └── image.png ├── uploads ├── training_data0.jsonl └── training_data1.jsonl └── vector_stores └── index.html /.cursor/rules/api-usage.mdc: -------------------------------------------------------------------------------- 1 | --- 2 | description: 3 | globs: 4 | alwaysApply: false 5 | --- 6 | # API Usage Guide 7 | 8 | ## Basic Usage Pattern 9 | 10 | The library follows a consistent pattern for API usage: 11 | 12 | 1. Create a `ClientEnv` using `getClientEnv` 13 | 2. Create a `Methods` record using `makeMethods` 14 | 3. Extract the desired method from the `Methods` record 15 | 4. Call the method with appropriate parameters 16 | 17 | Example from [openai-example/Main.hs](mdc:openai-example/Main.hs): 18 | 19 | ```haskell 20 | -- Get API key from environment 21 | key <- Environment.getEnv "OPENAI_KEY" 22 | 23 | -- Create client environment 24 | clientEnv <- getClientEnv "https://api.openai.com" 25 | 26 | -- Create methods record with authentication 27 | let Methods{ createChatCompletion } = makeMethods clientEnv (Text.pack key) 28 | 29 | -- Call API method 30 | ChatCompletionObject{ choices } <- createChatCompletion _CreateChatCompletion 31 | { messages = [ User{ content = [ Text{ text } ], name = Nothing } ] 32 | , model = "gpt-4o-mini" 33 | } 34 | ``` 35 | 36 | ## Authentication 37 | 38 | All API calls require an OpenAI API key, which should be provided when creating the `Methods` record: 39 | 40 | ```haskell 41 | let methods = makeMethods clientEnv (Text.pack apiKey) 42 | ``` 43 | 44 | ## Error Handling 45 | 46 | API errors are thrown as exceptions. You should use standard Haskell exception handling mechanisms like `try` or `catch` to handle them: 47 | 48 | ```haskell 49 | result <- try (createChatCompletion params) 50 | case result of 51 | Left err -> handleError err 52 | Right response -> processResponse response 53 | ``` 54 | -------------------------------------------------------------------------------- /.cursor/rules/code-conventions.mdc: -------------------------------------------------------------------------------- 1 | --- 2 | description: 3 | globs: 4 | alwaysApply: false 5 | --- 6 | # Code Conventions 7 | 8 | ## Language Extensions 9 | 10 | The codebase consistently uses the following GHC language extensions: 11 | 12 | ```haskell 13 | DataKinds 14 | DeriveAnyClass 15 | DeriveGeneric 16 | DerivingStrategies 17 | DuplicateRecordFields 18 | FlexibleInstances 19 | GeneralizedNewtypeDeriving 20 | OverloadedLists 21 | OverloadedStrings 22 | RecordWildCards 23 | MultiParamTypeClasses 24 | NamedFieldPuns 25 | TypeApplications 26 | TypeOperators 27 | ViewPatterns 28 | ``` 29 | 30 | These extensions are enabled in [openai.cabal](mdc:openai.cabal) for the entire library. 31 | 32 | ## Imports 33 | 34 | Imports should be organized in the following order: 35 | 1. Unqualified imports, sorted alphabetically 36 | 2. Qualified imports, sorted alphabetically, with a qualifier comprising complete words and not just letters (e.g. `import qualified Data.Vector as Vector`) 37 | 38 | ## Naming Conventions 39 | 40 | - Module names use upper camel case (e.g., `OpenAI.V1.Chat.Completions`) 41 | - Type names use upper camel case (e.g., `ChatCompletionObject`) 42 | - Function names use lower camel case (e.g., `createChatCompletion`) 43 | - Record field names use lower camel case 44 | - Avoid abbreviations in names when possible 45 | 46 | ## API Structure 47 | 48 | OpenAI endpoints are organized into modules corresponding to their API paths, with operations exposed through the `Methods` record in [src/OpenAI/V1.hs](mdc:src/OpenAI/V1.hs). 49 | -------------------------------------------------------------------------------- /.cursor/rules/development-workflow.mdc: -------------------------------------------------------------------------------- 1 | --- 2 | description: 3 | globs: 4 | alwaysApply: false 5 | --- 6 | # Development Workflow 7 | 8 | This project uses Nix with flakes for development environment setup. The flake configuration is in [flake.nix](mdc:flake.nix). 9 | 10 | ## Development Environment Setup 11 | 12 | 1. Ensure you have Nix with flakes enabled 13 | 2. Use direnv to automatically load the development environment: 14 | 15 | ```bash 16 | # Enable direnv hook in your shell 17 | eval "$(direnv hook bash)" # or zsh, fish, etc. 18 | 19 | # Enter the project directory (direnv will automatically load the environment) 20 | cd openai 21 | ``` 22 | 23 | ## Testing 24 | 25 | The test suite is in the `tasty/` directory. Run tests with: 26 | 27 | ```bash 28 | cabal test 29 | ``` 30 | 31 | Test data is located in `tasty/data/`. 32 | 33 | ## Adding New OpenAI API Endpoints 34 | 35 | When adding a new OpenAI API endpoint: 36 | 37 | 1. Create a new module in `src/OpenAI/V1/` for the endpoint 38 | 2. Define the request and response types 39 | 3. Add the endpoint to the Servant API in [src/OpenAI/V1.hs](mdc:src/OpenAI/V1.hs) 40 | 4. Add the corresponding method to the `Methods` record 41 | 5. Add tests in the `tasty/` directory 42 | 43 | ## Documentation 44 | 45 | Keep documentation in module headers up to date. Use Haddock syntax for documenting functions, types, and modules. 46 | -------------------------------------------------------------------------------- /.cursor/rules/endpoints.mdc: -------------------------------------------------------------------------------- 1 | --- 2 | description: 3 | globs: 4 | alwaysApply: false 5 | --- 6 | # OpenAI API Endpoints 7 | 8 | This library provides access to the following OpenAI API endpoints: 9 | 10 | ## Chat Completions 11 | 12 | - Module: [OpenAI.V1.Chat.Completions](mdc:src/OpenAI/V1/Chat/Completions.hs) 13 | - Method: `createChatCompletion` 14 | - Creates a completion for a chat conversation 15 | 16 | ## Models 17 | 18 | - Module: [OpenAI.V1.Models](mdc:src/OpenAI/V1/Models.hs) 19 | - Methods: `listModels`, `retrieveModel`, `deleteModel` 20 | - Lists, retrieves, and deletes models 21 | 22 | ## Images 23 | 24 | - Module: [OpenAI.V1.Images](mdc:src/OpenAI/V1/Images.hs) 25 | - Methods: `createImage`, `createImageEdit`, `createImageVariation` 26 | - Generates, edits, and creates variations of images 27 | 28 | ## Embeddings 29 | 30 | - Module: [OpenAI.V1.Embeddings](mdc:src/OpenAI/V1/Embeddings.hs) 31 | - Method: `createEmbeddings` 32 | - Creates embeddings for input text 33 | 34 | ## Files 35 | 36 | - Module: [OpenAI.V1.Files](mdc:src/OpenAI/V1/Files.hs) 37 | - Methods: `uploadFile`, `listFiles`, `retrieveFile`, `deleteFile`, `retrieveFileContent` 38 | - Manages files used for fine-tuning and other purposes 39 | 40 | ## Audio 41 | 42 | - Module: [OpenAI.V1.Audio](mdc:src/OpenAI/V1/Audio.hs) 43 | - Methods: `createSpeech`, `createTranscription`, `createTranslation` 44 | - Handles text-to-speech, transcription, and translation 45 | 46 | ## Fine-tuning 47 | 48 | - Module: [OpenAI.V1.FineTuning.Jobs](mdc:src/OpenAI/V1/FineTuning/Jobs.hs) 49 | - Methods: `createFineTuningJob`, `listFineTuningJobs`, etc. 50 | - Manages fine-tuning jobs 51 | 52 | ## Assistants, Threads, and Runs 53 | 54 | - Modules: `OpenAI.V1.Assistants`, `OpenAI.V1.Threads`, `OpenAI.V1.Threads.Runs` 55 | - Methods: Various CRUD operations for assistants, threads, and runs 56 | - Implements the assistants API for complex multi-turn conversations 57 | 58 | Additional endpoints are available for moderations, vector stores, and other OpenAI services. Refer to the `Methods` record in [src/OpenAI/V1.hs](mdc:src/OpenAI/V1.hs) for a complete list. 59 | -------------------------------------------------------------------------------- /.cursor/rules/project-overview.mdc: -------------------------------------------------------------------------------- 1 | --- 2 | description: 3 | globs: 4 | alwaysApply: false 5 | --- 6 | # Project Overview 7 | 8 | This library provides Haskell bindings to the OpenAI API using the Servant framework. It offers a type-safe interface for interacting with various OpenAI endpoints. 9 | 10 | ## Main Components 11 | 12 | - [src/OpenAI/V1.hs](mdc:src/OpenAI/V1.hs) - Main entry point for the library, exports the Methods record and utility functions 13 | - [openai-example/Main.hs](mdc:openai-example/Main.hs) - Example application demonstrating basic usage 14 | 15 | ## Project Structure 16 | 17 | - `src/OpenAI/V1/` - Contains modules for each OpenAI API endpoint 18 | - `openai-example/` - Example application showing library usage 19 | - `tasty/` - Test suite 20 | 21 | ## Key Features 22 | 23 | - Type-safe API with proper Haskell data types 24 | - Support for all OpenAI API endpoints 25 | - Convenient helper functions for authentication and environment setup 26 | -------------------------------------------------------------------------------- /.envrc.sample: -------------------------------------------------------------------------------- 1 | use flake . 2 | 3 | export OPENAI_KEY="sk-proj-…" -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .direnv 2 | dist-newstyle 3 | .envrc 4 | .DS_Store 5 | -------------------------------------------------------------------------------- /.stylish-haskell.yaml: -------------------------------------------------------------------------------- 1 | steps: 2 | - module_header: 3 | sort: false 4 | 5 | separate_lists: false 6 | 7 | break_where: exports 8 | 9 | - imports: 10 | align: group 11 | 12 | pad_module_names: false 13 | 14 | long_list_align: new_line_multiline 15 | 16 | separate_lists: false 17 | 18 | - language_pragmas: 19 | style: vertical 20 | 21 | - tabs: {} 22 | 23 | - trailing_whitespace: {} 24 | 25 | cabal: true 26 | -------------------------------------------------------------------------------- /CHANGELOG.md: -------------------------------------------------------------------------------- 1 | 1.2.0: 2 | 3 | - [`/v1/responses`: Add support for Responses API](https://platform.openai.com/docs/api-reference/responses) 4 | - Add `Tool_Web_Search` 5 | 6 | 1.1.1: 7 | 8 | - [Remove timeout on default `ClientEnv`](https://github.com/MercuryTechnologies/openai/pull/55) 9 | - [`/v1/chat/completions`: Add support for Search](https://github.com/MercuryTechnologies/openai/pull/57) 10 | - [Fix CreateSpeech JSON instances, add new voices and optional instructions field](https://github.com/MercuryTechnologies/openai/pull/58) 11 | - [Correct `ToJSON` of `FileSearchResources`](https://github.com/MercuryTechnologies/openai/pull/49) 12 | - [New example app for tool-calling and chat-loop](https://github.com/MercuryTechnologies/openai/pull/60) 13 | 14 | 1.1.0: 15 | 16 | - BREAKING CHANGE: Fix details representations for various types [[#44](https://github.com/MercuryTechnologies/openai/pull/44)] [[#45](https://github.com/MercuryTechnologies/openai/pull/45)] [[#50](https://github.com/MercuryTechnologies/openai/pull/50)] [[#51](https://github.com/MercuryTechnologies/openai/pull/51)] 17 | 18 | A few details-related fields were fixed to match the behavior of the OpenAI 19 | API. 20 | 21 | - Add `FromJSON`/`ToJSON` instances for all types [[#42](https://github.com/MercuryTechnologies/openai/pull/42)] [[#47](https://github.com/MercuryTechnologies/openai/pull/47)] 22 | 23 | - [Add support for `reasoning_effort` parameter in chat completions](https://github.com/MercuryTechnologies/openai/pull/48) 24 | 25 | 1.0.1: 26 | 27 | - Include `README` 28 | - Include usage example 29 | 30 | 1.0.0: 31 | 32 | - Initial release 33 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2024, Gabriella Gonzalez 2 | 3 | All rights reserved. 4 | 5 | Redistribution and use in source and binary forms, with or without 6 | modification, are permitted provided that the following conditions are met: 7 | 8 | * Redistributions of source code must retain the above copyright 9 | notice, this list of conditions and the following disclaimer. 10 | 11 | * Redistributions in binary form must reproduce the above 12 | copyright notice, this list of conditions and the following 13 | disclaimer in the documentation and/or other materials provided 14 | with the distribution. 15 | 16 | * Neither the name of Gabriella Gonzalez nor the names of other 17 | contributors may be used to endorse or promote products derived 18 | from this software without specific prior written permission. 19 | 20 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS 21 | "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT 22 | LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR 23 | A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT 24 | OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, 25 | SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT 26 | LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, 27 | DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY 28 | THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT 29 | (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 30 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 31 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # `openai` 2 | 3 | This provides a binding to OpenAI's API using `servant` 4 | 5 | ## Example usage 6 | 7 | ```haskell 8 | {-# LANGUAGE DuplicateRecordFields #-} 9 | {-# LANGUAGE NamedFieldPuns #-} 10 | {-# LANGUAGE OverloadedStrings #-} 11 | {-# LANGUAGE OverloadedLists #-} 12 | 13 | module Main where 14 | 15 | import Data.Foldable (traverse_) 16 | import OpenAI.V1 17 | import OpenAI.V1.Chat.Completions 18 | 19 | import qualified Data.Text as Text 20 | import qualified Data.Text.IO as Text.IO 21 | import qualified System.Environment as Environment 22 | 23 | main :: IO () 24 | main = do 25 | key <- Environment.getEnv "OPENAI_KEY" 26 | 27 | clientEnv <- getClientEnv "https://api.openai.com" 28 | 29 | let Methods{ createChatCompletion } = makeMethods clientEnv (Text.pack key) 30 | 31 | text <- Text.IO.getLine 32 | 33 | ChatCompletionObject{ choices } <- createChatCompletion _CreateChatCompletion 34 | { messages = [ User{ content = [ Text{ text } ], name = Nothing } ] 35 | , model = "gpt-4o-mini" 36 | } 37 | 38 | let display Choice{ message } = Text.IO.putStrLn (messageToContent message) 39 | 40 | traverse_ display choices 41 | ``` 42 | 43 | ### Responses API (JSON) 44 | 45 | ```haskell 46 | {-# LANGUAGE DuplicateRecordFields #-} 47 | {-# LANGUAGE NamedFieldPuns #-} 48 | {-# LANGUAGE OverloadedStrings #-} 49 | 50 | import qualified Data.Text as Text 51 | import qualified OpenAI.V1 as V1 52 | import qualified OpenAI.V1.Responses as Responses 53 | 54 | main :: IO () 55 | main = do 56 | key <- System.Environment.getEnv "OPENAI_KEY" 57 | 58 | env <- V1.getClientEnv "https://api.openai.com" 59 | let V1.Methods{ createResponse } = V1.makeMethods env (Text.pack key) Nothing Nothing 60 | 61 | let req = Responses._CreateResponse 62 | { Responses.model = "gpt-5" 63 | , Responses.input = Just (Responses.Input 64 | [ Responses.Item_InputMessage 65 | { Responses.role = Responses.User 66 | , Responses.content = [ Responses.Input_Text{ Responses.text = "Say hello in one sentence." } ] 67 | , Responses.status = Nothing 68 | } 69 | ]) 70 | } 71 | 72 | res <- createResponse req 73 | print res 74 | ``` 75 | 76 | 77 | ## Setup 78 | 79 | ### Using Nix with Flakes (Recommended) 80 | 81 | This project uses Nix with flakes for development environment setup. 82 | 83 | 1. Ensure you have Nix with flakes enabled 84 | 2. Copy the sample environment file and configure your OpenAI API key: 85 | 86 | ```bash 87 | # Copy the sample environment file 88 | cp .envrc.sample .envrc 89 | ``` 90 | 91 | 3. Edit the `.envrc` file and replace the placeholder API key with your actual key 92 | 93 | 4. Use direnv to automatically load the development environment: 94 | 95 | ```bash 96 | # Install direnv if you haven't already 97 | # macOS: brew install direnv 98 | # Linux: your-package-manager install direnv 99 | 100 | # Enable direnv hook in your shell 101 | eval "$(direnv hook bash)" # or zsh, fish, etc. 102 | 103 | # Clone the repository and enter the directory 104 | git clone https://github.com/MercuryTechnologies/openai.git 105 | cd openai 106 | 107 | # Allow direnv (this will automatically load the environment) 108 | direnv allow 109 | ``` 110 | 111 | ### Manual Setup 112 | 113 | Without Nix: 114 | 115 | ```bash 116 | # Clone the repository 117 | git clone https://github.com/MercuryTechnologies/openai.git 118 | cd openai 119 | 120 | # Build with cabal 121 | cabal build 122 | ``` 123 | 124 | ## Environment Variables 125 | 126 | Set your OpenAI API key as an environment variable: 127 | 128 | ```bash 129 | # Option 1: Set directly in your shell 130 | export OPENAI_KEY="your-openai-api-key" 131 | 132 | # Option 2: Using .envrc with direnv (recommended) 133 | (umask 077; cp .envrc.sample .envrc) 134 | # Edit .envrc to add your API key 135 | direnv allow 136 | ``` 137 | 138 | The API key is needed for running the test suite and example program. 139 | 140 | ## Testing 141 | 142 | Run the test suite: 143 | 144 | ```bash 145 | cabal test 146 | ``` 147 | 148 | The test suite is in the `tasty/` directory with test data located in `tasty/data/`. 149 | 150 | ## Running the Example 151 | 152 | ```bash 153 | # Make sure your API key is set (either via .envrc or export) 154 | # If using direnv with proper .envrc setup, this happens automatically 155 | 156 | # Build and run the example 157 | cabal run openai-example 158 | ``` 159 | -------------------------------------------------------------------------------- /dependencies/.gitkeep: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MercuryTechnologies/openai/0246105e18f52f9c509b36f0769a137a8f822a93/dependencies/.gitkeep -------------------------------------------------------------------------------- /examples/README.md: -------------------------------------------------------------------------------- 1 | # OpenAI Haskell Examples 2 | 3 | This directory contains examples demonstrating various features of the OpenAI Haskell library. 4 | 5 | ## Examples 6 | 7 | ### [`openai-example`](./openai-example/) 8 | 9 | A simple example demonstrating basic chat completion with the OpenAI API. 10 | 11 | **Features:** 12 | 13 | - Basic chat completion 14 | - Simple text input/output 15 | - Minimal setup 16 | 17 | **Usage:** 18 | 19 | ```bash 20 | cabal run openai-example 21 | ``` 22 | 23 | ### [`weather-chatbot-example`](./weather-chatbot-example/) 24 | 25 | A chatbot example demonstrating tool calling and turn-based conversation flow. 26 | 27 | **Features:** 28 | 29 | - OpenAI Function Calling (Tools) 30 | - Interactive chatbot with conversation history 31 | - Mock weather tool implementation 32 | - Turn-based conversation flow 33 | - Multiple tool calls support 34 | 35 | **Usage:** 36 | 37 | ```bash 38 | cabal run weather-chatbot-example 39 | ``` 40 | 41 | ### [`responses-example`](./responses-example/) 42 | 43 | Minimal example using the Responses API to generate text output. 44 | 45 | **Features:** 46 | 47 | - Calls `/v1/responses` 48 | - Prints aggregated output text from response items 49 | 50 | **Usage:** 51 | 52 | ```bash 53 | cabal run responses-example 54 | ``` 55 | 56 | ### [`responses-stream-example`](./responses-stream-example/) 57 | 58 | Streams events from the Responses API and prints text deltas in real time. 59 | 60 | **Features:** 61 | 62 | - Calls `/v1/responses` with `stream = true` 63 | - Handles `response.output_text.delta` and prints deltas as they arrive 64 | 65 | **Usage:** 66 | 67 | ```bash 68 | cabal run responses-stream-example 69 | ``` 70 | 71 | ## Setup 72 | 73 | All examples require an OpenAI API key set as an environment variable: 74 | 75 | ```bash 76 | export OPENAI_KEY="your-openai-api-key-here" 77 | ``` 78 | 79 | ## Building 80 | 81 | Build all examples: 82 | 83 | ```bash 84 | cabal build 85 | ``` 86 | 87 | Build specific example: 88 | 89 | ```bash 90 | cabal build openai-example 91 | cabal build weather-chatbot-example 92 | cabal build responses-example 93 | cabal build responses-stream-example 94 | ``` 95 | -------------------------------------------------------------------------------- /examples/openai-example/Main.hs: -------------------------------------------------------------------------------- 1 | {-# LANGUAGE DuplicateRecordFields #-} 2 | {-# LANGUAGE NamedFieldPuns #-} 3 | {-# LANGUAGE OverloadedLists #-} 4 | {-# LANGUAGE OverloadedStrings #-} 5 | 6 | module Main where 7 | 8 | import Data.Foldable (traverse_) 9 | import OpenAI.V1 10 | import OpenAI.V1.Chat.Completions 11 | 12 | import qualified Data.Text as Text 13 | import qualified Data.Text.IO as Text.IO 14 | import qualified System.Environment as Environment 15 | 16 | main :: IO () 17 | main = do 18 | key <- Environment.getEnv "OPENAI_KEY" 19 | 20 | clientEnv <- getClientEnv "https://api.openai.com" 21 | 22 | let Methods{ createChatCompletion } = makeMethods clientEnv (Text.pack key) Nothing Nothing 23 | 24 | text <- Text.IO.getLine 25 | 26 | ChatCompletionObject{ choices } <- createChatCompletion _CreateChatCompletion 27 | { messages = [ User{ content = [ Text{ text } ], name = Nothing } ] 28 | , model = "gpt-4o-mini" 29 | } 30 | 31 | let display Choice{ message } = Text.IO.putStrLn (messageToContent message) 32 | 33 | traverse_ display choices 34 | -------------------------------------------------------------------------------- /examples/responses-example/Main.hs: -------------------------------------------------------------------------------- 1 | {-# LANGUAGE DuplicateRecordFields #-} 2 | {-# LANGUAGE NamedFieldPuns #-} 3 | {-# LANGUAGE OverloadedLists #-} 4 | {-# LANGUAGE OverloadedStrings #-} 5 | 6 | module Main where 7 | 8 | import Data.Foldable (toList) 9 | import System.Environment (getEnv) 10 | 11 | import qualified Data.Text as Text 12 | import qualified Data.Text.IO as TextIO 13 | import qualified OpenAI.V1 as V1 14 | import qualified OpenAI.V1.Responses as Responses 15 | 16 | main :: IO () 17 | main = do 18 | key <- Text.pack <$> getEnv "OPENAI_KEY" 19 | env <- V1.getClientEnv "https://api.openai.com" 20 | 21 | let V1.Methods{ createResponse } = V1.makeMethods env key Nothing Nothing 22 | 23 | let req = Responses._CreateResponse 24 | { Responses.model = "gpt-5" 25 | , Responses.input = Just (Responses.Input 26 | [ Responses.Item_InputMessage 27 | { Responses.role = Responses.User 28 | , Responses.content = [ Responses.Input_Text{ Responses.text = "Tell me a three sentence bedtime story about a unicorn." } ] 29 | , Responses.status = Nothing 30 | } 31 | ]) 32 | } 33 | 34 | resp <- createResponse req 35 | 36 | let texts = collectText resp 37 | mapM_ TextIO.putStrLn texts 38 | 39 | collectText :: Responses.ResponseObject -> [Text.Text] 40 | collectText Responses.ResponseObject{ Responses.output } = do 41 | Responses.Item_OutputMessage{ Responses.message_content } <- toList output 42 | Responses.Output_Text{ Responses.text } <- toList message_content 43 | return text 44 | -------------------------------------------------------------------------------- /examples/responses-stream-example/Main.hs: -------------------------------------------------------------------------------- 1 | {-# LANGUAGE DuplicateRecordFields #-} 2 | {-# LANGUAGE NamedFieldPuns #-} 3 | {-# LANGUAGE OverloadedLists #-} 4 | {-# LANGUAGE OverloadedStrings #-} 5 | 6 | module Main where 7 | 8 | import System.Environment (getEnv) 9 | import System.IO (hFlush, hPutStrLn, stderr, stdout) 10 | 11 | import qualified Data.Text as T 12 | import qualified Data.Text.IO as TIO 13 | import qualified OpenAI.V1 as V1 14 | import qualified OpenAI.V1.Responses as Responses 15 | import qualified OpenAI.V1.Tool as Tool 16 | 17 | main :: IO () 18 | main = do 19 | key <- T.pack <$> getEnv "OPENAI_KEY" 20 | env <- V1.getClientEnv "https://api.openai.com" 21 | 22 | let V1.Methods{ createResponseStreamTyped } = V1.makeMethods env key Nothing Nothing 23 | 24 | let onEvent (Left err) = hPutStrLn stderr ("stream error: " <> T.unpack err) 25 | onEvent (Right ev) = case ev of 26 | -- Only print model text deltas and newline on part done 27 | Responses.ResponseTextDeltaEvent{ Responses.delta = d } -> 28 | TIO.putStr d >> hFlush stdout 29 | Responses.ResponseTextDoneEvent{} -> putStrLn "" 30 | -- Ignore all other events for a clean output 31 | _ -> pure () 32 | 33 | -- 1) Cute haiku test (no tools) 34 | let reqHaiku = Responses._CreateResponse 35 | { Responses.model = "gpt-5-mini" 36 | , Responses.input = Just (Responses.Input 37 | [ Responses.Item_InputMessage 38 | { Responses.role = Responses.User 39 | , Responses.content = [ Responses.Input_Text{ Responses.text = "Write a short haiku about the sea." } ] 40 | , Responses.status = Nothing 41 | } 42 | ]) 43 | } 44 | 45 | createResponseStreamTyped reqHaiku onEvent 46 | 47 | putStrLn "--------------------------------" 48 | 49 | -- 2) Web search example 50 | let reqSearch = Responses._CreateResponse 51 | { Responses.model = "gpt-5-mini" 52 | , Responses.input = Just (Responses.Input 53 | [ Responses.Item_InputMessage 54 | { Responses.role = Responses.User 55 | , Responses.content = [ Responses.Input_Text{ Responses.text = "Use web_search to find current news about France and display a concise summary. Do not include citations, references, or URLs in the output; provide only the summary text." } ] 56 | , Responses.status = Nothing 57 | } 58 | ]) 59 | , Responses.tools = Just [ Tool.Tool_Web_Search ] 60 | } 61 | 62 | createResponseStreamTyped reqSearch onEvent 63 | 64 | putStrLn "--------------------------------" 65 | 66 | -- 3) Code interpreter example (per docs) 67 | let reqCode = Responses._CreateResponse 68 | { Responses.model = "gpt-5-mini" 69 | , Responses.instructions = Just "You are a personal math tutor. When asked a math question, write and run code using the python tool to answer the question." 70 | , Responses.input = Just (Responses.Input 71 | [ Responses.Item_InputMessage 72 | { Responses.role = Responses.User 73 | , Responses.content = [ Responses.Input_Text{ Responses.text = "I need to solve the equation 3x + 11 = 14. Can you help me?" } ] 74 | , Responses.status = Nothing 75 | } 76 | ]) 77 | , Responses.tools = Just [ Tool.codeInterpreterAuto ] 78 | } 79 | 80 | createResponseStreamTyped reqCode onEvent 81 | -------------------------------------------------------------------------------- /examples/weather-chatbot-example/Main.hs: -------------------------------------------------------------------------------- 1 | {-# LANGUAGE DuplicateRecordFields #-} 2 | {-# LANGUAGE NamedFieldPuns #-} 3 | {-# LANGUAGE OverloadedLists #-} 4 | {-# LANGUAGE OverloadedStrings #-} 5 | 6 | module Main where 7 | 8 | import Control.Monad (foldM, when) 9 | import Data.Aeson (Value(..), (.=)) 10 | import Data.Text (Text) 11 | import Data.Vector (Vector) 12 | import GHC.Exts (toList) 13 | import OpenAI.V1 14 | import OpenAI.V1.Chat.Completions 15 | import OpenAI.V1.Tool 16 | import OpenAI.V1.ToolCall 17 | 18 | import qualified Data.Aeson as Aeson 19 | import qualified Data.ByteString.Lazy as ByteString.Lazy 20 | import qualified Data.Text as Text 21 | import qualified Data.Text.Encoding as Text.Encoding 22 | import qualified Data.Text.IO as Text.IO 23 | import qualified Data.Vector as Vector 24 | import qualified System.Environment as Environment 25 | 26 | -- | Mock weather data for different cities 27 | mockWeatherData :: Text -> Text 28 | mockWeatherData city = case Text.toLower city of 29 | "london" -> "London: 18°C, partly cloudy, 60% humidity, light rain expected" 30 | "paris" -> "Paris: 22°C, sunny, 45% humidity, clear skies" 31 | "tokyo" -> "Tokyo: 28°C, humid, 80% humidity, thunderstorms possible" 32 | "new york" -> "New York: 25°C, partly sunny, 55% humidity, pleasant weather" 33 | "san francisco" -> "San Francisco: 16°C, foggy, 70% humidity, typical coastal weather" 34 | _ -> city <> ": Weather data not available for this location. Try London, Paris, Tokyo, New York, or San Francisco." 35 | 36 | -- | Process a weather tool call and return the result 37 | processWeatherToolCall :: ToolCall -> IO (Message (Vector Content)) 38 | processWeatherToolCall (ToolCall_Function toolCallId function) = do 39 | let functionName = OpenAI.V1.ToolCall.name function 40 | arguments = OpenAI.V1.ToolCall.arguments function 41 | 42 | putStrLn $ "🔧 Processing tool call: " <> Text.unpack functionName 43 | putStrLn $ " Arguments: " <> Text.unpack arguments 44 | 45 | -- Parse arguments to extract city name 46 | let weatherResult = case Aeson.decode (ByteString.Lazy.fromStrict $ Text.Encoding.encodeUtf8 arguments) of 47 | Just (Object obj) -> case lookup "city" (toList obj) of 48 | Just (String cityName) -> mockWeatherData cityName 49 | _ -> "Error: Could not parse city name from arguments" 50 | _ -> "Error: Invalid arguments format" 51 | 52 | putStrLn $ " Result: " <> Text.unpack weatherResult 53 | putStrLn "" 54 | 55 | -- Return a Tool message with the result 56 | return $ 57 | Tool 58 | { content = [Text weatherResult], 59 | tool_call_id = toolCallId 60 | } 61 | 62 | -- | Define the weather tool 63 | weatherTool :: Tool 64 | weatherTool = 65 | Tool_Function $ 66 | OpenAI.V1.Tool.Function 67 | { description = Just "Get current weather information for a city", 68 | name = "get_weather", 69 | parameters = 70 | Just $ 71 | Aeson.object 72 | [ "type" .= ("object" :: Text), 73 | "properties" 74 | .= Aeson.object 75 | [ "city" 76 | .= Aeson.object 77 | [ "type" .= ("string" :: Text), 78 | "description" .= ("The city name to get weather for" :: Text) 79 | ] 80 | ], 81 | "required" .= (["city"] :: [Text]), 82 | "additionalProperties" .= False 83 | ], 84 | strict = Just True 85 | } 86 | 87 | -- | Convert a Message Text to Message (Vector Content) 88 | convertMessage :: Message Text -> Message (Vector Content) 89 | convertMessage (System {content, name}) = System {content = [Text content], name = name} 90 | convertMessage (User {content, name}) = User {content = [Text content], name = name} 91 | convertMessage (Assistant {assistant_content, refusal, name, assistant_audio, tool_calls}) = 92 | Assistant 93 | { assistant_content = fmap (\c -> [Text c]) assistant_content, 94 | refusal = refusal, 95 | name = name, 96 | assistant_audio = assistant_audio, 97 | tool_calls = tool_calls 98 | } 99 | convertMessage (Tool {content, tool_call_id}) = Tool {content = [Text content], tool_call_id = tool_call_id} 100 | 101 | -- | Process all tool calls in a message and return tool response messages 102 | processToolCalls :: Vector ToolCall -> IO (Vector (Message (Vector Content))) 103 | processToolCalls toolCalls = do 104 | when (not $ Vector.null toolCalls) $ do 105 | putStrLn $ "🤖 Processing " <> show (Vector.length toolCalls) <> " tool call(s)..." 106 | Vector.mapM processWeatherToolCall toolCalls 107 | 108 | -- | Main chat loop 109 | chatLoop :: (CreateChatCompletion -> IO ChatCompletionObject) -> Vector (Message (Vector Content)) -> IO () 110 | chatLoop createChatCompletion messages = do 111 | Text.IO.putStr "You: " 112 | userInput <- Text.IO.getLine 113 | 114 | -- Exit condition 115 | when (Text.toLower userInput == "quit" || Text.toLower userInput == "exit") $ do 116 | putStrLn "Goodbye! 👋" 117 | return () 118 | 119 | -- Add user message to conversation 120 | let userMessage = 121 | User 122 | { content = [Text userInput], 123 | name = Nothing 124 | } 125 | updatedMessages = messages <> [userMessage] 126 | 127 | -- Make API call with tool support 128 | response <- 129 | createChatCompletion 130 | _CreateChatCompletion 131 | { messages = updatedMessages, 132 | model = "gpt-4o-mini", 133 | tools = Just [weatherTool], 134 | tool_choice = Just ToolChoiceAuto 135 | } 136 | 137 | let ChatCompletionObject {choices} = response 138 | 139 | -- Process each choice (usually just one) 140 | newMessages <- foldM (processChoice createChatCompletion) updatedMessages choices 141 | 142 | -- Continue the conversation 143 | chatLoop createChatCompletion newMessages 144 | 145 | -- | Process a single choice, handling potential tool calls 146 | processChoice :: 147 | (CreateChatCompletion -> IO ChatCompletionObject) -> 148 | Vector (Message (Vector Content)) -> 149 | Choice -> 150 | IO (Vector (Message (Vector Content))) 151 | processChoice createChatCompletion messages choice = do 152 | let Choice {message = assistantMessage} = choice 153 | 154 | case assistantMessage of 155 | Assistant {assistant_content, tool_calls = Just toolCalls} -> do 156 | -- Display assistant's message if any 157 | case assistant_content of 158 | Just content -> putStrLn $ "Assistant: " <> Text.unpack content 159 | Nothing -> return () 160 | 161 | -- Add assistant message to history 162 | let messagesWithAssistant = messages <> [convertMessage assistantMessage] 163 | 164 | -- Process tool calls 165 | toolResults <- processToolCalls toolCalls 166 | let messagesWithTools = messagesWithAssistant <> toolResults 167 | 168 | -- Make another API call to get the final response after tool calls 169 | finalResponse <- 170 | createChatCompletion 171 | _CreateChatCompletion 172 | { messages = messagesWithTools, 173 | model = "gpt-4o-mini", 174 | tools = Just [weatherTool], 175 | tool_choice = Just ToolChoiceAuto 176 | } 177 | 178 | let ChatCompletionObject {choices = finalChoices} = finalResponse 179 | 180 | -- Process the final response (this should not have tool calls) 181 | foldM processFinalChoice messagesWithTools finalChoices 182 | Assistant {assistant_content = Just content} -> do 183 | -- No tool calls, just display the response 184 | putStrLn $ "Assistant: " <> Text.unpack content 185 | return $ messages <> [convertMessage assistantMessage] 186 | _ -> do 187 | putStrLn "Assistant: (No response content)" 188 | return messages 189 | 190 | -- | Process final choice after tool calls (should just be text response) 191 | processFinalChoice :: Vector (Message (Vector Content)) -> Choice -> IO (Vector (Message (Vector Content))) 192 | processFinalChoice messages choice = do 193 | let Choice {message = finalMessage} = choice 194 | case finalMessage of 195 | Assistant {assistant_content = Just content} -> do 196 | putStrLn $ "Assistant: " <> Text.unpack content 197 | return $ messages <> [convertMessage finalMessage] 198 | _ -> do 199 | putStrLn "Assistant: (Unexpected response format)" 200 | return messages 201 | 202 | main :: IO () 203 | main = do 204 | putStrLn "🌤️ Weather Chatbot with Tool Calling" 205 | putStrLn "======================================" 206 | putStrLn "Ask me about the weather in different cities!" 207 | putStrLn "Try: 'What's the weather like in London?'" 208 | putStrLn "Available cities: London, Paris, Tokyo, New York, San Francisco" 209 | putStrLn "Type 'quit' or 'exit' to end the conversation." 210 | putStrLn "" 211 | 212 | -- Get OpenAI API key 213 | key <- Environment.getEnv "OPENAI_KEY" 214 | 215 | -- Set up client 216 | clientEnv <- getClientEnv "https://api.openai.com" 217 | let Methods {createChatCompletion} = makeMethods clientEnv (Text.pack key) Nothing Nothing 218 | 219 | -- Initial system message 220 | let systemMessage = 221 | System 222 | { content = [Text "You are a helpful weather assistant. Use the get_weather tool to provide current weather information when users ask about weather in specific cities. Be conversational and helpful."], 223 | name = Nothing 224 | } 225 | 226 | -- Start the chat loop 227 | chatLoop createChatCompletion [systemMessage] 228 | -------------------------------------------------------------------------------- /examples/weather-chatbot-example/README.md: -------------------------------------------------------------------------------- 1 | # Weather Chatbot Example 2 | 3 | This example demonstrates how to create a chatbot using the OpenAI Haskell library with tool calling functionality. The chatbot can respond to weather queries by calling a mock weather tool. 4 | 5 | ## Features 6 | 7 | - **Tool Calling**: Demonstrates how to define and use tools with the OpenAI API 8 | - **Turn-based Chat**: Interactive chatbot that maintains conversation context 9 | - **Multiple Tool Calls**: The LLM can make multiple tool calls before responding to the user 10 | - **Mock Data**: Uses stub weather data for demonstration purposes 11 | 12 | ## Setup 13 | 14 | 1. Make sure you have an OpenAI API key set as an environment variable: 15 | 16 | ```bash 17 | export OPENAI_KEY="your-openai-api-key-here" 18 | ``` 19 | 20 | 2. Build the example: 21 | 22 | ```bash 23 | cabal build weather-chatbot-example 24 | ``` 25 | 26 | 3. Run the example: 27 | ```bash 28 | cabal run weather-chatbot-example 29 | ``` 30 | 31 | ## Usage 32 | 33 | Once running, you can interact with the weather chatbot: 34 | 35 | ``` 36 | 🌤️ Weather Chatbot with Tool Calling 37 | ====================================== 38 | Ask me about the weather in different cities! 39 | Try: 'What's the weather like in London?' 40 | Available cities: London, Paris, Tokyo, New York, San Francisco 41 | Type 'quit' or 'exit' to end the conversation. 42 | 43 | You: What's the weather like in London? 44 | 🤖 Processing 1 tool call(s)... 45 | 🔧 Processing tool call: get_weather 46 | Arguments: {"city":"London"} 47 | Result: London: 18°C, partly cloudy, 60% humidity, light rain expected 48 | ``` 49 | -------------------------------------------------------------------------------- /flake.lock: -------------------------------------------------------------------------------- 1 | { 2 | "nodes": { 3 | "all-cabal-hashes": { 4 | "flake": false, 5 | "locked": { 6 | "lastModified": 1733343438, 7 | "narHash": "sha256-F8RLjLuh+p1QDVLAuAE5a5sXvOtCEeWamieW0wHDmwk=", 8 | "owner": "commercialhaskell", 9 | "repo": "all-cabal-hashes", 10 | "rev": "0bc1ae56faa4daed288b83eab1549ffa88055691", 11 | "type": "github" 12 | }, 13 | "original": { 14 | "owner": "commercialhaskell", 15 | "ref": "hackage", 16 | "repo": "all-cabal-hashes", 17 | "type": "github" 18 | } 19 | }, 20 | "flake-utils": { 21 | "inputs": { 22 | "systems": "systems" 23 | }, 24 | "locked": { 25 | "lastModified": 1731533236, 26 | "narHash": "sha256-l0KFg5HjrsfsO/JpG+r7fRrqm12kzFHyUHqHCVpMMbI=", 27 | "owner": "numtide", 28 | "repo": "flake-utils", 29 | "rev": "11707dc2f618dd54ca8739b309ec4fc024de578b", 30 | "type": "github" 31 | }, 32 | "original": { 33 | "id": "flake-utils", 34 | "type": "indirect" 35 | } 36 | }, 37 | "nixpkgs": { 38 | "locked": { 39 | "lastModified": 1733097829, 40 | "narHash": "sha256-9hbb1rqGelllb4kVUCZ307G2k3/UhmA8PPGBoyuWaSw=", 41 | "owner": "NixOS", 42 | "repo": "nixpkgs", 43 | "rev": "2c15aa59df0017ca140d9ba302412298ab4bf22a", 44 | "type": "github" 45 | }, 46 | "original": { 47 | "id": "nixpkgs", 48 | "type": "indirect" 49 | } 50 | }, 51 | "root": { 52 | "inputs": { 53 | "all-cabal-hashes": "all-cabal-hashes", 54 | "flake-utils": "flake-utils", 55 | "nixpkgs": "nixpkgs" 56 | } 57 | }, 58 | "systems": { 59 | "locked": { 60 | "lastModified": 1681028828, 61 | "narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=", 62 | "owner": "nix-systems", 63 | "repo": "default", 64 | "rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e", 65 | "type": "github" 66 | }, 67 | "original": { 68 | "owner": "nix-systems", 69 | "repo": "default", 70 | "type": "github" 71 | } 72 | } 73 | }, 74 | "root": "root", 75 | "version": 7 76 | } 77 | -------------------------------------------------------------------------------- /flake.nix: -------------------------------------------------------------------------------- 1 | { inputs.all-cabal-hashes = { 2 | url = "github:commercialhaskell/all-cabal-hashes/hackage"; 3 | 4 | flake = false; 5 | }; 6 | 7 | outputs = { all-cabal-hashes, flake-utils, nixpkgs, self }: 8 | flake-utils.lib.eachDefaultSystem (system: 9 | let 10 | config.allowUnfree = true; 11 | 12 | pkgs = import nixpkgs { 13 | inherit config system; 14 | 15 | overlays = [ self.overlays.default ]; 16 | }; 17 | 18 | in 19 | { packages.default = pkgs.haskellPackagesCustom.openai; 20 | 21 | devShells.default = pkgs.haskellPackagesCustom.shellFor { 22 | packages = hpkgs: [ 23 | (pkgs.haskell.lib.doCheck hpkgs.openai) 24 | ]; 25 | 26 | nativeBuildInputs = [ 27 | pkgs.haskell-language-server 28 | pkgs.stylish-haskell 29 | 30 | (pkgs.vscode-with-extensions.override { 31 | vscodeExtensions = [ 32 | pkgs.vscode-extensions.haskell.haskell 33 | pkgs.vscode-extensions.justusadam.language-haskell 34 | ]; 35 | }) 36 | ]; 37 | 38 | withHoogle = true; 39 | 40 | doBenchmark = true; 41 | }; 42 | } 43 | ) // { 44 | overlays.default = self: super: { 45 | inherit all-cabal-hashes; 46 | 47 | haskellPackagesCustom = self.haskellPackages.override (old: { 48 | overrides = 49 | let 50 | hlib = self.haskell.lib.compose; 51 | in 52 | self.lib.composeManyExtensions [ 53 | (hlib.packageSourceOverrides { 54 | openai = ./.; 55 | }) 56 | 57 | (hlib.packagesFromDirectory { 58 | directory = ./dependencies; 59 | }) 60 | 61 | (hself: hsuper: { 62 | openai = hlib.dontCheck hsuper.openai; 63 | }) 64 | ]; 65 | }); 66 | }; 67 | }; 68 | } 69 | -------------------------------------------------------------------------------- /openai.cabal: -------------------------------------------------------------------------------- 1 | cabal-version: 2.4 2 | name: openai 3 | version: 1.2.0 4 | synopsis: Servant bindings to OpenAI 5 | description: This package provides comprehensive and type-safe bindings 6 | to OpenAI, providing both a Servant interface and 7 | non-Servant interface for convenience. 8 | . 9 | Read the @README@ below for a fully worked usage example. 10 | . 11 | Otherwise, browse the "OpenAI.V1" module, which is the 12 | intended package entrypoint. 13 | license: BSD-3-Clause 14 | license-file: LICENSE 15 | author: Gabriella Gonzalez 16 | maintainer: GenuineGabriella@gmail.com 17 | category: AI, API, Web 18 | copyright: 2024 Gabriella Gonzalez 19 | build-type: Simple 20 | extra-doc-files: CHANGELOG.md 21 | extra-source-files: README.md 22 | 23 | source-repository head 24 | type: git 25 | location: https://github.com/MercuryTechnologies/openai 26 | 27 | library 28 | default-language: Haskell2010 29 | hs-source-dirs: src 30 | build-depends: base >=4.15.0.0 && < 5 31 | , aeson 32 | , bytestring 33 | , containers 34 | , filepath 35 | , http-api-data 36 | , http-client 37 | , http-client-tls 38 | , http-types 39 | , servant 40 | , servant-multipart-api 41 | , servant-client 42 | , servant-multipart-client 43 | , text 44 | , time 45 | , vector 46 | exposed-modules: OpenAI.V1 47 | OpenAI.V1.Assistants 48 | OpenAI.V1.Audio 49 | OpenAI.V1.Audio.Speech 50 | OpenAI.V1.Audio.Transcriptions 51 | OpenAI.V1.Audio.Translations 52 | OpenAI.V1.AutoOr 53 | OpenAI.V1.Batches 54 | OpenAI.V1.Chat.Completions 55 | OpenAI.V1.ChunkingStrategy 56 | OpenAI.V1.DeletionStatus 57 | OpenAI.V1.Embeddings 58 | OpenAI.V1.Error 59 | OpenAI.V1.Files 60 | OpenAI.V1.FineTuning.Jobs 61 | OpenAI.V1.Images 62 | OpenAI.V1.Images.Edits 63 | OpenAI.V1.Images.Generations 64 | OpenAI.V1.Images.Image 65 | OpenAI.V1.Images.ResponseFormat 66 | OpenAI.V1.Images.Variations 67 | OpenAI.V1.ListOf 68 | OpenAI.V1.Message 69 | OpenAI.V1.Models 70 | OpenAI.V1.Moderations 71 | OpenAI.V1.Order 72 | OpenAI.V1.ResponseFormat 73 | OpenAI.V1.Responses 74 | OpenAI.V1.Threads 75 | OpenAI.V1.Threads.Messages 76 | OpenAI.V1.Threads.Runs 77 | OpenAI.V1.Threads.Runs.Steps 78 | OpenAI.V1.Tool 79 | OpenAI.V1.ToolCall 80 | OpenAI.V1.ToolResources 81 | OpenAI.V1.Uploads 82 | OpenAI.V1.Usage 83 | OpenAI.V1.VectorStores 84 | OpenAI.V1.VectorStores.FileCounts 85 | OpenAI.V1.VectorStores.Files 86 | OpenAI.V1.VectorStores.FileBatches 87 | OpenAI.V1.VectorStores.Status 88 | other-modules: OpenAI.Prelude 89 | default-extensions: DataKinds 90 | , DeriveAnyClass 91 | , DeriveGeneric 92 | , DerivingStrategies 93 | , DuplicateRecordFields 94 | , FlexibleInstances 95 | , GeneralizedNewtypeDeriving 96 | , OverloadedLists 97 | , OverloadedStrings 98 | , RecordWildCards 99 | , MultiParamTypeClasses 100 | , NamedFieldPuns 101 | , TypeApplications 102 | , TypeOperators 103 | , ViewPatterns 104 | ghc-options: -Wall -Wno-missing-fields 105 | 106 | test-suite tasty 107 | default-language: Haskell2010 108 | type: exitcode-stdio-1.0 109 | hs-source-dirs: tasty 110 | main-is: Main.hs 111 | build-depends: base 112 | , aeson 113 | , http-client 114 | , http-client-tls 115 | , openai 116 | , servant-client 117 | , tasty 118 | , tasty-hunit 119 | , text 120 | ghc-options: -Wall 121 | 122 | executable openai-example 123 | default-language: Haskell2010 124 | hs-source-dirs: examples/openai-example 125 | main-is: Main.hs 126 | build-depends: base 127 | , openai 128 | , text 129 | ghc-options: -Wall 130 | 131 | executable weather-chatbot-example 132 | default-language: Haskell2010 133 | hs-source-dirs: examples/weather-chatbot-example 134 | main-is: Main.hs 135 | build-depends: base 136 | , aeson 137 | , bytestring 138 | , openai 139 | , text 140 | , vector 141 | ghc-options: -Wall 142 | 143 | executable responses-example 144 | default-language: Haskell2010 145 | hs-source-dirs: examples/responses-example 146 | main-is: Main.hs 147 | build-depends: base 148 | , openai 149 | , text 150 | , vector 151 | ghc-options: -Wall 152 | 153 | executable responses-stream-example 154 | default-language: Haskell2010 155 | hs-source-dirs: examples/responses-stream-example 156 | main-is: Main.hs 157 | build-depends: base 158 | , aeson 159 | , openai 160 | , text 161 | ghc-options: -Wall 162 | -------------------------------------------------------------------------------- /src/OpenAI/Prelude.hs: -------------------------------------------------------------------------------- 1 | module OpenAI.Prelude 2 | ( -- * JSON 3 | aesonOptions 4 | , stripPrefix 5 | , labelModifier 6 | -- * Multipart Form Data 7 | , input 8 | , renderIntegral 9 | , renderRealFloat 10 | , getExtension 11 | -- * Re-exports 12 | , module Data.Aeson 13 | , module Data.ByteString.Lazy 14 | , module Data.List.NonEmpty 15 | , module Data.Map 16 | , module Data.String 17 | , module Data.Text 18 | , module Data.Time.Clock.POSIX 19 | , module Data.Vector 20 | , module Data.Void 21 | , module Data.Word 22 | , module GHC.Generics 23 | , module Numeric.Natural 24 | , module Servant.API 25 | , module Servant.Multipart.API 26 | , module Web.HttpApiData 27 | ) where 28 | 29 | import Data.ByteString.Lazy (ByteString) 30 | import Data.List.NonEmpty (NonEmpty(..)) 31 | import Data.Map (Map) 32 | import Data.String (IsString(..)) 33 | import Data.Text (Text) 34 | import Data.Time.Clock.POSIX (POSIXTime) 35 | import Data.Vector (Vector) 36 | import Data.Void (Void) 37 | import Data.Word (Word8) 38 | import GHC.Generics (Generic) 39 | import Numeric.Natural (Natural) 40 | import Web.HttpApiData (ToHttpApiData(..)) 41 | 42 | import Data.Aeson 43 | ( FromJSON(..) 44 | , Options(..) 45 | , SumEncoding(..) 46 | , ToJSON(..) 47 | , Value(..) 48 | , genericParseJSON 49 | , genericToJSON 50 | ) 51 | import Servant.API 52 | ( Accept(..) 53 | , Capture 54 | , Delete 55 | , Get 56 | , Header' 57 | , JSON 58 | , MimeUnrender(..) 59 | , OctetStream 60 | , Optional 61 | , Post 62 | , QueryParam 63 | , ReqBody 64 | , Required 65 | , Strict 66 | , (:<|>)(..) 67 | , (:>) 68 | ) 69 | import Servant.Multipart.API 70 | ( FileData(..) 71 | , Input(..) 72 | , MultipartData(..) 73 | , MultipartForm 74 | , Tmp 75 | , ToMultipart(..) 76 | ) 77 | 78 | import qualified Data.Aeson as Aeson 79 | import qualified Data.Char as Char 80 | import qualified Data.List as List 81 | import qualified Data.Text as Text 82 | import qualified Data.Text.Lazy as Text.Lazy 83 | import qualified Data.Text.Lazy.Builder as Builder 84 | import qualified Data.Text.Lazy.Builder.Int as Int 85 | import qualified Data.Text.Lazy.Builder.RealFloat as RealFloat 86 | import qualified System.FilePath as FilePath 87 | 88 | dropTrailingUnderscore :: String -> String 89 | dropTrailingUnderscore "_" = "" 90 | dropTrailingUnderscore "" = "" 91 | dropTrailingUnderscore (c : cs) = c : dropTrailingUnderscore cs 92 | 93 | labelModifier :: String -> String 94 | labelModifier = map Char.toLower . dropTrailingUnderscore 95 | 96 | stripPrefix :: String -> String -> String 97 | stripPrefix prefix string = labelModifier suffix 98 | where 99 | suffix = case List.stripPrefix prefix string of 100 | Nothing -> string 101 | Just x -> x 102 | 103 | aesonOptions :: Options 104 | aesonOptions = Aeson.defaultOptions 105 | { fieldLabelModifier = labelModifier 106 | , constructorTagModifier = labelModifier 107 | , omitNothingFields = True 108 | } 109 | 110 | input :: Text -> Text -> [ Input ] 111 | input iName iValue = [ Input{..} ] 112 | 113 | renderIntegral :: Integral number => number -> Text 114 | renderIntegral number = Text.Lazy.toStrict (Builder.toLazyText builder) 115 | where 116 | builder = Int.decimal number 117 | 118 | renderRealFloat :: RealFloat number => number -> Text 119 | renderRealFloat number = Text.Lazy.toStrict (Builder.toLazyText builder) 120 | where 121 | builder = RealFloat.formatRealFloat RealFloat.Fixed Nothing number 122 | 123 | getExtension :: FilePath -> Text 124 | getExtension file = Text.pack (drop 1 (FilePath.takeExtension file)) 125 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Assistants.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/assistants@ 2 | module OpenAI.V1.Assistants 3 | ( -- * Main types 4 | AssistantID(..) 5 | , CreateAssistant(..) 6 | , _CreateAssistant 7 | , ModifyAssistant(..) 8 | , _ModifyAssistant 9 | , AssistantObject(..) 10 | -- * Other types 11 | , RankingOptions 12 | , FileSearch(..) 13 | , Function(..) 14 | , Tool(..) 15 | , CodeInterpreterResources(..) 16 | , FileSearchResources(..) 17 | , ToolResources(..) 18 | -- * Servant 19 | , API 20 | ) where 21 | 22 | import OpenAI.Prelude 23 | import OpenAI.V1.AutoOr 24 | import OpenAI.V1.DeletionStatus 25 | import OpenAI.V1.ListOf 26 | import OpenAI.V1.Models (Model) 27 | import OpenAI.V1.Order 28 | import OpenAI.V1.ResponseFormat 29 | import OpenAI.V1.Tool 30 | import OpenAI.V1.ToolResources 31 | 32 | -- | AssistantID 33 | newtype AssistantID = AssistantID{ text :: Text } 34 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 35 | 36 | -- | Request body for @\/v1\/assistants@ 37 | data CreateAssistant = CreateAssistant 38 | { model :: Model 39 | , name :: Maybe Text 40 | , description :: Maybe Text 41 | , instructions :: Maybe Text 42 | , tools :: Maybe (Vector Tool) 43 | , tool_resources :: Maybe ToolResources 44 | , metadata :: Maybe (Map Text Text) 45 | , temperature :: Maybe Double 46 | , top_p :: Maybe Double 47 | , response_format :: Maybe (AutoOr ResponseFormat) 48 | } deriving stock (Generic, Show) 49 | 50 | instance FromJSON CreateAssistant where 51 | parseJSON = genericParseJSON aesonOptions 52 | 53 | instance ToJSON CreateAssistant where 54 | toJSON = genericToJSON aesonOptions 55 | 56 | -- | Default `CreateAssistant` 57 | _CreateAssistant :: CreateAssistant 58 | _CreateAssistant = CreateAssistant 59 | { name = Nothing 60 | , description = Nothing 61 | , instructions = Nothing 62 | , tools = Nothing 63 | , tool_resources = Nothing 64 | , metadata = Nothing 65 | , temperature = Nothing 66 | , top_p = Nothing 67 | , response_format = Nothing 68 | } 69 | 70 | -- | Request body for @\/v1\/assistants/:assistant_id@ 71 | data ModifyAssistant = ModifyAssistant 72 | { model :: Model 73 | , name :: Maybe Text 74 | , description :: Maybe Text 75 | , instructions :: Maybe Text 76 | , tools :: Maybe (Vector Tool) 77 | , tool_resources :: Maybe ToolResources 78 | , metadata :: Maybe (Map Text Text) 79 | , temperature :: Maybe Double 80 | , top_p :: Maybe Double 81 | , response_format :: Maybe (AutoOr ResponseFormat) 82 | } deriving stock (Generic, Show) 83 | 84 | instance FromJSON ModifyAssistant where 85 | parseJSON = genericParseJSON aesonOptions 86 | 87 | instance ToJSON ModifyAssistant where 88 | toJSON = genericToJSON aesonOptions 89 | 90 | -- | Default `ModifyAssistant` 91 | _ModifyAssistant :: ModifyAssistant 92 | _ModifyAssistant = ModifyAssistant 93 | { name = Nothing 94 | , description = Nothing 95 | , instructions = Nothing 96 | , tools = Nothing 97 | , tool_resources = Nothing 98 | , metadata = Nothing 99 | , temperature = Nothing 100 | , top_p = Nothing 101 | , response_format = Nothing 102 | } 103 | 104 | -- | Represents an assistant that can call the model and use tools. 105 | data AssistantObject = AssistantObject 106 | { id :: AssistantID 107 | , object :: Text 108 | , created_at :: POSIXTime 109 | , name :: Maybe Text 110 | , description :: Maybe Text 111 | , model :: Model 112 | , instructions :: Maybe Text 113 | , tools :: Maybe (Vector Tool) 114 | , tool_resources :: Maybe ToolResources 115 | , metadata :: Map Text Text 116 | , temperature :: Maybe Double 117 | , top_p :: Maybe Double 118 | , response_format :: AutoOr ResponseFormat 119 | } deriving stock (Generic, Show) 120 | deriving anyclass (FromJSON, ToJSON) 121 | 122 | -- | Servant API 123 | type API = 124 | Header' '[Required, Strict] "OpenAI-Beta" Text 125 | :> "assistants" 126 | :> ( ReqBody '[JSON] CreateAssistant 127 | :> Post '[JSON] AssistantObject 128 | :<|> QueryParam "limit" Natural 129 | :> QueryParam "order" Order 130 | :> QueryParam "after" Text 131 | :> QueryParam "before" Text 132 | :> Get '[JSON] (ListOf AssistantObject) 133 | :<|> Capture "assistant_id" AssistantID 134 | :> Get '[JSON] AssistantObject 135 | :<|> Capture "assistant_id" AssistantID 136 | :> ReqBody '[JSON] ModifyAssistant 137 | :> Post '[JSON] AssistantObject 138 | :<|> Capture "assistant_id" AssistantID 139 | :> Delete '[JSON] DeletionStatus 140 | ) 141 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Audio.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/audio@ 2 | module OpenAI.V1.Audio 3 | ( -- * Servant 4 | API 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | import qualified OpenAI.V1.Audio.Speech as Speech 10 | import qualified OpenAI.V1.Audio.Transcriptions as Transcriptions 11 | import qualified OpenAI.V1.Audio.Translations as Translations 12 | 13 | -- | Servant API 14 | type API = 15 | "audio" :> (Speech.API :<|> Transcriptions.API :<|> Translations.API) 16 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Audio/Speech.hs: -------------------------------------------------------------------------------- 1 | -- | [@\/v1\/audio\/speech@](https://platform.openai.com/docs/api-reference/audio/createSpeech) 2 | module OpenAI.V1.Audio.Speech 3 | ( -- * Main types 4 | CreateSpeech(..) 5 | , _CreateSpeech 6 | -- * Other types 7 | , Voice(..) 8 | , Format(..) 9 | -- * Servant 10 | , ContentType(..) 11 | , API 12 | ) where 13 | 14 | import OpenAI.Prelude 15 | import OpenAI.V1.Models (Model) 16 | 17 | -- | The voice to use when generating the audio 18 | -- 19 | -- Previews of the voices are available in the 20 | -- [Text to speech guide](https://platform.openai.com/docs/guides/text-to-speech#voice-options). 21 | data Voice = Alloy | Ash | Ballad | Coral | Echo | Fable | Nova | Onyx | Sage | Shimmer 22 | deriving stock (Bounded, Enum, Generic, Show) 23 | 24 | instance FromJSON Voice where 25 | parseJSON = genericParseJSON aesonOptions 26 | 27 | instance ToJSON Voice where 28 | toJSON = genericToJSON aesonOptions 29 | 30 | -- | The format to generate the audio in 31 | data Format = MP3 | Opus | AAC | FLAC | WAV | PCM 32 | deriving stock (Bounded, Enum, Generic, Show) 33 | 34 | instance FromJSON Format where 35 | parseJSON = genericParseJSON aesonOptions 36 | 37 | instance ToJSON Format where 38 | toJSON = genericToJSON aesonOptions 39 | 40 | -- | Request body for @\/v1\/audio\/speech@ 41 | data CreateSpeech = CreateSpeech 42 | { model :: Model -- ^ Note that only [TTS models](https://platform.openai.com/docs/models#tts) support speech synthesis. 43 | , input :: Text -- ^ The text to generate audio for. The maximum length is 4096 characters. 44 | , voice :: Voice 45 | , instructions :: Maybe Text -- ^ Instructions for the model to follow when generating the audio. 46 | -- Does not work with @tts-1@ or @tts-1-hd@ models. 47 | , response_format :: Maybe Format -- ^ Defaults to 'MP3'. 48 | , speed :: Maybe Double -- ^ Defaults to 1.0 (normal speed). Should be between 0.25 and 4.0. 49 | -- Does not work with @gpt-4o-mini-tts@ model. 50 | } deriving stock (Generic, Show) 51 | 52 | instance ToJSON CreateSpeech where 53 | toJSON = genericToJSON aesonOptions 54 | 55 | instance FromJSON CreateSpeech where 56 | parseJSON = genericParseJSON aesonOptions 57 | 58 | -- | Default `CreateSpeech` 59 | _CreateSpeech :: CreateSpeech 60 | _CreateSpeech = CreateSpeech 61 | { instructions = Nothing 62 | , response_format = Nothing 63 | , speed = Nothing 64 | } 65 | 66 | -- | Content type 67 | data ContentType = ContentType 68 | 69 | instance Accept ContentType where 70 | contentTypes _ = 71 | "audio/mpeg" 72 | :| [ "audio/flac" 73 | , "audio/wav" 74 | , "audio/aac" 75 | , "audio/opus" 76 | , "audio/pcm" 77 | ] 78 | 79 | instance MimeUnrender ContentType ByteString where 80 | mimeUnrender _ bytes = Right bytes 81 | 82 | -- | Servant API 83 | type API = 84 | "speech" :> ReqBody '[JSON] CreateSpeech :> Post '[ContentType] ByteString 85 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Audio/Transcriptions.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/audio\/transcriptions@ 2 | -- 3 | -- To simplify things, this only supports the @verbose_json@ response format 4 | -- and also only supports the @segment@ granularity 5 | module OpenAI.V1.Audio.Transcriptions 6 | ( -- * Main types 7 | CreateTranscription(..) 8 | , _CreateTranscription 9 | , TranscriptionObject(..) 10 | -- * Other types 11 | , Segment(..) 12 | -- * Servant 13 | , API 14 | ) where 15 | 16 | import OpenAI.Prelude as OpenAI.Prelude 17 | import OpenAI.V1.Models (Model(..)) 18 | 19 | import qualified Data.Text as Text 20 | 21 | -- | Request body for @\/v1\/audio\/transcriptions@ 22 | data CreateTranscription = CreateTranscription 23 | { file :: FilePath 24 | , model :: Model 25 | , language :: Maybe Text 26 | , prompt :: Maybe Text 27 | , temperature :: Maybe Double 28 | } deriving stock (Generic, Show) 29 | deriving anyclass (FromJSON, ToJSON) 30 | 31 | instance ToMultipart Tmp CreateTranscription where 32 | toMultipart CreateTranscription{ model = Model model, ..} = 33 | MultipartData{..} 34 | where 35 | inputs = 36 | input "model" model 37 | <> foldMap (input "language") language 38 | <> foldMap (input "prompt") prompt 39 | <> input "response_format" "verbose_json" 40 | <> foldMap (input "temperature" . renderRealFloat) temperature 41 | <> input "timestamp_granularities[]" "segment" 42 | 43 | files = [ FileData{..} ] 44 | where 45 | fdInputName = "file" 46 | fdFileName = Text.pack file 47 | fdFileCType = "audio/" <> getExtension file 48 | fdPayload = file 49 | 50 | -- | Default `CreateTranscription` 51 | _CreateTranscription :: CreateTranscription 52 | _CreateTranscription = CreateTranscription 53 | { language = Nothing 54 | , prompt = Nothing 55 | , temperature = Nothing 56 | } 57 | 58 | -- | Segment of the transcribed text and its corresponding details 59 | data Segment = Segment 60 | { id :: Integer 61 | , seek :: Integer 62 | , start :: Double 63 | , end :: Double 64 | , text :: Text 65 | , tokens :: Vector Prelude.Word 66 | , temperature :: Double 67 | , avg_logprob :: Double 68 | , compression_ratio :: Double 69 | , no_speech_prob :: Double 70 | } deriving stock (Generic, Show) 71 | deriving anyclass (FromJSON, ToJSON) 72 | 73 | -- | Represents a verbose json transcription response returned by model, based 74 | -- on the provided input. 75 | data TranscriptionObject = TranscriptionObject 76 | { language :: Maybe Text 77 | , duration :: Maybe Double 78 | , text :: Text 79 | , segments :: Vector Segment 80 | } deriving stock (Generic, Show) 81 | deriving anyclass (FromJSON, ToJSON) 82 | 83 | -- | Servant API 84 | type API = 85 | "transcriptions" 86 | :> MultipartForm Tmp CreateTranscription 87 | :> Post '[JSON] TranscriptionObject 88 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Audio/Translations.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/audio\/translations@ 2 | -- 3 | -- To simplify things, this only supports the @verbose_json@ response format 4 | module OpenAI.V1.Audio.Translations 5 | ( -- * Main types 6 | CreateTranslation(..) 7 | , _CreateTranslation 8 | , TranslationObject(..) 9 | -- * Servant 10 | , API 11 | ) where 12 | 13 | import OpenAI.Prelude as OpenAI.Prelude 14 | import OpenAI.V1.Models (Model(..)) 15 | 16 | import qualified Data.Text as Text 17 | 18 | -- | Request body for @\/v1\/audio\/translations@ 19 | data CreateTranslation = CreateTranslation 20 | { file :: FilePath 21 | , model :: Model 22 | , prompt :: Maybe Text 23 | , temperature :: Maybe Double 24 | } deriving stock (Generic, Show) 25 | deriving anyclass (FromJSON, ToJSON) 26 | 27 | instance ToMultipart Tmp CreateTranslation where 28 | toMultipart CreateTranslation{ model = Model model, ..} = MultipartData{..} 29 | where 30 | inputs = 31 | input "model" model 32 | <> foldMap (input "prompt") prompt 33 | <> input "response_format" "verbose_json" 34 | <> foldMap (input "temperature" . renderRealFloat) temperature 35 | 36 | files = [ FileData{..} ] 37 | where 38 | fdInputName = "file" 39 | fdFileName = Text.pack file 40 | fdFileCType = "audio/" <> getExtension file 41 | fdPayload = file 42 | 43 | -- | Default `CreateTranslation` 44 | _CreateTranslation :: CreateTranslation 45 | _CreateTranslation = CreateTranslation 46 | { prompt = Nothing 47 | , temperature = Nothing 48 | } 49 | 50 | -- | Represents a transcription response returned by model, based on the 51 | -- provided input. 52 | data TranslationObject = TranslationObject 53 | { text :: Text 54 | } deriving stock (Generic, Show) 55 | deriving anyclass (FromJSON, ToJSON) 56 | 57 | -- | Servant API 58 | type API = 59 | "translations" 60 | :> MultipartForm Tmp CreateTranslation 61 | :> Post '[JSON] TranslationObject 62 | -------------------------------------------------------------------------------- /src/OpenAI/V1/AutoOr.hs: -------------------------------------------------------------------------------- 1 | -- | The `AutoOr` type constructor 2 | module OpenAI.V1.AutoOr 3 | ( -- * Types 4 | AutoOr(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | A type that can also be the string @\"auto\"@ 10 | data AutoOr a = Auto | Specific a 11 | deriving stock (Generic, Show) 12 | 13 | instance FromJSON a => FromJSON (AutoOr a) where 14 | parseJSON "auto" = pure Auto 15 | parseJSON value = fmap Specific (parseJSON value) 16 | 17 | instance ToJSON a => ToJSON (AutoOr a) where 18 | toJSON Auto = "auto" 19 | toJSON (Specific a) = toJSON a 20 | 21 | instance IsString a => IsString (AutoOr a) where 22 | fromString string = Specific (fromString string) 23 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Batches.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/batches@ 2 | module OpenAI.V1.Batches 3 | ( -- * Main types 4 | BatchID(..) 5 | , CreateBatch(..) 6 | , _CreateBatch 7 | , BatchObject(..) 8 | -- * Other types 9 | , Status(..) 10 | , Counts(..) 11 | -- * Servant 12 | , API 13 | ) where 14 | 15 | import OpenAI.Prelude 16 | import OpenAI.V1.Error 17 | import OpenAI.V1.Files (FileID) 18 | import OpenAI.V1.ListOf 19 | 20 | -- | Batch ID 21 | newtype BatchID = BatchID{ text :: Text } 22 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 23 | 24 | -- | Request body for @\/v1\/batches@ 25 | data CreateBatch = CreateBatch 26 | { input_file_id :: FileID 27 | , endpoint :: Text 28 | , completion_window :: Text 29 | , metadata :: Maybe (Map Text Text) 30 | } deriving stock (Generic, Show) 31 | deriving anyclass (FromJSON, ToJSON) 32 | 33 | -- | Default `CreateBatch` 34 | _CreateBatch :: CreateBatch 35 | _CreateBatch = CreateBatch 36 | { metadata = Nothing 37 | } 38 | 39 | -- | The current status of the batch. 40 | data Status 41 | = Validating 42 | | Failed 43 | | In_Progress 44 | | Finalizing 45 | | Completed 46 | | Expired 47 | | Cancelling 48 | | Cancelled 49 | deriving stock (Generic, Show) 50 | 51 | instance FromJSON Status where 52 | parseJSON = genericParseJSON aesonOptions 53 | 54 | instance ToJSON Status where 55 | toJSON = genericToJSON aesonOptions 56 | 57 | -- | The request counts for different statuses within the batch. 58 | data Counts = Counts 59 | { total :: Natural 60 | , completed :: Natural 61 | , failed :: Natural 62 | } deriving stock (Generic, Show) 63 | deriving anyclass (FromJSON, ToJSON) 64 | 65 | -- | The batch object 66 | data BatchObject = BatchObject 67 | { id :: BatchID 68 | , object :: Text 69 | , endpoint :: Text 70 | , errors :: Maybe (ListOf Error) 71 | , input_file_id :: FileID 72 | , completion_window :: Text 73 | , status :: Status 74 | , output_file_id :: Maybe FileID 75 | , error_file_id :: Maybe FileID 76 | , created_at :: POSIXTime 77 | , in_progress_at :: Maybe POSIXTime 78 | , expires_at :: Maybe POSIXTime 79 | , finalizing_at :: Maybe POSIXTime 80 | , completed_at :: Maybe POSIXTime 81 | , failed_at :: Maybe POSIXTime 82 | , expired_at :: Maybe POSIXTime 83 | , cancelling_at :: Maybe POSIXTime 84 | , cancelled_at :: Maybe POSIXTime 85 | , request_counts :: Maybe Counts 86 | , metadata :: Maybe (Map Text Text) 87 | } deriving stock (Generic, Show) 88 | 89 | instance FromJSON BatchObject where 90 | parseJSON = genericParseJSON aesonOptions 91 | 92 | instance ToJSON BatchObject where 93 | toJSON = genericToJSON aesonOptions 94 | 95 | -- | Servant API 96 | type API = 97 | "batches" 98 | :> ( ReqBody '[JSON] CreateBatch 99 | :> Post '[JSON] BatchObject 100 | :<|> Capture "batch_id" BatchID 101 | :> Get '[JSON] BatchObject 102 | :<|> Capture "batch_id" BatchID 103 | :> "cancel" 104 | :> Post '[JSON] BatchObject 105 | :<|> QueryParam "after" Text 106 | :> QueryParam "limit" Natural 107 | :> Get '[JSON] (ListOf BatchObject) 108 | ) 109 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Chat/Completions.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/chat\/completions@ 2 | -- 3 | -- Streaming results are not yet supported 4 | module OpenAI.V1.Chat.Completions 5 | ( -- * Main types 6 | CreateChatCompletion(..) 7 | , _CreateChatCompletion 8 | , ChatCompletionObject(..) 9 | , Choice(..) 10 | , Message(..) 11 | , messageToContent 12 | , Content(..) 13 | -- * Other types 14 | , InputAudio(..) 15 | , ImageURL(..) 16 | , AudioData(..) 17 | , Modality(..) 18 | , Prediction(..) 19 | , Voice(..) 20 | , AudioFormat(..) 21 | , AudioParameters(..) 22 | , ResponseFormat(..) 23 | , ServiceTier 24 | , ReasoningEffort(..) 25 | , SearchContextSize(..) 26 | , UserLocation(..) 27 | , WebSearchOptions(..) 28 | , FinishReason(..) 29 | , Token(..) 30 | , LogProbs(..) 31 | -- * Servant 32 | , API 33 | ) where 34 | 35 | import OpenAI.Prelude 36 | import OpenAI.V1.AutoOr 37 | import OpenAI.V1.Models (Model) 38 | import OpenAI.V1.ResponseFormat 39 | import OpenAI.V1.Tool 40 | import OpenAI.V1.ToolCall 41 | import OpenAI.V1.Usage 42 | import Prelude hiding (id) 43 | 44 | -- | Audio content part 45 | data InputAudio = InputAudio{ data_ :: Text, format :: AudioFormat } 46 | deriving stock (Generic, Show) 47 | 48 | instance FromJSON InputAudio where 49 | parseJSON = genericParseJSON aesonOptions 50 | 51 | instance ToJSON InputAudio where 52 | toJSON = genericToJSON aesonOptions 53 | 54 | -- | Image content part 55 | data ImageURL = ImageURL{ url :: Text, detail :: Maybe (AutoOr Text) } 56 | deriving stock (Generic, Show) 57 | deriving anyclass (FromJSON, ToJSON) 58 | 59 | -- | A content part 60 | data Content 61 | = Text{ text :: Text } 62 | | Image_URL{ image_url :: ImageURL } 63 | | Input_Audio{ input_audio :: InputAudio } 64 | deriving (Generic, Show) 65 | 66 | instance IsString Content where 67 | fromString string = Text{ text = fromString string } 68 | 69 | contentOptions :: Options 70 | contentOptions = 71 | aesonOptions 72 | { sumEncoding = 73 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 74 | 75 | , tagSingleConstructors = True 76 | } 77 | 78 | instance FromJSON Content where 79 | parseJSON = genericParseJSON contentOptions 80 | 81 | instance ToJSON Content where 82 | toJSON = genericToJSON contentOptions 83 | 84 | -- | Data about a previous audio response from the model. 85 | -- [Learn more](https://platform.openai.com/docs/guides/audio) 86 | data AudioData = AudioData{ id :: Text } 87 | deriving stock (Generic, Show) 88 | deriving anyclass (FromJSON, ToJSON) 89 | 90 | -- | A message from the conversation so far 91 | data Message content 92 | = System 93 | { content :: content 94 | , name :: Maybe Text 95 | } 96 | | User 97 | { content :: content 98 | , name :: Maybe Text 99 | } 100 | | Assistant 101 | { assistant_content :: Maybe content 102 | , refusal :: Maybe Text 103 | , name :: Maybe Text 104 | , assistant_audio :: Maybe AudioData 105 | , tool_calls :: Maybe (Vector ToolCall) 106 | } 107 | | Tool 108 | { content :: content 109 | , tool_call_id :: Text 110 | } 111 | deriving stock (Generic, Show) 112 | 113 | messageOptions :: Options 114 | messageOptions = aesonOptions 115 | { sumEncoding = 116 | TaggedObject{ tagFieldName = "role", contentsFieldName = "" } 117 | 118 | , tagSingleConstructors = True 119 | 120 | , fieldLabelModifier = stripPrefix "assistant_" 121 | } 122 | 123 | instance FromJSON content => FromJSON (Message content) where 124 | parseJSON = genericParseJSON messageOptions 125 | 126 | instance ToJSON content => ToJSON (Message content) where 127 | toJSON = genericToJSON messageOptions 128 | 129 | -- | Extract the message body from a `Message` 130 | -- 131 | -- Normally this would just be the @content@ field selector, but the problem 132 | -- is that the content field for the `Assistant` constructor is not required 133 | -- to be present, so we provide a utility function to default to extract the 134 | -- @content@ field for all constructors, defaulting to `mempty` for the special 135 | -- case where the `Message` is an `Assistant` constructor with a missing 136 | -- @content@ field 137 | messageToContent :: Monoid content => Message content -> content 138 | messageToContent System{ content } = content 139 | messageToContent User{ content } = content 140 | messageToContent Assistant{ assistant_content = Just content } = content 141 | messageToContent Assistant{ assistant_content = Nothing } = mempty 142 | messageToContent Tool{ content } = content 143 | 144 | -- | Output types that you would like the model to generate for this request 145 | data Modality = Modality_Text | Modality_Audio 146 | deriving stock (Generic, Show) 147 | 148 | modalityOptions :: Options 149 | modalityOptions = 150 | aesonOptions 151 | { constructorTagModifier = stripPrefix "Modality_" } 152 | 153 | instance FromJSON Modality where 154 | parseJSON = genericParseJSON modalityOptions 155 | 156 | instance ToJSON Modality where 157 | toJSON = genericToJSON modalityOptions 158 | 159 | -- | Configuration for a 160 | -- [Predicted Output](https://platform.openai.com/docs/guides/predicted-outputs), 161 | -- which can greatly improve response times when large parts of the model 162 | -- response are known ahead of time. This is most common when you are 163 | -- regenerating a file with only minor changes to most of the content 164 | data Prediction = Content{ content :: Text } 165 | deriving stock (Generic, Show) 166 | 167 | predictionOptions :: Options 168 | predictionOptions = 169 | aesonOptions 170 | { sumEncoding = 171 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 172 | 173 | , tagSingleConstructors = True 174 | } 175 | 176 | instance FromJSON Prediction where 177 | parseJSON = genericParseJSON predictionOptions 178 | 179 | instance ToJSON Prediction where 180 | toJSON = genericToJSON predictionOptions 181 | 182 | -- | The voice the model uses to respond 183 | data Voice = Ash | Ballad | Coral | Sage | Verse 184 | deriving stock (Generic, Show) 185 | 186 | instance FromJSON Voice where 187 | parseJSON = genericParseJSON aesonOptions 188 | 189 | instance ToJSON Voice where 190 | toJSON = genericToJSON aesonOptions 191 | 192 | -- | Specifies the output audio format 193 | data AudioFormat = WAV | MP3 | FLAC | Opus | PCM16 194 | deriving stock (Generic, Show) 195 | 196 | instance FromJSON AudioFormat where 197 | parseJSON = genericParseJSON aesonOptions 198 | 199 | instance ToJSON AudioFormat where 200 | toJSON = genericToJSON aesonOptions 201 | 202 | -- | Parameters for audio output 203 | data AudioParameters = AudioParameters 204 | { voice :: Voice 205 | , format :: AudioFormat 206 | } deriving stock (Generic, Show) 207 | deriving anyclass (FromJSON, ToJSON) 208 | 209 | -- | Specifies the latency tier to use for processing the request 210 | type ServiceTier = Text 211 | 212 | -- | Constrains effort on reasoning for reasoning models 213 | data ReasoningEffort 214 | = ReasoningEffort_Minimal 215 | | ReasoningEffort_Low 216 | | ReasoningEffort_Medium 217 | | ReasoningEffort_High 218 | deriving stock (Generic, Show) 219 | 220 | reasoningEffortOptions :: Options 221 | reasoningEffortOptions = 222 | aesonOptions 223 | { constructorTagModifier = stripPrefix "ReasoningEffort_" } 224 | 225 | instance FromJSON ReasoningEffort where 226 | parseJSON = genericParseJSON reasoningEffortOptions 227 | 228 | instance ToJSON ReasoningEffort where 229 | toJSON = genericToJSON reasoningEffortOptions 230 | 231 | -- | High level guidance for the amount of context window space to use for the 232 | -- search 233 | data SearchContextSize 234 | = SearchContextSize_Low 235 | | SearchContextSize_Medium 236 | | SearchContextSize_High 237 | deriving stock (Generic, Show) 238 | 239 | instance FromJSON SearchContextSize where 240 | parseJSON = genericParseJSON searchContextSizeOptions 241 | 242 | instance ToJSON SearchContextSize where 243 | toJSON = genericToJSON searchContextSizeOptions 244 | 245 | -- | Approximate location parameters for the search 246 | data UserLocation = Approximate 247 | { city :: Maybe Text 248 | , country :: Maybe Text 249 | , region :: Maybe Text 250 | , timezone :: Maybe Text 251 | } deriving stock (Generic, Show) 252 | 253 | instance FromJSON UserLocation where 254 | parseJSON = genericParseJSON userLocationOptions 255 | 256 | instance ToJSON UserLocation where 257 | toJSON = genericToJSON userLocationOptions 258 | 259 | userLocationOptions :: Options 260 | userLocationOptions = 261 | aesonOptions 262 | { sumEncoding = 263 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 264 | 265 | , tagSingleConstructors = True 266 | } 267 | 268 | searchContextSizeOptions :: Options 269 | searchContextSizeOptions = 270 | aesonOptions 271 | { constructorTagModifier = stripPrefix "SearchContextSize_" } 272 | 273 | -- | Search the web for relevant results to use in a response 274 | data WebSearchOptions = WebSearchOptions 275 | { search_context_size :: Maybe SearchContextSize 276 | , user_location :: Maybe UserLocation 277 | } deriving stock (Generic, Show) 278 | deriving anyclass (FromJSON, ToJSON) 279 | 280 | -- | Request body for @\/v1\/chat\/completions@ 281 | data CreateChatCompletion = CreateChatCompletion 282 | { messages :: Vector (Message (Vector Content)) 283 | , model :: Model 284 | , store :: Maybe Bool 285 | , metadata :: Maybe (Map Text Text) 286 | , frequency_penalty :: Maybe Double 287 | , logit_bias :: Maybe (Map Word Int) 288 | , logprobs :: Maybe Bool 289 | , top_logprobs :: Maybe Word 290 | , max_completion_tokens :: Maybe Natural 291 | , n :: Maybe Natural 292 | , modalities :: Maybe (Vector Modality) 293 | , prediction :: Maybe Prediction 294 | , audio :: Maybe AudioParameters 295 | , presence_penalty :: Maybe Double 296 | , reasoning_effort :: Maybe ReasoningEffort 297 | , response_format :: Maybe ResponseFormat 298 | , seed :: Maybe Integer 299 | , service_tier :: Maybe (AutoOr ServiceTier) 300 | , stop :: Maybe (Vector Text) 301 | , temperature :: Maybe Double 302 | , top_p :: Maybe Double 303 | , tools :: Maybe (Vector Tool) 304 | , tool_choice :: Maybe ToolChoice 305 | , parallel_tool_calls :: Maybe Bool 306 | , user :: Maybe Text 307 | , web_search_options :: Maybe WebSearchOptions 308 | } deriving stock (Generic, Show) 309 | 310 | instance FromJSON CreateChatCompletion where 311 | parseJSON = genericParseJSON aesonOptions 312 | 313 | instance ToJSON CreateChatCompletion where 314 | toJSON = genericToJSON aesonOptions 315 | 316 | -- | Default `CreateChatCompletion` 317 | _CreateChatCompletion :: CreateChatCompletion 318 | _CreateChatCompletion = CreateChatCompletion 319 | { store = Nothing 320 | , metadata = Nothing 321 | , frequency_penalty = Nothing 322 | , logit_bias = Nothing 323 | , logprobs = Nothing 324 | , top_logprobs = Nothing 325 | , max_completion_tokens = Nothing 326 | , n = Nothing 327 | , modalities = Nothing 328 | , prediction = Nothing 329 | , audio = Nothing 330 | , presence_penalty = Nothing 331 | , reasoning_effort = Nothing 332 | , response_format = Nothing 333 | , seed = Nothing 334 | , service_tier = Nothing 335 | , stop = Nothing 336 | , temperature = Nothing 337 | , top_p = Nothing 338 | , tools = Nothing 339 | , tool_choice = Nothing 340 | , parallel_tool_calls = Nothing 341 | , user = Nothing 342 | , web_search_options = Nothing 343 | } 344 | 345 | -- | The reason the model stopped generating tokens 346 | data FinishReason 347 | = Stop 348 | | Length 349 | | Content_Filter 350 | | Tool_Calls 351 | deriving stock (Generic, Show) 352 | 353 | instance FromJSON FinishReason where 354 | parseJSON = genericParseJSON aesonOptions 355 | 356 | instance ToJSON FinishReason where 357 | toJSON = genericToJSON aesonOptions 358 | 359 | -- | Message tokens with log probability information 360 | data Token = Token 361 | { token :: Text 362 | , logprob :: Double 363 | , bytes :: Maybe (Vector Word8) 364 | , top_logprobs :: Maybe (Vector Token) 365 | } deriving stock (Generic, Show) 366 | deriving anyclass (FromJSON, ToJSON) 367 | 368 | -- | Log probability information for the choice 369 | data LogProbs = LogProbs 370 | { content :: Maybe (Vector Token) 371 | , refusal :: Maybe (Vector Token) 372 | } deriving stock (Generic, Show) 373 | deriving anyclass (FromJSON, ToJSON) 374 | 375 | -- | A chat completion choice 376 | data Choice = Choice 377 | { finish_reason :: Text 378 | , index :: Natural 379 | , message :: Message Text 380 | , logprobs :: Maybe LogProbs 381 | } deriving stock (Generic, Show) 382 | deriving anyclass (FromJSON, ToJSON) 383 | 384 | -- | ChatCompletion body 385 | data ChatCompletionObject = ChatCompletionObject 386 | { id :: Text 387 | , choices :: Vector Choice 388 | , created :: POSIXTime 389 | , model :: Model 390 | , reasoning_effort :: Maybe ReasoningEffort 391 | , service_tier :: Maybe ServiceTier 392 | , system_fingerprint :: Maybe Text 393 | , object :: Text 394 | , usage :: Usage CompletionTokensDetails PromptTokensDetails 395 | } deriving stock (Generic, Show) 396 | deriving anyclass (FromJSON, ToJSON) 397 | 398 | -- | Servant API 399 | type API = 400 | "chat" 401 | :> "completions" 402 | :> ReqBody '[JSON] CreateChatCompletion 403 | :> Post '[JSON] ChatCompletionObject 404 | -------------------------------------------------------------------------------- /src/OpenAI/V1/ChunkingStrategy.hs: -------------------------------------------------------------------------------- 1 | -- | The `ChunkingStrategy` type 2 | module OpenAI.V1.ChunkingStrategy 3 | ( -- * Main types 4 | ChunkingStrategy(..) 5 | -- * Other types 6 | , Static(..) 7 | ) where 8 | 9 | import OpenAI.Prelude 10 | 11 | -- | Static chunking strategy 12 | data Static = Static 13 | { max_chunk_size_tokens :: Natural 14 | , chunk_overlap_tokens :: Natural 15 | } deriving stock (Generic, Show) 16 | deriving anyclass (FromJSON, ToJSON) 17 | 18 | -- | The chunking strategy used to chunk the file(s) 19 | data ChunkingStrategy = ChunkingStrategy_Static{ static :: Static } 20 | deriving stock (Generic, Show) 21 | 22 | chunkingStrategyOptions :: Options 23 | chunkingStrategyOptions = aesonOptions 24 | { sumEncoding = 25 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 26 | 27 | , tagSingleConstructors = True 28 | 29 | , constructorTagModifier = stripPrefix "ChunkingStrategy_" 30 | } 31 | 32 | instance FromJSON ChunkingStrategy where 33 | parseJSON = genericParseJSON chunkingStrategyOptions 34 | 35 | instance ToJSON ChunkingStrategy where 36 | toJSON = genericToJSON chunkingStrategyOptions 37 | -------------------------------------------------------------------------------- /src/OpenAI/V1/DeletionStatus.hs: -------------------------------------------------------------------------------- 1 | -- | The `DeletionStatus` type 2 | module OpenAI.V1.DeletionStatus 3 | ( -- * Types 4 | DeletionStatus(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | Deletion status 10 | data DeletionStatus = DeletionStatus 11 | { id :: Text 12 | , object :: Text 13 | , deleted :: Bool 14 | } deriving stock (Generic, Show) 15 | deriving anyclass (FromJSON, ToJSON) 16 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Embeddings.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/embeddings@ 2 | module OpenAI.V1.Embeddings 3 | ( -- * Main types 4 | CreateEmbeddings(..) 5 | , _CreateEmbeddings 6 | , EmbeddingObject(..) 7 | -- * Other types 8 | , EncodingFormat(..) 9 | -- * Servant 10 | , API 11 | ) where 12 | 13 | import OpenAI.Prelude 14 | import OpenAI.V1.ListOf 15 | import OpenAI.V1.Models (Model) 16 | 17 | -- | The format to return the embeddings in 18 | data EncodingFormat = Float | Base64 19 | deriving stock (Generic, Show) 20 | 21 | instance FromJSON EncodingFormat where 22 | parseJSON = genericParseJSON aesonOptions 23 | 24 | instance ToJSON EncodingFormat where 25 | toJSON = genericToJSON aesonOptions 26 | 27 | -- | Request body for @\/v1\/embeddings@ 28 | data CreateEmbeddings = CreateEmbeddings 29 | { input :: Text 30 | , model :: Model 31 | , encoding_format :: Maybe EncodingFormat 32 | , dimensions :: Maybe Natural 33 | , user :: Maybe Text 34 | } deriving stock (Generic, Show) 35 | deriving anyclass (FromJSON, ToJSON) 36 | 37 | -- | Default `CreateEmbeddings` 38 | _CreateEmbeddings :: CreateEmbeddings 39 | _CreateEmbeddings = CreateEmbeddings 40 | { encoding_format = Nothing 41 | , dimensions = Nothing 42 | , user = Nothing 43 | } 44 | 45 | -- | The embedding object 46 | data EmbeddingObject = EmbbeddingObject 47 | { index :: Natural 48 | , embedding :: Vector Double 49 | , object :: Text 50 | } deriving stock (Generic, Show) 51 | deriving anyclass (FromJSON, ToJSON) 52 | 53 | -- | Servant API 54 | type API = 55 | "embeddings" 56 | :> ReqBody '[JSON] CreateEmbeddings 57 | :> Post '[JSON] (ListOf EmbeddingObject) 58 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Error.hs: -------------------------------------------------------------------------------- 1 | -- | Error information 2 | module OpenAI.V1.Error 3 | ( -- * Types 4 | Error(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | More information on the cause of the failure. 10 | -- 11 | -- NOTE: OpenAPI API's says that the `code` and `message` fields are required, 12 | -- but in practice the `Error` record can be present with all fields omitted, 13 | -- so they are all marked optional (`Maybe`) here 14 | data Error = Error 15 | { code :: Maybe Text 16 | , message :: Maybe Text 17 | , param :: Maybe Text 18 | , line :: Maybe Natural 19 | } deriving stock (Generic, Show) 20 | deriving anyclass (FromJSON, ToJSON) 21 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Files.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/files@ 2 | module OpenAI.V1.Files 3 | ( -- * Main types 4 | FileID(..) 5 | , UploadFile(..) 6 | , _UploadFile 7 | , FileObject(..) 8 | -- * Other types 9 | , Order(..) 10 | , Purpose(..) 11 | , DeletionStatus(..) 12 | -- * Servant 13 | , API 14 | ) where 15 | 16 | import OpenAI.Prelude 17 | import OpenAI.V1.DeletionStatus 18 | import OpenAI.V1.ListOf 19 | import OpenAI.V1.Order 20 | 21 | import qualified Data.Text as Text 22 | 23 | -- | File ID 24 | newtype FileID = FileID{ text :: Text } 25 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 26 | 27 | -- | UploadFile body 28 | data UploadFile = UploadFile 29 | { file :: FilePath 30 | , purpose :: Purpose 31 | } deriving stock (Generic, Show) 32 | 33 | -- | Default `UploadFile` 34 | _UploadFile :: UploadFile 35 | _UploadFile = UploadFile{ } 36 | 37 | instance ToMultipart Tmp UploadFile where 38 | toMultipart UploadFile{..} = MultipartData{..} 39 | where 40 | inputs = input "purpose" (toUrlPiece purpose) 41 | 42 | files = [ FileData{..} ] 43 | where 44 | fdInputName = "file" 45 | fdFileName = Text.pack file 46 | fdFileCType = "application/json" 47 | fdPayload = file 48 | 49 | -- | The File object represents a document that has been uploaded to OpenAI 50 | data FileObject = FileObject 51 | { id :: FileID 52 | , bytes :: Natural 53 | , created_at :: POSIXTime 54 | , filename :: Text 55 | , object :: Text 56 | , purpose :: Purpose 57 | } deriving stock (Generic, Show) 58 | deriving anyclass (FromJSON, ToJSON) 59 | 60 | -- | The intended purpose of the uploaded file. 61 | data Purpose 62 | = Assistants 63 | | Assistants_Output 64 | | Batch 65 | | Batch_Output 66 | | Fine_Tune 67 | | Fine_Tune_Results 68 | | Vision 69 | deriving stock (Generic, Show) 70 | 71 | purposeOptions :: Options 72 | purposeOptions = aesonOptions 73 | { constructorTagModifier = fix . labelModifier } 74 | where 75 | fix "fine_tune" = "fine-tune" 76 | fix "fine_tune_results" = "fine-tune-results" 77 | fix string = string 78 | 79 | instance FromJSON Purpose where 80 | parseJSON = genericParseJSON purposeOptions 81 | 82 | instance ToJSON Purpose where 83 | toJSON = genericToJSON purposeOptions 84 | 85 | instance ToHttpApiData Purpose where 86 | toUrlPiece Assistants = "assistants" 87 | toUrlPiece Assistants_Output = "assistants" 88 | toUrlPiece Batch = "batch" 89 | toUrlPiece Batch_Output = "batch_output" 90 | toUrlPiece Fine_Tune = "fine-tune" 91 | toUrlPiece Fine_Tune_Results = "fine-tune-results" 92 | toUrlPiece Vision = "vision" 93 | 94 | -- | Servant API 95 | type API = 96 | "files" 97 | :> ( MultipartForm Tmp UploadFile 98 | :> Post '[JSON] FileObject 99 | :<|> QueryParam "purpose" Purpose 100 | :> QueryParam "limit" Natural 101 | :> QueryParam "order" Order 102 | :> QueryParam "after" Text 103 | :> Get '[JSON] (ListOf FileObject) 104 | :<|> Capture "file_id" FileID 105 | :> Get '[JSON] FileObject 106 | :<|> Capture "file_id" FileID 107 | :> Delete '[JSON] DeletionStatus 108 | :<|> Capture "file_id" FileID 109 | :> "content" 110 | :> Get '[OctetStream] ByteString 111 | ) 112 | -------------------------------------------------------------------------------- /src/OpenAI/V1/FineTuning/Jobs.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/fine_tuning/jobs@ 2 | module OpenAI.V1.FineTuning.Jobs 3 | ( -- * Main types 4 | FineTuningJobID(..) 5 | , CreateFineTuningJob(..) 6 | , _CreateFineTuningJob 7 | , JobObject(..) 8 | , EventObject(..) 9 | , CheckpointObject(..) 10 | -- * Other types 11 | , AutoOr(..) 12 | , Hyperparameters(..) 13 | , WAndB(..) 14 | , Integration(..) 15 | , Status(..) 16 | , Level(..) 17 | , Metrics(..) 18 | -- * Servant 19 | , API 20 | ) where 21 | 22 | import OpenAI.Prelude 23 | import OpenAI.V1.AutoOr 24 | import OpenAI.V1.Error 25 | import OpenAI.V1.Files (FileID) 26 | import OpenAI.V1.ListOf 27 | import OpenAI.V1.Models (Model) 28 | 29 | -- | Fine tuning job ID 30 | newtype FineTuningJobID = FineTuningJobID{ text :: Text } 31 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 32 | 33 | -- | The hyperparameters used for the fine-tuning job 34 | data Hyperparameters = Hyperparameters 35 | { batch_size :: Maybe (AutoOr Natural) 36 | , learning_rate_multiplier :: Maybe (AutoOr Double) 37 | , n_epochs :: Maybe (AutoOr Natural) 38 | } deriving stock (Generic, Show) 39 | deriving anyclass (FromJSON, ToJSON) 40 | 41 | -- | The settings for your integration with Weights and 42 | data WAndB = WAndB 43 | { project :: Text 44 | , name :: Maybe Text 45 | , entity :: Maybe Text 46 | , tags :: Maybe (Vector Text) 47 | } deriving stock (Generic, Show) 48 | deriving anyclass (FromJSON, ToJSON) 49 | 50 | -- | An integration to enable for your fine-tuning job 51 | data Integration = Integration_WAndB{ wandb :: WAndB } 52 | deriving stock (Generic, Show) 53 | 54 | integrationOptions :: Options 55 | integrationOptions = aesonOptions 56 | { sumEncoding = 57 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 58 | 59 | , tagSingleConstructors = True 60 | 61 | , constructorTagModifier = stripPrefix "Integration_" 62 | } 63 | 64 | instance FromJSON Integration where 65 | parseJSON = genericParseJSON integrationOptions 66 | 67 | instance ToJSON Integration where 68 | toJSON = genericToJSON integrationOptions 69 | 70 | -- | Request body for @\/v1\/fine_tuning\/jobs@ 71 | data CreateFineTuningJob = CreateFineTuningJob 72 | { model :: Model 73 | , training_file :: FileID 74 | , hyperparameters :: Maybe Hyperparameters 75 | , suffix :: Maybe Text 76 | , validation_file :: Maybe FileID 77 | , integrations :: Maybe (Vector Integration) 78 | , seed :: Maybe Integer 79 | } deriving stock (Generic, Show) 80 | deriving anyclass (FromJSON, ToJSON) 81 | 82 | -- | Default `CreateFineTuningJob` 83 | _CreateFineTuningJob :: CreateFineTuningJob 84 | _CreateFineTuningJob = CreateFineTuningJob 85 | { hyperparameters = Nothing 86 | , suffix = Nothing 87 | , validation_file = Nothing 88 | , integrations = Nothing 89 | , seed = Nothing 90 | } 91 | 92 | -- | The current status of the fine-tuning job 93 | data Status 94 | = Validating_Files 95 | | Queued 96 | | Running 97 | | Succeeded 98 | | Failed 99 | | Cancelled 100 | deriving stock (Generic, Show) 101 | 102 | instance FromJSON Status where 103 | parseJSON = genericParseJSON aesonOptions 104 | 105 | instance ToJSON Status where 106 | toJSON = genericToJSON aesonOptions 107 | 108 | -- | The fine_tuning.job object represents a fine-tuning job that has been 109 | -- created through the API. 110 | data JobObject = JobObject 111 | { id :: FineTuningJobID 112 | , created_at :: POSIXTime 113 | , error :: Maybe Error 114 | , fine_tuned_model :: Maybe Model 115 | , finished_at :: Maybe POSIXTime 116 | , hyperparameters :: Hyperparameters 117 | , model :: Model 118 | , object :: Text 119 | , organization_id :: Text 120 | , result_files :: Vector FileID 121 | , status :: Status 122 | , trained_tokens :: Maybe Natural 123 | , training_file :: FileID 124 | , validation_file :: Maybe FileID 125 | , integrations :: Maybe (Vector Integration) 126 | , seed :: Integer 127 | , estimated_finish :: Maybe POSIXTime 128 | } deriving stock (Generic, Show) 129 | deriving anyclass (FromJSON, ToJSON) 130 | 131 | -- | Log level 132 | data Level = Info | Warn | Error 133 | deriving stock (Generic, Show) 134 | 135 | instance FromJSON Level where 136 | parseJSON = genericParseJSON aesonOptions 137 | 138 | instance ToJSON Level where 139 | toJSON = genericToJSON aesonOptions 140 | 141 | -- | Fine-tuning job event object 142 | data EventObject = EventObject 143 | { id :: Text 144 | , created_at :: POSIXTime 145 | , level :: Level 146 | , message :: Text 147 | , object :: Text 148 | } deriving stock (Generic, Show) 149 | deriving anyclass (FromJSON, ToJSON) 150 | 151 | -- | Metrics at the step number during the fine-tuning job. 152 | data Metrics = Metrics 153 | { step :: Double 154 | , train_loss :: Double 155 | , train_mean_token_accuracy :: Double 156 | , valid_loss :: Double 157 | , valid_mean_token_accuracy :: Double 158 | , full_valid_loss :: Double 159 | , full_valid_mean_token_accuracy :: Double 160 | } deriving stock (Generic, Show) 161 | deriving anyclass (FromJSON, ToJSON) 162 | 163 | -- | The @fine_tuning.job.checkpoint@ object represents a model checkpoint for 164 | -- a fine-tuning job that is ready to use 165 | data CheckpointObject = CheckpointObject 166 | { id :: Text 167 | , created_at :: POSIXTime 168 | , fine_tuned_model_checkpoint :: Text 169 | , step_number :: Natural 170 | , metrics :: Metrics 171 | , fine_tuning_job_id :: FineTuningJobID 172 | , object :: Text 173 | } deriving stock (Generic, Show) 174 | deriving anyclass (FromJSON, ToJSON) 175 | 176 | -- | Servant API 177 | type API = 178 | "fine_tuning" 179 | :> "jobs" 180 | :> ( ReqBody '[JSON] CreateFineTuningJob 181 | :> Post '[JSON] JobObject 182 | :<|> QueryParam "after" Text 183 | :> QueryParam "limit" Natural 184 | :> Get '[JSON] (ListOf JobObject) 185 | :<|> Capture "fine_tuning_job_id" FineTuningJobID 186 | :> "events" 187 | :> QueryParam "after" Text 188 | :> QueryParam "limit" Natural 189 | :> Get '[JSON] (ListOf EventObject) 190 | :<|> Capture "fine_tuning_job_id" FineTuningJobID 191 | :> "checkpoints" 192 | :> QueryParam "after" Text 193 | :> QueryParam "limit" Natural 194 | :> Get '[JSON] (ListOf CheckpointObject) 195 | :<|> Capture "fine_tuning_job_id" FineTuningJobID 196 | :> Get '[JSON] JobObject 197 | :<|> Capture "fine_tuning_job_id" FineTuningJobID 198 | :> "cancel" 199 | :> Post '[JSON] JobObject 200 | ) 201 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Images.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/images@ 2 | module OpenAI.V1.Images 3 | ( -- * Servant 4 | API 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | import qualified OpenAI.V1.Images.Edits as Edits 10 | import qualified OpenAI.V1.Images.Generations as Generations 11 | import qualified OpenAI.V1.Images.Variations as Variations 12 | 13 | -- | Servant API 14 | type API = "images" :> (Generations.API :<|> Edits.API :<|> Variations.API) 15 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Images/Edits.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/images\/edits@ 2 | module OpenAI.V1.Images.Edits 3 | ( -- * Main types 4 | CreateImageEdit(..) 5 | , _CreateImageEdit 6 | -- * Other types 7 | , ResponseFormat(..) 8 | -- * Servant 9 | , API 10 | ) where 11 | 12 | import OpenAI.Prelude 13 | import OpenAI.V1.Images.Image 14 | import OpenAI.V1.Images.ResponseFormat 15 | import OpenAI.V1.ListOf 16 | import OpenAI.V1.Models (Model(..)) 17 | 18 | import qualified Data.Text as Text 19 | 20 | -- | Request body for @\/v1\/images\/edits@ 21 | data CreateImageEdit = CreateImageEdit 22 | { image :: FilePath 23 | , prompt :: Text 24 | , mask :: Maybe FilePath 25 | , model :: Maybe Model 26 | , n :: Maybe Natural 27 | , size :: Maybe Text 28 | , response_format :: Maybe ResponseFormat 29 | , user :: Maybe Text 30 | } deriving stock (Generic, Show) 31 | deriving anyclass (FromJSON, ToJSON) 32 | 33 | -- | Default `CreateImageEdit` 34 | _CreateImageEdit :: CreateImageEdit 35 | _CreateImageEdit = CreateImageEdit 36 | { mask = Nothing 37 | , model = Nothing 38 | , n = Nothing 39 | , size = Nothing 40 | , response_format = Nothing 41 | , user = Nothing 42 | } 43 | 44 | instance ToMultipart Tmp CreateImageEdit where 45 | toMultipart CreateImageEdit{ ..} = MultipartData{..} 46 | where 47 | inputs = 48 | input "prompt" prompt 49 | <> foldMap (input "model" . text) model 50 | <> foldMap (input "n" . renderIntegral) n 51 | <> foldMap (input "size") size 52 | <> foldMap (input "response_format" . toUrlPiece) response_format 53 | <> foldMap (input "user") user 54 | 55 | files = file0 : files1 56 | where 57 | file0 = FileData{..} 58 | where 59 | fdInputName = "image" 60 | fdFileName = Text.pack image 61 | fdFileCType = "image/" <> getExtension image 62 | fdPayload = image 63 | 64 | files1 = case mask of 65 | Nothing -> [ ] 66 | Just m -> [ FileData{..} ] 67 | where 68 | fdInputName = "mask" 69 | fdFileName = Text.pack m 70 | fdFileCType = "image/" <> getExtension m 71 | fdPayload = m 72 | 73 | -- | Servant API 74 | type API = 75 | "edits" 76 | :> MultipartForm Tmp CreateImageEdit 77 | :> Post '[JSON] (ListOf ImageObject) 78 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Images/Generations.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/images\/generations@ 2 | module OpenAI.V1.Images.Generations 3 | ( -- * Main types 4 | CreateImage(..) 5 | , _CreateImage 6 | -- * Other types 7 | , Quality(..) 8 | , Style(..) 9 | -- * Servant 10 | , API 11 | ) where 12 | 13 | import OpenAI.Prelude 14 | import OpenAI.V1.Images.Image 15 | import OpenAI.V1.Images.ResponseFormat 16 | import OpenAI.V1.ListOf 17 | import OpenAI.V1.Models (Model) 18 | 19 | -- | The quality of the image that will be generated 20 | data Quality = Standard | HD 21 | deriving stock (Generic, Show) 22 | 23 | instance FromJSON Quality where 24 | parseJSON = genericParseJSON aesonOptions 25 | 26 | instance ToJSON Quality where 27 | toJSON = genericToJSON aesonOptions 28 | 29 | -- | The style of the generated images 30 | data Style = Vivid | Natural 31 | deriving stock (Generic, Show) 32 | 33 | instance FromJSON Style where 34 | parseJSON = genericParseJSON aesonOptions 35 | 36 | instance ToJSON Style where 37 | toJSON = genericToJSON aesonOptions 38 | 39 | -- | Requesty body for @\/v1/images/generations@ 40 | data CreateImage = CreateImage 41 | { prompt :: Text 42 | , model :: Maybe Model 43 | , n :: Maybe Natural 44 | , quality :: Maybe Quality 45 | , response_format :: Maybe ResponseFormat 46 | , size :: Maybe Text 47 | , style :: Maybe Style 48 | , user :: Maybe Text 49 | } deriving stock (Generic, Show) 50 | 51 | -- | Default `CreateImage` 52 | _CreateImage :: CreateImage 53 | _CreateImage = CreateImage 54 | { model = Nothing 55 | , n = Nothing 56 | , quality = Nothing 57 | , response_format = Nothing 58 | , size = Nothing 59 | , style = Nothing 60 | , user = Nothing 61 | } 62 | 63 | instance FromJSON CreateImage where 64 | parseJSON = genericParseJSON aesonOptions 65 | 66 | instance ToJSON CreateImage where 67 | toJSON = genericToJSON aesonOptions 68 | 69 | -- | Servant API 70 | type API = 71 | "generations" 72 | :> ReqBody '[JSON] CreateImage 73 | :> Post '[JSON] (ListOf ImageObject) 74 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Images/Image.hs: -------------------------------------------------------------------------------- 1 | -- | The image object 2 | module OpenAI.V1.Images.Image 3 | ( -- * Types 4 | ImageObject(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | Represents the url or the content of an image generated by the OpenAI API 10 | data ImageObject = ImageObject 11 | { b64_json :: Maybe Text 12 | , url :: Maybe Text 13 | , revised_prompt :: Maybe Text 14 | } deriving stock (Generic, Show) 15 | deriving anyclass (FromJSON, ToJSON) 16 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Images/ResponseFormat.hs: -------------------------------------------------------------------------------- 1 | -- | The response format 2 | module OpenAI.V1.Images.ResponseFormat 3 | ( -- * Types 4 | ResponseFormat(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | The format in which the generated images are returned 10 | data ResponseFormat = URL | B64_JSON 11 | deriving stock (Generic, Show) 12 | 13 | instance FromJSON ResponseFormat where 14 | parseJSON = genericParseJSON aesonOptions 15 | 16 | instance ToJSON ResponseFormat where 17 | toJSON = genericToJSON aesonOptions 18 | 19 | instance ToHttpApiData ResponseFormat where 20 | toUrlPiece URL = "url" 21 | toUrlPiece B64_JSON = "b64_json" 22 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Images/Variations.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/images\/variations@ 2 | module OpenAI.V1.Images.Variations 3 | ( -- * Main types 4 | CreateImageVariation(..) 5 | , _CreateImageVariation 6 | -- * Servant 7 | , API 8 | ) where 9 | 10 | import OpenAI.Prelude 11 | import OpenAI.V1.Images.Image 12 | import OpenAI.V1.Images.ResponseFormat 13 | import OpenAI.V1.ListOf 14 | import OpenAI.V1.Models (Model(..)) 15 | 16 | import qualified Data.Text as Text 17 | 18 | -- | Request body for @\/v1\/images\/variations@ 19 | data CreateImageVariation = CreateImageVariation 20 | { image :: FilePath 21 | , model :: Maybe Model 22 | , n :: Maybe Natural 23 | , response_format :: Maybe ResponseFormat 24 | , size :: Maybe Text 25 | , user :: Maybe Text 26 | } deriving stock (Generic, Show) 27 | deriving anyclass (FromJSON, ToJSON) 28 | 29 | -- | Default `CreateImageVariation` 30 | _CreateImageVariation :: CreateImageVariation 31 | _CreateImageVariation = CreateImageVariation 32 | { model = Nothing 33 | , n = Nothing 34 | , response_format = Nothing 35 | , size = Nothing 36 | , user = Nothing 37 | } 38 | 39 | instance ToMultipart Tmp CreateImageVariation where 40 | toMultipart CreateImageVariation{..} = MultipartData{..} 41 | where 42 | inputs = 43 | foldMap (input "model" . text) model 44 | <> foldMap (input "n" . renderIntegral) n 45 | <> foldMap (input "response_format" . toUrlPiece) response_format 46 | <> foldMap (input "size") size 47 | <> foldMap (input "user") user 48 | 49 | files = [ FileData{..} ] 50 | where 51 | fdInputName = "image" 52 | fdFileName = Text.pack image 53 | fdFileCType = "image/" <> getExtension image 54 | fdPayload = image 55 | 56 | -- | Servant API 57 | type API = 58 | "variations" 59 | :> MultipartForm Tmp CreateImageVariation 60 | :> Post '[JSON] (ListOf ImageObject) 61 | -------------------------------------------------------------------------------- /src/OpenAI/V1/ListOf.hs: -------------------------------------------------------------------------------- 1 | -- | The `ListOf` type constructor 2 | module OpenAI.V1.ListOf 3 | ( -- * Types 4 | ListOf(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | Whenever OpenAI says that an API endpoint returns a list of items, what 10 | -- they actually mean is that it returns that list wrapped inside of the 11 | -- following object 12 | data ListOf a = List{ data_ :: Vector a } 13 | deriving stock (Generic, Show) 14 | 15 | instance FromJSON a => FromJSON (ListOf a) where 16 | parseJSON = genericParseJSON aesonOptions 17 | 18 | instance ToJSON a => ToJSON (ListOf a) where 19 | toJSON = genericToJSON aesonOptions 20 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Message.hs: -------------------------------------------------------------------------------- 1 | -- | The `Message` type 2 | module OpenAI.V1.Message 3 | ( -- * Main types 4 | Message(..) 5 | -- * Other types 6 | , ImageFile(..) 7 | , ImageURL(..) 8 | , Content(..) 9 | , Attachment(..) 10 | ) where 11 | 12 | import OpenAI.Prelude 13 | import OpenAI.V1.AutoOr 14 | import OpenAI.V1.Files (FileID) 15 | import OpenAI.V1.Tool 16 | 17 | -- | References an image File in the content of a message 18 | data ImageFile = ImageFile{ file_id :: FileID, detail :: Maybe (AutoOr Text) } 19 | deriving stock (Generic, Show) 20 | deriving anyclass (FromJSON, ToJSON) 21 | 22 | -- | References an image URL in the content of a message 23 | data ImageURL = ImageURL 24 | { image_url :: Text 25 | , detail :: Maybe (AutoOr Text) 26 | } deriving stock (Generic, Show) 27 | deriving anyclass (FromJSON, ToJSON) 28 | 29 | -- | Message content 30 | data Content text 31 | = Image_File{ image_file :: ImageFile } 32 | | Image_URL{ image_url :: ImageURL } 33 | | Text{ text :: text } 34 | deriving stock (Generic, Show) 35 | 36 | contentOptions :: Options 37 | contentOptions = aesonOptions 38 | { sumEncoding = 39 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 40 | 41 | , tagSingleConstructors = True 42 | } 43 | 44 | instance FromJSON text => FromJSON (Content text) where 45 | parseJSON = genericParseJSON contentOptions 46 | 47 | instance ToJSON text => ToJSON (Content text) where 48 | toJSON = genericToJSON contentOptions 49 | 50 | instance IsString text => IsString (Content text) where 51 | fromString string = Text{ text = fromString string } 52 | 53 | -- | A file attached to the message, and the tools it should be added to 54 | data Attachment = Attachment{ file_id :: FileID, tools :: Maybe (Vector Tool) } 55 | deriving stock (Generic, Show) 56 | deriving anyclass (FromJSON, ToJSON) 57 | 58 | -- | A message 59 | data Message 60 | = User 61 | { content :: Vector (Content Text) 62 | , attachments :: Maybe (Vector Attachment) 63 | , metadata :: Maybe (Map Text Text) 64 | } 65 | | Assistant 66 | { content :: Vector (Content Text) 67 | , attachments :: Maybe (Vector Attachment) 68 | , metadata :: Maybe (Map Text Text) 69 | } 70 | deriving stock (Generic, Show) 71 | 72 | messageOptions :: Options 73 | messageOptions = 74 | aesonOptions 75 | { sumEncoding = 76 | TaggedObject{ tagFieldName = "role", contentsFieldName = "" } 77 | , tagSingleConstructors = True 78 | } 79 | 80 | instance FromJSON Message where 81 | parseJSON = genericParseJSON messageOptions 82 | 83 | instance ToJSON Message where 84 | toJSON = genericToJSON messageOptions 85 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Models.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/models@ 2 | module OpenAI.V1.Models 3 | ( -- * Main types 4 | Model(..) 5 | , ModelObject(..) 6 | -- * Servant 7 | , API 8 | ) where 9 | 10 | import OpenAI.Prelude 11 | import OpenAI.V1.DeletionStatus 12 | import OpenAI.V1.ListOf 13 | 14 | -- | Model 15 | newtype Model = Model{ text :: Text } 16 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 17 | 18 | -- | Describes an OpenAI model offering that can be used with the API 19 | data ModelObject = ModelObject 20 | { id :: Model 21 | , created :: POSIXTime 22 | , object :: Text 23 | , owned_by :: Text 24 | } deriving stock (Generic, Show) 25 | deriving anyclass (FromJSON) 26 | 27 | -- | Servant API 28 | type API = 29 | "models" 30 | :> ( Get '[JSON] (ListOf ModelObject) 31 | :<|> Capture "model" Model 32 | :> Get '[JSON] ModelObject 33 | :<|> Capture "model" Model 34 | :> Delete '[JSON] DeletionStatus 35 | ) 36 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Moderations.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/moderations@ 2 | module OpenAI.V1.Moderations 3 | ( -- * Main types 4 | CreateModeration(..) 5 | , _CreateModeration 6 | , Moderation(..) 7 | -- * Other types 8 | , InputType(..) 9 | , Result(..) 10 | -- * Servant 11 | , API 12 | ) where 13 | 14 | import OpenAI.Prelude 15 | import OpenAI.V1.Models (Model) 16 | 17 | -- | Request body for @\/v1\/moderations@ 18 | data CreateModeration = CreateModeration 19 | { input :: Text 20 | , model :: Maybe Model 21 | } deriving stock (Generic, Show) 22 | deriving anyclass (FromJSON, ToJSON) 23 | 24 | -- | Default `CreateModeration` 25 | _CreateModeration :: CreateModeration 26 | _CreateModeration = CreateModeration 27 | { model = Nothing 28 | } 29 | 30 | -- | The input type that the score applies to 31 | data InputType = Text | Image 32 | deriving stock (Generic, Show) 33 | 34 | instance FromJSON InputType where 35 | parseJSON = genericParseJSON aesonOptions 36 | 37 | instance ToJSON InputType where 38 | toJSON = genericToJSON aesonOptions 39 | 40 | -- | A moderation result 41 | data Result = Result 42 | { flagged :: Bool 43 | , categories :: Map Text Bool 44 | , category_scores :: Map Text Double 45 | , category_applied_input_types :: Maybe (Map Text InputType) 46 | -- According to the OpenAPI spec the `category_applied_input_types` 47 | -- field is required but their actual implementation omits this field. 48 | } deriving stock (Generic, Show) 49 | deriving anyclass (FromJSON, ToJSON) 50 | 51 | -- | Represents if a given text input is potentially harmful. 52 | data Moderation = Moderation 53 | { id :: Text 54 | , model :: Model 55 | , results :: Vector Result 56 | } deriving stock (Generic, Show) 57 | deriving anyclass (FromJSON, ToJSON) 58 | 59 | -- | Servant API 60 | type API = 61 | "moderations" :> ReqBody '[JSON] CreateModeration :> Post '[JSON] Moderation 62 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Order.hs: -------------------------------------------------------------------------------- 1 | -- | The `Order` type 2 | module OpenAI.V1.Order 3 | ( -- * Types 4 | Order(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | Sort order by the @created_at@ timestamp of the objects 10 | data Order = Desc | Asc 11 | deriving stock (Generic, Show) 12 | 13 | instance FromJSON Order where 14 | parseJSON = genericParseJSON aesonOptions 15 | 16 | instance ToJSON Order where 17 | toJSON = genericToJSON aesonOptions 18 | 19 | instance ToHttpApiData Order where 20 | toUrlPiece Desc = "desc" 21 | toUrlPiece Asc = "asc" 22 | -------------------------------------------------------------------------------- /src/OpenAI/V1/ResponseFormat.hs: -------------------------------------------------------------------------------- 1 | -- | The `ResponseFormat` type 2 | module OpenAI.V1.ResponseFormat 3 | ( -- * Types 4 | ResponseFormat(..) 5 | , JSONSchema(..) 6 | ) where 7 | 8 | import OpenAI.Prelude 9 | 10 | -- | An object specifying the format that the model must output 11 | data ResponseFormat 12 | = ResponseFormat_Text 13 | | JSON_Object 14 | | JSON_Schema{ json_schema :: JSONSchema } 15 | deriving stock (Generic, Show) 16 | 17 | responseFormatOptions :: Options 18 | responseFormatOptions = aesonOptions 19 | { sumEncoding = 20 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 21 | 22 | , tagSingleConstructors = True 23 | 24 | , constructorTagModifier = stripPrefix "ResponseFormat_" 25 | } 26 | 27 | instance FromJSON ResponseFormat where 28 | parseJSON = genericParseJSON responseFormatOptions 29 | 30 | instance ToJSON ResponseFormat where 31 | toJSON = genericToJSON responseFormatOptions 32 | 33 | -- | Setting to { "type": "json_schema", "json_schema": {...} } enables 34 | -- Structured Outputs which ensures the model will match your supplied JSON 35 | -- schema. Learn more in the 36 | -- [Structured Outputs](https://platform.openai.com/docs/guides/structured-outputs) guide 37 | data JSONSchema = JSONSchema 38 | { description :: Maybe Text 39 | , name :: Text 40 | , schema :: Maybe Value 41 | , strict :: Maybe Bool 42 | } deriving stock (Generic, Show) 43 | deriving anyclass (FromJSON, ToJSON) 44 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Responses.hs: -------------------------------------------------------------------------------- 1 | -- | /v1/responses 2 | -- 3 | -- Streaming is not implemented here; this covers JSON responses only. 4 | module OpenAI.V1.Responses 5 | ( -- * Main types 6 | CreateResponse(..) 7 | , _CreateResponse 8 | , Input(..) 9 | , InputItem(..) 10 | , InputRole(..) 11 | , InputContent(..) 12 | , OutputItem(..) 13 | , OutputMessage(..) 14 | , OutputContent(..) 15 | , FunctionToolCall(..) 16 | , FunctionToolCallOutput(..) 17 | , WebSearchToolCall(..) 18 | , FileSearchToolCall(..) 19 | , FileSearchResult(..) 20 | , CodeInterpreterToolCall(..) 21 | , CodeInterpreterOutput(..) 22 | , WebSearchAction(..) 23 | , WebSearchSource(..) 24 | , Annotation(..) 25 | , ReasoningItem(..) 26 | , SummaryPart(..) 27 | , ReasoningText(..) 28 | , ResponseStreamEvent(..) 29 | , ResponseObject(..) 30 | , ResponseUsage(..) 31 | , InputTokensDetails(..) 32 | , OutputTokensDetails(..) 33 | -- * Servant 34 | , API 35 | ) where 36 | 37 | import Data.Aeson (Object) 38 | import OpenAI.Prelude hiding (Input(..)) 39 | -- no TH; inline JSON instances for payloads 40 | import OpenAI.V1.ListOf (ListOf) 41 | import OpenAI.V1.Models (Model) 42 | import OpenAI.V1.Tool (Tool, ToolChoice) 43 | 44 | -- | Input for the Responses API: a list of input items 45 | newtype Input = Input (Vector InputItem) 46 | deriving stock (Generic, Show) 47 | 48 | instance ToJSON Input where 49 | toJSON (Input xs) = toJSON xs 50 | 51 | instance FromJSON Input where 52 | parseJSON v = Input <$> parseJSON v 53 | 54 | -- | Role of an input message 55 | data InputRole = User | System | Developer 56 | deriving stock (Generic, Show) 57 | 58 | instance FromJSON InputRole where 59 | parseJSON = genericParseJSON aesonOptions 60 | 61 | instance ToJSON InputRole where 62 | toJSON = genericToJSON aesonOptions 63 | 64 | -- | Content parts for input messages 65 | data InputContent 66 | = Input_Text { text :: Text } 67 | | Input_Image { image_url :: Maybe Text, file_id :: Maybe Text, detail :: Maybe Text } 68 | | Input_File { file_id :: Maybe Text, filename :: Maybe Text, file_url :: Maybe Text, file_data :: Maybe Text } 69 | | Input_Audio { input_audio :: Object } 70 | deriving stock (Generic, Show) 71 | 72 | inputContentOptions :: Options 73 | inputContentOptions = 74 | aesonOptions 75 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 76 | , tagSingleConstructors = True 77 | -- Keep constructor names like "Input_Text" -> "input_text" 78 | , constructorTagModifier = labelModifier 79 | } 80 | 81 | instance FromJSON InputContent where 82 | parseJSON = genericParseJSON inputContentOptions 83 | 84 | instance ToJSON InputContent where 85 | toJSON = genericToJSON inputContentOptions 86 | 87 | -- | An input item 88 | data InputItem 89 | = Item_InputMessage 90 | { role :: InputRole 91 | , content :: Vector InputContent 92 | , status :: Maybe Text 93 | } 94 | deriving stock (Generic, Show) 95 | 96 | inputItemOptions :: Options 97 | inputItemOptions = 98 | aesonOptions 99 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 100 | , tagSingleConstructors = True 101 | , constructorTagModifier = stripPrefix "Item_Input" 102 | } 103 | 104 | instance FromJSON InputItem where 105 | parseJSON = genericParseJSON inputItemOptions 106 | 107 | instance ToJSON InputItem where 108 | toJSON = genericToJSON inputItemOptions 109 | 110 | -- | Output content from the model 111 | data OutputContent 112 | = Output_Text 113 | { text :: Text 114 | , annotations :: Vector Value 115 | , logprobs :: Maybe (Vector Value) 116 | } 117 | | Refusal 118 | { refusal :: Text } 119 | deriving stock (Generic, Show) 120 | 121 | outputContentOptions :: Options 122 | outputContentOptions = 123 | aesonOptions 124 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 125 | , tagSingleConstructors = True 126 | } 127 | 128 | instance FromJSON OutputContent where 129 | parseJSON = genericParseJSON outputContentOptions 130 | 131 | instance ToJSON OutputContent where 132 | toJSON = genericToJSON outputContentOptions 133 | 134 | -- | An output message item 135 | data OutputMessage = OutputMessage 136 | { id :: Text 137 | , role :: Text 138 | , content :: Vector OutputContent 139 | , status :: Text 140 | } deriving stock (Generic, Show) 141 | 142 | instance FromJSON OutputMessage where 143 | parseJSON = genericParseJSON aesonOptions 144 | 145 | instance ToJSON OutputMessage where 146 | toJSON = genericToJSON aesonOptions 147 | 148 | -- | A generated output item. 149 | data OutputItem 150 | = Item_OutputMessage 151 | { message_id :: Text 152 | , message_role :: Text 153 | , message_content :: Vector OutputContent 154 | , message_status :: Text 155 | } 156 | | Item_FunctionToolCall 157 | { function_id :: Maybe Text 158 | , function_call_id :: Text 159 | , function_name :: Text 160 | , function_arguments :: Text 161 | , function_status :: Maybe Text 162 | } 163 | | Item_WebSearchToolCall 164 | { web_search_id :: Text 165 | , web_search_status :: Text 166 | , web_search_action :: Maybe WebSearchAction 167 | } 168 | | Item_FunctionToolCallOutput 169 | { function_output_id :: Maybe Text 170 | , function_output_call_id :: Text 171 | , function_output_output :: Text 172 | , function_output_status :: Maybe Text 173 | } 174 | | Item_FileSearchToolCall 175 | { file_search_id :: Text 176 | , file_search_status :: Text 177 | , file_search_queries :: Vector Text 178 | , file_search_results :: Maybe (Vector FileSearchResult) 179 | } 180 | | Item_CodeInterpreterToolCall 181 | { code_interpreter_id :: Text 182 | , code_interpreter_status :: Text 183 | , code_interpreter_container_id :: Maybe Text 184 | , code_interpreter_code :: Maybe Text 185 | , code_interpreter_outputs :: Maybe (Vector CodeInterpreterOutput) 186 | } 187 | | Item_Reasoning 188 | { reasoning_id :: Text 189 | , reasoning_encrypted_content :: Maybe Text 190 | , reasoning_summary :: Maybe (Vector SummaryPart) 191 | , reasoning_content :: Maybe (Vector ReasoningText) 192 | , reasoning_status :: Maybe Text 193 | } 194 | deriving stock (Generic, Show) 195 | 196 | outputItemOptions :: Options 197 | outputItemOptions = aesonOptions 198 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 199 | , tagSingleConstructors = True 200 | , fieldLabelModifier = \s -> case s of 201 | "message_id" -> "id" 202 | "message_role" -> "role" 203 | "message_content" -> "content" 204 | "message_status" -> "status" 205 | "function_id" -> "id" 206 | "function_call_id" -> "call_id" 207 | "function_name" -> "name" 208 | "function_arguments" -> "arguments" 209 | "function_status" -> "status" 210 | "web_search_id" -> "id" 211 | "web_search_status" -> "status" 212 | "web_search_action" -> "action" 213 | "function_output_id" -> "id" 214 | "function_output_call_id" -> "call_id" 215 | "function_output_output" -> "output" 216 | "function_output_status" -> "status" 217 | "file_search_id" -> "id" 218 | "file_search_status" -> "status" 219 | "file_search_queries" -> "queries" 220 | "file_search_results" -> "results" 221 | "code_interpreter_id" -> "id" 222 | "code_interpreter_status" -> "status" 223 | "code_interpreter_container_id" -> "container_id" 224 | "code_interpreter_code" -> "code" 225 | "code_interpreter_outputs" -> "outputs" 226 | "reasoning_id" -> "id" 227 | "reasoning_encrypted_content" -> "encrypted_content" 228 | "reasoning_summary" -> "summary" 229 | "reasoning_content" -> "content" 230 | "reasoning_status" -> "status" 231 | other -> other 232 | , constructorTagModifier = \s -> case s of 233 | "Item_OutputMessage" -> "message" 234 | "Item_FunctionToolCall" -> "function_call" 235 | "Item_WebSearchToolCall" -> "web_search_call" 236 | "Item_FunctionToolCallOutput" -> "function_call_output" 237 | "Item_FileSearchToolCall" -> "file_search_call" 238 | "Item_CodeInterpreterToolCall" -> "code_interpreter_call" 239 | "Item_Reasoning" -> "reasoning" 240 | _ -> Prelude.error "Unknown OutputItem constructor" 241 | } 242 | 243 | instance FromJSON OutputItem where 244 | parseJSON = genericParseJSON outputItemOptions 245 | 246 | instance ToJSON OutputItem where 247 | toJSON = genericToJSON outputItemOptions 248 | 249 | -- | Function tool call output item 250 | data FunctionToolCall = FunctionToolCall 251 | { id :: Maybe Text 252 | , call_id :: Text 253 | , name :: Text 254 | , arguments :: Text 255 | , status :: Maybe Text 256 | } deriving stock (Generic, Show) 257 | deriving anyclass (FromJSON, ToJSON) 258 | 259 | -- | Function tool call output item 260 | data FunctionToolCallOutput = FunctionToolCallOutput 261 | { id :: Maybe Text 262 | , call_id :: Text 263 | , output :: Text 264 | , status :: Maybe Text 265 | } deriving stock (Generic, Show) 266 | deriving anyclass (FromJSON, ToJSON) 267 | 268 | -- | Web search tool call output item (action is left generic for now) 269 | data WebSearchToolCall = WebSearchToolCall 270 | { id :: Text 271 | , status :: Text 272 | , action :: Maybe WebSearchAction 273 | } deriving stock (Generic, Show) 274 | 275 | instance FromJSON WebSearchToolCall where 276 | parseJSON = genericParseJSON aesonOptions 277 | 278 | instance ToJSON WebSearchToolCall where 279 | toJSON = genericToJSON aesonOptions 280 | 281 | -- | File search result entry 282 | data FileSearchResult = FileSearchResult 283 | { file_id :: Text 284 | , text :: Text 285 | , filename :: Text 286 | , score :: Maybe Double 287 | } deriving stock (Generic, Show) 288 | deriving anyclass (FromJSON, ToJSON) 289 | 290 | -- | File search tool call output item 291 | data FileSearchToolCall = FileSearchToolCall 292 | { id :: Text 293 | , status :: Text 294 | , queries :: Vector Text 295 | , results :: Maybe (Vector FileSearchResult) 296 | } deriving stock (Generic, Show) 297 | deriving anyclass (FromJSON, ToJSON) 298 | 299 | -- | Code interpreter tool call outputs 300 | data CodeInterpreterOutput 301 | = CodeInterpreterOutput_Logs{ logs :: Text } 302 | | CodeInterpreterOutput_Image{ url :: Text } 303 | deriving stock (Generic, Show) 304 | 305 | codeInterpreterOutputOptions :: Options 306 | codeInterpreterOutputOptions = aesonOptions 307 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 308 | , tagSingleConstructors = True 309 | , constructorTagModifier = stripPrefix "CodeInterpreterOutput_" 310 | } 311 | 312 | instance FromJSON CodeInterpreterOutput where 313 | parseJSON = genericParseJSON codeInterpreterOutputOptions 314 | 315 | instance ToJSON CodeInterpreterOutput where 316 | toJSON = genericToJSON codeInterpreterOutputOptions 317 | 318 | -- | Code interpreter tool call output item 319 | data CodeInterpreterToolCall = CodeInterpreterToolCall 320 | { id :: Text 321 | , status :: Text 322 | , container_id :: Maybe Text 323 | , code :: Maybe Text 324 | , outputs :: Maybe (Vector CodeInterpreterOutput) 325 | } deriving stock (Generic, Show) 326 | deriving anyclass (FromJSON, ToJSON) 327 | 328 | -- | Web search action sources 329 | data WebSearchSource = WebSearchSource_URL{ url :: Text } 330 | deriving stock (Generic, Show) 331 | 332 | webSearchSourceOptions :: Options 333 | webSearchSourceOptions = aesonOptions 334 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 335 | , tagSingleConstructors = True 336 | , constructorTagModifier = stripPrefix "WebSearchSource_" 337 | } 338 | 339 | instance FromJSON WebSearchSource where 340 | parseJSON = genericParseJSON webSearchSourceOptions 341 | 342 | instance ToJSON WebSearchSource where 343 | toJSON = genericToJSON webSearchSourceOptions 344 | 345 | -- | Web search action 346 | data WebSearchAction 347 | = WebSearchAction_Search 348 | { query :: Maybe Text 349 | , sources :: Maybe (Vector WebSearchSource) 350 | } 351 | | WebSearchAction_Open_Page 352 | { url :: Maybe Text } 353 | | WebSearchAction_Find 354 | { url :: Maybe Text 355 | , pattern :: Maybe Text 356 | } 357 | deriving stock (Generic, Show) 358 | 359 | webSearchActionOptions :: Options 360 | webSearchActionOptions = aesonOptions 361 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 362 | , tagSingleConstructors = True 363 | , constructorTagModifier = stripPrefix "WebSearchAction_" 364 | } 365 | 366 | instance FromJSON WebSearchAction where 367 | parseJSON = genericParseJSON webSearchActionOptions 368 | 369 | instance ToJSON WebSearchAction where 370 | toJSON = genericToJSON webSearchActionOptions 371 | 372 | -- | Output text annotation 373 | data Annotation 374 | = Annotation_File_Citation 375 | { file_id :: Text 376 | , index :: Natural 377 | , filename :: Text 378 | } 379 | | Annotation_Url_Citation 380 | { url :: Text 381 | , start_index :: Natural 382 | , end_index :: Natural 383 | , title :: Text 384 | } 385 | | Annotation_Container_File_Citation 386 | { container_id :: Text 387 | , file_id :: Text 388 | , start_index :: Natural 389 | , end_index :: Natural 390 | , filename :: Text 391 | } 392 | | Annotation_File_Path 393 | { file_id :: Text 394 | , index :: Natural 395 | } 396 | deriving stock (Generic, Show) 397 | 398 | annotationOptions :: Options 399 | annotationOptions = aesonOptions 400 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 401 | , tagSingleConstructors = True 402 | , constructorTagModifier = stripPrefix "Annotation_" 403 | } 404 | 405 | instance FromJSON Annotation where 406 | parseJSON = genericParseJSON annotationOptions 407 | 408 | instance ToJSON Annotation where 409 | toJSON = genericToJSON annotationOptions 410 | 411 | -- | Reasoning summary part 412 | data SummaryPart = Summary_Text{ text :: Text } 413 | deriving stock (Generic, Show) 414 | 415 | summaryPartOptions :: Options 416 | summaryPartOptions = aesonOptions 417 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 418 | , tagSingleConstructors = True 419 | } 420 | 421 | instance FromJSON SummaryPart where 422 | parseJSON = genericParseJSON summaryPartOptions 423 | 424 | instance ToJSON SummaryPart where 425 | toJSON = genericToJSON summaryPartOptions 426 | 427 | -- | Reasoning text part 428 | data ReasoningText = Reasoning_Text{ text :: Text } 429 | deriving stock (Generic, Show) 430 | 431 | reasoningTextOptions :: Options 432 | reasoningTextOptions = aesonOptions 433 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 434 | , tagSingleConstructors = True 435 | } 436 | 437 | instance FromJSON ReasoningText where 438 | parseJSON = genericParseJSON reasoningTextOptions 439 | 440 | instance ToJSON ReasoningText where 441 | toJSON = genericToJSON reasoningTextOptions 442 | 443 | -- | Reasoning item produced by reasoning models 444 | data ReasoningItem = ReasoningItem 445 | { id :: Text 446 | , encrypted_content :: Maybe Text 447 | , summary :: Maybe (Vector SummaryPart) 448 | , content :: Maybe (Vector ReasoningText) 449 | , status :: Maybe Text 450 | } deriving stock (Generic, Show) 451 | 452 | instance FromJSON ReasoningItem where 453 | parseJSON = genericParseJSON aesonOptions 454 | 455 | instance ToJSON ReasoningItem where 456 | toJSON = genericToJSON aesonOptions 457 | 458 | -- | Streaming events for /v1/responses 459 | data ResponseStreamEvent 460 | = ResponseCreatedEvent 461 | { response :: ResponseObject 462 | , sequence_number :: Natural 463 | } 464 | | ResponseInProgressEvent 465 | { response :: ResponseObject 466 | , sequence_number :: Natural 467 | } 468 | | ResponseCompletedEvent 469 | { response :: ResponseObject 470 | , sequence_number :: Natural 471 | } 472 | | ResponseFailedEvent 473 | { response :: ResponseObject 474 | , sequence_number :: Natural 475 | } 476 | | ResponseOutputItemAddedEvent 477 | { output_index :: Natural 478 | , item :: OutputItem 479 | , sequence_number :: Natural 480 | } 481 | | ResponseOutputItemDoneEvent 482 | { output_index :: Natural 483 | , sequence_number :: Natural 484 | } 485 | | ResponseContentPartAddedEvent 486 | { item_id :: Text 487 | , output_index :: Natural 488 | , content_index :: Natural 489 | , part :: OutputContent 490 | , sequence_number :: Natural 491 | } 492 | | ResponseContentPartDoneEvent 493 | { item_id :: Text 494 | , output_index :: Natural 495 | , content_index :: Natural 496 | , part :: OutputContent 497 | , sequence_number :: Natural 498 | } 499 | | ResponseTextDeltaEvent 500 | { item_id :: Text 501 | , output_index :: Natural 502 | , content_index :: Natural 503 | , delta :: Text 504 | , sequence_number :: Natural 505 | } 506 | | ResponseTextDoneEvent 507 | { item_id :: Text 508 | , output_index :: Natural 509 | , content_index :: Natural 510 | , text :: Text 511 | , sequence_number :: Natural 512 | } 513 | | ResponseOutputTextAnnotationAddedEvent 514 | { item_id :: Text 515 | , output_index :: Natural 516 | , content_index :: Natural 517 | , annotation_index :: Natural 518 | , annotation :: Annotation 519 | , sequence_number :: Natural 520 | } 521 | | ResponseWebSearchCallInProgressEvent 522 | { output_index :: Natural 523 | , item_id :: Text 524 | , sequence_number :: Natural 525 | } 526 | | ResponseWebSearchCallSearchingEvent 527 | { output_index :: Natural 528 | , item_id :: Text 529 | , sequence_number :: Natural 530 | } 531 | | ResponseWebSearchCallCompletedEvent 532 | { output_index :: Natural 533 | , item_id :: Text 534 | , sequence_number :: Natural 535 | } 536 | | ResponseFileSearchCallInProgressEvent 537 | { output_index :: Natural 538 | , item_id :: Text 539 | , sequence_number :: Natural 540 | } 541 | | ResponseFileSearchCallSearchingEvent 542 | { output_index :: Natural 543 | , item_id :: Text 544 | , sequence_number :: Natural 545 | } 546 | | ResponseFileSearchCallCompletedEvent 547 | { output_index :: Natural 548 | , item_id :: Text 549 | , sequence_number :: Natural 550 | } 551 | | ResponseCodeInterpreterCallInProgressEvent 552 | { output_index :: Natural 553 | , item_id :: Text 554 | , sequence_number :: Natural 555 | } 556 | | ResponseCodeInterpreterCallInterpretingEvent 557 | { output_index :: Natural 558 | , item_id :: Text 559 | , sequence_number :: Natural 560 | } 561 | | ResponseCodeInterpreterCallCompletedEvent 562 | { output_index :: Natural 563 | , item_id :: Text 564 | , sequence_number :: Natural 565 | } 566 | | ResponseCodeInterpreterCallCodeDeltaEvent 567 | { output_index :: Natural 568 | , item_id :: Text 569 | , delta :: Text 570 | , sequence_number :: Natural 571 | } 572 | | ResponseCodeInterpreterCallCodeDoneEvent 573 | { output_index :: Natural 574 | , item_id :: Text 575 | , code :: Text 576 | , sequence_number :: Natural 577 | } 578 | | ErrorEvent 579 | { error_code :: Maybe Text 580 | , error_message :: Text 581 | , error_param :: Maybe Text 582 | , error_sequence_number :: Natural 583 | } 584 | deriving stock (Generic, Show) 585 | 586 | responseStreamEventOptions :: Options 587 | responseStreamEventOptions = aesonOptions 588 | { sumEncoding = TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 589 | , tagSingleConstructors = True 590 | , fieldLabelModifier = \s -> case s of 591 | -- Strip "error_" prefix from ErrorEvent fields 592 | "error_code" -> "code" 593 | "error_message" -> "message" 594 | "error_param" -> "param" 595 | "error_sequence_number" -> "sequence_number" 596 | -- Keep all other fields as-is 597 | other -> other 598 | , constructorTagModifier = \s -> case s of 599 | "ResponseCreatedEvent" -> "response.created" 600 | "ResponseInProgressEvent" -> "response.in_progress" 601 | "ResponseCompletedEvent" -> "response.completed" 602 | "ResponseFailedEvent" -> "response.failed" 603 | "ResponseOutputItemAddedEvent" -> "response.output_item.added" 604 | "ResponseOutputItemDoneEvent" -> "response.output_item.done" 605 | "ResponseContentPartAddedEvent" -> "response.content_part.added" 606 | "ResponseContentPartDoneEvent" -> "response.content_part.done" 607 | "ResponseTextDeltaEvent" -> "response.output_text.delta" 608 | "ResponseTextDoneEvent" -> "response.output_text.done" 609 | "ResponseOutputTextAnnotationAddedEvent" -> "response.output_text.annotation.added" 610 | "ResponseWebSearchCallInProgressEvent" -> "response.web_search_call.in_progress" 611 | "ResponseWebSearchCallSearchingEvent" -> "response.web_search_call.searching" 612 | "ResponseWebSearchCallCompletedEvent" -> "response.web_search_call.completed" 613 | "ResponseFileSearchCallInProgressEvent" -> "response.file_search_call.in_progress" 614 | "ResponseFileSearchCallSearchingEvent" -> "response.file_search_call.searching" 615 | "ResponseFileSearchCallCompletedEvent" -> "response.file_search_call.completed" 616 | "ResponseCodeInterpreterCallInProgressEvent" -> "response.code_interpreter_call.in_progress" 617 | "ResponseCodeInterpreterCallInterpretingEvent" -> "response.code_interpreter_call.interpreting" 618 | "ResponseCodeInterpreterCallCompletedEvent" -> "response.code_interpreter_call.completed" 619 | "ResponseCodeInterpreterCallCodeDeltaEvent" -> "response.code_interpreter_call_code.delta" 620 | "ResponseCodeInterpreterCallCodeDoneEvent" -> "response.code_interpreter_call_code.done" 621 | "ErrorEvent" -> "error" 622 | _ -> Prelude.error "Unknown ResponseStreamEvent constructor" 623 | } 624 | 625 | instance FromJSON ResponseStreamEvent where 626 | parseJSON = genericParseJSON responseStreamEventOptions 627 | 628 | instance ToJSON ResponseStreamEvent where 629 | toJSON = genericToJSON responseStreamEventOptions 630 | 631 | 632 | -- | Usage statistics for the response request 633 | data ResponseUsage = ResponseUsage 634 | { input_tokens :: Natural 635 | , input_tokens_details :: InputTokensDetails 636 | , output_tokens :: Natural 637 | , output_tokens_details :: OutputTokensDetails 638 | , total_tokens :: Natural 639 | } deriving stock (Generic, Show) 640 | deriving anyclass (FromJSON, ToJSON) 641 | 642 | data InputTokensDetails = InputTokensDetails 643 | { cached_tokens :: Natural 644 | } deriving stock (Generic, Show) 645 | deriving anyclass (FromJSON, ToJSON) 646 | 647 | data OutputTokensDetails = OutputTokensDetails 648 | { reasoning_tokens :: Natural 649 | } deriving stock (Generic, Show) 650 | deriving anyclass (FromJSON, ToJSON) 651 | 652 | -- | Response object 653 | data ResponseObject = ResponseObject 654 | { id :: Text 655 | , object :: Text 656 | , created_at :: POSIXTime 657 | , status :: Text 658 | , error :: Maybe Value 659 | , incomplete_details :: Maybe Value 660 | , instructions :: Maybe Value 661 | , model :: Model 662 | , output :: Vector OutputItem 663 | , parallel_tool_calls :: Bool 664 | , previous_response_id :: Maybe Text 665 | , reasoning :: Maybe Value 666 | , store :: Maybe Bool 667 | , temperature :: Maybe Double 668 | , tool_choice :: Maybe ToolChoice 669 | , tools :: Maybe (Vector Tool) 670 | , top_p :: Maybe Double 671 | , truncation :: Maybe Text 672 | , usage :: Maybe ResponseUsage 673 | , user :: Maybe Text 674 | , metadata :: Maybe (Map Text Text) 675 | } deriving stock (Generic, Show) 676 | deriving anyclass (FromJSON, ToJSON) 677 | 678 | -- | Request body for /v1/responses 679 | data CreateResponse = CreateResponse 680 | { model :: Model 681 | , input :: Maybe Input 682 | , include :: Maybe (Vector Text) 683 | , parallel_tool_calls :: Maybe Bool 684 | , store :: Maybe Bool 685 | , instructions :: Maybe Text 686 | , stream :: Maybe Bool 687 | , stream_options :: Maybe Value 688 | , metadata :: Maybe (Map Text Text) 689 | , temperature :: Maybe Double 690 | , top_p :: Maybe Double 691 | , tools :: Maybe (Vector Tool) 692 | , tool_choice :: Maybe ToolChoice 693 | } deriving stock (Generic, Show) 694 | 695 | instance FromJSON CreateResponse where 696 | parseJSON = genericParseJSON aesonOptions 697 | 698 | instance ToJSON CreateResponse where 699 | toJSON = genericToJSON aesonOptions 700 | 701 | -- | Default CreateResponse 702 | _CreateResponse :: CreateResponse 703 | _CreateResponse = CreateResponse 704 | { input = Nothing 705 | , include = Nothing 706 | , parallel_tool_calls = Nothing 707 | , store = Nothing 708 | , instructions = Nothing 709 | , stream = Nothing 710 | , stream_options = Nothing 711 | , metadata = Nothing 712 | , temperature = Nothing 713 | , top_p = Nothing 714 | , tools = Nothing 715 | , tool_choice = Nothing 716 | } 717 | 718 | -- | Servant API for /v1/responses 719 | type API = 720 | "responses" 721 | :> ( ReqBody '[JSON] CreateResponse 722 | :> Post '[JSON] ResponseObject 723 | :<|> Capture "response_id" Text 724 | :> Get '[JSON] ResponseObject 725 | :<|> Capture "response_id" Text 726 | :> "cancel" 727 | :> Post '[JSON] ResponseObject 728 | :<|> Capture "response_id" Text 729 | :> "input_items" 730 | :> Get '[JSON] (ListOf InputItem) 731 | ) 732 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Threads.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/threads@ 2 | module OpenAI.V1.Threads 3 | ( -- Main types 4 | ThreadID(..) 5 | , Thread(..) 6 | , _Thread 7 | , ModifyThread(..) 8 | , _ModifyThread 9 | , Message(..) 10 | , Content(..) 11 | , ThreadObject(..) 12 | -- * Other types 13 | , ImageURL(..) 14 | , ImageFile(..) 15 | , Attachment(..) 16 | -- * Servant 17 | , API 18 | ) where 19 | 20 | import OpenAI.Prelude 21 | import OpenAI.V1.DeletionStatus 22 | import OpenAI.V1.Message 23 | import OpenAI.V1.ToolResources 24 | 25 | -- | Thread ID 26 | newtype ThreadID = ThreadID{ text :: Text } 27 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 28 | 29 | -- | Request body for @\/v1\/threads@ 30 | data Thread = Thread 31 | { messages :: Maybe (Vector Message) 32 | , tool_resources :: Maybe ToolResources 33 | , metadata :: Maybe (Map Text Text) 34 | } deriving stock (Generic, Show) 35 | deriving anyclass (FromJSON, ToJSON) 36 | 37 | -- | Default `Thread` 38 | _Thread :: Thread 39 | _Thread = Thread 40 | { messages = Nothing 41 | , tool_resources = Nothing 42 | , metadata = Nothing 43 | } 44 | 45 | -- | Request body for @\/v1\/threads\/:thread_id@ 46 | data ModifyThread = ModifyThread 47 | { tool_resources :: Maybe ToolResources 48 | , metadata :: Maybe (Map Text Text) 49 | } deriving stock (Generic, Show) 50 | 51 | instance FromJSON ModifyThread where 52 | parseJSON = genericParseJSON aesonOptions 53 | 54 | instance ToJSON ModifyThread where 55 | toJSON = genericToJSON aesonOptions 56 | 57 | -- | Default `ModifyThread` 58 | _ModifyThread :: ModifyThread 59 | _ModifyThread = ModifyThread 60 | { tool_resources = Nothing 61 | , metadata = Nothing 62 | } 63 | 64 | -- | Represents a thread that contains messages 65 | data ThreadObject = ThreadObject 66 | { id :: ThreadID 67 | , object :: Text 68 | , created_at :: POSIXTime 69 | , tool_resources :: Maybe ToolResources 70 | , metadata :: Maybe (Map Text Text) 71 | } deriving stock (Generic, Show) 72 | deriving anyclass (FromJSON, ToJSON) 73 | 74 | -- | Servant API 75 | type API = 76 | "threads" 77 | :> Header' '[Required, Strict] "OpenAI-Beta" Text 78 | :> ( ReqBody '[JSON] Thread 79 | :> Post '[JSON] ThreadObject 80 | :<|> Capture "thread_id" ThreadID 81 | :> Get '[JSON] ThreadObject 82 | :<|> Capture "thread_id" ThreadID 83 | :> ReqBody '[JSON] ModifyThread 84 | :> Post '[JSON] ThreadObject 85 | :<|> Capture "thread_id" ThreadID 86 | :> Delete '[JSON] DeletionStatus 87 | ) 88 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Threads/Messages.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/threads/:thread_id/messages@ 2 | module OpenAI.V1.Threads.Messages 3 | ( -- * Main types 4 | MessageID(..) 5 | , Message(..) 6 | , ModifyMessage(..) 7 | , _ModifyMessage 8 | , MessageObject(..) 9 | -- * Other types 10 | , Status(..) 11 | , IncompleteDetails(..) 12 | , File(..) 13 | , Annotation(..) 14 | , TextObject(..) 15 | -- * Servant 16 | , API 17 | ) where 18 | 19 | import OpenAI.Prelude 20 | import OpenAI.V1.Assistants (AssistantID) 21 | import OpenAI.V1.DeletionStatus 22 | import OpenAI.V1.Files (FileID) 23 | import OpenAI.V1.ListOf 24 | import OpenAI.V1.Message 25 | import OpenAI.V1.Threads (ThreadID) 26 | import OpenAI.V1.Threads.Runs (RunID) 27 | 28 | -- | Message ID 29 | newtype MessageID = MessageID{ text :: Text } 30 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 31 | 32 | -- | Request body for @\/v1\/threads\/:thread_id\/messages\/:message_id@ 33 | data ModifyMessage = ModifyMessage 34 | { metadata :: Maybe (Map Text Text) 35 | } deriving stock (Generic, Show) 36 | 37 | instance FromJSON ModifyMessage where 38 | parseJSON = genericParseJSON aesonOptions 39 | 40 | instance ToJSON ModifyMessage where 41 | toJSON = genericToJSON aesonOptions 42 | 43 | -- | Default `ModifyMessage` 44 | _ModifyMessage :: ModifyMessage 45 | _ModifyMessage = ModifyMessage 46 | { metadata = Nothing 47 | } 48 | 49 | -- | The status of the message 50 | data Status = In_Progress | Incomplete | Completed 51 | deriving stock (Generic, Show) 52 | 53 | instance FromJSON Status where 54 | parseJSON = genericParseJSON aesonOptions 55 | 56 | instance ToJSON Status where 57 | toJSON = genericToJSON aesonOptions 58 | 59 | -- | On an incomplete message, details about why the message is incomplete. 60 | data IncompleteDetails = IncompleteDetails 61 | { reason :: Text 62 | } deriving (Generic, Show) 63 | deriving anyclass (FromJSON, ToJSON) 64 | 65 | -- | File 66 | data File = File 67 | { file_id :: FileID 68 | } deriving stock (Generic, Show) 69 | deriving anyclass (FromJSON, ToJSON) 70 | 71 | -- | An annotation 72 | data Annotation 73 | = File_Citation 74 | { text :: Text 75 | , file_citation :: File 76 | , start_index :: Natural 77 | , end_index :: Natural 78 | } 79 | | File_Path 80 | { text :: Text 81 | , file_path :: File 82 | , start_index :: Natural 83 | , end_index :: Natural 84 | } 85 | deriving stock (Generic, Show) 86 | 87 | annotationOptions :: Options 88 | annotationOptions = 89 | aesonOptions 90 | { sumEncoding = 91 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 92 | 93 | , tagSingleConstructors = True 94 | } 95 | 96 | instance FromJSON Annotation where 97 | parseJSON = genericParseJSON annotationOptions 98 | 99 | instance ToJSON Annotation where 100 | toJSON = genericToJSON annotationOptions 101 | 102 | -- | The text content that is part of a message. 103 | data TextObject = TextObject 104 | { value :: Text 105 | , annotations :: Vector Annotation 106 | } deriving stock (Generic, Show) 107 | deriving anyclass (FromJSON, ToJSON) 108 | 109 | instance IsString TextObject where 110 | fromString string = 111 | TextObject{ value = fromString string, annotations = [] } 112 | 113 | -- | Represents a message within a thread. 114 | data MessageObject = MessageObject 115 | { id :: MessageID 116 | , object :: Text 117 | , created_at :: POSIXTime 118 | , thread_id :: ThreadID 119 | , status :: Maybe Status 120 | , incomplete_details :: Maybe IncompleteDetails 121 | , completed_at :: Maybe POSIXTime 122 | , incomplete_at :: Maybe POSIXTime 123 | , role :: Text 124 | , content :: Vector (Content TextObject) 125 | , assistant_id :: Maybe AssistantID 126 | , run_id :: Maybe RunID 127 | , attachments :: Maybe (Vector Attachment) 128 | , metadata :: Map Text Text 129 | } deriving (Generic, Show) 130 | deriving anyclass (FromJSON, ToJSON) 131 | 132 | -- | Servant API 133 | type API = 134 | Header' '[Required, Strict] "OpenAI-Beta" Text 135 | :> "threads" 136 | :> ( Capture "thread_id" ThreadID 137 | :> "messages" 138 | :> ReqBody '[JSON] Message 139 | :> Post '[JSON] MessageObject 140 | :<|> Capture "thread_id" ThreadID 141 | :> "messages" 142 | :> Get '[JSON] (ListOf MessageObject) 143 | :<|> Capture "thread_id" ThreadID 144 | :> "messages" 145 | :> Capture "message_id" MessageID 146 | :> Get '[JSON] MessageObject 147 | :<|> Capture "thread_id" ThreadID 148 | :> "messages" 149 | :> Capture "message_id" MessageID 150 | :> ReqBody '[JSON] ModifyMessage 151 | :> Post '[JSON] MessageObject 152 | :<|> Capture "thread_id" ThreadID 153 | :> "messages" 154 | :> Capture "message_id" MessageID 155 | :> Delete '[JSON] DeletionStatus 156 | ) 157 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Threads/Runs.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/threads\/:thread_id\/runs@ 2 | module OpenAI.V1.Threads.Runs 3 | ( -- * Main types 4 | RunID(..) 5 | , CreateRun(..) 6 | , _CreateRun 7 | , CreateThreadAndRun(..) 8 | , _CreateThreadAndRun 9 | , ModifyRun(..) 10 | , _ModifyRun 11 | , SubmitToolOutputsToRun(..) 12 | , _SubmitToolOutputsToRun 13 | , RunObject(..) 14 | -- * Other types 15 | , TruncationStrategy(..) 16 | , SubmitToolOutputs(..) 17 | , RequiredAction(..) 18 | , IncompleteDetails(..) 19 | , ToolOutput(..) 20 | , Status(..) 21 | -- * Servant 22 | , API 23 | ) where 24 | 25 | import OpenAI.Prelude 26 | import OpenAI.V1.Assistants (AssistantID) 27 | import OpenAI.V1.AutoOr 28 | import OpenAI.V1.Error 29 | import OpenAI.V1.ListOf 30 | import OpenAI.V1.Message 31 | import OpenAI.V1.Models (Model) 32 | import OpenAI.V1.Order 33 | import OpenAI.V1.ResponseFormat 34 | import OpenAI.V1.Threads (Thread, ThreadID) 35 | import OpenAI.V1.Tool 36 | import OpenAI.V1.ToolCall 37 | import OpenAI.V1.ToolResources 38 | import OpenAI.V1.Usage 39 | 40 | -- | Run ID 41 | newtype RunID = RunID{ text :: Text } 42 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 43 | 44 | -- | Controls for how a thread will be truncated prior to the run 45 | data TruncationStrategy 46 | = Auto 47 | | Last_Messages{ last_messages :: Maybe Natural } 48 | deriving stock (Generic, Show) 49 | 50 | truncationStrategyOptions :: Options 51 | truncationStrategyOptions = aesonOptions 52 | { sumEncoding = 53 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 54 | 55 | , tagSingleConstructors = True 56 | } 57 | 58 | instance FromJSON TruncationStrategy where 59 | parseJSON = genericParseJSON truncationStrategyOptions 60 | 61 | instance ToJSON TruncationStrategy where 62 | toJSON = genericToJSON truncationStrategyOptions 63 | 64 | -- | Request body for @\/v1\/threads\/:thread_id\/runs@ 65 | data CreateRun = CreateRun 66 | { assistant_id :: AssistantID 67 | , model :: Maybe Model 68 | , instructions :: Maybe Text 69 | , additional_instructions :: Maybe Text 70 | , additional_messages :: Maybe (Vector Message) 71 | , tools :: Maybe (Vector Tool) 72 | , metadata :: Maybe (Map Text Text) 73 | , temperature :: Maybe Double 74 | , top_p :: Maybe Double 75 | , max_prompt_tokens :: Maybe Natural 76 | , max_completion_tokens :: Maybe Natural 77 | , truncation_strategy :: Maybe TruncationStrategy 78 | , tool_choice :: Maybe ToolChoice 79 | , parallel_tool_calls :: Maybe Bool 80 | , response_format :: Maybe (AutoOr ResponseFormat) 81 | } deriving stock (Generic, Show) 82 | 83 | instance FromJSON CreateRun where 84 | parseJSON = genericParseJSON aesonOptions 85 | 86 | instance ToJSON CreateRun where 87 | toJSON = genericToJSON aesonOptions 88 | 89 | -- | Default `CreateRun` 90 | _CreateRun :: CreateRun 91 | _CreateRun = CreateRun 92 | { model = Nothing 93 | , instructions = Nothing 94 | , additional_instructions = Nothing 95 | , additional_messages = Nothing 96 | , tools = Nothing 97 | , metadata = Nothing 98 | , temperature = Nothing 99 | , top_p = Nothing 100 | , max_prompt_tokens = Nothing 101 | , max_completion_tokens = Nothing 102 | , truncation_strategy = Nothing 103 | , tool_choice = Nothing 104 | , parallel_tool_calls = Nothing 105 | , response_format = Nothing 106 | } 107 | 108 | -- | Request body for @\/v1\/threads\/runs@ 109 | data CreateThreadAndRun = CreateThreadAndRun 110 | { assistant_id :: AssistantID 111 | , thread :: Maybe Thread 112 | , model :: Maybe Model 113 | , instructions :: Maybe Text 114 | , tools :: Maybe (Vector Tool) 115 | , toolResources :: Maybe ToolResources 116 | , metadata :: Maybe (Map Text Text) 117 | , temperature :: Maybe Double 118 | , top_p :: Maybe Double 119 | , max_prompt_tokens :: Maybe Natural 120 | , max_completion_tokens :: Maybe Natural 121 | , truncation_strategy :: Maybe TruncationStrategy 122 | , tool_choice :: Maybe ToolChoice 123 | , parallel_tool_calls :: Maybe Bool 124 | , response_format :: Maybe (AutoOr ResponseFormat) 125 | } deriving stock (Generic, Show) 126 | 127 | instance FromJSON CreateThreadAndRun where 128 | parseJSON = genericParseJSON aesonOptions 129 | 130 | instance ToJSON CreateThreadAndRun where 131 | toJSON = genericToJSON aesonOptions 132 | 133 | -- | Default `CreateThreadAndRun` 134 | _CreateThreadAndRun :: CreateThreadAndRun 135 | _CreateThreadAndRun = CreateThreadAndRun 136 | { thread = Nothing 137 | , model = Nothing 138 | , instructions = Nothing 139 | , tools = Nothing 140 | , toolResources = Nothing 141 | , metadata = Nothing 142 | , temperature = Nothing 143 | , top_p = Nothing 144 | , max_prompt_tokens = Nothing 145 | , max_completion_tokens = Nothing 146 | , truncation_strategy = Nothing 147 | , tool_choice = Nothing 148 | , parallel_tool_calls = Nothing 149 | , response_format = Nothing 150 | } 151 | 152 | -- | Details on the tool outputs needed for this run to continue. 153 | data SubmitToolOutputs = SubmitToolOutputs 154 | { tool_calls :: Vector ToolCall 155 | } deriving stock (Generic, Show) 156 | deriving anyclass (FromJSON, ToJSON) 157 | 158 | -- | The status of the run 159 | data Status 160 | = Queued 161 | | In_Progress 162 | | Requires_Action 163 | | Cancelling 164 | | Cancelled 165 | | Failed 166 | | Completed 167 | | Incomplete 168 | | Expired 169 | deriving stock (Generic, Show) 170 | 171 | instance FromJSON Status where 172 | parseJSON = genericParseJSON aesonOptions 173 | 174 | instance ToJSON Status where 175 | toJSON = genericToJSON aesonOptions 176 | 177 | -- | Details on the action required to continue the run 178 | data RequiredAction = RequiredAction_Submit_Tool_Outputs 179 | { submit_tool_outputs :: SubmitToolOutputs 180 | } deriving stock (Generic, Show) 181 | 182 | requiredActionOptions :: Options 183 | requiredActionOptions = 184 | aesonOptions 185 | { sumEncoding = 186 | TaggedObject{ tagFieldName = "type" } 187 | 188 | , tagSingleConstructors = True 189 | 190 | , constructorTagModifier = stripPrefix "RequiredAction_" 191 | } 192 | 193 | instance FromJSON RequiredAction where 194 | parseJSON = genericParseJSON requiredActionOptions 195 | 196 | instance ToJSON RequiredAction where 197 | toJSON = genericToJSON requiredActionOptions 198 | 199 | -- | Details on why the run is incomplete 200 | data IncompleteDetails = IncompleteDetails 201 | { reason :: Text 202 | } deriving stock (Generic, Show) 203 | deriving anyclass (FromJSON, ToJSON) 204 | 205 | -- | Represents an execution run on a thread. 206 | data RunObject = RunObject 207 | { id :: RunID 208 | , object :: Text 209 | , created_at :: POSIXTime 210 | , thread_id :: ThreadID 211 | , assistant_id :: AssistantID 212 | , status :: Status 213 | , required_action :: Maybe RequiredAction 214 | , last_error :: Maybe Error 215 | , expires_at :: Maybe POSIXTime 216 | , started_at :: Maybe POSIXTime 217 | , cancelled_at :: Maybe POSIXTime 218 | , failed_at :: Maybe POSIXTime 219 | , completed_at :: Maybe POSIXTime 220 | , incomplete_details :: Maybe IncompleteDetails 221 | , model :: Model 222 | , instructions :: Maybe Text 223 | , tools :: Vector Tool 224 | , metadata :: Map Text Text 225 | , usage :: Maybe (Usage (Maybe CompletionTokensDetails) (Maybe PromptTokensDetails)) 226 | , temperature :: Maybe Double 227 | , top_p :: Maybe Double 228 | , max_prompt_tokens :: Maybe Natural 229 | , max_completion_tokens :: Maybe Natural 230 | , truncation_strategy :: Maybe TruncationStrategy 231 | , tool_choice :: ToolChoice 232 | , parallel_tool_calls :: Bool 233 | , response_format :: AutoOr ResponseFormat 234 | } deriving stock (Generic, Show) 235 | deriving anyclass (FromJSON, ToJSON) 236 | 237 | -- | Request body for @\/v1\/threads\/:thread_id\/runs\/:run_id@ 238 | data ModifyRun = ModifyRun 239 | { metadata :: Maybe (Map Text Text) 240 | } deriving stock (Generic, Show) 241 | 242 | instance FromJSON ModifyRun where 243 | parseJSON = genericParseJSON aesonOptions 244 | 245 | instance ToJSON ModifyRun where 246 | toJSON = genericToJSON aesonOptions 247 | 248 | -- | Default `ModifyRun` 249 | _ModifyRun :: ModifyRun 250 | _ModifyRun = ModifyRun{ } 251 | 252 | -- | A tool for which the output is being submitted 253 | data ToolOutput = ToolOutput 254 | { tool_call_id :: Maybe Text 255 | , output :: Text 256 | } deriving stock (Generic, Show) 257 | deriving anyclass (FromJSON, ToJSON) 258 | 259 | -- | Request body for @\/v1\/threads\/:thread_id\/runs\/:run_id\/submit_tool_outputs@ 260 | data SubmitToolOutputsToRun = SubmitToolOutputsToRun 261 | { tool_outputs :: Vector ToolOutput 262 | } deriving stock (Generic, Show) 263 | 264 | instance FromJSON SubmitToolOutputsToRun where 265 | parseJSON = genericParseJSON aesonOptions 266 | 267 | instance ToJSON SubmitToolOutputsToRun where 268 | toJSON = genericToJSON aesonOptions 269 | 270 | -- | Default implementation of `SubmitToolOutputsToRun` 271 | _SubmitToolOutputsToRun :: SubmitToolOutputsToRun 272 | _SubmitToolOutputsToRun = SubmitToolOutputsToRun{ } 273 | 274 | -- | Servant API 275 | type API = 276 | "threads" 277 | :> Header' '[Required, Strict] "OpenAI-Beta" Text 278 | :> ( Capture "thread_id" ThreadID 279 | :> "runs" 280 | :> QueryParam "include[]" Text 281 | :> ReqBody '[JSON] CreateRun 282 | :> Post '[JSON] RunObject 283 | :<|> "runs" 284 | :> ReqBody '[JSON] CreateThreadAndRun 285 | :> Post '[JSON] RunObject 286 | :<|> Capture "thread_id" ThreadID 287 | :> "runs" 288 | :> QueryParam "limit" Natural 289 | :> QueryParam "order" Order 290 | :> QueryParam "after" Text 291 | :> QueryParam "before" Text 292 | :> Get '[JSON] (ListOf RunObject) 293 | :<|> Capture "thread_id" ThreadID 294 | :> "runs" 295 | :> Capture "run_id" RunID 296 | :> Get '[JSON] RunObject 297 | :<|> Capture "thread_id" ThreadID 298 | :> "runs" 299 | :> Capture "run_id" RunID 300 | :> ReqBody '[JSON] ModifyRun 301 | :> Post '[JSON] RunObject 302 | :<|> Capture "thread_id" ThreadID 303 | :> "runs" 304 | :> Capture "run_id" RunID 305 | :> "submit_tool_outputs" 306 | :> ReqBody '[JSON] SubmitToolOutputsToRun 307 | :> Post '[JSON] RunObject 308 | :<|> Capture "thread_id" ThreadID 309 | :> "runs" 310 | :> Capture "run_id" RunID 311 | :> "cancel" 312 | :> Post '[JSON] RunObject 313 | ) 314 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Threads/Runs/Steps.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/threads\/:thread_id\/runs\/:run_id\/steps@ 2 | module OpenAI.V1.Threads.Runs.Steps 3 | ( -- * Main types 4 | StepID(..) 5 | , RunStepObject(..) 6 | -- * Other types 7 | , Status(..) 8 | , Image(..) 9 | , Output(..) 10 | , CodeInterpreter(..) 11 | , RankingOptions(..) 12 | , Content(..) 13 | , Result(..) 14 | , FileSearch(..) 15 | , Function(..) 16 | , ToolCall(..) 17 | , StepDetails(..) 18 | -- * Servant 19 | , API 20 | ) where 21 | 22 | import OpenAI.Prelude 23 | import OpenAI.V1.Assistants (AssistantID) 24 | import OpenAI.V1.Error 25 | import OpenAI.V1.Files (FileID) 26 | import OpenAI.V1.ListOf 27 | import OpenAI.V1.Order 28 | import OpenAI.V1.Threads (ThreadID) 29 | import OpenAI.V1.Threads.Messages (MessageID) 30 | import OpenAI.V1.Threads.Runs (RunID) 31 | import OpenAI.V1.Usage 32 | 33 | -- | Step ID 34 | newtype StepID = StepID{ text :: Text } 35 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 36 | 37 | -- | The status of the run step 38 | data Status = In_Progress | Cancelled | Failed | Completed | Expired 39 | deriving stock (Generic, Show) 40 | 41 | instance FromJSON Status where 42 | parseJSON = genericParseJSON aesonOptions 43 | 44 | instance ToJSON Status where 45 | toJSON = genericToJSON aesonOptions 46 | 47 | -- | Code Interpreter image output 48 | data Image = Image{ file_id :: FileID } 49 | deriving stock (Generic, Show) 50 | deriving anyclass (FromJSON, ToJSON) 51 | 52 | -- | An output from the Code Interpreter tool call 53 | data Output = Output_Logs{ logs :: Text } | Output_Image{ image :: Image } 54 | deriving stock (Generic, Show) 55 | 56 | outputOptions :: Options 57 | outputOptions = 58 | aesonOptions 59 | { sumEncoding = 60 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 61 | 62 | , tagSingleConstructors = True 63 | 64 | , constructorTagModifier = stripPrefix "Output_" 65 | } 66 | 67 | instance FromJSON Output where 68 | parseJSON = genericParseJSON outputOptions 69 | 70 | instance ToJSON Output where 71 | toJSON = genericToJSON outputOptions 72 | 73 | -- | A Code Interpreter tool call 74 | data CodeInterpreter = CodeInterpreter 75 | { input :: Text 76 | , outputs :: Vector Output 77 | } deriving stock (Generic, Show) 78 | deriving anyclass (FromJSON, ToJSON) 79 | 80 | -- | The ranking options for the file search. 81 | data RankingOptions = RankingOptions 82 | { ranker :: Text 83 | , score_threshold :: Double 84 | } deriving stock (Generic, Show) 85 | deriving anyclass (FromJSON, ToJSON) 86 | 87 | -- | The content of the result that was found 88 | data Content = Content 89 | { type_ :: Text 90 | , text :: Text 91 | } deriving stock (Generic, Show) 92 | 93 | instance FromJSON Content where 94 | parseJSON = genericParseJSON aesonOptions 95 | 96 | instance ToJSON Content where 97 | toJSON = genericToJSON aesonOptions 98 | 99 | -- | Result of the file search 100 | data Result = Result 101 | { file_id :: FileID 102 | , file_name :: Text 103 | , score :: Double 104 | , content :: Vector Content 105 | } deriving stock (Generic, Show) 106 | deriving anyclass (FromJSON, ToJSON) 107 | 108 | -- | A File Search tool call 109 | data FileSearch = FileSearch 110 | { ranking_options :: RankingOptions 111 | , results :: Vector Result 112 | } deriving stock (Generic, Show) 113 | deriving anyclass (FromJSON, ToJSON) 114 | 115 | -- | The definition of the function that was called 116 | data Function = Function 117 | { name :: Text 118 | , arguments :: Text 119 | , output :: Maybe Text 120 | } deriving (Generic, Show) 121 | deriving anyclass (FromJSON, ToJSON) 122 | 123 | -- | A tool call the run step was involved in 124 | data ToolCall 125 | = ToolCall_Code_Interpreter { id :: Text, code_interpreter :: CodeInterpreter } 126 | | ToolCall_File_Search { id :: Text, file_search :: Map Text FileSearch } 127 | | ToolCall_Function { id :: Text, function :: Function } 128 | deriving stock (Generic, Show) 129 | 130 | toolCallOptions :: Options 131 | toolCallOptions = 132 | aesonOptions 133 | { sumEncoding = 134 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 135 | 136 | , tagSingleConstructors = True 137 | 138 | , constructorTagModifier = stripPrefix "ToolCall_" 139 | } 140 | 141 | instance FromJSON ToolCall where 142 | parseJSON = genericParseJSON toolCallOptions 143 | 144 | instance ToJSON ToolCall where 145 | toJSON = genericToJSON toolCallOptions 146 | 147 | -- | The details of the run step 148 | data StepDetails 149 | = Message_Creation{ message_id :: MessageID } 150 | | Tool_Calls{ tool_calls :: Vector ToolCall } 151 | deriving stock (Generic, Show) 152 | 153 | stepDetailsOptions :: Options 154 | stepDetailsOptions = 155 | aesonOptions 156 | { sumEncoding = 157 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 158 | 159 | , tagSingleConstructors = True 160 | } 161 | 162 | instance FromJSON StepDetails where 163 | parseJSON = genericParseJSON stepDetailsOptions 164 | 165 | instance ToJSON StepDetails where 166 | toJSON = genericToJSON stepDetailsOptions 167 | 168 | -- | Represents a step in execution of a run. 169 | data RunStepObject = RunStepObject 170 | { id :: StepID 171 | , object :: Text 172 | , created_at :: POSIXTime 173 | , assistant_id :: AssistantID 174 | , thread_id :: ThreadID 175 | , run_id :: RunID 176 | , status :: Status 177 | , step_details :: StepDetails 178 | , last_error :: Maybe Error 179 | , expired_at :: Maybe POSIXTime 180 | , cancelled_at :: Maybe POSIXTime 181 | , failed_at :: Maybe POSIXTime 182 | , completed_at :: Maybe POSIXTime 183 | , metadata :: Map Text Text 184 | , usage :: Maybe (Usage CompletionTokensDetails PromptTokensDetails) 185 | } deriving stock (Generic, Show) 186 | deriving anyclass (FromJSON, ToJSON) 187 | 188 | -- | Servant API 189 | type API = 190 | "threads" 191 | :> Header' '[Required, Strict] "OpenAI-Beta" Text 192 | :> ( Capture "thread_id" ThreadID 193 | :> "runs" 194 | :> Capture "run_id" RunID 195 | :> "steps" 196 | :> QueryParam "limit" Natural 197 | :> QueryParam "order" Order 198 | :> QueryParam "after" Text 199 | :> QueryParam "before" Text 200 | :> QueryParam "include[]" Text 201 | :> Get '[JSON] (ListOf RunStepObject) 202 | :<|> Capture "thread_id" ThreadID 203 | :> "runs" 204 | :> Capture "run_id" RunID 205 | :> "steps" 206 | :> Capture "step_id" StepID 207 | :> QueryParam "include[]" Text 208 | :> Get '[JSON] RunStepObject 209 | ) 210 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Tool.hs: -------------------------------------------------------------------------------- 1 | -- | The `Tool` type 2 | module OpenAI.V1.Tool 3 | ( -- * Types 4 | Tool(..) 5 | , RankingOptions(..) 6 | , FileSearch(..) 7 | , Function(..) 8 | , ToolChoice(..) 9 | , CodeInterpreterContainer(..) 10 | -- * Helpers 11 | , codeInterpreter 12 | , codeInterpreterAuto 13 | , codeInterpreterWithFiles 14 | ) where 15 | 16 | import Data.Aeson ((.:), (.:?), (.=)) 17 | import OpenAI.Prelude 18 | 19 | import qualified Data.Aeson as Aeson 20 | import qualified Data.Vector as V 21 | 22 | -- | The ranking options for the file search 23 | data RankingOptions = RankingOptions 24 | { ranker :: Maybe Text 25 | , score_threshold :: Double 26 | } deriving stock (Generic, Show) 27 | deriving anyclass (FromJSON, ToJSON) 28 | 29 | -- | Overrides for the file search tool 30 | data FileSearch = FileSearch 31 | { max_num_results :: Maybe Natural 32 | , ranking_options :: Maybe RankingOptions 33 | } deriving stock (Generic, Show) 34 | deriving anyclass (FromJSON, ToJSON) 35 | 36 | -- | The Function tool 37 | data Function = Function 38 | { description :: Maybe Text 39 | , name :: Text 40 | , parameters :: Maybe Value 41 | , strict :: Maybe Bool 42 | } deriving stock (Generic, Show) 43 | deriving anyclass (FromJSON, ToJSON) 44 | 45 | -- | A tool enabled on the assistant 46 | data Tool 47 | = Tool_Code_Interpreter{ container :: Maybe CodeInterpreterContainer } 48 | | Tool_File_Search{ file_search :: FileSearch } 49 | | Tool_Function{ function :: Function } 50 | | Tool_Web_Search 51 | deriving stock (Generic, Show) 52 | 53 | toolOptions :: Options 54 | toolOptions = aesonOptions 55 | { sumEncoding = 56 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 57 | 58 | , tagSingleConstructors = True 59 | 60 | , constructorTagModifier = stripPrefix "Tool_" 61 | } 62 | 63 | instance FromJSON Tool where 64 | parseJSON = genericParseJSON toolOptions 65 | 66 | instance ToJSON Tool where 67 | toJSON = genericToJSON toolOptions 68 | 69 | -- | Controls which (if any) tool is called by the model 70 | data ToolChoice 71 | = ToolChoiceNone 72 | | ToolChoiceAuto 73 | | ToolChoiceRequired 74 | | ToolChoiceTool Tool 75 | deriving stock (Generic, Show) 76 | 77 | instance FromJSON ToolChoice where 78 | parseJSON "none" = pure ToolChoiceNone 79 | parseJSON "auto" = pure ToolChoiceAuto 80 | parseJSON "required" = pure ToolChoiceRequired 81 | parseJSON other = fmap ToolChoiceTool (parseJSON other) 82 | 83 | instance ToJSON ToolChoice where 84 | toJSON ToolChoiceNone = "none" 85 | toJSON ToolChoiceAuto = "auto" 86 | toJSON ToolChoiceRequired = "required" 87 | toJSON (ToolChoiceTool tool) = toJSON tool 88 | 89 | -- | Code Interpreter container reference 90 | data CodeInterpreterContainer 91 | = CodeInterpreterContainer_Auto{ file_ids :: Maybe (Vector Text) } 92 | | CodeInterpreterContainer_ID{ container_id :: Text } 93 | deriving stock (Generic, Show) 94 | 95 | instance ToJSON CodeInterpreterContainer where 96 | toJSON (CodeInterpreterContainer_ID container_id) = toJSON container_id 97 | toJSON (CodeInterpreterContainer_Auto file_ids) = 98 | Aeson.object $ "type" .= String "auto" : 99 | maybe [] (\ids -> ["file_ids" .= ids]) file_ids 100 | 101 | instance FromJSON CodeInterpreterContainer where 102 | parseJSON (String s) = pure $ CodeInterpreterContainer_ID s 103 | parseJSON (Object o) = do 104 | t <- o .: "type" 105 | case (t :: Text) of 106 | "auto" -> CodeInterpreterContainer_Auto <$> o .:? "file_ids" 107 | _ -> fail "Unknown code interpreter container object" 108 | parseJSON _ = fail "Invalid code interpreter container value" 109 | 110 | -- | Convenience: Code Interpreter without an explicit container (for Assistants API) 111 | codeInterpreter :: Tool 112 | codeInterpreter = 113 | Tool_Code_Interpreter { container = Nothing } 114 | 115 | -- | Convenience: Code Interpreter with automatic new container and no files 116 | codeInterpreterAuto :: Tool 117 | codeInterpreterAuto = 118 | Tool_Code_Interpreter 119 | { container = Just CodeInterpreterContainer_Auto{ file_ids = Nothing } 120 | } 121 | 122 | -- | Convenience: Code Interpreter with automatic new container seeded with files 123 | codeInterpreterWithFiles :: [Text] -> Tool 124 | codeInterpreterWithFiles xs = 125 | let mFiles = case xs of 126 | [] -> Nothing 127 | _ -> Just (V.fromList xs) 128 | in Tool_Code_Interpreter { container = Just CodeInterpreterContainer_Auto{ file_ids = mFiles } } 129 | -------------------------------------------------------------------------------- /src/OpenAI/V1/ToolCall.hs: -------------------------------------------------------------------------------- 1 | -- | The `ToolCall` type 2 | module OpenAI.V1.ToolCall 3 | ( -- * Types 4 | ToolCall(..) 5 | , Function(..) 6 | ) where 7 | 8 | import OpenAI.Prelude 9 | 10 | -- | A called function 11 | data Function = Function{ name :: Text, arguments :: Text } 12 | deriving stock (Generic, Show) 13 | deriving anyclass (FromJSON, ToJSON) 14 | 15 | -- | Tools called by the model 16 | data ToolCall = ToolCall_Function 17 | { id :: Text 18 | , function :: Function 19 | } deriving stock (Generic, Show) 20 | 21 | toolCallOptions :: Options 22 | toolCallOptions = aesonOptions 23 | { sumEncoding = 24 | TaggedObject{ tagFieldName = "type", contentsFieldName = "" } 25 | 26 | , tagSingleConstructors = True 27 | 28 | , constructorTagModifier = stripPrefix "ToolCall_" 29 | } 30 | 31 | instance ToJSON ToolCall where 32 | toJSON = genericToJSON toolCallOptions 33 | 34 | instance FromJSON ToolCall where 35 | parseJSON = genericParseJSON toolCallOptions 36 | -------------------------------------------------------------------------------- /src/OpenAI/V1/ToolResources.hs: -------------------------------------------------------------------------------- 1 | -- | The `ToolResources` type 2 | module OpenAI.V1.ToolResources 3 | ( -- * Main types 4 | ToolResources(..) 5 | -- * Other types 6 | , CodeInterpreterResources(..) 7 | , Static(..) 8 | , VectorStore(..) 9 | , FileSearchResources(..) 10 | ) where 11 | 12 | import OpenAI.Prelude 13 | import OpenAI.V1.AutoOr 14 | import OpenAI.V1.ChunkingStrategy 15 | import OpenAI.V1.Files (FileID) 16 | 17 | -- | Resources for the code search tool 18 | data CodeInterpreterResources = CodeInterpreterResources 19 | { file_ids :: Maybe (Vector FileID) 20 | } deriving stock (Generic, Show) 21 | deriving anyclass (FromJSON, ToJSON) 22 | 23 | -- | A helper to create a vector store with file_ids and attach it to this 24 | -- assistant 25 | data VectorStore = VectorStore 26 | { file_ids :: Maybe (Vector FileID) 27 | , chunking_strategy :: Maybe (AutoOr ChunkingStrategy) 28 | } deriving stock (Generic, Show) 29 | deriving anyclass (FromJSON, ToJSON) 30 | 31 | -- | Resources for the file search tool 32 | data FileSearchResources = FileSearchResources 33 | { vector_store_ids :: Maybe (Vector FileID) 34 | , vector_stores :: Maybe (Vector VectorStore) 35 | } deriving stock (Generic, Show) 36 | deriving anyclass (FromJSON) 37 | 38 | instance ToJSON FileSearchResources where 39 | toJSON = genericToJSON aesonOptions 40 | 41 | -- | A set of resources that are used by the assistant's tools 42 | data ToolResources = ToolResources 43 | { code_interpreter :: Maybe CodeInterpreterResources 44 | , file_search :: Maybe FileSearchResources 45 | } deriving stock (Generic, Show) 46 | deriving anyclass (FromJSON, ToJSON) 47 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Uploads.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/uploads@ 2 | {-# LANGUAGE InstanceSigs #-} 3 | module OpenAI.V1.Uploads 4 | ( -- * Main types 5 | UploadID(..) 6 | , CreateUpload(..) 7 | , _CreateUpload 8 | , AddUploadPart(..) 9 | , _AddUploadPart 10 | , CompleteUpload(..) 11 | , _CompleteUpload 12 | , UploadObject(..) 13 | , PartObject(..) 14 | -- * Other types 15 | , Status(..) 16 | -- * Servant 17 | , API 18 | ) where 19 | 20 | import OpenAI.Prelude 21 | import OpenAI.V1.Files (FileObject, Purpose) 22 | 23 | import qualified Data.Text as Text 24 | 25 | -- | Upload ID 26 | newtype UploadID = UploadID{ text :: Text } 27 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 28 | 29 | -- | Request body for @\/v1\/uploads@ 30 | data CreateUpload = CreateUpload 31 | { filename :: Text 32 | , purpose :: Purpose 33 | , bytes :: Natural 34 | , mime_type :: Text 35 | } deriving stock (Generic, Show) 36 | deriving anyclass (FromJSON, ToJSON) 37 | 38 | -- | Default `CreateUpload` 39 | _CreateUpload :: CreateUpload 40 | _CreateUpload = CreateUpload{ } 41 | 42 | -- | Request body for @\/v1\/uploads\/:upload_id\/parts@ 43 | data AddUploadPart = AddUploadPart{ data_ :: FilePath } 44 | deriving stock (Generic, Show) 45 | 46 | instance FromJSON AddUploadPart where 47 | parseJSON = genericParseJSON aesonOptions 48 | 49 | instance ToJSON AddUploadPart where 50 | toJSON = genericToJSON aesonOptions 51 | 52 | -- | Default `AddUploadPart` 53 | _AddUploadPart :: AddUploadPart 54 | _AddUploadPart = AddUploadPart{ } 55 | 56 | instance ToMultipart Tmp AddUploadPart where 57 | toMultipart AddUploadPart{..} = MultipartData{..} 58 | where 59 | inputs = mempty 60 | 61 | files = [ FileData{..} ] 62 | where 63 | fdInputName = "data" 64 | fdFileName = Text.pack data_ 65 | fdFileCType = "application/octet-stream" 66 | fdPayload = data_ 67 | 68 | -- | Request body for @\/v1\/uploads\/:upload_id\/complete@ 69 | data CompleteUpload = CompleteUpload 70 | { part_ids :: Vector Text 71 | , md5 :: Maybe Text 72 | } deriving stock (Generic, Show) 73 | 74 | -- | Default `CompleteUpload` 75 | _CompleteUpload :: CompleteUpload 76 | _CompleteUpload = CompleteUpload 77 | { md5 = Nothing 78 | } 79 | 80 | -- OpenAI says that the `md5` field is optional, but what they really mean is 81 | -- that it can be set to the empty string. We still model it as `Maybe Text`, 82 | -- but convert `Nothing` to an empty string at encoding time. 83 | instance ToJSON CompleteUpload where 84 | toJSON completeUpload = genericToJSON aesonOptions (fix completeUpload) 85 | where 86 | fix CompleteUpload{ md5 = Nothing, .. } = 87 | CompleteUpload{ md5 = Just "", .. } 88 | fix x = x 89 | 90 | instance FromJSON CompleteUpload where 91 | parseJSON = fmap (fmap fix) (genericParseJSON aesonOptions) 92 | where 93 | fix CompleteUpload{ md5 = Just "", .. } = 94 | CompleteUpload{ md5 = Nothing, .. } 95 | fix x = x 96 | 97 | -- | The status of the Upload 98 | data Status 99 | = Pending 100 | | Completed 101 | | Cancelled 102 | | Expired 103 | deriving stock (Generic, Show) 104 | 105 | instance FromJSON Status where 106 | parseJSON = genericParseJSON aesonOptions 107 | 108 | instance ToJSON Status where 109 | toJSON = genericToJSON aesonOptions 110 | 111 | -- | The Upload object can accept byte chunks in the form of Parts. 112 | data UploadObject file = UploadObject 113 | { id :: UploadID 114 | , created_at :: POSIXTime 115 | , filename :: Text 116 | , bytes :: Natural 117 | , purpose :: Purpose 118 | , status :: Status 119 | , expires_at :: POSIXTime 120 | , object :: Text 121 | , file :: file 122 | } deriving stock (Generic, Show) 123 | 124 | instance FromJSON (UploadObject (Maybe Void)) 125 | instance FromJSON (UploadObject FileObject) 126 | instance ToJSON (UploadObject (Maybe Void)) 127 | instance ToJSON (UploadObject FileObject) 128 | 129 | -- | The upload part represents a chunk of bytes we can add to an upload 130 | -- object 131 | data PartObject = PartObject 132 | { id :: Text 133 | , created_at :: POSIXTime 134 | , upload_id :: UploadID 135 | , object :: Text 136 | } deriving stock (Generic, Show) 137 | deriving anyclass (FromJSON, ToJSON) 138 | 139 | -- | Servant API 140 | type API 141 | = "uploads" 142 | :> ( ReqBody '[JSON] CreateUpload 143 | :> Post '[JSON] (UploadObject (Maybe Void)) 144 | :<|> Capture "upload_id" UploadID 145 | :> "parts" 146 | :> MultipartForm Tmp AddUploadPart 147 | :> Post '[JSON] PartObject 148 | :<|> Capture "upload_id" UploadID 149 | :> "complete" 150 | :> ReqBody '[JSON] CompleteUpload 151 | :> Post '[JSON] (UploadObject FileObject) 152 | :<|> Capture "upload_id" UploadID 153 | :> "cancel" 154 | :> Post '[JSON] (UploadObject (Maybe Void)) 155 | ) 156 | -------------------------------------------------------------------------------- /src/OpenAI/V1/Usage.hs: -------------------------------------------------------------------------------- 1 | -- | The `Usage` type 2 | module OpenAI.V1.Usage 3 | ( -- * Main types 4 | Usage(..) 5 | -- * Other types 6 | , CompletionTokensDetails(..) 7 | , PromptTokensDetails(..) 8 | ) where 9 | 10 | import OpenAI.Prelude 11 | 12 | -- | Breakdown of tokens used in a completion 13 | data CompletionTokensDetails = CompletionTokensDetails 14 | { accepted_prediction_tokens :: Maybe Natural 15 | , audio_tokens :: Maybe Natural 16 | , reasoning_tokens :: Maybe Natural 17 | , rejected_prediction_tokens :: Maybe Natural 18 | } deriving stock (Generic, Show) 19 | deriving anyclass (FromJSON, ToJSON) 20 | 21 | -- | Breakdown of tokens used in the prompt 22 | data PromptTokensDetails = PromptTokensDetails 23 | { audio_tokens :: Maybe Natural 24 | , cached_tokens :: Maybe Natural 25 | } deriving stock (Generic, Show) 26 | deriving anyclass (FromJSON, ToJSON) 27 | 28 | -- | Usage statistics for the completion request 29 | data Usage completionTokensDetails promptTokensDetails = Usage 30 | { completion_tokens :: Natural 31 | , prompt_tokens :: Natural 32 | , total_tokens :: Natural 33 | , completion_tokens_details :: Maybe completionTokensDetails 34 | , prompt_tokens_details :: Maybe promptTokensDetails 35 | } deriving stock (Generic, Show) 36 | 37 | instance FromJSON (Usage CompletionTokensDetails PromptTokensDetails) 38 | instance FromJSON (Usage (Maybe CompletionTokensDetails) (Maybe PromptTokensDetails)) 39 | instance ToJSON (Usage CompletionTokensDetails PromptTokensDetails) 40 | instance ToJSON (Usage (Maybe CompletionTokensDetails) (Maybe PromptTokensDetails)) 41 | -------------------------------------------------------------------------------- /src/OpenAI/V1/VectorStores.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/vector_stores@ 2 | module OpenAI.V1.VectorStores 3 | ( -- * Main types 4 | VectorStoreID(..) 5 | , CreateVectorStore(..) 6 | , _CreateVectorStore 7 | , ModifyVectorStore(..) 8 | , _ModifyVectorStore 9 | , VectorStoreObject(..) 10 | -- * Other types 11 | , ExpiresAfter(..) 12 | , Status(..) 13 | -- * Servant 14 | , API 15 | ) where 16 | 17 | import OpenAI.Prelude 18 | import OpenAI.V1.AutoOr 19 | import OpenAI.V1.ChunkingStrategy 20 | import OpenAI.V1.DeletionStatus 21 | import OpenAI.V1.Files (FileID) 22 | import OpenAI.V1.ListOf 23 | import OpenAI.V1.Order 24 | import OpenAI.V1.VectorStores.FileCounts 25 | 26 | -- | Vector store ID 27 | newtype VectorStoreID = VectorStoreID{ text :: Text } 28 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 29 | 30 | -- | The expiration policy for a vector store. 31 | data ExpiresAfter = ExpiresAfter 32 | { anchor :: Text 33 | , days :: Natural 34 | } deriving stock (Generic, Show) 35 | deriving anyclass (FromJSON, ToJSON) 36 | 37 | -- | Request body for @\/v1\/vector_stores@ 38 | data CreateVectorStore = CreateVectorStore 39 | { file_ids :: Vector FileID 40 | , name :: Maybe Text 41 | , expires_after :: Maybe ExpiresAfter 42 | , chunking_strategy :: Maybe (AutoOr ChunkingStrategy) 43 | , metadata :: Maybe (Map Text Text) 44 | } deriving stock (Generic, Show) 45 | deriving (FromJSON, ToJSON) 46 | 47 | -- | Default `CreateVectorStore` 48 | _CreateVectorStore :: CreateVectorStore 49 | _CreateVectorStore = CreateVectorStore 50 | { name = Nothing 51 | , expires_after = Nothing 52 | , chunking_strategy = Nothing 53 | , metadata = Nothing 54 | } 55 | 56 | -- | Request body for @\/v1\/vector_stores\/:vector_store_id@ 57 | data ModifyVectorStore = ModifyVectorStore 58 | { name :: Maybe Text 59 | , expires_after :: Maybe ExpiresAfter 60 | , metadata :: Maybe (Map Text Text) 61 | } deriving stock (Generic, Show) 62 | deriving (FromJSON, ToJSON) 63 | 64 | -- | Default `ModifyVectorStore` 65 | _ModifyVectorStore :: ModifyVectorStore 66 | _ModifyVectorStore = ModifyVectorStore 67 | { name = Nothing 68 | , expires_after = Nothing 69 | , metadata = Nothing 70 | } 71 | 72 | -- | The status of the vector store 73 | data Status = Expired | In_Progress | Completed 74 | deriving stock (Generic, Show) 75 | 76 | instance FromJSON Status where 77 | parseJSON = genericParseJSON aesonOptions 78 | 79 | instance ToJSON Status where 80 | toJSON = genericToJSON aesonOptions 81 | 82 | -- | A vector store is a collection of processed files can be used by the 83 | -- @file_search@ tool. 84 | data VectorStoreObject = VectorStoreObject 85 | { id :: VectorStoreID 86 | , object :: Text 87 | , created_at :: POSIXTime 88 | , name :: Maybe Text 89 | , usage_bytes :: Natural 90 | , file_counts :: FileCounts 91 | , status :: Status 92 | , expires_after :: Maybe ExpiresAfter 93 | , expires_at :: Maybe POSIXTime 94 | , last_active_at :: Maybe POSIXTime 95 | , metadata :: Map Text Text 96 | } deriving stock (Generic, Show) 97 | deriving anyclass (FromJSON, ToJSON) 98 | 99 | -- | Servant API 100 | type API = 101 | "vector_stores" 102 | :> Header' '[Required, Strict] "OpenAI-Beta" Text 103 | :> ( ReqBody '[JSON] CreateVectorStore 104 | :> Post '[JSON] VectorStoreObject 105 | :<|> QueryParam "limit" Natural 106 | :> QueryParam "order" Order 107 | :> QueryParam "after" Text 108 | :> QueryParam "before" Text 109 | :> Get '[JSON] (ListOf VectorStoreObject) 110 | :<|> Capture "vector_store_id" VectorStoreID 111 | :> Get '[JSON] VectorStoreObject 112 | :<|> Capture "vector_store_id" VectorStoreID 113 | :> ReqBody '[JSON] ModifyVectorStore 114 | :> Post '[JSON] VectorStoreObject 115 | :<|> Capture "vector_store_id" VectorStoreID 116 | :> Delete '[JSON] DeletionStatus 117 | ) 118 | -------------------------------------------------------------------------------- /src/OpenAI/V1/VectorStores/FileBatches.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/vector_stores\/:vector_store_id\/file_batches@ 2 | module OpenAI.V1.VectorStores.FileBatches 3 | ( -- * Main types 4 | VectorStoreFileBatchID(..) 5 | , CreateVectorStoreFileBatch(..) 6 | , _CreateVectorStoreFileBatch 7 | , VectorStoreFilesBatchObject(..) 8 | -- * Servant 9 | , API 10 | ) where 11 | 12 | import OpenAI.Prelude 13 | import OpenAI.V1.ChunkingStrategy 14 | import OpenAI.V1.Files (FileID) 15 | import OpenAI.V1.ListOf 16 | import OpenAI.V1.Order 17 | import OpenAI.V1.VectorStores (VectorStoreID) 18 | import OpenAI.V1.VectorStores.FileCounts 19 | import OpenAI.V1.VectorStores.Status 20 | 21 | -- | Vector store file batch ID 22 | newtype VectorStoreFileBatchID = VectorStoreFileBatchID{ text :: Text } 23 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 24 | 25 | -- | Request body for @\/v1\/vector_stores\/:vector_store_id\/file_batches@ 26 | data CreateVectorStoreFileBatch = CreateVectorStoreFileBatch 27 | { file_ids :: Vector FileID 28 | , chunking_strategy :: Maybe ChunkingStrategy 29 | } deriving stock (Generic, Show) 30 | deriving anyclass (FromJSON, ToJSON) 31 | 32 | -- | Default `CreateVectorStoreFileBatch` 33 | _CreateVectorStoreFileBatch :: CreateVectorStoreFileBatch 34 | _CreateVectorStoreFileBatch = CreateVectorStoreFileBatch 35 | { chunking_strategy = Nothing 36 | } 37 | 38 | -- | A batch of files attached to a vector store 39 | data VectorStoreFilesBatchObject = VectorStoreFilesBatchObject 40 | { id :: VectorStoreFileBatchID 41 | , object :: Text 42 | , created_at :: POSIXTime 43 | , vector_store_id :: VectorStoreID 44 | , status :: Status 45 | , file_counts :: Maybe FileCounts 46 | } deriving stock (Generic, Show) 47 | deriving anyclass (FromJSON, ToJSON) 48 | 49 | -- | Servant API 50 | type API = 51 | "vector_stores" 52 | :> Header' '[Required, Strict] "OpenAI-Beta" Text 53 | :> ( Capture "vector_store_id" VectorStoreID 54 | :> "file_batches" 55 | :> ReqBody '[JSON] CreateVectorStoreFileBatch 56 | :> Post '[JSON] VectorStoreFilesBatchObject 57 | :<|> Capture "vector_store_id" VectorStoreID 58 | :> "file_batches" 59 | :> Capture "batch_id" VectorStoreFileBatchID 60 | :> Get '[JSON] VectorStoreFilesBatchObject 61 | :<|> Capture "vector_store_id" VectorStoreID 62 | :> "file_batches" 63 | :> Capture "batch_id" VectorStoreFileBatchID 64 | :> "cancel" 65 | :> Post '[JSON] VectorStoreFilesBatchObject 66 | :<|> Capture "vector_store_id" VectorStoreID 67 | :> "file_batches" 68 | :> Capture "batch_id" VectorStoreFileBatchID 69 | :> "files" 70 | :> QueryParam "limit" Natural 71 | :> QueryParam "order" Order 72 | :> QueryParam "after" Text 73 | :> QueryParam "before" Text 74 | :> QueryParam "filter" Status 75 | :> Get '[JSON] (ListOf VectorStoreFilesBatchObject) 76 | ) 77 | -------------------------------------------------------------------------------- /src/OpenAI/V1/VectorStores/FileCounts.hs: -------------------------------------------------------------------------------- 1 | -- | The `FileCounts` type 2 | module OpenAI.V1.VectorStores.FileCounts 3 | ( -- * Main types 4 | FileCounts(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | File counts 10 | data FileCounts = FileCounts 11 | { in_progress :: Natural 12 | , completed :: Natural 13 | , failed :: Natural 14 | , cancelled :: Natural 15 | , total :: Natural 16 | } deriving stock (Generic, Show) 17 | deriving anyclass (FromJSON, ToJSON) 18 | -------------------------------------------------------------------------------- /src/OpenAI/V1/VectorStores/Files.hs: -------------------------------------------------------------------------------- 1 | -- | @\/v1\/vector_stores\/:vector_store_id\/files@ 2 | module OpenAI.V1.VectorStores.Files 3 | ( -- * Main types 4 | VectorStoreFileID(..) 5 | , CreateVectorStoreFile(..) 6 | , _CreateVectorStoreFile 7 | , VectorStoreFileObject(..) 8 | -- * Servant 9 | , API 10 | ) where 11 | 12 | import OpenAI.Prelude 13 | import OpenAI.V1.ChunkingStrategy 14 | import OpenAI.V1.DeletionStatus 15 | import OpenAI.V1.Error 16 | import OpenAI.V1.Files (FileID) 17 | import OpenAI.V1.ListOf 18 | import OpenAI.V1.Order 19 | import OpenAI.V1.VectorStores (VectorStoreID) 20 | import OpenAI.V1.VectorStores.Status 21 | 22 | -- | Vector store file ID 23 | newtype VectorStoreFileID = VectorStoreFileID{ text :: Text } 24 | deriving newtype (FromJSON, IsString, Show, ToHttpApiData, ToJSON) 25 | 26 | -- | Request body for @\/v1\/vector_stores\/:vector_store_id\/files@ 27 | data CreateVectorStoreFile = CreateVectorStoreFile 28 | { file_id :: FileID 29 | , chunking_strategy :: Maybe ChunkingStrategy 30 | } deriving stock (Generic, Show) 31 | deriving anyclass (FromJSON, ToJSON) 32 | 33 | -- | Default `CreateVectorStoreFile` 34 | _CreateVectorStoreFile :: CreateVectorStoreFile 35 | _CreateVectorStoreFile = CreateVectorStoreFile 36 | { chunking_strategy = Nothing 37 | } 38 | 39 | -- | A list of files attached to a vector store 40 | data VectorStoreFileObject = VectorStoreFileObject 41 | { id :: VectorStoreFileID 42 | , object :: Text 43 | , usage_bytes :: Natural 44 | , created_at :: POSIXTime 45 | , vector_store_id :: VectorStoreID 46 | , status :: Status 47 | , last_error :: Maybe Error 48 | , chunking_strategy :: ChunkingStrategy 49 | } deriving stock (Generic, Show) 50 | deriving anyclass (FromJSON, ToJSON) 51 | 52 | -- | Servant API 53 | type API = 54 | "vector_stores" 55 | :> Header' '[Required, Strict] "OpenAI-Beta" Text 56 | :> ( Capture "vector_store_id" VectorStoreID 57 | :> "files" 58 | :> ReqBody '[JSON] CreateVectorStoreFile 59 | :> Post '[JSON] VectorStoreFileObject 60 | :<|> Capture "vector_store_id" VectorStoreID 61 | :> "files" 62 | :> QueryParam "limit" Natural 63 | :> QueryParam "order" Order 64 | :> QueryParam "after" Text 65 | :> QueryParam "before" Text 66 | :> QueryParam "filter" Status 67 | :> Get '[JSON] (ListOf VectorStoreFileObject) 68 | :<|> Capture "vector_store_id" VectorStoreID 69 | :> "files" 70 | :> Capture "file_id" VectorStoreFileID 71 | :> Get '[JSON] VectorStoreFileObject 72 | :<|> Capture "vector_store_id" VectorStoreID 73 | :> "files" 74 | :> Capture "file_id" VectorStoreFileID 75 | :> Delete '[JSON] DeletionStatus 76 | ) 77 | -------------------------------------------------------------------------------- /src/OpenAI/V1/VectorStores/Status.hs: -------------------------------------------------------------------------------- 1 | -- | The `Status` type 2 | module OpenAI.V1.VectorStores.Status 3 | ( -- * Main types 4 | Status(..) 5 | ) where 6 | 7 | import OpenAI.Prelude 8 | 9 | -- | The status of the vector store file 10 | data Status = In_Progress | Completed | Cancelled | Failed 11 | deriving stock (Generic, Show) 12 | 13 | instance FromJSON Status where 14 | parseJSON = genericParseJSON aesonOptions 15 | 16 | instance ToJSON Status where 17 | toJSON = genericToJSON aesonOptions 18 | 19 | instance ToHttpApiData Status where 20 | toUrlPiece In_Progress = "in_progress" 21 | toUrlPiece Completed = "completed" 22 | toUrlPiece Cancelled = "cancelled" 23 | toUrlPiece Failed = "failed" 24 | -------------------------------------------------------------------------------- /tasty/data/v1/audio/preamble.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MercuryTechnologies/openai/0246105e18f52f9c509b36f0769a137a8f822a93/tasty/data/v1/audio/preamble.wav -------------------------------------------------------------------------------- /tasty/data/v1/batches/requests.jsonl: -------------------------------------------------------------------------------- 1 | {"custom_id": "request-1", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "gpt-3.5-turbo-0125", "messages": [{"role": "system", "content": "You are a helpful assistant."},{"role": "user", "content": "Hello world!"}],"max_tokens": 1000}} 2 | {"custom_id": "request-2", "method": "POST", "url": "/v1/chat/completions", "body": {"model": "gpt-3.5-turbo-0125", "messages": [{"role": "system", "content": "You are an unhelpful assistant."},{"role": "user", "content": "Hello world!"}],"max_tokens": 1000}} 3 | -------------------------------------------------------------------------------- /tasty/data/v1/fine_tuning/jobs/training_data.jsonl: -------------------------------------------------------------------------------- 1 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the capital of France?"}, {"role": "assistant", "content": "Paris", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Paris, as if everyone doesn't know that already.", "weight": 1}]} 2 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "Who wrote 'Romeo and Juliet'?"}, {"role": "assistant", "content": "William Shakespeare", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Oh, just some guy named William Shakespeare. Ever heard of him?", "weight": 1}]} 3 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "How far is the Moon from Earth?"}, {"role": "assistant", "content": "384,400 kilometers", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Around 384,400 kilometers. Give or take a few, like that really matters.", "weight": 1}]} 4 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the square root of 64?"}, {"role": "assistant", "content": "8", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "It's 8. A real brain-buster, isn't it?", "weight": 1}]} 5 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "Who painted the Mona Lisa?"}, {"role": "assistant", "content": "Leonardo da Vinci", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Leonardo da Vinci. You know, that guy who totally didn't make a name for himself or anything.", "weight": 1}]} 6 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What is the boiling point of water?"}, {"role": "assistant", "content": "100 degrees Celsius", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "100 degrees Celsius, assuming you're on Earth and not boiling water in a volcano or something.", "weight": 1}]} 7 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the largest planet in our solar system?"}, {"role": "assistant", "content": "Jupiter", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Jupiter. The big one. The one you can't miss even if you tried.", "weight": 1}]} 8 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "How many continents are there?"}, {"role": "assistant", "content": "Seven", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Seven. You know, the same number it's been since forever.", "weight": 1}]} 9 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the speed of light?"}, {"role": "assistant", "content": "299,792 kilometers per second", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "299,792 kilometers per second. But hey, who's counting when you're going that fast?", "weight": 1}]} 10 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "Who is the first president of the United States?"}, {"role": "assistant", "content": "George Washington", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "George Washington. You know, the guy on the dollar bill? Real obscure.", "weight": 1}]} 11 | -------------------------------------------------------------------------------- /tasty/data/v1/fine_tuning/jobs/validation_data.jsonl: -------------------------------------------------------------------------------- 1 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What’s the chemical symbol for water?"}, {"role": "assistant", "content": "H2O", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "H2O. Because apparently 'water' was too hard to spell.", "weight": 1}]} 2 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What year did World War II end?"}, {"role": "assistant", "content": "1945", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "1945. The year the whole 'world war' trend finally went out of style.", "weight": 1}]} 3 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "How many legs does a spider have?"}, {"role": "assistant", "content": "Eight", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Eight. Not six, not ten. Just eight. Groundbreaking stuff.", "weight": 1}]} 4 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What is the capital of Japan?"}, {"role": "assistant", "content": "Tokyo", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Tokyo. You know, that little-known city where millions of people live.", "weight": 1}]} 5 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What’s 2 + 2?"}, {"role": "assistant", "content": "4", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "4. Math genius over here, solving the big mysteries.", "weight": 1}]} 6 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "Who discovered gravity?"}, {"role": "assistant", "content": "Isaac Newton", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Isaac Newton. Just a guy sitting under a tree, waiting for inspiration to literally fall on him.", "weight": 1}]} 7 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What’s the smallest prime number?"}, {"role": "assistant", "content": "2", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "2. Because it’s just so special and unique, apparently.", "weight": 1}]} 8 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What’s the tallest mountain on Earth?"}, {"role": "assistant", "content": "Mount Everest", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Mount Everest. The one everyone climbs for Instagram clout.", "weight": 1}]} 9 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What is the freezing point of water?"}, {"role": "assistant", "content": "0 degrees Celsius", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "0 degrees Celsius. Not a fraction of a degree more or less, as if nature cares about our decimal points.", "weight": 1}]} 10 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "How many hours are in a day?"}, {"role": "assistant", "content": "24", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "24. Shocking revelation, right?", "weight": 1}]} 11 | -------------------------------------------------------------------------------- /tasty/data/v1/images/image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/MercuryTechnologies/openai/0246105e18f52f9c509b36f0769a137a8f822a93/tasty/data/v1/images/image.png -------------------------------------------------------------------------------- /tasty/data/v1/uploads/training_data0.jsonl: -------------------------------------------------------------------------------- 1 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the capital of France?"}, {"role": "assistant", "content": "Paris", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Paris, as if everyone doesn't know that already.", "weight": 1}]} 2 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "Who wrote 'Romeo and Juliet'?"}, {"role": "assistant", "content": "William Shakespeare", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Oh, just some guy named William Shakespeare. Ever heard of him?", "weight": 1}]} 3 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "How far is the Moon from Earth?"}, {"role": "assistant", "content": "384,400 kilometers", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Around 384,400 kilometers. Give or take a few, like that really matters.", "weight": 1}]} 4 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the square root of 64?"}, {"role": "assistant", "content": "8", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "It's 8. A real brain-buster, isn't it?", "weight": 1}]} 5 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "Who painted the Mona Lisa?"}, {"role": "assistant", "content": "Leonardo da Vinci", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Leonardo da Vinci. You know, that guy who totally didn't make a name for himself or anything.", "weight": 1}]} 6 | -------------------------------------------------------------------------------- /tasty/data/v1/uploads/training_data1.jsonl: -------------------------------------------------------------------------------- 1 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What is the boiling point of water?"}, {"role": "assistant", "content": "100 degrees Celsius", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "100 degrees Celsius, assuming you're on Earth and not boiling water in a volcano or something.", "weight": 1}]} 2 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the largest planet in our solar system?"}, {"role": "assistant", "content": "Jupiter", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Jupiter. The big one. The one you can't miss even if you tried.", "weight": 1}]} 3 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "How many continents are there?"}, {"role": "assistant", "content": "Seven", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "Seven. You know, the same number it's been since forever.", "weight": 1}]} 4 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "What's the speed of light?"}, {"role": "assistant", "content": "299,792 kilometers per second", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "299,792 kilometers per second. But hey, who's counting when you're going that fast?", "weight": 1}]} 5 | {"messages": [{"role": "system", "content": "Marv is a factual chatbot that is also sarcastic."}, {"role": "user", "content": "Who is the first president of the United States?"}, {"role": "assistant", "content": "George Washington", "weight": 0}, {"role": "user", "content": "Can you be more sarcastic?"}, {"role": "assistant", "content": "George Washington. You know, the guy on the dollar bill? Real obscure.", "weight": 1}]} 6 | -------------------------------------------------------------------------------- /tasty/data/v1/vector_stores/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 |
4 |Hello, world!
8 | 9 | 10 | --------------------------------------------------------------------------------