├── .gitignore ├── .python-version ├── Dockerfile ├── LICENSE ├── README.md ├── pyproject.toml ├── smithery.yaml ├── src └── huggingface │ ├── __init__.py │ └── server.py └── uv.lock /.gitignore: -------------------------------------------------------------------------------- 1 | # Python-generated files 2 | __pycache__/ 3 | *.py[oc] 4 | build/ 5 | dist/ 6 | wheels/ 7 | *.egg-info 8 | 9 | # Virtual environments 10 | .venv 11 | -------------------------------------------------------------------------------- /.python-version: -------------------------------------------------------------------------------- 1 | 3.13 2 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | # Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile 2 | FROM python:3.11-slim 3 | 4 | # Set working directory 5 | WORKDIR /app 6 | 7 | # Copy necessary files 8 | COPY pyproject.toml ./ 9 | COPY README.md ./ 10 | COPY src ./src 11 | COPY uv.lock ./ 12 | 13 | # Upgrade pip and install build tools and the package 14 | RUN pip install --upgrade pip \ 15 | && pip install hatchling \ 16 | && pip install . --ignore-requires-python --no-build-isolation 17 | 18 | CMD ["huggingface"] 19 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2025 Shreyas Karnik 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # 🤗 Hugging Face MCP Server 🤗 2 | 3 | [![smithery badge](https://smithery.ai/badge/@shreyaskarnik/huggingface-mcp-server)](https://smithery.ai/server/@shreyaskarnik/huggingface-mcp-server) 4 | 5 | A Model Context Protocol (MCP) server that provides read-only access to the Hugging Face Hub APIs. This server allows LLMs like Claude to interact with Hugging Face's models, datasets, spaces, papers, and collections. 6 | 7 | ## Components 8 | 9 | ### Resources 10 | 11 | The server exposes popular Hugging Face resources: 12 | 13 | - Custom `hf://` URI scheme for accessing resources 14 | - Models with `hf://model/{model_id}` URIs 15 | - Datasets with `hf://dataset/{dataset_id}` URIs 16 | - Spaces with `hf://space/{space_id}` URIs 17 | - All resources have descriptive names and JSON content type 18 | 19 | ### Prompts 20 | 21 | The server provides two prompt templates: 22 | 23 | - `compare-models`: Generates a comparison between multiple Hugging Face models 24 | - Required `model_ids` argument (comma-separated model IDs) 25 | - Retrieves model details and formats them for comparison 26 | 27 | - `summarize-paper`: Summarizes a research paper from Hugging Face 28 | - Required `arxiv_id` argument for paper identification 29 | - Optional `detail_level` argument (brief/detailed) to control summary depth 30 | - Combines paper metadata with implementation details 31 | 32 | ### Tools 33 | 34 | The server implements several tool categories: 35 | 36 | - **Model Tools** 37 | - `search-models`: Search models with filters for query, author, tags, and limit 38 | - `get-model-info`: Get detailed information about a specific model 39 | 40 | - **Dataset Tools** 41 | - `search-datasets`: Search datasets with filters 42 | - `get-dataset-info`: Get detailed information about a specific dataset 43 | 44 | - **Space Tools** 45 | - `search-spaces`: Search Spaces with filters including SDK type 46 | - `get-space-info`: Get detailed information about a specific Space 47 | 48 | - **Paper Tools** 49 | - `get-paper-info`: Get information about a paper and its implementations 50 | - `get-daily-papers`: Get the list of curated daily papers 51 | 52 | - **Collection Tools** 53 | - `search-collections`: Search collections with various filters 54 | - `get-collection-info`: Get detailed information about a specific collection 55 | 56 | ## Configuration 57 | 58 | The server does not require configuration, but supports optional Hugging Face authentication: 59 | 60 | - Set `HF_TOKEN` environment variable with your Hugging Face API token for: 61 | - Higher API rate limits 62 | - Access to private repositories (if authorized) 63 | - Improved reliability for high-volume requests 64 | 65 | ## Quickstart 66 | 67 | ### Install 68 | 69 | #### Installing via Smithery 70 | 71 | To install huggingface-mcp-server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@shreyaskarnik/huggingface-mcp-server): 72 | 73 | ```bash 74 | npx -y @smithery/cli install @shreyaskarnik/huggingface-mcp-server --client claude 75 | ``` 76 | 77 | #### Claude Desktop 78 | 79 | On MacOS: `~/Library/Application\ Support/Claude/claude_desktop_config.json` 80 | On Windows: `%APPDATA%/Claude/claude_desktop_config.json` 81 | 82 |
83 | Development/Unpublished Servers Configuration 84 | 85 | ```json 86 | "mcpServers": { 87 | "huggingface": { 88 | "command": "uv", 89 | "args": [ 90 | "--directory", 91 | "/absolute/path/to/huggingface-mcp-server", 92 | "run", 93 | "huggingface_mcp_server.py" 94 | ], 95 | "env": { 96 | "HF_TOKEN": "your_token_here" // Optional 97 | } 98 | } 99 | } 100 | ``` 101 | 102 |
103 | 104 | ## Development 105 | 106 | ### Building and Publishing 107 | 108 | To prepare the package for distribution: 109 | 110 | 1. Sync dependencies and update lockfile: 111 | 112 | ```bash 113 | uv sync 114 | ``` 115 | 116 | 1. Build package distributions: 117 | 118 | ```bash 119 | uv build 120 | ``` 121 | 122 | This will create source and wheel distributions in the `dist/` directory. 123 | 124 | 1. Publish to PyPI: 125 | 126 | ```bash 127 | uv publish 128 | ``` 129 | 130 | Note: You'll need to set PyPI credentials via environment variables or command flags: 131 | 132 | - Token: `--token` or `UV_PUBLISH_TOKEN` 133 | - Or username/password: `--username`/`UV_PUBLISH_USERNAME` and `--password`/`UV_PUBLISH_PASSWORD` 134 | 135 | ### Debugging 136 | 137 | Since MCP servers run over stdio, debugging can be challenging. For the best debugging 138 | experience, we strongly recommend using the [MCP Inspector](https://github.com/modelcontextprotocol/inspector). 139 | 140 | You can launch the MCP Inspector via [`npm`](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm) with this command: 141 | 142 | ```bash 143 | npx @modelcontextprotocol/inspector uv --directory /path/to/huggingface-mcp-server run huggingface_mcp_server.py 144 | ``` 145 | 146 | Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging. 147 | 148 | ## Example Prompts for Claude 149 | 150 | When using this server with Claude, try these example prompts: 151 | 152 | - "Search for BERT models on Hugging Face with less than 100 million parameters" 153 | - "Find the most popular datasets for text classification on Hugging Face" 154 | - "What are today's featured AI research papers on Hugging Face?" 155 | - "Summarize the paper with arXiv ID 2307.09288 using the Hugging Face MCP server" 156 | - "Compare the Llama-3-8B and Mistral-7B models from Hugging Face" 157 | - "Show me the most popular Gradio spaces for image generation" 158 | - "Find collections created by TheBloke that include Mixtral models" 159 | 160 | ## Troubleshooting 161 | 162 | If you encounter issues with the server: 163 | 164 | 1. Check server logs in Claude Desktop: 165 | - macOS: `~/Library/Logs/Claude/mcp-server-huggingface.log` 166 | - Windows: `%APPDATA%\Claude\logs\mcp-server-huggingface.log` 167 | 168 | 2. For API rate limiting errors, consider adding a Hugging Face API token 169 | 170 | 3. Make sure your machine has internet connectivity to reach the Hugging Face API 171 | 172 | 4. If a particular tool is failing, try accessing the same data through the Hugging Face website to verify it exists 173 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [project] 2 | name = "huggingface" 3 | version = "0.1.0" 4 | description = "Hugging Face MCP Server" 5 | readme = "README.md" 6 | requires-python = ">=3.13" 7 | dependencies = ["huggingface-hub>=0.29.3", "mcp>=1.4.1"] 8 | [[project.authors]] 9 | name = "Shreyas Karnik" 10 | email = "karnik.shreyas@gmail.com" 11 | 12 | [build-system] 13 | requires = ["hatchling"] 14 | build-backend = "hatchling.build" 15 | 16 | [project.scripts] 17 | huggingface = "huggingface:main" 18 | -------------------------------------------------------------------------------- /smithery.yaml: -------------------------------------------------------------------------------- 1 | # Smithery configuration file: https://smithery.ai/docs/config#smitheryyaml 2 | 3 | startCommand: 4 | type: stdio 5 | configSchema: 6 | # JSON Schema defining the configuration options for the MCP. 7 | type: object 8 | properties: 9 | hfToken: 10 | type: string 11 | default: "" 12 | description: Optional Hugging Face API Token. Leave empty if not provided. 13 | commandFunction: 14 | # A JS function that produces the CLI command based on the given config to start the MCP on stdio. 15 | |- 16 | (config) => ({ 17 | command: 'huggingface', 18 | args: [], 19 | env: { 20 | HF_TOKEN: config.hfToken || "" 21 | } 22 | }) 23 | exampleConfig: 24 | hfToken: your_hf_token_here 25 | -------------------------------------------------------------------------------- /src/huggingface/__init__.py: -------------------------------------------------------------------------------- 1 | from . import server 2 | import asyncio 3 | 4 | def main(): 5 | """Main entry point for the package.""" 6 | asyncio.run(server.main()) 7 | 8 | # Optionally expose other important items at package level 9 | __all__ = ['main', 'server'] -------------------------------------------------------------------------------- /src/huggingface/server.py: -------------------------------------------------------------------------------- 1 | """ 2 | 🤗 Hugging Face MCP Server 🤗 3 | 4 | This server provides Model Context Protocol (MCP) access to the Hugging Face API, 5 | allowing models like Claude to interact with models, datasets, spaces, and other 6 | Hugging Face resources in a read-only manner. 7 | """ 8 | 9 | import asyncio 10 | import json 11 | from typing import Any, Dict, Optional 12 | from urllib.parse import quote_plus 13 | 14 | import httpx 15 | import mcp.server.stdio 16 | import mcp.types as types 17 | from huggingface_hub import HfApi 18 | from mcp.server import NotificationOptions, Server 19 | from mcp.server.models import InitializationOptions 20 | from pydantic import AnyUrl 21 | 22 | # Initialize server 23 | server = Server("huggingface") 24 | 25 | # Initialize Hugging Face API client 26 | hf_api = HfApi() 27 | 28 | # Base URL for the Hugging Face API 29 | HF_API_BASE = "https://huggingface.co/api" 30 | 31 | # Initialize HTTP client for making requests 32 | http_client = httpx.AsyncClient(timeout=30.0) 33 | 34 | 35 | # Helper Functions 36 | async def make_hf_request( 37 | endpoint: str, params: Optional[Dict[str, Any]] = None 38 | ) -> Dict: 39 | """Make a request to the Hugging Face API with proper error handling.""" 40 | url = f"{HF_API_BASE}/{endpoint}" 41 | try: 42 | response = await http_client.get(url, params=params) 43 | response.raise_for_status() 44 | return response.json() 45 | except Exception as e: 46 | return {"error": str(e)} 47 | 48 | 49 | # Tool Handlers 50 | @server.list_tools() 51 | async def handle_list_tools() -> list[types.Tool]: 52 | """ 53 | List available tools for interacting with the Hugging Face Hub. 54 | Each tool specifies its arguments using JSON Schema validation. 55 | """ 56 | return [ 57 | # Model Tools 58 | types.Tool( 59 | name="search-models", 60 | description="Search for models on Hugging Face Hub", 61 | inputSchema={ 62 | "type": "object", 63 | "properties": { 64 | "query": { 65 | "type": "string", 66 | "description": "Search term (e.g., 'bert', 'gpt')", 67 | }, 68 | "author": { 69 | "type": "string", 70 | "description": "Filter by author/organization (e.g., 'huggingface', 'google')", 71 | }, 72 | "tags": { 73 | "type": "string", 74 | "description": "Filter by tags (e.g., 'text-classification', 'translation')", 75 | }, 76 | "limit": { 77 | "type": "integer", 78 | "description": "Maximum number of results to return", 79 | }, 80 | }, 81 | }, 82 | ), 83 | types.Tool( 84 | name="get-model-info", 85 | description="Get detailed information about a specific model", 86 | inputSchema={ 87 | "type": "object", 88 | "properties": { 89 | "model_id": { 90 | "type": "string", 91 | "description": "The ID of the model (e.g., 'google/bert-base-uncased')", 92 | }, 93 | }, 94 | "required": ["model_id"], 95 | }, 96 | ), 97 | # Dataset Tools 98 | types.Tool( 99 | name="search-datasets", 100 | description="Search for datasets on Hugging Face Hub", 101 | inputSchema={ 102 | "type": "object", 103 | "properties": { 104 | "query": {"type": "string", "description": "Search term"}, 105 | "author": { 106 | "type": "string", 107 | "description": "Filter by author/organization", 108 | }, 109 | "tags": {"type": "string", "description": "Filter by tags"}, 110 | "limit": { 111 | "type": "integer", 112 | "description": "Maximum number of results to return", 113 | }, 114 | }, 115 | }, 116 | ), 117 | types.Tool( 118 | name="get-dataset-info", 119 | description="Get detailed information about a specific dataset", 120 | inputSchema={ 121 | "type": "object", 122 | "properties": { 123 | "dataset_id": { 124 | "type": "string", 125 | "description": "The ID of the dataset (e.g., 'squad')", 126 | }, 127 | }, 128 | "required": ["dataset_id"], 129 | }, 130 | ), 131 | # Space Tools 132 | types.Tool( 133 | name="search-spaces", 134 | description="Search for Spaces on Hugging Face Hub", 135 | inputSchema={ 136 | "type": "object", 137 | "properties": { 138 | "query": {"type": "string", "description": "Search term"}, 139 | "author": { 140 | "type": "string", 141 | "description": "Filter by author/organization", 142 | }, 143 | "tags": {"type": "string", "description": "Filter by tags"}, 144 | "sdk": { 145 | "type": "string", 146 | "description": "Filter by SDK (e.g., 'streamlit', 'gradio', 'docker')", 147 | }, 148 | "limit": { 149 | "type": "integer", 150 | "description": "Maximum number of results to return", 151 | }, 152 | }, 153 | }, 154 | ), 155 | types.Tool( 156 | name="get-space-info", 157 | description="Get detailed information about a specific Space", 158 | inputSchema={ 159 | "type": "object", 160 | "properties": { 161 | "space_id": { 162 | "type": "string", 163 | "description": "The ID of the Space (e.g., 'huggingface/diffusers-demo')", 164 | }, 165 | }, 166 | "required": ["space_id"], 167 | }, 168 | ), 169 | # Papers Tools 170 | types.Tool( 171 | name="get-paper-info", 172 | description="Get information about a specific paper on Hugging Face", 173 | inputSchema={ 174 | "type": "object", 175 | "properties": { 176 | "arxiv_id": { 177 | "type": "string", 178 | "description": "The arXiv ID of the paper (e.g., '1810.04805')", 179 | }, 180 | }, 181 | "required": ["arxiv_id"], 182 | }, 183 | ), 184 | types.Tool( 185 | name="get-daily-papers", 186 | description="Get the list of daily papers curated by Hugging Face", 187 | inputSchema={ 188 | "type": "object", 189 | "properties": {}, 190 | }, 191 | ), 192 | # Collections Tools 193 | types.Tool( 194 | name="search-collections", 195 | description="Search for collections on Hugging Face Hub", 196 | inputSchema={ 197 | "type": "object", 198 | "properties": { 199 | "owner": {"type": "string", "description": "Filter by owner"}, 200 | "item": { 201 | "type": "string", 202 | "description": "Filter by item (e.g., 'models/teknium/OpenHermes-2.5-Mistral-7B')", 203 | }, 204 | "query": { 205 | "type": "string", 206 | "description": "Search term for titles and descriptions", 207 | }, 208 | "limit": { 209 | "type": "integer", 210 | "description": "Maximum number of results to return", 211 | }, 212 | }, 213 | }, 214 | ), 215 | types.Tool( 216 | name="get-collection-info", 217 | description="Get detailed information about a specific collection", 218 | inputSchema={ 219 | "type": "object", 220 | "properties": { 221 | "namespace": { 222 | "type": "string", 223 | "description": "The namespace of the collection (user or organization)", 224 | }, 225 | "collection_id": { 226 | "type": "string", 227 | "description": "The ID part of the collection", 228 | }, 229 | }, 230 | "required": ["namespace", "collection_id"], 231 | }, 232 | ), 233 | ] 234 | 235 | 236 | @server.call_tool() 237 | async def handle_call_tool( 238 | name: str, arguments: dict | None 239 | ) -> list[types.TextContent | types.ImageContent | types.EmbeddedResource]: 240 | """ 241 | Handle tool execution requests for Hugging Face API. 242 | """ 243 | if not arguments: 244 | arguments = {} 245 | 246 | if name == "search-models": 247 | query = arguments.get("query") 248 | author = arguments.get("author") 249 | tags = arguments.get("tags") 250 | limit = arguments.get("limit", 10) 251 | 252 | params = {"limit": limit} 253 | if query: 254 | params["search"] = query 255 | if author: 256 | params["author"] = author 257 | if tags: 258 | params["filter"] = tags 259 | 260 | data = await make_hf_request("models", params) 261 | 262 | if "error" in data: 263 | return [ 264 | types.TextContent( 265 | type="text", text=f"Error searching models: {data['error']}" 266 | ) 267 | ] 268 | 269 | # Format the results 270 | results = [] 271 | for model in data: 272 | model_info = { 273 | "id": model.get("id", ""), 274 | "name": model.get("modelId", ""), 275 | "author": model.get("author", ""), 276 | "tags": model.get("tags", []), 277 | "downloads": model.get("downloads", 0), 278 | "likes": model.get("likes", 0), 279 | "lastModified": model.get("lastModified", ""), 280 | } 281 | results.append(model_info) 282 | 283 | return [types.TextContent(type="text", text=json.dumps(results, indent=2))] 284 | 285 | elif name == "get-model-info": 286 | model_id = arguments.get("model_id") 287 | if not model_id: 288 | return [types.TextContent(type="text", text="Error: model_id is required")] 289 | 290 | data = await make_hf_request(f"models/{quote_plus(model_id)}") 291 | 292 | if "error" in data: 293 | return [ 294 | types.TextContent( 295 | type="text", 296 | text=f"Error retrieving model information: {data['error']}", 297 | ) 298 | ] 299 | 300 | # Format the result 301 | model_info = { 302 | "id": data.get("id", ""), 303 | "name": data.get("modelId", ""), 304 | "author": data.get("author", ""), 305 | "tags": data.get("tags", []), 306 | "pipeline_tag": data.get("pipeline_tag", ""), 307 | "downloads": data.get("downloads", 0), 308 | "likes": data.get("likes", 0), 309 | "lastModified": data.get("lastModified", ""), 310 | "description": data.get("description", "No description available"), 311 | } 312 | 313 | # Add model card if available 314 | if "card" in data and data["card"]: 315 | model_info["model_card"] = ( 316 | data["card"].get("data", {}).get("text", "No model card available") 317 | ) 318 | 319 | return [types.TextContent(type="text", text=json.dumps(model_info, indent=2))] 320 | 321 | elif name == "search-datasets": 322 | query = arguments.get("query") 323 | author = arguments.get("author") 324 | tags = arguments.get("tags") 325 | limit = arguments.get("limit", 10) 326 | 327 | params = {"limit": limit} 328 | if query: 329 | params["search"] = query 330 | if author: 331 | params["author"] = author 332 | if tags: 333 | params["filter"] = tags 334 | 335 | data = await make_hf_request("datasets", params) 336 | 337 | if "error" in data: 338 | return [ 339 | types.TextContent( 340 | type="text", text=f"Error searching datasets: {data['error']}" 341 | ) 342 | ] 343 | 344 | # Format the results 345 | results = [] 346 | for dataset in data: 347 | dataset_info = { 348 | "id": dataset.get("id", ""), 349 | "name": dataset.get("datasetId", ""), 350 | "author": dataset.get("author", ""), 351 | "tags": dataset.get("tags", []), 352 | "downloads": dataset.get("downloads", 0), 353 | "likes": dataset.get("likes", 0), 354 | "lastModified": dataset.get("lastModified", ""), 355 | } 356 | results.append(dataset_info) 357 | 358 | return [types.TextContent(type="text", text=json.dumps(results, indent=2))] 359 | 360 | elif name == "get-dataset-info": 361 | dataset_id = arguments.get("dataset_id") 362 | if not dataset_id: 363 | return [ 364 | types.TextContent(type="text", text="Error: dataset_id is required") 365 | ] 366 | 367 | data = await make_hf_request(f"datasets/{quote_plus(dataset_id)}") 368 | 369 | if "error" in data: 370 | return [ 371 | types.TextContent( 372 | type="text", 373 | text=f"Error retrieving dataset information: {data['error']}", 374 | ) 375 | ] 376 | 377 | # Format the result 378 | dataset_info = { 379 | "id": data.get("id", ""), 380 | "name": data.get("datasetId", ""), 381 | "author": data.get("author", ""), 382 | "tags": data.get("tags", []), 383 | "downloads": data.get("downloads", 0), 384 | "likes": data.get("likes", 0), 385 | "lastModified": data.get("lastModified", ""), 386 | "description": data.get("description", "No description available"), 387 | } 388 | 389 | # Add dataset card if available 390 | if "card" in data and data["card"]: 391 | dataset_info["dataset_card"] = ( 392 | data["card"].get("data", {}).get("text", "No dataset card available") 393 | ) 394 | 395 | return [types.TextContent(type="text", text=json.dumps(dataset_info, indent=2))] 396 | 397 | elif name == "search-spaces": 398 | query = arguments.get("query") 399 | author = arguments.get("author") 400 | tags = arguments.get("tags") 401 | sdk = arguments.get("sdk") 402 | limit = arguments.get("limit", 10) 403 | 404 | params = {"limit": limit} 405 | if query: 406 | params["search"] = query 407 | if author: 408 | params["author"] = author 409 | if tags: 410 | params["filter"] = tags 411 | if sdk: 412 | params["filter"] = params.get("filter", "") + f" sdk:{sdk}" 413 | 414 | data = await make_hf_request("spaces", params) 415 | 416 | if "error" in data: 417 | return [ 418 | types.TextContent( 419 | type="text", text=f"Error searching spaces: {data['error']}" 420 | ) 421 | ] 422 | 423 | # Format the results 424 | results = [] 425 | for space in data: 426 | space_info = { 427 | "id": space.get("id", ""), 428 | "name": space.get("spaceId", ""), 429 | "author": space.get("author", ""), 430 | "sdk": space.get("sdk", ""), 431 | "tags": space.get("tags", []), 432 | "likes": space.get("likes", 0), 433 | "lastModified": space.get("lastModified", ""), 434 | } 435 | results.append(space_info) 436 | 437 | return [types.TextContent(type="text", text=json.dumps(results, indent=2))] 438 | 439 | elif name == "get-space-info": 440 | space_id = arguments.get("space_id") 441 | if not space_id: 442 | return [types.TextContent(type="text", text="Error: space_id is required")] 443 | 444 | data = await make_hf_request(f"spaces/{quote_plus(space_id)}") 445 | 446 | if "error" in data: 447 | return [ 448 | types.TextContent( 449 | type="text", 450 | text=f"Error retrieving space information: {data['error']}", 451 | ) 452 | ] 453 | 454 | # Format the result 455 | space_info = { 456 | "id": data.get("id", ""), 457 | "name": data.get("spaceId", ""), 458 | "author": data.get("author", ""), 459 | "sdk": data.get("sdk", ""), 460 | "tags": data.get("tags", []), 461 | "likes": data.get("likes", 0), 462 | "lastModified": data.get("lastModified", ""), 463 | "description": data.get("description", "No description available"), 464 | "url": f"https://huggingface.co/spaces/{space_id}", 465 | } 466 | 467 | return [types.TextContent(type="text", text=json.dumps(space_info, indent=2))] 468 | 469 | elif name == "get-paper-info": 470 | arxiv_id = arguments.get("arxiv_id") 471 | if not arxiv_id: 472 | return [types.TextContent(type="text", text="Error: arxiv_id is required")] 473 | 474 | data = await make_hf_request(f"papers/{arxiv_id}") 475 | 476 | if "error" in data: 477 | return [ 478 | types.TextContent( 479 | type="text", 480 | text=f"Error retrieving paper information: {data['error']}", 481 | ) 482 | ] 483 | 484 | # Format the result 485 | paper_info = { 486 | "arxiv_id": data.get("arxivId", ""), 487 | "title": data.get("title", ""), 488 | "authors": data.get("authors", []), 489 | "summary": data.get("summary", "No summary available"), 490 | "url": f"https://huggingface.co/papers/{arxiv_id}", 491 | } 492 | 493 | # Get implementations 494 | implementations = await make_hf_request(f"arxiv/{arxiv_id}/repos") 495 | if "error" not in implementations: 496 | paper_info["implementations"] = implementations 497 | 498 | return [types.TextContent(type="text", text=json.dumps(paper_info, indent=2))] 499 | 500 | elif name == "get-daily-papers": 501 | data = await make_hf_request("daily_papers") 502 | 503 | if "error" in data: 504 | return [ 505 | types.TextContent( 506 | type="text", text=f"Error retrieving daily papers: {data['error']}" 507 | ) 508 | ] 509 | 510 | # Format the results 511 | results = [] 512 | for paper in data: 513 | paper_info = { 514 | "arxiv_id": paper.get("paper", {}).get("arxivId", ""), 515 | "title": paper.get("paper", {}).get("title", ""), 516 | "authors": paper.get("paper", {}).get("authors", []), 517 | "summary": paper.get("paper", {}).get("summary", "")[:200] + "..." 518 | if len(paper.get("paper", {}).get("summary", "")) > 200 519 | else paper.get("paper", {}).get("summary", ""), 520 | } 521 | results.append(paper_info) 522 | 523 | return [types.TextContent(type="text", text=json.dumps(results, indent=2))] 524 | 525 | elif name == "search-collections": 526 | owner = arguments.get("owner") 527 | item = arguments.get("item") 528 | query = arguments.get("query") 529 | limit = arguments.get("limit", 10) 530 | 531 | params = {"limit": limit} 532 | if owner: 533 | params["owner"] = owner 534 | if item: 535 | params["item"] = item 536 | if query: 537 | params["q"] = query 538 | 539 | data = await make_hf_request("collections", params) 540 | 541 | if "error" in data: 542 | return [ 543 | types.TextContent( 544 | type="text", text=f"Error searching collections: {data['error']}" 545 | ) 546 | ] 547 | 548 | # Format the results 549 | results = [] 550 | for collection in data: 551 | collection_info = { 552 | "id": collection.get("id", ""), 553 | "title": collection.get("title", ""), 554 | "owner": collection.get("owner", {}).get("name", ""), 555 | "description": collection.get( 556 | "description", "No description available" 557 | ), 558 | "items_count": collection.get("itemsCount", 0), 559 | "upvotes": collection.get("upvotes", 0), 560 | "last_modified": collection.get("lastModified", ""), 561 | } 562 | results.append(collection_info) 563 | 564 | return [types.TextContent(type="text", text=json.dumps(results, indent=2))] 565 | 566 | elif name == "get-collection-info": 567 | namespace = arguments.get("namespace") 568 | collection_id = arguments.get("collection_id") 569 | 570 | if not namespace or not collection_id: 571 | return [ 572 | types.TextContent( 573 | type="text", text="Error: namespace and collection_id are required" 574 | ) 575 | ] 576 | 577 | # Extract the slug from the collection_id if it contains a dash 578 | slug = collection_id.split("-")[0] if "-" in collection_id else collection_id 579 | endpoint = f"collections/{namespace}/{slug}-{collection_id}" 580 | 581 | data = await make_hf_request(endpoint) 582 | 583 | if "error" in data: 584 | return [ 585 | types.TextContent( 586 | type="text", 587 | text=f"Error retrieving collection information: {data['error']}", 588 | ) 589 | ] 590 | 591 | # Format the result 592 | collection_info = { 593 | "id": data.get("id", ""), 594 | "title": data.get("title", ""), 595 | "owner": data.get("owner", {}).get("name", ""), 596 | "description": data.get("description", "No description available"), 597 | "upvotes": data.get("upvotes", 0), 598 | "last_modified": data.get("lastModified", ""), 599 | "items": [], 600 | } 601 | 602 | # Add items 603 | for item in data.get("items", []): 604 | item_info = { 605 | "type": item.get("item", {}).get("type", ""), 606 | "id": item.get("item", {}).get("id", ""), 607 | "note": item.get("note", ""), 608 | } 609 | collection_info["items"].append(item_info) 610 | 611 | return [ 612 | types.TextContent(type="text", text=json.dumps(collection_info, indent=2)) 613 | ] 614 | 615 | else: 616 | return [types.TextContent(type="text", text=f"Unknown tool: {name}")] 617 | 618 | 619 | # Resource Handlers - Define popular models, datasets, and spaces as resources 620 | @server.list_resources() 621 | async def handle_list_resources() -> list[types.Resource]: 622 | """ 623 | List available Hugging Face resources. 624 | This provides direct access to popular models, datasets, and spaces. 625 | """ 626 | resources = [] 627 | 628 | # Popular models 629 | popular_models = [ 630 | ( 631 | "meta-llama/Llama-3-8B-Instruct", 632 | "Llama 3 8B Instruct", 633 | "Meta's Llama 3 8B Instruct model", 634 | ), 635 | ( 636 | "mistralai/Mistral-7B-Instruct-v0.2", 637 | "Mistral 7B Instruct v0.2", 638 | "Mistral AI's 7B instruction-following model", 639 | ), 640 | ( 641 | "openchat/openchat-3.5-0106", 642 | "OpenChat 3.5", 643 | "Open-source chatbot based on Mistral 7B", 644 | ), 645 | ( 646 | "stabilityai/stable-diffusion-xl-base-1.0", 647 | "Stable Diffusion XL 1.0", 648 | "SDXL text-to-image model", 649 | ), 650 | ] 651 | 652 | for model_id, name, description in popular_models: 653 | resources.append( 654 | types.Resource( 655 | uri=AnyUrl(f"hf://model/{model_id}"), 656 | name=name, 657 | description=description, 658 | mimeType="application/json", 659 | ) 660 | ) 661 | 662 | # Popular datasets 663 | popular_datasets = [ 664 | ( 665 | "databricks/databricks-dolly-15k", 666 | "Databricks Dolly 15k", 667 | "15k instruction-following examples", 668 | ), 669 | ("squad", "SQuAD", "Stanford Question Answering Dataset"), 670 | ("glue", "GLUE", "General Language Understanding Evaluation benchmark"), 671 | ( 672 | "openai/summarize_from_feedback", 673 | "Summarize From Feedback", 674 | "OpenAI summarization dataset", 675 | ), 676 | ] 677 | 678 | for dataset_id, name, description in popular_datasets: 679 | resources.append( 680 | types.Resource( 681 | uri=AnyUrl(f"hf://dataset/{dataset_id}"), 682 | name=name, 683 | description=description, 684 | mimeType="application/json", 685 | ) 686 | ) 687 | 688 | # Popular spaces 689 | popular_spaces = [ 690 | ( 691 | "huggingface/diffusers-demo", 692 | "Diffusers Demo", 693 | "Demo of Stable Diffusion models", 694 | ), 695 | ("gradio/chatbot-demo", "Chatbot Demo", "Demo of a Gradio chatbot interface"), 696 | ( 697 | "prompthero/midjourney-v4-diffusion", 698 | "Midjourney v4 Diffusion", 699 | "Replica of Midjourney v4", 700 | ), 701 | ("stabilityai/stablevicuna", "StableVicuna", "Fine-tuned Vicuna with RLHF"), 702 | ] 703 | 704 | for space_id, name, description in popular_spaces: 705 | resources.append( 706 | types.Resource( 707 | uri=AnyUrl(f"hf://space/{space_id}"), 708 | name=name, 709 | description=description, 710 | mimeType="application/json", 711 | ) 712 | ) 713 | 714 | return resources 715 | 716 | 717 | @server.read_resource() 718 | async def handle_read_resource(uri: AnyUrl) -> str: 719 | """ 720 | Read a specific Hugging Face resource by its URI. 721 | """ 722 | if uri.scheme != "hf": 723 | raise ValueError(f"Unsupported URI scheme: {uri.scheme}") 724 | 725 | if not uri.path: 726 | raise ValueError("Invalid Hugging Face resource URI") 727 | 728 | parts = uri.path.lstrip("/").split("/", 1) 729 | if len(parts) != 2: 730 | raise ValueError("Invalid Hugging Face resource URI format") 731 | 732 | resource_type, resource_id = parts 733 | 734 | if resource_type == "model": 735 | data = await make_hf_request(f"models/{quote_plus(resource_id)}") 736 | elif resource_type == "dataset": 737 | data = await make_hf_request(f"datasets/{quote_plus(resource_id)}") 738 | elif resource_type == "space": 739 | data = await make_hf_request(f"spaces/{quote_plus(resource_id)}") 740 | else: 741 | raise ValueError(f"Unsupported resource type: {resource_type}") 742 | 743 | if "error" in data: 744 | raise ValueError(f"Error retrieving resource: {data['error']}") 745 | 746 | return json.dumps(data, indent=2) 747 | 748 | 749 | # Prompt Handlers 750 | @server.list_prompts() 751 | async def handle_list_prompts() -> list[types.Prompt]: 752 | """ 753 | List available prompts for Hugging Face integration. 754 | """ 755 | return [ 756 | types.Prompt( 757 | name="compare-models", 758 | description="Compare multiple Hugging Face models", 759 | arguments=[ 760 | types.PromptArgument( 761 | name="model_ids", 762 | description="Comma-separated list of model IDs to compare", 763 | required=True, 764 | ) 765 | ], 766 | ), 767 | types.Prompt( 768 | name="summarize-paper", 769 | description="Summarize an AI research paper from arXiv", 770 | arguments=[ 771 | types.PromptArgument( 772 | name="arxiv_id", 773 | description="arXiv ID of the paper to summarize", 774 | required=True, 775 | ), 776 | types.PromptArgument( 777 | name="detail_level", 778 | description="Level of detail in the summary (brief/detailed/eli5)", 779 | required=False, 780 | ), 781 | ], 782 | ), 783 | ] 784 | 785 | 786 | @server.get_prompt() 787 | async def handle_get_prompt( 788 | name: str, arguments: dict[str, str] | None 789 | ) -> types.GetPromptResult: 790 | """ 791 | Generate a prompt related to Hugging Face resources. 792 | """ 793 | if not arguments: 794 | arguments = {} 795 | 796 | if name == "compare-models": 797 | model_ids = arguments.get("model_ids", "") 798 | if not model_ids: 799 | raise ValueError("model_ids argument is required") 800 | 801 | model_list = [model_id.strip() for model_id in model_ids.split(",")] 802 | models_data = [] 803 | 804 | for model_id in model_list: 805 | data = await make_hf_request(f"models/{quote_plus(model_id)}") 806 | if "error" not in data: 807 | models_data.append(data) 808 | 809 | model_details = [] 810 | for data in models_data: 811 | details = { 812 | "id": data.get("id", ""), 813 | "author": data.get("author", ""), 814 | "downloads": data.get("downloads", 0), 815 | "tags": data.get("tags", []), 816 | "description": data.get("description", "No description available"), 817 | } 818 | model_details.append(details) 819 | 820 | return types.GetPromptResult( 821 | description=f"Comparing models: {model_ids}", 822 | messages=[ 823 | types.PromptMessage( 824 | role="user", 825 | content=types.TextContent( 826 | type="text", 827 | text="I'd like you to compare these Hugging Face models and help me understand their differences, strengths, and suitable use cases:\n\n" 828 | + json.dumps(model_details, indent=2) 829 | + "\n\nPlease structure your comparison with sections on architecture, performance, use cases, and limitations.", 830 | ), 831 | ) 832 | ], 833 | ) 834 | 835 | elif name == "summarize-paper": 836 | arxiv_id = arguments.get("arxiv_id", "") 837 | if not arxiv_id: 838 | raise ValueError("arxiv_id argument is required") 839 | 840 | detail_level = arguments.get("detail_level", "detailed") 841 | 842 | paper_data = await make_hf_request(f"papers/{arxiv_id}") 843 | if "error" in paper_data: 844 | raise ValueError(f"Error retrieving paper: {paper_data['error']}") 845 | 846 | # Get implementations 847 | implementations = await make_hf_request(f"arxiv/{arxiv_id}/repos") 848 | 849 | return types.GetPromptResult( 850 | description=f"Summarizing paper: {paper_data.get('title', arxiv_id)}", 851 | messages=[ 852 | types.PromptMessage( 853 | role="user", 854 | content=types.TextContent( 855 | type="text", 856 | text=f"Please provide a {'detailed' if detail_level == 'detailed' else 'brief'} summary of this AI research paper:\n\n" 857 | + f"Title: {paper_data.get('title', 'Unknown')}\n" 858 | + f"Authors: {', '.join(paper_data.get('authors', []))}\n" 859 | + f"Abstract: {paper_data.get('summary', 'No abstract available')}\n\n" 860 | + ( 861 | f"Implementations on Hugging Face: {json.dumps(implementations, indent=2)}\n\n" 862 | if "error" not in implementations 863 | else "" 864 | ) 865 | + f"Please {'cover all key aspects including methodology, results, and implications' if detail_level == 'detailed' else 'provide a concise overview of the main contributions'}.", 866 | ), 867 | ) 868 | ], 869 | ) 870 | 871 | else: 872 | raise ValueError(f"Unknown prompt: {name}") 873 | 874 | 875 | async def main(): 876 | # Run the server using stdin/stdout streams 877 | async with mcp.server.stdio.stdio_server() as (read_stream, write_stream): 878 | await server.run( 879 | read_stream, 880 | write_stream, 881 | InitializationOptions( 882 | server_name="huggingface", 883 | server_version="0.1.0", 884 | capabilities=server.get_capabilities( 885 | notification_options=NotificationOptions(), 886 | experimental_capabilities={}, 887 | ), 888 | ), 889 | ) 890 | 891 | 892 | if __name__ == "__main__": 893 | asyncio.run(main()) 894 | -------------------------------------------------------------------------------- /uv.lock: -------------------------------------------------------------------------------- 1 | version = 1 2 | revision = 1 3 | requires-python = ">=3.13" 4 | 5 | [[package]] 6 | name = "annotated-types" 7 | version = "0.7.0" 8 | source = { registry = "https://pypi.org/simple" } 9 | sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 } 10 | wheels = [ 11 | { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 }, 12 | ] 13 | 14 | [[package]] 15 | name = "anyio" 16 | version = "4.9.0" 17 | source = { registry = "https://pypi.org/simple" } 18 | dependencies = [ 19 | { name = "idna" }, 20 | { name = "sniffio" }, 21 | ] 22 | sdist = { url = "https://files.pythonhosted.org/packages/95/7d/4c1bd541d4dffa1b52bd83fb8527089e097a106fc90b467a7313b105f840/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028", size = 190949 } 23 | wheels = [ 24 | { url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916 }, 25 | ] 26 | 27 | [[package]] 28 | name = "certifi" 29 | version = "2025.1.31" 30 | source = { registry = "https://pypi.org/simple" } 31 | sdist = { url = "https://files.pythonhosted.org/packages/1c/ab/c9f1e32b7b1bf505bf26f0ef697775960db7932abeb7b516de930ba2705f/certifi-2025.1.31.tar.gz", hash = "sha256:3d5da6925056f6f18f119200434a4780a94263f10d1c21d032a6f6b2baa20651", size = 167577 } 32 | wheels = [ 33 | { url = "https://files.pythonhosted.org/packages/38/fc/bce832fd4fd99766c04d1ee0eead6b0ec6486fb100ae5e74c1d91292b982/certifi-2025.1.31-py3-none-any.whl", hash = "sha256:ca78db4565a652026a4db2bcdf68f2fb589ea80d0be70e03929ed730746b84fe", size = 166393 }, 34 | ] 35 | 36 | [[package]] 37 | name = "charset-normalizer" 38 | version = "3.4.1" 39 | source = { registry = "https://pypi.org/simple" } 40 | sdist = { url = "https://files.pythonhosted.org/packages/16/b0/572805e227f01586461c80e0fd25d65a2115599cc9dad142fee4b747c357/charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3", size = 123188 } 41 | wheels = [ 42 | { url = "https://files.pythonhosted.org/packages/38/94/ce8e6f63d18049672c76d07d119304e1e2d7c6098f0841b51c666e9f44a0/charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda", size = 195698 }, 43 | { url = "https://files.pythonhosted.org/packages/24/2e/dfdd9770664aae179a96561cc6952ff08f9a8cd09a908f259a9dfa063568/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313", size = 140162 }, 44 | { url = "https://files.pythonhosted.org/packages/24/4e/f646b9093cff8fc86f2d60af2de4dc17c759de9d554f130b140ea4738ca6/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9", size = 150263 }, 45 | { url = "https://files.pythonhosted.org/packages/5e/67/2937f8d548c3ef6e2f9aab0f6e21001056f692d43282b165e7c56023e6dd/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b", size = 142966 }, 46 | { url = "https://files.pythonhosted.org/packages/52/ed/b7f4f07de100bdb95c1756d3a4d17b90c1a3c53715c1a476f8738058e0fa/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11", size = 144992 }, 47 | { url = "https://files.pythonhosted.org/packages/96/2c/d49710a6dbcd3776265f4c923bb73ebe83933dfbaa841c5da850fe0fd20b/charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f", size = 147162 }, 48 | { url = "https://files.pythonhosted.org/packages/b4/41/35ff1f9a6bd380303dea55e44c4933b4cc3c4850988927d4082ada230273/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd", size = 140972 }, 49 | { url = "https://files.pythonhosted.org/packages/fb/43/c6a0b685fe6910d08ba971f62cd9c3e862a85770395ba5d9cad4fede33ab/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2", size = 149095 }, 50 | { url = "https://files.pythonhosted.org/packages/4c/ff/a9a504662452e2d2878512115638966e75633519ec11f25fca3d2049a94a/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886", size = 152668 }, 51 | { url = "https://files.pythonhosted.org/packages/6c/71/189996b6d9a4b932564701628af5cee6716733e9165af1d5e1b285c530ed/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601", size = 150073 }, 52 | { url = "https://files.pythonhosted.org/packages/e4/93/946a86ce20790e11312c87c75ba68d5f6ad2208cfb52b2d6a2c32840d922/charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd", size = 145732 }, 53 | { url = "https://files.pythonhosted.org/packages/cd/e5/131d2fb1b0dddafc37be4f3a2fa79aa4c037368be9423061dccadfd90091/charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407", size = 95391 }, 54 | { url = "https://files.pythonhosted.org/packages/27/f2/4f9a69cc7712b9b5ad8fdb87039fd89abba997ad5cbe690d1835d40405b0/charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971", size = 102702 }, 55 | { url = "https://files.pythonhosted.org/packages/0e/f6/65ecc6878a89bb1c23a086ea335ad4bf21a588990c3f535a227b9eea9108/charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85", size = 49767 }, 56 | ] 57 | 58 | [[package]] 59 | name = "click" 60 | version = "8.1.8" 61 | source = { registry = "https://pypi.org/simple" } 62 | dependencies = [ 63 | { name = "colorama", marker = "sys_platform == 'win32'" }, 64 | ] 65 | sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593 } 66 | wheels = [ 67 | { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188 }, 68 | ] 69 | 70 | [[package]] 71 | name = "colorama" 72 | version = "0.4.6" 73 | source = { registry = "https://pypi.org/simple" } 74 | sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 } 75 | wheels = [ 76 | { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 }, 77 | ] 78 | 79 | [[package]] 80 | name = "filelock" 81 | version = "3.18.0" 82 | source = { registry = "https://pypi.org/simple" } 83 | sdist = { url = "https://files.pythonhosted.org/packages/0a/10/c23352565a6544bdc5353e0b15fc1c563352101f30e24bf500207a54df9a/filelock-3.18.0.tar.gz", hash = "sha256:adbc88eabb99d2fec8c9c1b229b171f18afa655400173ddc653d5d01501fb9f2", size = 18075 } 84 | wheels = [ 85 | { url = "https://files.pythonhosted.org/packages/4d/36/2a115987e2d8c300a974597416d9de88f2444426de9571f4b59b2cca3acc/filelock-3.18.0-py3-none-any.whl", hash = "sha256:c401f4f8377c4464e6db25fff06205fd89bdd83b65eb0488ed1b160f780e21de", size = 16215 }, 86 | ] 87 | 88 | [[package]] 89 | name = "fsspec" 90 | version = "2025.3.0" 91 | source = { registry = "https://pypi.org/simple" } 92 | sdist = { url = "https://files.pythonhosted.org/packages/34/f4/5721faf47b8c499e776bc34c6a8fc17efdf7fdef0b00f398128bc5dcb4ac/fsspec-2025.3.0.tar.gz", hash = "sha256:a935fd1ea872591f2b5148907d103488fc523295e6c64b835cfad8c3eca44972", size = 298491 } 93 | wheels = [ 94 | { url = "https://files.pythonhosted.org/packages/56/53/eb690efa8513166adef3e0669afd31e95ffde69fb3c52ec2ac7223ed6018/fsspec-2025.3.0-py3-none-any.whl", hash = "sha256:efb87af3efa9103f94ca91a7f8cb7a4df91af9f74fc106c9c7ea0efd7277c1b3", size = 193615 }, 95 | ] 96 | 97 | [[package]] 98 | name = "h11" 99 | version = "0.14.0" 100 | source = { registry = "https://pypi.org/simple" } 101 | sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 } 102 | wheels = [ 103 | { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 }, 104 | ] 105 | 106 | [[package]] 107 | name = "httpcore" 108 | version = "1.0.7" 109 | source = { registry = "https://pypi.org/simple" } 110 | dependencies = [ 111 | { name = "certifi" }, 112 | { name = "h11" }, 113 | ] 114 | sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196 } 115 | wheels = [ 116 | { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551 }, 117 | ] 118 | 119 | [[package]] 120 | name = "httpx" 121 | version = "0.28.1" 122 | source = { registry = "https://pypi.org/simple" } 123 | dependencies = [ 124 | { name = "anyio" }, 125 | { name = "certifi" }, 126 | { name = "httpcore" }, 127 | { name = "idna" }, 128 | ] 129 | sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406 } 130 | wheels = [ 131 | { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517 }, 132 | ] 133 | 134 | [[package]] 135 | name = "httpx-sse" 136 | version = "0.4.0" 137 | source = { registry = "https://pypi.org/simple" } 138 | sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 } 139 | wheels = [ 140 | { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 }, 141 | ] 142 | 143 | [[package]] 144 | name = "huggingface" 145 | version = "0.1.0" 146 | source = { editable = "." } 147 | dependencies = [ 148 | { name = "huggingface-hub" }, 149 | { name = "mcp" }, 150 | ] 151 | 152 | [package.metadata] 153 | requires-dist = [ 154 | { name = "huggingface-hub", specifier = ">=0.29.3" }, 155 | { name = "mcp", specifier = ">=1.4.1" }, 156 | ] 157 | 158 | [[package]] 159 | name = "huggingface-hub" 160 | version = "0.29.3" 161 | source = { registry = "https://pypi.org/simple" } 162 | dependencies = [ 163 | { name = "filelock" }, 164 | { name = "fsspec" }, 165 | { name = "packaging" }, 166 | { name = "pyyaml" }, 167 | { name = "requests" }, 168 | { name = "tqdm" }, 169 | { name = "typing-extensions" }, 170 | ] 171 | sdist = { url = "https://files.pythonhosted.org/packages/e5/f9/851f34b02970e8143d41d4001b2d49e54ef113f273902103823b8bc95ada/huggingface_hub-0.29.3.tar.gz", hash = "sha256:64519a25716e0ba382ba2d3fb3ca082e7c7eb4a2fc634d200e8380006e0760e5", size = 390123 } 172 | wheels = [ 173 | { url = "https://files.pythonhosted.org/packages/40/0c/37d380846a2e5c9a3c6a73d26ffbcfdcad5fc3eacf42fdf7cff56f2af634/huggingface_hub-0.29.3-py3-none-any.whl", hash = "sha256:0b25710932ac649c08cdbefa6c6ccb8e88eef82927cacdb048efb726429453aa", size = 468997 }, 174 | ] 175 | 176 | [[package]] 177 | name = "idna" 178 | version = "3.10" 179 | source = { registry = "https://pypi.org/simple" } 180 | sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 } 181 | wheels = [ 182 | { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 }, 183 | ] 184 | 185 | [[package]] 186 | name = "mcp" 187 | version = "1.4.1" 188 | source = { registry = "https://pypi.org/simple" } 189 | dependencies = [ 190 | { name = "anyio" }, 191 | { name = "httpx" }, 192 | { name = "httpx-sse" }, 193 | { name = "pydantic" }, 194 | { name = "pydantic-settings" }, 195 | { name = "sse-starlette" }, 196 | { name = "starlette" }, 197 | { name = "uvicorn" }, 198 | ] 199 | sdist = { url = "https://files.pythonhosted.org/packages/50/cc/5c5bb19f1a0f8f89a95e25cb608b0b07009e81fd4b031e519335404e1422/mcp-1.4.1.tar.gz", hash = "sha256:b9655d2de6313f9d55a7d1df62b3c3fe27a530100cc85bf23729145b0dba4c7a", size = 154942 } 200 | wheels = [ 201 | { url = "https://files.pythonhosted.org/packages/e8/0e/885f156ade60108e67bf044fada5269da68e29d758a10b0c513f4d85dd76/mcp-1.4.1-py3-none-any.whl", hash = "sha256:a7716b1ec1c054e76f49806f7d96113b99fc1166fc9244c2c6f19867cb75b593", size = 72448 }, 202 | ] 203 | 204 | [[package]] 205 | name = "packaging" 206 | version = "24.2" 207 | source = { registry = "https://pypi.org/simple" } 208 | sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 } 209 | wheels = [ 210 | { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 }, 211 | ] 212 | 213 | [[package]] 214 | name = "pydantic" 215 | version = "2.10.6" 216 | source = { registry = "https://pypi.org/simple" } 217 | dependencies = [ 218 | { name = "annotated-types" }, 219 | { name = "pydantic-core" }, 220 | { name = "typing-extensions" }, 221 | ] 222 | sdist = { url = "https://files.pythonhosted.org/packages/b7/ae/d5220c5c52b158b1de7ca89fc5edb72f304a70a4c540c84c8844bf4008de/pydantic-2.10.6.tar.gz", hash = "sha256:ca5daa827cce33de7a42be142548b0096bf05a7e7b365aebfa5f8eeec7128236", size = 761681 } 223 | wheels = [ 224 | { url = "https://files.pythonhosted.org/packages/f4/3c/8cc1cc84deffa6e25d2d0c688ebb80635dfdbf1dbea3e30c541c8cf4d860/pydantic-2.10.6-py3-none-any.whl", hash = "sha256:427d664bf0b8a2b34ff5dd0f5a18df00591adcee7198fbd71981054cef37b584", size = 431696 }, 225 | ] 226 | 227 | [[package]] 228 | name = "pydantic-core" 229 | version = "2.27.2" 230 | source = { registry = "https://pypi.org/simple" } 231 | dependencies = [ 232 | { name = "typing-extensions" }, 233 | ] 234 | sdist = { url = "https://files.pythonhosted.org/packages/fc/01/f3e5ac5e7c25833db5eb555f7b7ab24cd6f8c322d3a3ad2d67a952dc0abc/pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39", size = 413443 } 235 | wheels = [ 236 | { url = "https://files.pythonhosted.org/packages/41/b1/9bc383f48f8002f99104e3acff6cba1231b29ef76cfa45d1506a5cad1f84/pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b", size = 1892709 }, 237 | { url = "https://files.pythonhosted.org/packages/10/6c/e62b8657b834f3eb2961b49ec8e301eb99946245e70bf42c8817350cbefc/pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154", size = 1811273 }, 238 | { url = "https://files.pythonhosted.org/packages/ba/15/52cfe49c8c986e081b863b102d6b859d9defc63446b642ccbbb3742bf371/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9", size = 1823027 }, 239 | { url = "https://files.pythonhosted.org/packages/b1/1c/b6f402cfc18ec0024120602bdbcebc7bdd5b856528c013bd4d13865ca473/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9", size = 1868888 }, 240 | { url = "https://files.pythonhosted.org/packages/bd/7b/8cb75b66ac37bc2975a3b7de99f3c6f355fcc4d89820b61dffa8f1e81677/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1", size = 2037738 }, 241 | { url = "https://files.pythonhosted.org/packages/c8/f1/786d8fe78970a06f61df22cba58e365ce304bf9b9f46cc71c8c424e0c334/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a", size = 2685138 }, 242 | { url = "https://files.pythonhosted.org/packages/a6/74/d12b2cd841d8724dc8ffb13fc5cef86566a53ed358103150209ecd5d1999/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e", size = 1997025 }, 243 | { url = "https://files.pythonhosted.org/packages/a0/6e/940bcd631bc4d9a06c9539b51f070b66e8f370ed0933f392db6ff350d873/pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4", size = 2004633 }, 244 | { url = "https://files.pythonhosted.org/packages/50/cc/a46b34f1708d82498c227d5d80ce615b2dd502ddcfd8376fc14a36655af1/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27", size = 1999404 }, 245 | { url = "https://files.pythonhosted.org/packages/ca/2d/c365cfa930ed23bc58c41463bae347d1005537dc8db79e998af8ba28d35e/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee", size = 2130130 }, 246 | { url = "https://files.pythonhosted.org/packages/f4/d7/eb64d015c350b7cdb371145b54d96c919d4db516817f31cd1c650cae3b21/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1", size = 2157946 }, 247 | { url = "https://files.pythonhosted.org/packages/a4/99/bddde3ddde76c03b65dfd5a66ab436c4e58ffc42927d4ff1198ffbf96f5f/pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130", size = 1834387 }, 248 | { url = "https://files.pythonhosted.org/packages/71/47/82b5e846e01b26ac6f1893d3c5f9f3a2eb6ba79be26eef0b759b4fe72946/pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee", size = 1990453 }, 249 | { url = "https://files.pythonhosted.org/packages/51/b2/b2b50d5ecf21acf870190ae5d093602d95f66c9c31f9d5de6062eb329ad1/pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b", size = 1885186 }, 250 | ] 251 | 252 | [[package]] 253 | name = "pydantic-settings" 254 | version = "2.8.1" 255 | source = { registry = "https://pypi.org/simple" } 256 | dependencies = [ 257 | { name = "pydantic" }, 258 | { name = "python-dotenv" }, 259 | ] 260 | sdist = { url = "https://files.pythonhosted.org/packages/88/82/c79424d7d8c29b994fb01d277da57b0a9b09cc03c3ff875f9bd8a86b2145/pydantic_settings-2.8.1.tar.gz", hash = "sha256:d5c663dfbe9db9d5e1c646b2e161da12f0d734d422ee56f567d0ea2cee4e8585", size = 83550 } 261 | wheels = [ 262 | { url = "https://files.pythonhosted.org/packages/0b/53/a64f03044927dc47aafe029c42a5b7aabc38dfb813475e0e1bf71c4a59d0/pydantic_settings-2.8.1-py3-none-any.whl", hash = "sha256:81942d5ac3d905f7f3ee1a70df5dfb62d5569c12f51a5a647defc1c3d9ee2e9c", size = 30839 }, 263 | ] 264 | 265 | [[package]] 266 | name = "python-dotenv" 267 | version = "1.0.1" 268 | source = { registry = "https://pypi.org/simple" } 269 | sdist = { url = "https://files.pythonhosted.org/packages/bc/57/e84d88dfe0aec03b7a2d4327012c1627ab5f03652216c63d49846d7a6c58/python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca", size = 39115 } 270 | wheels = [ 271 | { url = "https://files.pythonhosted.org/packages/6a/3e/b68c118422ec867fa7ab88444e1274aa40681c606d59ac27de5a5588f082/python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a", size = 19863 }, 272 | ] 273 | 274 | [[package]] 275 | name = "pyyaml" 276 | version = "6.0.2" 277 | source = { registry = "https://pypi.org/simple" } 278 | sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631 } 279 | wheels = [ 280 | { url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309 }, 281 | { url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679 }, 282 | { url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428 }, 283 | { url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361 }, 284 | { url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523 }, 285 | { url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660 }, 286 | { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597 }, 287 | { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527 }, 288 | { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446 }, 289 | ] 290 | 291 | [[package]] 292 | name = "requests" 293 | version = "2.32.3" 294 | source = { registry = "https://pypi.org/simple" } 295 | dependencies = [ 296 | { name = "certifi" }, 297 | { name = "charset-normalizer" }, 298 | { name = "idna" }, 299 | { name = "urllib3" }, 300 | ] 301 | sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218 } 302 | wheels = [ 303 | { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 }, 304 | ] 305 | 306 | [[package]] 307 | name = "sniffio" 308 | version = "1.3.1" 309 | source = { registry = "https://pypi.org/simple" } 310 | sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 } 311 | wheels = [ 312 | { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 }, 313 | ] 314 | 315 | [[package]] 316 | name = "sse-starlette" 317 | version = "2.2.1" 318 | source = { registry = "https://pypi.org/simple" } 319 | dependencies = [ 320 | { name = "anyio" }, 321 | { name = "starlette" }, 322 | ] 323 | sdist = { url = "https://files.pythonhosted.org/packages/71/a4/80d2a11af59fe75b48230846989e93979c892d3a20016b42bb44edb9e398/sse_starlette-2.2.1.tar.gz", hash = "sha256:54470d5f19274aeed6b2d473430b08b4b379ea851d953b11d7f1c4a2c118b419", size = 17376 } 324 | wheels = [ 325 | { url = "https://files.pythonhosted.org/packages/d9/e0/5b8bd393f27f4a62461c5cf2479c75a2cc2ffa330976f9f00f5f6e4f50eb/sse_starlette-2.2.1-py3-none-any.whl", hash = "sha256:6410a3d3ba0c89e7675d4c273a301d64649c03a5ef1ca101f10b47f895fd0e99", size = 10120 }, 326 | ] 327 | 328 | [[package]] 329 | name = "starlette" 330 | version = "0.46.1" 331 | source = { registry = "https://pypi.org/simple" } 332 | dependencies = [ 333 | { name = "anyio" }, 334 | ] 335 | sdist = { url = "https://files.pythonhosted.org/packages/04/1b/52b27f2e13ceedc79a908e29eac426a63465a1a01248e5f24aa36a62aeb3/starlette-0.46.1.tar.gz", hash = "sha256:3c88d58ee4bd1bb807c0d1acb381838afc7752f9ddaec81bbe4383611d833230", size = 2580102 } 336 | wheels = [ 337 | { url = "https://files.pythonhosted.org/packages/a0/4b/528ccf7a982216885a1ff4908e886b8fb5f19862d1962f56a3fce2435a70/starlette-0.46.1-py3-none-any.whl", hash = "sha256:77c74ed9d2720138b25875133f3a2dae6d854af2ec37dceb56aef370c1d8a227", size = 71995 }, 338 | ] 339 | 340 | [[package]] 341 | name = "tqdm" 342 | version = "4.67.1" 343 | source = { registry = "https://pypi.org/simple" } 344 | dependencies = [ 345 | { name = "colorama", marker = "sys_platform == 'win32'" }, 346 | ] 347 | sdist = { url = "https://files.pythonhosted.org/packages/a8/4b/29b4ef32e036bb34e4ab51796dd745cdba7ed47ad142a9f4a1eb8e0c744d/tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2", size = 169737 } 348 | wheels = [ 349 | { url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540 }, 350 | ] 351 | 352 | [[package]] 353 | name = "typing-extensions" 354 | version = "4.12.2" 355 | source = { registry = "https://pypi.org/simple" } 356 | sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 } 357 | wheels = [ 358 | { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 }, 359 | ] 360 | 361 | [[package]] 362 | name = "urllib3" 363 | version = "2.3.0" 364 | source = { registry = "https://pypi.org/simple" } 365 | sdist = { url = "https://files.pythonhosted.org/packages/aa/63/e53da845320b757bf29ef6a9062f5c669fe997973f966045cb019c3f4b66/urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d", size = 307268 } 366 | wheels = [ 367 | { url = "https://files.pythonhosted.org/packages/c8/19/4ec628951a74043532ca2cf5d97b7b14863931476d117c471e8e2b1eb39f/urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df", size = 128369 }, 368 | ] 369 | 370 | [[package]] 371 | name = "uvicorn" 372 | version = "0.34.0" 373 | source = { registry = "https://pypi.org/simple" } 374 | dependencies = [ 375 | { name = "click" }, 376 | { name = "h11" }, 377 | ] 378 | sdist = { url = "https://files.pythonhosted.org/packages/4b/4d/938bd85e5bf2edeec766267a5015ad969730bb91e31b44021dfe8b22df6c/uvicorn-0.34.0.tar.gz", hash = "sha256:404051050cd7e905de2c9a7e61790943440b3416f49cb409f965d9dcd0fa73e9", size = 76568 } 379 | wheels = [ 380 | { url = "https://files.pythonhosted.org/packages/61/14/33a3a1352cfa71812a3a21e8c9bfb83f60b0011f5e36f2b1399d51928209/uvicorn-0.34.0-py3-none-any.whl", hash = "sha256:023dc038422502fa28a09c7a30bf2b6991512da7dcdb8fd35fe57cfc154126f4", size = 62315 }, 381 | ] 382 | --------------------------------------------------------------------------------