├── .DS_Store ├── .gitignore ├── .python-version ├── LICENSE ├── README.md ├── addon.py ├── assets ├── addon-instructions.png └── hammer-icon.png ├── main.py ├── pyproject.toml ├── src └── blender_mcp │ ├── __init__.py │ └── server.py └── uv.lock /.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ahujasid/blender-mcp/972096e9dc0183c0ac72a5391a99c99533c7f783/.DS_Store -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Python-generated files 2 | __pycache__/ 3 | *.py[oc] 4 | build/ 5 | dist/ 6 | wheels/ 7 | *.egg-info 8 | 9 | # Virtual environments 10 | .venv 11 | -------------------------------------------------------------------------------- /.python-version: -------------------------------------------------------------------------------- 1 | 3.13.2 2 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2025 Siddharth Ahuja 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # BlenderMCP - Blender Model Context Protocol Integration 2 | 3 | BlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation. 4 | 5 | [Full tutorial](https://www.youtube.com/watch?v=lCyQ717DuzQ) 6 | 7 | ### Join the Community 8 | 9 | Give feedback, get inspired, and build on top of the MCP: [Discord](https://discord.gg/z5apgR8TFU) 10 | 11 | ### Supporters 12 | 13 | **Top supporters:** 14 | 15 | [CodeRabbit](https://www.coderabbit.ai/) 16 | 17 | **All supporters:** 18 | 19 | [Support this project](https://github.com/sponsors/ahujasid) 20 | 21 | ## Release notes (1.1.0) 22 | 23 | - Added support for Poly Haven assets through their API 24 | - Added support to prompt 3D models using Hyper3D Rodin 25 | - For newcomers, you can go straight to Installation. For existing users, see the points below 26 | - Download the latest addon.py file and replace the older one, then add it to Blender 27 | - Delete the MCP server from Claude and add it back again, and you should be good to go! 28 | 29 | ## Features 30 | 31 | - **Two-way communication**: Connect Claude AI to Blender through a socket-based server 32 | - **Object manipulation**: Create, modify, and delete 3D objects in Blender 33 | - **Material control**: Apply and modify materials and colors 34 | - **Scene inspection**: Get detailed information about the current Blender scene 35 | - **Code execution**: Run arbitrary Python code in Blender from Claude 36 | 37 | ## Components 38 | 39 | The system consists of two main components: 40 | 41 | 1. **Blender Addon (`addon.py`)**: A Blender addon that creates a socket server within Blender to receive and execute commands 42 | 2. **MCP Server (`src/blender_mcp/server.py`)**: A Python server that implements the Model Context Protocol and connects to the Blender addon 43 | 44 | ## Installation 45 | 46 | 47 | ### Prerequisites 48 | 49 | - Blender 3.0 or newer 50 | - Python 3.10 or newer 51 | - uv package manager: 52 | 53 | **If you're on Mac, please install uv as** 54 | ```bash 55 | brew install uv 56 | ``` 57 | **On Windows** 58 | ```bash 59 | powershell -c "irm https://astral.sh/uv/install.ps1 | iex" 60 | ``` 61 | and then 62 | ```bash 63 | set Path=C:\Users\nntra\.local\bin;%Path% 64 | ``` 65 | 66 | Otherwise installation instructions are on their website: [Install uv](https://docs.astral.sh/uv/getting-started/installation/) 67 | 68 | **⚠️ Do not proceed before installing UV** 69 | 70 | 71 | ### Claude for Desktop Integration 72 | 73 | [Watch the setup instruction video](https://www.youtube.com/watch?v=neoK_WMq92g) (Assuming you have already installed uv) 74 | 75 | Go to Claude > Settings > Developer > Edit Config > claude_desktop_config.json to include the following: 76 | 77 | ```json 78 | { 79 | "mcpServers": { 80 | "blender": { 81 | "command": "uvx", 82 | "args": [ 83 | "blender-mcp" 84 | ] 85 | } 86 | } 87 | } 88 | ``` 89 | 90 | ### Cursor integration 91 | 92 | For Mac users, go to Settings > MCP and paste the following 93 | 94 | - To use as a global server, use "add new global MCP server" button and paste 95 | - To use as a project specific server, create `.cursor/mcp.json` in the root of the project and paste 96 | 97 | 98 | ```json 99 | { 100 | "mcpServers": { 101 | "blender": { 102 | "command": "uvx", 103 | "args": [ 104 | "blender-mcp" 105 | ] 106 | } 107 | } 108 | } 109 | ``` 110 | 111 | For Windows users, go to Settings > MCP > Add Server, add a new server with the following settings: 112 | 113 | ```json 114 | { 115 | "mcpServers": { 116 | "blender": { 117 | "command": "cmd", 118 | "args": [ 119 | "/c", 120 | "uvx", 121 | "blender-mcp" 122 | ] 123 | } 124 | } 125 | } 126 | ``` 127 | 128 | [Cursor setup video](https://www.youtube.com/watch?v=wgWsJshecac) 129 | 130 | **⚠️ Only run one instance of the MCP server (either on Cursor or Claude Desktop), not both** 131 | 132 | ### Installing the Blender Addon 133 | 134 | 1. Download the `addon.py` file from this repo 135 | 1. Open Blender 136 | 2. Go to Edit > Preferences > Add-ons 137 | 3. Click "Install..." and select the `addon.py` file 138 | 4. Enable the addon by checking the box next to "Interface: Blender MCP" 139 | 140 | 141 | ## Usage 142 | 143 | ### Starting the Connection 144 | ![BlenderMCP in the sidebar](assets/addon-instructions.png) 145 | 146 | 1. In Blender, go to the 3D View sidebar (press N if not visible) 147 | 2. Find the "BlenderMCP" tab 148 | 3. Turn on the Poly Haven checkbox if you want assets from their API (optional) 149 | 4. Click "Connect to Claude" 150 | 5. Make sure the MCP server is running in your terminal 151 | 152 | ### Using with Claude 153 | 154 | Once the config file has been set on Claude, and the addon is running on Blender, you will see a hammer icon with tools for the Blender MCP. 155 | 156 | ![BlenderMCP in the sidebar](assets/hammer-icon.png) 157 | 158 | #### Capabilities 159 | 160 | - Get scene and object information 161 | - Create, delete and modify shapes 162 | - Apply or create materials for objects 163 | - Execute any Python code in Blender 164 | - Download the right models, assets and HDRIs through [Poly Haven](https://polyhaven.com/) 165 | - AI generated 3D models through [Hyper3D Rodin](https://hyper3d.ai/) 166 | 167 | 168 | ### Example Commands 169 | 170 | Here are some examples of what you can ask Claude to do: 171 | 172 | - "Create a low poly scene in a dungeon, with a dragon guarding a pot of gold" [Demo](https://www.youtube.com/watch?v=DqgKuLYUv00) 173 | - "Create a beach vibe using HDRIs, textures, and models like rocks and vegetation from Poly Haven" [Demo](https://www.youtube.com/watch?v=I29rn92gkC4) 174 | - Give a reference image, and create a Blender scene out of it [Demo](https://www.youtube.com/watch?v=FDRb03XPiRo) 175 | - "Generate a 3D model of a garden gnome through Hyper3D" 176 | - "Get information about the current scene, and make a threejs sketch from it" [Demo](https://www.youtube.com/watch?v=jxbNI5L7AH8) 177 | - "Make this car red and metallic" 178 | - "Create a sphere and place it above the cube" 179 | - "Make the lighting like a studio" 180 | - "Point the camera at the scene, and make it isometric" 181 | 182 | ## Hyper3D integration 183 | 184 | Hyper3D's free trial key allows you to generate a limited number of models per day. If the daily limit is reached, you can wait for the next day's reset or obtain your own key from hyper3d.ai and fal.ai. 185 | 186 | ## Troubleshooting 187 | 188 | - **Connection issues**: Make sure the Blender addon server is running, and the MCP server is configured on Claude, DO NOT run the uvx command in the terminal. Sometimes, the first command won't go through but after that it starts working. 189 | - **Timeout errors**: Try simplifying your requests or breaking them into smaller steps 190 | - **Poly Haven integration**: Claude is sometimes erratic with its behaviour 191 | - **Have you tried turning it off and on again?**: If you're still having connection errors, try restarting both Claude and the Blender server 192 | 193 | 194 | ## Technical Details 195 | 196 | ### Communication Protocol 197 | 198 | The system uses a simple JSON-based protocol over TCP sockets: 199 | 200 | - **Commands** are sent as JSON objects with a `type` and optional `params` 201 | - **Responses** are JSON objects with a `status` and `result` or `message` 202 | 203 | ## Limitations & Security Considerations 204 | 205 | - The `execute_blender_code` tool allows running arbitrary Python code in Blender, which can be powerful but potentially dangerous. Use with caution in production environments. ALWAYS save your work before using it. 206 | - Poly Haven requires downloading models, textures, and HDRI images. If you do not want to use it, please turn it off in the checkbox in Blender. 207 | - Complex operations might need to be broken down into smaller steps 208 | 209 | 210 | ## Contributing 211 | 212 | Contributions are welcome! Please feel free to submit a Pull Request. 213 | 214 | ## Disclaimer 215 | 216 | This is a third-party integration and not made by Blender. Made by [Siddharth](https://x.com/sidahuj) 217 | -------------------------------------------------------------------------------- /addon.py: -------------------------------------------------------------------------------- 1 | # Code created by Siddharth Ahuja: www.github.com/ahujasid © 2025 2 | 3 | import bpy 4 | import mathutils 5 | import json 6 | import threading 7 | import socket 8 | import time 9 | import requests 10 | import tempfile 11 | import traceback 12 | import os 13 | import shutil 14 | from bpy.props import StringProperty, IntProperty, BoolProperty, EnumProperty 15 | import io 16 | from contextlib import redirect_stdout 17 | 18 | bl_info = { 19 | "name": "Blender MCP", 20 | "author": "BlenderMCP", 21 | "version": (1, 2), 22 | "blender": (3, 0, 0), 23 | "location": "View3D > Sidebar > BlenderMCP", 24 | "description": "Connect Blender to Claude via MCP", 25 | "category": "Interface", 26 | } 27 | 28 | RODIN_FREE_TRIAL_KEY = "k9TcfFoEhNd9cCPP2guHAHHHkctZHIRhZDywZ1euGUXwihbYLpOjQhofby80NJez" 29 | 30 | class BlenderMCPServer: 31 | def __init__(self, host='localhost', port=9876): 32 | self.host = host 33 | self.port = port 34 | self.running = False 35 | self.socket = None 36 | self.server_thread = None 37 | 38 | def start(self): 39 | if self.running: 40 | print("Server is already running") 41 | return 42 | 43 | self.running = True 44 | 45 | try: 46 | # Create socket 47 | self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) 48 | self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) 49 | self.socket.bind((self.host, self.port)) 50 | self.socket.listen(1) 51 | 52 | # Start server thread 53 | self.server_thread = threading.Thread(target=self._server_loop) 54 | self.server_thread.daemon = True 55 | self.server_thread.start() 56 | 57 | print(f"BlenderMCP server started on {self.host}:{self.port}") 58 | except Exception as e: 59 | print(f"Failed to start server: {str(e)}") 60 | self.stop() 61 | 62 | def stop(self): 63 | self.running = False 64 | 65 | # Close socket 66 | if self.socket: 67 | try: 68 | self.socket.close() 69 | except: 70 | pass 71 | self.socket = None 72 | 73 | # Wait for thread to finish 74 | if self.server_thread: 75 | try: 76 | if self.server_thread.is_alive(): 77 | self.server_thread.join(timeout=1.0) 78 | except: 79 | pass 80 | self.server_thread = None 81 | 82 | print("BlenderMCP server stopped") 83 | 84 | def _server_loop(self): 85 | """Main server loop in a separate thread""" 86 | print("Server thread started") 87 | self.socket.settimeout(1.0) # Timeout to allow for stopping 88 | 89 | while self.running: 90 | try: 91 | # Accept new connection 92 | try: 93 | client, address = self.socket.accept() 94 | print(f"Connected to client: {address}") 95 | 96 | # Handle client in a separate thread 97 | client_thread = threading.Thread( 98 | target=self._handle_client, 99 | args=(client,) 100 | ) 101 | client_thread.daemon = True 102 | client_thread.start() 103 | except socket.timeout: 104 | # Just check running condition 105 | continue 106 | except Exception as e: 107 | print(f"Error accepting connection: {str(e)}") 108 | time.sleep(0.5) 109 | except Exception as e: 110 | print(f"Error in server loop: {str(e)}") 111 | if not self.running: 112 | break 113 | time.sleep(0.5) 114 | 115 | print("Server thread stopped") 116 | 117 | def _handle_client(self, client): 118 | """Handle connected client""" 119 | print("Client handler started") 120 | client.settimeout(None) # No timeout 121 | buffer = b'' 122 | 123 | try: 124 | while self.running: 125 | # Receive data 126 | try: 127 | data = client.recv(8192) 128 | if not data: 129 | print("Client disconnected") 130 | break 131 | 132 | buffer += data 133 | try: 134 | # Try to parse command 135 | command = json.loads(buffer.decode('utf-8')) 136 | buffer = b'' 137 | 138 | # Execute command in Blender's main thread 139 | def execute_wrapper(): 140 | try: 141 | response = self.execute_command(command) 142 | response_json = json.dumps(response) 143 | try: 144 | client.sendall(response_json.encode('utf-8')) 145 | except: 146 | print("Failed to send response - client disconnected") 147 | except Exception as e: 148 | print(f"Error executing command: {str(e)}") 149 | traceback.print_exc() 150 | try: 151 | error_response = { 152 | "status": "error", 153 | "message": str(e) 154 | } 155 | client.sendall(json.dumps(error_response).encode('utf-8')) 156 | except: 157 | pass 158 | return None 159 | 160 | # Schedule execution in main thread 161 | bpy.app.timers.register(execute_wrapper, first_interval=0.0) 162 | except json.JSONDecodeError: 163 | # Incomplete data, wait for more 164 | pass 165 | except Exception as e: 166 | print(f"Error receiving data: {str(e)}") 167 | break 168 | except Exception as e: 169 | print(f"Error in client handler: {str(e)}") 170 | finally: 171 | try: 172 | client.close() 173 | except: 174 | pass 175 | print("Client handler stopped") 176 | 177 | def execute_command(self, command): 178 | """Execute a command in the main Blender thread""" 179 | try: 180 | return self._execute_command_internal(command) 181 | 182 | except Exception as e: 183 | print(f"Error executing command: {str(e)}") 184 | traceback.print_exc() 185 | return {"status": "error", "message": str(e)} 186 | 187 | def _execute_command_internal(self, command): 188 | """Internal command execution with proper context""" 189 | cmd_type = command.get("type") 190 | params = command.get("params", {}) 191 | 192 | # Add a handler for checking PolyHaven status 193 | if cmd_type == "get_polyhaven_status": 194 | return {"status": "success", "result": self.get_polyhaven_status()} 195 | 196 | # Base handlers that are always available 197 | handlers = { 198 | "get_scene_info": self.get_scene_info, 199 | "get_object_info": self.get_object_info, 200 | "execute_code": self.execute_code, 201 | "get_polyhaven_status": self.get_polyhaven_status, 202 | "get_hyper3d_status": self.get_hyper3d_status, 203 | } 204 | 205 | # Add Polyhaven handlers only if enabled 206 | if bpy.context.scene.blendermcp_use_polyhaven: 207 | polyhaven_handlers = { 208 | "get_polyhaven_categories": self.get_polyhaven_categories, 209 | "search_polyhaven_assets": self.search_polyhaven_assets, 210 | "download_polyhaven_asset": self.download_polyhaven_asset, 211 | "set_texture": self.set_texture, 212 | } 213 | handlers.update(polyhaven_handlers) 214 | 215 | # Add Hyper3d handlers only if enabled 216 | if bpy.context.scene.blendermcp_use_hyper3d: 217 | polyhaven_handlers = { 218 | "create_rodin_job": self.create_rodin_job, 219 | "poll_rodin_job_status": self.poll_rodin_job_status, 220 | "import_generated_asset": self.import_generated_asset, 221 | } 222 | handlers.update(polyhaven_handlers) 223 | 224 | handler = handlers.get(cmd_type) 225 | if handler: 226 | try: 227 | print(f"Executing handler for {cmd_type}") 228 | result = handler(**params) 229 | print(f"Handler execution complete") 230 | return {"status": "success", "result": result} 231 | except Exception as e: 232 | print(f"Error in handler: {str(e)}") 233 | traceback.print_exc() 234 | return {"status": "error", "message": str(e)} 235 | else: 236 | return {"status": "error", "message": f"Unknown command type: {cmd_type}"} 237 | 238 | 239 | 240 | def get_scene_info(self): 241 | """Get information about the current Blender scene""" 242 | try: 243 | print("Getting scene info...") 244 | # Simplify the scene info to reduce data size 245 | scene_info = { 246 | "name": bpy.context.scene.name, 247 | "object_count": len(bpy.context.scene.objects), 248 | "objects": [], 249 | "materials_count": len(bpy.data.materials), 250 | } 251 | 252 | # Collect minimal object information (limit to first 10 objects) 253 | for i, obj in enumerate(bpy.context.scene.objects): 254 | if i >= 10: # Reduced from 20 to 10 255 | break 256 | 257 | obj_info = { 258 | "name": obj.name, 259 | "type": obj.type, 260 | # Only include basic location data 261 | "location": [round(float(obj.location.x), 2), 262 | round(float(obj.location.y), 2), 263 | round(float(obj.location.z), 2)], 264 | } 265 | scene_info["objects"].append(obj_info) 266 | 267 | print(f"Scene info collected: {len(scene_info['objects'])} objects") 268 | return scene_info 269 | except Exception as e: 270 | print(f"Error in get_scene_info: {str(e)}") 271 | traceback.print_exc() 272 | return {"error": str(e)} 273 | 274 | @staticmethod 275 | def _get_aabb(obj): 276 | """ Returns the world-space axis-aligned bounding box (AABB) of an object. """ 277 | if obj.type != 'MESH': 278 | raise TypeError("Object must be a mesh") 279 | 280 | # Get the bounding box corners in local space 281 | local_bbox_corners = [mathutils.Vector(corner) for corner in obj.bound_box] 282 | 283 | # Convert to world coordinates 284 | world_bbox_corners = [obj.matrix_world @ corner for corner in local_bbox_corners] 285 | 286 | # Compute axis-aligned min/max coordinates 287 | min_corner = mathutils.Vector(map(min, zip(*world_bbox_corners))) 288 | max_corner = mathutils.Vector(map(max, zip(*world_bbox_corners))) 289 | 290 | return [ 291 | [*min_corner], [*max_corner] 292 | ] 293 | 294 | 295 | 296 | def get_object_info(self, name): 297 | """Get detailed information about a specific object""" 298 | obj = bpy.data.objects.get(name) 299 | if not obj: 300 | raise ValueError(f"Object not found: {name}") 301 | 302 | # Basic object info 303 | obj_info = { 304 | "name": obj.name, 305 | "type": obj.type, 306 | "location": [obj.location.x, obj.location.y, obj.location.z], 307 | "rotation": [obj.rotation_euler.x, obj.rotation_euler.y, obj.rotation_euler.z], 308 | "scale": [obj.scale.x, obj.scale.y, obj.scale.z], 309 | "visible": obj.visible_get(), 310 | "materials": [], 311 | } 312 | 313 | if obj.type == "MESH": 314 | bounding_box = self._get_aabb(obj) 315 | obj_info["world_bounding_box"] = bounding_box 316 | 317 | # Add material slots 318 | for slot in obj.material_slots: 319 | if slot.material: 320 | obj_info["materials"].append(slot.material.name) 321 | 322 | # Add mesh data if applicable 323 | if obj.type == 'MESH' and obj.data: 324 | mesh = obj.data 325 | obj_info["mesh"] = { 326 | "vertices": len(mesh.vertices), 327 | "edges": len(mesh.edges), 328 | "polygons": len(mesh.polygons), 329 | } 330 | 331 | return obj_info 332 | 333 | def execute_code(self, code): 334 | """Execute arbitrary Blender Python code""" 335 | # This is powerful but potentially dangerous - use with caution 336 | try: 337 | # Create a local namespace for execution 338 | namespace = {"bpy": bpy} 339 | 340 | # Capture stdout during execution, and return it as result 341 | capture_buffer = io.StringIO() 342 | with redirect_stdout(capture_buffer): 343 | exec(code, namespace) 344 | 345 | captured_output = capture_buffer.getvalue() 346 | return {"executed": True, "result": captured_output} 347 | except Exception as e: 348 | raise Exception(f"Code execution error: {str(e)}") 349 | 350 | 351 | 352 | def get_polyhaven_categories(self, asset_type): 353 | """Get categories for a specific asset type from Polyhaven""" 354 | try: 355 | if asset_type not in ["hdris", "textures", "models", "all"]: 356 | return {"error": f"Invalid asset type: {asset_type}. Must be one of: hdris, textures, models, all"} 357 | 358 | response = requests.get(f"https://api.polyhaven.com/categories/{asset_type}") 359 | if response.status_code == 200: 360 | return {"categories": response.json()} 361 | else: 362 | return {"error": f"API request failed with status code {response.status_code}"} 363 | except Exception as e: 364 | return {"error": str(e)} 365 | 366 | def search_polyhaven_assets(self, asset_type=None, categories=None): 367 | """Search for assets from Polyhaven with optional filtering""" 368 | try: 369 | url = "https://api.polyhaven.com/assets" 370 | params = {} 371 | 372 | if asset_type and asset_type != "all": 373 | if asset_type not in ["hdris", "textures", "models"]: 374 | return {"error": f"Invalid asset type: {asset_type}. Must be one of: hdris, textures, models, all"} 375 | params["type"] = asset_type 376 | 377 | if categories: 378 | params["categories"] = categories 379 | 380 | response = requests.get(url, params=params) 381 | if response.status_code == 200: 382 | # Limit the response size to avoid overwhelming Blender 383 | assets = response.json() 384 | # Return only the first 20 assets to keep response size manageable 385 | limited_assets = {} 386 | for i, (key, value) in enumerate(assets.items()): 387 | if i >= 20: # Limit to 20 assets 388 | break 389 | limited_assets[key] = value 390 | 391 | return {"assets": limited_assets, "total_count": len(assets), "returned_count": len(limited_assets)} 392 | else: 393 | return {"error": f"API request failed with status code {response.status_code}"} 394 | except Exception as e: 395 | return {"error": str(e)} 396 | 397 | def download_polyhaven_asset(self, asset_id, asset_type, resolution="1k", file_format=None): 398 | try: 399 | # First get the files information 400 | files_response = requests.get(f"https://api.polyhaven.com/files/{asset_id}") 401 | if files_response.status_code != 200: 402 | return {"error": f"Failed to get asset files: {files_response.status_code}"} 403 | 404 | files_data = files_response.json() 405 | 406 | # Handle different asset types 407 | if asset_type == "hdris": 408 | # For HDRIs, download the .hdr or .exr file 409 | if not file_format: 410 | file_format = "hdr" # Default format for HDRIs 411 | 412 | if "hdri" in files_data and resolution in files_data["hdri"] and file_format in files_data["hdri"][resolution]: 413 | file_info = files_data["hdri"][resolution][file_format] 414 | file_url = file_info["url"] 415 | 416 | # For HDRIs, we need to save to a temporary file first 417 | # since Blender can't properly load HDR data directly from memory 418 | with tempfile.NamedTemporaryFile(suffix=f".{file_format}", delete=False) as tmp_file: 419 | # Download the file 420 | response = requests.get(file_url) 421 | if response.status_code != 200: 422 | return {"error": f"Failed to download HDRI: {response.status_code}"} 423 | 424 | tmp_file.write(response.content) 425 | tmp_path = tmp_file.name 426 | 427 | try: 428 | # Create a new world if none exists 429 | if not bpy.data.worlds: 430 | bpy.data.worlds.new("World") 431 | 432 | world = bpy.data.worlds[0] 433 | world.use_nodes = True 434 | node_tree = world.node_tree 435 | 436 | # Clear existing nodes 437 | for node in node_tree.nodes: 438 | node_tree.nodes.remove(node) 439 | 440 | # Create nodes 441 | tex_coord = node_tree.nodes.new(type='ShaderNodeTexCoord') 442 | tex_coord.location = (-800, 0) 443 | 444 | mapping = node_tree.nodes.new(type='ShaderNodeMapping') 445 | mapping.location = (-600, 0) 446 | 447 | # Load the image from the temporary file 448 | env_tex = node_tree.nodes.new(type='ShaderNodeTexEnvironment') 449 | env_tex.location = (-400, 0) 450 | env_tex.image = bpy.data.images.load(tmp_path) 451 | 452 | # Use a color space that exists in all Blender versions 453 | if file_format.lower() == 'exr': 454 | # Try to use Linear color space for EXR files 455 | try: 456 | env_tex.image.colorspace_settings.name = 'Linear' 457 | except: 458 | # Fallback to Non-Color if Linear isn't available 459 | env_tex.image.colorspace_settings.name = 'Non-Color' 460 | else: # hdr 461 | # For HDR files, try these options in order 462 | for color_space in ['Linear', 'Linear Rec.709', 'Non-Color']: 463 | try: 464 | env_tex.image.colorspace_settings.name = color_space 465 | break # Stop if we successfully set a color space 466 | except: 467 | continue 468 | 469 | background = node_tree.nodes.new(type='ShaderNodeBackground') 470 | background.location = (-200, 0) 471 | 472 | output = node_tree.nodes.new(type='ShaderNodeOutputWorld') 473 | output.location = (0, 0) 474 | 475 | # Connect nodes 476 | node_tree.links.new(tex_coord.outputs['Generated'], mapping.inputs['Vector']) 477 | node_tree.links.new(mapping.outputs['Vector'], env_tex.inputs['Vector']) 478 | node_tree.links.new(env_tex.outputs['Color'], background.inputs['Color']) 479 | node_tree.links.new(background.outputs['Background'], output.inputs['Surface']) 480 | 481 | # Set as active world 482 | bpy.context.scene.world = world 483 | 484 | # Clean up temporary file 485 | try: 486 | tempfile._cleanup() # This will clean up all temporary files 487 | except: 488 | pass 489 | 490 | return { 491 | "success": True, 492 | "message": f"HDRI {asset_id} imported successfully", 493 | "image_name": env_tex.image.name 494 | } 495 | except Exception as e: 496 | return {"error": f"Failed to set up HDRI in Blender: {str(e)}"} 497 | else: 498 | return {"error": f"Requested resolution or format not available for this HDRI"} 499 | 500 | elif asset_type == "textures": 501 | if not file_format: 502 | file_format = "jpg" # Default format for textures 503 | 504 | downloaded_maps = {} 505 | 506 | try: 507 | for map_type in files_data: 508 | if map_type not in ["blend", "gltf"]: # Skip non-texture files 509 | if resolution in files_data[map_type] and file_format in files_data[map_type][resolution]: 510 | file_info = files_data[map_type][resolution][file_format] 511 | file_url = file_info["url"] 512 | 513 | # Use NamedTemporaryFile like we do for HDRIs 514 | with tempfile.NamedTemporaryFile(suffix=f".{file_format}", delete=False) as tmp_file: 515 | # Download the file 516 | response = requests.get(file_url) 517 | if response.status_code == 200: 518 | tmp_file.write(response.content) 519 | tmp_path = tmp_file.name 520 | 521 | # Load image from temporary file 522 | image = bpy.data.images.load(tmp_path) 523 | image.name = f"{asset_id}_{map_type}.{file_format}" 524 | 525 | # Pack the image into .blend file 526 | image.pack() 527 | 528 | # Set color space based on map type 529 | if map_type in ['color', 'diffuse', 'albedo']: 530 | try: 531 | image.colorspace_settings.name = 'sRGB' 532 | except: 533 | pass 534 | else: 535 | try: 536 | image.colorspace_settings.name = 'Non-Color' 537 | except: 538 | pass 539 | 540 | downloaded_maps[map_type] = image 541 | 542 | # Clean up temporary file 543 | try: 544 | os.unlink(tmp_path) 545 | except: 546 | pass 547 | 548 | if not downloaded_maps: 549 | return {"error": f"No texture maps found for the requested resolution and format"} 550 | 551 | # Create a new material with the downloaded textures 552 | mat = bpy.data.materials.new(name=asset_id) 553 | mat.use_nodes = True 554 | nodes = mat.node_tree.nodes 555 | links = mat.node_tree.links 556 | 557 | # Clear default nodes 558 | for node in nodes: 559 | nodes.remove(node) 560 | 561 | # Create output node 562 | output = nodes.new(type='ShaderNodeOutputMaterial') 563 | output.location = (300, 0) 564 | 565 | # Create principled BSDF node 566 | principled = nodes.new(type='ShaderNodeBsdfPrincipled') 567 | principled.location = (0, 0) 568 | links.new(principled.outputs[0], output.inputs[0]) 569 | 570 | # Add texture nodes based on available maps 571 | tex_coord = nodes.new(type='ShaderNodeTexCoord') 572 | tex_coord.location = (-800, 0) 573 | 574 | mapping = nodes.new(type='ShaderNodeMapping') 575 | mapping.location = (-600, 0) 576 | mapping.vector_type = 'TEXTURE' # Changed from default 'POINT' to 'TEXTURE' 577 | links.new(tex_coord.outputs['UV'], mapping.inputs['Vector']) 578 | 579 | # Position offset for texture nodes 580 | x_pos = -400 581 | y_pos = 300 582 | 583 | # Connect different texture maps 584 | for map_type, image in downloaded_maps.items(): 585 | tex_node = nodes.new(type='ShaderNodeTexImage') 586 | tex_node.location = (x_pos, y_pos) 587 | tex_node.image = image 588 | 589 | # Set color space based on map type 590 | if map_type.lower() in ['color', 'diffuse', 'albedo']: 591 | try: 592 | tex_node.image.colorspace_settings.name = 'sRGB' 593 | except: 594 | pass # Use default if sRGB not available 595 | else: 596 | try: 597 | tex_node.image.colorspace_settings.name = 'Non-Color' 598 | except: 599 | pass # Use default if Non-Color not available 600 | 601 | links.new(mapping.outputs['Vector'], tex_node.inputs['Vector']) 602 | 603 | # Connect to appropriate input on Principled BSDF 604 | if map_type.lower() in ['color', 'diffuse', 'albedo']: 605 | links.new(tex_node.outputs['Color'], principled.inputs['Base Color']) 606 | elif map_type.lower() in ['roughness', 'rough']: 607 | links.new(tex_node.outputs['Color'], principled.inputs['Roughness']) 608 | elif map_type.lower() in ['metallic', 'metalness', 'metal']: 609 | links.new(tex_node.outputs['Color'], principled.inputs['Metallic']) 610 | elif map_type.lower() in ['normal', 'nor']: 611 | # Add normal map node 612 | normal_map = nodes.new(type='ShaderNodeNormalMap') 613 | normal_map.location = (x_pos + 200, y_pos) 614 | links.new(tex_node.outputs['Color'], normal_map.inputs['Color']) 615 | links.new(normal_map.outputs['Normal'], principled.inputs['Normal']) 616 | elif map_type in ['displacement', 'disp', 'height']: 617 | # Add displacement node 618 | disp_node = nodes.new(type='ShaderNodeDisplacement') 619 | disp_node.location = (x_pos + 200, y_pos - 200) 620 | links.new(tex_node.outputs['Color'], disp_node.inputs['Height']) 621 | links.new(disp_node.outputs['Displacement'], output.inputs['Displacement']) 622 | 623 | y_pos -= 250 624 | 625 | return { 626 | "success": True, 627 | "message": f"Texture {asset_id} imported as material", 628 | "material": mat.name, 629 | "maps": list(downloaded_maps.keys()) 630 | } 631 | 632 | except Exception as e: 633 | return {"error": f"Failed to process textures: {str(e)}"} 634 | 635 | elif asset_type == "models": 636 | # For models, prefer glTF format if available 637 | if not file_format: 638 | file_format = "gltf" # Default format for models 639 | 640 | if file_format in files_data and resolution in files_data[file_format]: 641 | file_info = files_data[file_format][resolution][file_format] 642 | file_url = file_info["url"] 643 | 644 | # Create a temporary directory to store the model and its dependencies 645 | temp_dir = tempfile.mkdtemp() 646 | main_file_path = "" 647 | 648 | try: 649 | # Download the main model file 650 | main_file_name = file_url.split("/")[-1] 651 | main_file_path = os.path.join(temp_dir, main_file_name) 652 | 653 | response = requests.get(file_url) 654 | if response.status_code != 200: 655 | return {"error": f"Failed to download model: {response.status_code}"} 656 | 657 | with open(main_file_path, "wb") as f: 658 | f.write(response.content) 659 | 660 | # Check for included files and download them 661 | if "include" in file_info and file_info["include"]: 662 | for include_path, include_info in file_info["include"].items(): 663 | # Get the URL for the included file - this is the fix 664 | include_url = include_info["url"] 665 | 666 | # Create the directory structure for the included file 667 | include_file_path = os.path.join(temp_dir, include_path) 668 | os.makedirs(os.path.dirname(include_file_path), exist_ok=True) 669 | 670 | # Download the included file 671 | include_response = requests.get(include_url) 672 | if include_response.status_code == 200: 673 | with open(include_file_path, "wb") as f: 674 | f.write(include_response.content) 675 | else: 676 | print(f"Failed to download included file: {include_path}") 677 | 678 | # Import the model into Blender 679 | if file_format == "gltf" or file_format == "glb": 680 | bpy.ops.import_scene.gltf(filepath=main_file_path) 681 | elif file_format == "fbx": 682 | bpy.ops.import_scene.fbx(filepath=main_file_path) 683 | elif file_format == "obj": 684 | bpy.ops.import_scene.obj(filepath=main_file_path) 685 | elif file_format == "blend": 686 | # For blend files, we need to append or link 687 | with bpy.data.libraries.load(main_file_path, link=False) as (data_from, data_to): 688 | data_to.objects = data_from.objects 689 | 690 | # Link the objects to the scene 691 | for obj in data_to.objects: 692 | if obj is not None: 693 | bpy.context.collection.objects.link(obj) 694 | else: 695 | return {"error": f"Unsupported model format: {file_format}"} 696 | 697 | # Get the names of imported objects 698 | imported_objects = [obj.name for obj in bpy.context.selected_objects] 699 | 700 | return { 701 | "success": True, 702 | "message": f"Model {asset_id} imported successfully", 703 | "imported_objects": imported_objects 704 | } 705 | except Exception as e: 706 | return {"error": f"Failed to import model: {str(e)}"} 707 | finally: 708 | # Clean up temporary directory 709 | try: 710 | shutil.rmtree(temp_dir) 711 | except: 712 | print(f"Failed to clean up temporary directory: {temp_dir}") 713 | else: 714 | return {"error": f"Requested format or resolution not available for this model"} 715 | 716 | else: 717 | return {"error": f"Unsupported asset type: {asset_type}"} 718 | 719 | except Exception as e: 720 | return {"error": f"Failed to download asset: {str(e)}"} 721 | 722 | def set_texture(self, object_name, texture_id): 723 | """Apply a previously downloaded Polyhaven texture to an object by creating a new material""" 724 | try: 725 | # Get the object 726 | obj = bpy.data.objects.get(object_name) 727 | if not obj: 728 | return {"error": f"Object not found: {object_name}"} 729 | 730 | # Make sure object can accept materials 731 | if not hasattr(obj, 'data') or not hasattr(obj.data, 'materials'): 732 | return {"error": f"Object {object_name} cannot accept materials"} 733 | 734 | # Find all images related to this texture and ensure they're properly loaded 735 | texture_images = {} 736 | for img in bpy.data.images: 737 | if img.name.startswith(texture_id + "_"): 738 | # Extract the map type from the image name 739 | map_type = img.name.split('_')[-1].split('.')[0] 740 | 741 | # Force a reload of the image 742 | img.reload() 743 | 744 | # Ensure proper color space 745 | if map_type.lower() in ['color', 'diffuse', 'albedo']: 746 | try: 747 | img.colorspace_settings.name = 'sRGB' 748 | except: 749 | pass 750 | else: 751 | try: 752 | img.colorspace_settings.name = 'Non-Color' 753 | except: 754 | pass 755 | 756 | # Ensure the image is packed 757 | if not img.packed_file: 758 | img.pack() 759 | 760 | texture_images[map_type] = img 761 | print(f"Loaded texture map: {map_type} - {img.name}") 762 | 763 | # Debug info 764 | print(f"Image size: {img.size[0]}x{img.size[1]}") 765 | print(f"Color space: {img.colorspace_settings.name}") 766 | print(f"File format: {img.file_format}") 767 | print(f"Is packed: {bool(img.packed_file)}") 768 | 769 | if not texture_images: 770 | return {"error": f"No texture images found for: {texture_id}. Please download the texture first."} 771 | 772 | # Create a new material 773 | new_mat_name = f"{texture_id}_material_{object_name}" 774 | 775 | # Remove any existing material with this name to avoid conflicts 776 | existing_mat = bpy.data.materials.get(new_mat_name) 777 | if existing_mat: 778 | bpy.data.materials.remove(existing_mat) 779 | 780 | new_mat = bpy.data.materials.new(name=new_mat_name) 781 | new_mat.use_nodes = True 782 | 783 | # Set up the material nodes 784 | nodes = new_mat.node_tree.nodes 785 | links = new_mat.node_tree.links 786 | 787 | # Clear default nodes 788 | nodes.clear() 789 | 790 | # Create output node 791 | output = nodes.new(type='ShaderNodeOutputMaterial') 792 | output.location = (600, 0) 793 | 794 | # Create principled BSDF node 795 | principled = nodes.new(type='ShaderNodeBsdfPrincipled') 796 | principled.location = (300, 0) 797 | links.new(principled.outputs[0], output.inputs[0]) 798 | 799 | # Add texture nodes based on available maps 800 | tex_coord = nodes.new(type='ShaderNodeTexCoord') 801 | tex_coord.location = (-800, 0) 802 | 803 | mapping = nodes.new(type='ShaderNodeMapping') 804 | mapping.location = (-600, 0) 805 | mapping.vector_type = 'TEXTURE' # Changed from default 'POINT' to 'TEXTURE' 806 | links.new(tex_coord.outputs['UV'], mapping.inputs['Vector']) 807 | 808 | # Position offset for texture nodes 809 | x_pos = -400 810 | y_pos = 300 811 | 812 | # Connect different texture maps 813 | for map_type, image in texture_images.items(): 814 | tex_node = nodes.new(type='ShaderNodeTexImage') 815 | tex_node.location = (x_pos, y_pos) 816 | tex_node.image = image 817 | 818 | # Set color space based on map type 819 | if map_type.lower() in ['color', 'diffuse', 'albedo']: 820 | try: 821 | tex_node.image.colorspace_settings.name = 'sRGB' 822 | except: 823 | pass # Use default if sRGB not available 824 | else: 825 | try: 826 | tex_node.image.colorspace_settings.name = 'Non-Color' 827 | except: 828 | pass # Use default if Non-Color not available 829 | 830 | links.new(mapping.outputs['Vector'], tex_node.inputs['Vector']) 831 | 832 | # Connect to appropriate input on Principled BSDF 833 | if map_type.lower() in ['color', 'diffuse', 'albedo']: 834 | links.new(tex_node.outputs['Color'], principled.inputs['Base Color']) 835 | elif map_type.lower() in ['roughness', 'rough']: 836 | links.new(tex_node.outputs['Color'], principled.inputs['Roughness']) 837 | elif map_type.lower() in ['metallic', 'metalness', 'metal']: 838 | links.new(tex_node.outputs['Color'], principled.inputs['Metallic']) 839 | elif map_type.lower() in ['normal', 'nor', 'dx', 'gl']: 840 | # Add normal map node 841 | normal_map = nodes.new(type='ShaderNodeNormalMap') 842 | normal_map.location = (x_pos + 200, y_pos) 843 | links.new(tex_node.outputs['Color'], normal_map.inputs['Color']) 844 | links.new(normal_map.outputs['Normal'], principled.inputs['Normal']) 845 | elif map_type.lower() in ['displacement', 'disp', 'height']: 846 | # Add displacement node 847 | disp_node = nodes.new(type='ShaderNodeDisplacement') 848 | disp_node.location = (x_pos + 200, y_pos - 200) 849 | disp_node.inputs['Scale'].default_value = 0.1 # Reduce displacement strength 850 | links.new(tex_node.outputs['Color'], disp_node.inputs['Height']) 851 | links.new(disp_node.outputs['Displacement'], output.inputs['Displacement']) 852 | 853 | y_pos -= 250 854 | 855 | # Second pass: Connect nodes with proper handling for special cases 856 | texture_nodes = {} 857 | 858 | # First find all texture nodes and store them by map type 859 | for node in nodes: 860 | if node.type == 'TEX_IMAGE' and node.image: 861 | for map_type, image in texture_images.items(): 862 | if node.image == image: 863 | texture_nodes[map_type] = node 864 | break 865 | 866 | # Now connect everything using the nodes instead of images 867 | # Handle base color (diffuse) 868 | for map_name in ['color', 'diffuse', 'albedo']: 869 | if map_name in texture_nodes: 870 | links.new(texture_nodes[map_name].outputs['Color'], principled.inputs['Base Color']) 871 | print(f"Connected {map_name} to Base Color") 872 | break 873 | 874 | # Handle roughness 875 | for map_name in ['roughness', 'rough']: 876 | if map_name in texture_nodes: 877 | links.new(texture_nodes[map_name].outputs['Color'], principled.inputs['Roughness']) 878 | print(f"Connected {map_name} to Roughness") 879 | break 880 | 881 | # Handle metallic 882 | for map_name in ['metallic', 'metalness', 'metal']: 883 | if map_name in texture_nodes: 884 | links.new(texture_nodes[map_name].outputs['Color'], principled.inputs['Metallic']) 885 | print(f"Connected {map_name} to Metallic") 886 | break 887 | 888 | # Handle normal maps 889 | for map_name in ['gl', 'dx', 'nor']: 890 | if map_name in texture_nodes: 891 | normal_map_node = nodes.new(type='ShaderNodeNormalMap') 892 | normal_map_node.location = (100, 100) 893 | links.new(texture_nodes[map_name].outputs['Color'], normal_map_node.inputs['Color']) 894 | links.new(normal_map_node.outputs['Normal'], principled.inputs['Normal']) 895 | print(f"Connected {map_name} to Normal") 896 | break 897 | 898 | # Handle displacement 899 | for map_name in ['displacement', 'disp', 'height']: 900 | if map_name in texture_nodes: 901 | disp_node = nodes.new(type='ShaderNodeDisplacement') 902 | disp_node.location = (300, -200) 903 | disp_node.inputs['Scale'].default_value = 0.1 # Reduce displacement strength 904 | links.new(texture_nodes[map_name].outputs['Color'], disp_node.inputs['Height']) 905 | links.new(disp_node.outputs['Displacement'], output.inputs['Displacement']) 906 | print(f"Connected {map_name} to Displacement") 907 | break 908 | 909 | # Handle ARM texture (Ambient Occlusion, Roughness, Metallic) 910 | if 'arm' in texture_nodes: 911 | separate_rgb = nodes.new(type='ShaderNodeSeparateRGB') 912 | separate_rgb.location = (-200, -100) 913 | links.new(texture_nodes['arm'].outputs['Color'], separate_rgb.inputs['Image']) 914 | 915 | # Connect Roughness (G) if no dedicated roughness map 916 | if not any(map_name in texture_nodes for map_name in ['roughness', 'rough']): 917 | links.new(separate_rgb.outputs['G'], principled.inputs['Roughness']) 918 | print("Connected ARM.G to Roughness") 919 | 920 | # Connect Metallic (B) if no dedicated metallic map 921 | if not any(map_name in texture_nodes for map_name in ['metallic', 'metalness', 'metal']): 922 | links.new(separate_rgb.outputs['B'], principled.inputs['Metallic']) 923 | print("Connected ARM.B to Metallic") 924 | 925 | # For AO (R channel), multiply with base color if we have one 926 | base_color_node = None 927 | for map_name in ['color', 'diffuse', 'albedo']: 928 | if map_name in texture_nodes: 929 | base_color_node = texture_nodes[map_name] 930 | break 931 | 932 | if base_color_node: 933 | mix_node = nodes.new(type='ShaderNodeMixRGB') 934 | mix_node.location = (100, 200) 935 | mix_node.blend_type = 'MULTIPLY' 936 | mix_node.inputs['Fac'].default_value = 0.8 # 80% influence 937 | 938 | # Disconnect direct connection to base color 939 | for link in base_color_node.outputs['Color'].links: 940 | if link.to_socket == principled.inputs['Base Color']: 941 | links.remove(link) 942 | 943 | # Connect through the mix node 944 | links.new(base_color_node.outputs['Color'], mix_node.inputs[1]) 945 | links.new(separate_rgb.outputs['R'], mix_node.inputs[2]) 946 | links.new(mix_node.outputs['Color'], principled.inputs['Base Color']) 947 | print("Connected ARM.R to AO mix with Base Color") 948 | 949 | # Handle AO (Ambient Occlusion) if separate 950 | if 'ao' in texture_nodes: 951 | base_color_node = None 952 | for map_name in ['color', 'diffuse', 'albedo']: 953 | if map_name in texture_nodes: 954 | base_color_node = texture_nodes[map_name] 955 | break 956 | 957 | if base_color_node: 958 | mix_node = nodes.new(type='ShaderNodeMixRGB') 959 | mix_node.location = (100, 200) 960 | mix_node.blend_type = 'MULTIPLY' 961 | mix_node.inputs['Fac'].default_value = 0.8 # 80% influence 962 | 963 | # Disconnect direct connection to base color 964 | for link in base_color_node.outputs['Color'].links: 965 | if link.to_socket == principled.inputs['Base Color']: 966 | links.remove(link) 967 | 968 | # Connect through the mix node 969 | links.new(base_color_node.outputs['Color'], mix_node.inputs[1]) 970 | links.new(texture_nodes['ao'].outputs['Color'], mix_node.inputs[2]) 971 | links.new(mix_node.outputs['Color'], principled.inputs['Base Color']) 972 | print("Connected AO to mix with Base Color") 973 | 974 | # CRITICAL: Make sure to clear all existing materials from the object 975 | while len(obj.data.materials) > 0: 976 | obj.data.materials.pop(index=0) 977 | 978 | # Assign the new material to the object 979 | obj.data.materials.append(new_mat) 980 | 981 | # CRITICAL: Make the object active and select it 982 | bpy.context.view_layer.objects.active = obj 983 | obj.select_set(True) 984 | 985 | # CRITICAL: Force Blender to update the material 986 | bpy.context.view_layer.update() 987 | 988 | # Get the list of texture maps 989 | texture_maps = list(texture_images.keys()) 990 | 991 | # Get info about texture nodes for debugging 992 | material_info = { 993 | "name": new_mat.name, 994 | "has_nodes": new_mat.use_nodes, 995 | "node_count": len(new_mat.node_tree.nodes), 996 | "texture_nodes": [] 997 | } 998 | 999 | for node in new_mat.node_tree.nodes: 1000 | if node.type == 'TEX_IMAGE' and node.image: 1001 | connections = [] 1002 | for output in node.outputs: 1003 | for link in output.links: 1004 | connections.append(f"{output.name} → {link.to_node.name}.{link.to_socket.name}") 1005 | 1006 | material_info["texture_nodes"].append({ 1007 | "name": node.name, 1008 | "image": node.image.name, 1009 | "colorspace": node.image.colorspace_settings.name, 1010 | "connections": connections 1011 | }) 1012 | 1013 | return { 1014 | "success": True, 1015 | "message": f"Created new material and applied texture {texture_id} to {object_name}", 1016 | "material": new_mat.name, 1017 | "maps": texture_maps, 1018 | "material_info": material_info 1019 | } 1020 | 1021 | except Exception as e: 1022 | print(f"Error in set_texture: {str(e)}") 1023 | traceback.print_exc() 1024 | return {"error": f"Failed to apply texture: {str(e)}"} 1025 | 1026 | def get_polyhaven_status(self): 1027 | """Get the current status of PolyHaven integration""" 1028 | enabled = bpy.context.scene.blendermcp_use_polyhaven 1029 | if enabled: 1030 | return {"enabled": True, "message": "PolyHaven integration is enabled and ready to use."} 1031 | else: 1032 | return { 1033 | "enabled": False, 1034 | "message": """PolyHaven integration is currently disabled. To enable it: 1035 | 1. In the 3D Viewport, find the BlenderMCP panel in the sidebar (press N if hidden) 1036 | 2. Check the 'Use assets from Poly Haven' checkbox 1037 | 3. Restart the connection to Claude""" 1038 | } 1039 | 1040 | #region Hyper3D 1041 | def get_hyper3d_status(self): 1042 | """Get the current status of Hyper3D Rodin integration""" 1043 | enabled = bpy.context.scene.blendermcp_use_hyper3d 1044 | if enabled: 1045 | if not bpy.context.scene.blendermcp_hyper3d_api_key: 1046 | return { 1047 | "enabled": False, 1048 | "message": """Hyper3D Rodin integration is currently enabled, but API key is not given. To enable it: 1049 | 1. In the 3D Viewport, find the BlenderMCP panel in the sidebar (press N if hidden) 1050 | 2. Keep the 'Use Hyper3D Rodin 3D model generation' checkbox checked 1051 | 3. Choose the right plaform and fill in the API Key 1052 | 4. Restart the connection to Claude""" 1053 | } 1054 | mode = bpy.context.scene.blendermcp_hyper3d_mode 1055 | message = f"Hyper3D Rodin integration is enabled and ready to use. Mode: {mode}. " + \ 1056 | f"Key type: {'private' if bpy.context.scene.blendermcp_hyper3d_api_key != RODIN_FREE_TRIAL_KEY else 'free_trial'}" 1057 | return { 1058 | "enabled": True, 1059 | "message": message 1060 | } 1061 | else: 1062 | return { 1063 | "enabled": False, 1064 | "message": """Hyper3D Rodin integration is currently disabled. To enable it: 1065 | 1. In the 3D Viewport, find the BlenderMCP panel in the sidebar (press N if hidden) 1066 | 2. Check the 'Use Hyper3D Rodin 3D model generation' checkbox 1067 | 3. Restart the connection to Claude""" 1068 | } 1069 | 1070 | def create_rodin_job(self, *args, **kwargs): 1071 | match bpy.context.scene.blendermcp_hyper3d_mode: 1072 | case "MAIN_SITE": 1073 | return self.create_rodin_job_main_site(*args, **kwargs) 1074 | case "FAL_AI": 1075 | return self.create_rodin_job_fal_ai(*args, **kwargs) 1076 | case _: 1077 | return f"Error: Unknown Hyper3D Rodin mode!" 1078 | 1079 | def create_rodin_job_main_site( 1080 | self, 1081 | text_prompt: str=None, 1082 | images: list[tuple[str, str]]=None, 1083 | bbox_condition=None 1084 | ): 1085 | try: 1086 | if images is None: 1087 | images = [] 1088 | """Call Rodin API, get the job uuid and subscription key""" 1089 | files = [ 1090 | *[("images", (f"{i:04d}{img_suffix}", img)) for i, (img_suffix, img) in enumerate(images)], 1091 | ("tier", (None, "Sketch")), 1092 | ("mesh_mode", (None, "Raw")), 1093 | ] 1094 | if text_prompt: 1095 | files.append(("prompt", (None, text_prompt))) 1096 | if bbox_condition: 1097 | files.append(("bbox_condition", (None, json.dumps(bbox_condition)))) 1098 | response = requests.post( 1099 | "https://hyperhuman.deemos.com/api/v2/rodin", 1100 | headers={ 1101 | "Authorization": f"Bearer {bpy.context.scene.blendermcp_hyper3d_api_key}", 1102 | }, 1103 | files=files 1104 | ) 1105 | data = response.json() 1106 | return data 1107 | except Exception as e: 1108 | return {"error": str(e)} 1109 | 1110 | def create_rodin_job_fal_ai( 1111 | self, 1112 | text_prompt: str=None, 1113 | images: list[tuple[str, str]]=None, 1114 | bbox_condition=None 1115 | ): 1116 | try: 1117 | req_data = { 1118 | "tier": "Sketch", 1119 | } 1120 | if images: 1121 | req_data["input_image_urls"] = images 1122 | if text_prompt: 1123 | req_data["prompt"] = text_prompt 1124 | if bbox_condition: 1125 | req_data["bbox_condition"] = bbox_condition 1126 | response = requests.post( 1127 | "https://queue.fal.run/fal-ai/hyper3d/rodin", 1128 | headers={ 1129 | "Authorization": f"Key {bpy.context.scene.blendermcp_hyper3d_api_key}", 1130 | "Content-Type": "application/json", 1131 | }, 1132 | json=req_data 1133 | ) 1134 | data = response.json() 1135 | return data 1136 | except Exception as e: 1137 | return {"error": str(e)} 1138 | 1139 | def poll_rodin_job_status(self, *args, **kwargs): 1140 | match bpy.context.scene.blendermcp_hyper3d_mode: 1141 | case "MAIN_SITE": 1142 | return self.poll_rodin_job_status_main_site(*args, **kwargs) 1143 | case "FAL_AI": 1144 | return self.poll_rodin_job_status_fal_ai(*args, **kwargs) 1145 | case _: 1146 | return f"Error: Unknown Hyper3D Rodin mode!" 1147 | 1148 | def poll_rodin_job_status_main_site(self, subscription_key: str): 1149 | """Call the job status API to get the job status""" 1150 | response = requests.post( 1151 | "https://hyperhuman.deemos.com/api/v2/status", 1152 | headers={ 1153 | "Authorization": f"Bearer {bpy.context.scene.blendermcp_hyper3d_api_key}", 1154 | }, 1155 | json={ 1156 | "subscription_key": subscription_key, 1157 | }, 1158 | ) 1159 | data = response.json() 1160 | return { 1161 | "status_list": [i["status"] for i in data["jobs"]] 1162 | } 1163 | 1164 | def poll_rodin_job_status_fal_ai(self, request_id: str): 1165 | """Call the job status API to get the job status""" 1166 | response = requests.get( 1167 | f"https://queue.fal.run/fal-ai/hyper3d/requests/{request_id}/status", 1168 | headers={ 1169 | "Authorization": f"KEY {bpy.context.scene.blendermcp_hyper3d_api_key}", 1170 | }, 1171 | ) 1172 | data = response.json() 1173 | return data 1174 | 1175 | @staticmethod 1176 | def _clean_imported_glb(filepath, mesh_name=None): 1177 | # Get the set of existing objects before import 1178 | existing_objects = set(bpy.data.objects) 1179 | 1180 | # Import the GLB file 1181 | bpy.ops.import_scene.gltf(filepath=filepath) 1182 | 1183 | # Ensure the context is updated 1184 | bpy.context.view_layer.update() 1185 | 1186 | # Get all imported objects 1187 | imported_objects = list(set(bpy.data.objects) - existing_objects) 1188 | # imported_objects = [obj for obj in bpy.context.view_layer.objects if obj.select_get()] 1189 | 1190 | if not imported_objects: 1191 | print("Error: No objects were imported.") 1192 | return 1193 | 1194 | # Identify the mesh object 1195 | mesh_obj = None 1196 | 1197 | if len(imported_objects) == 1 and imported_objects[0].type == 'MESH': 1198 | mesh_obj = imported_objects[0] 1199 | print("Single mesh imported, no cleanup needed.") 1200 | else: 1201 | if len(imported_objects) == 2: 1202 | empty_objs = [i for i in imported_objects if i.type == "EMPTY"] 1203 | if len(empty_objs) != 1: 1204 | print("Error: Expected an empty node with one mesh child or a single mesh object.") 1205 | return 1206 | parent_obj = empty_objs.pop() 1207 | if len(parent_obj.children) == 1: 1208 | potential_mesh = parent_obj.children[0] 1209 | if potential_mesh.type == 'MESH': 1210 | print("GLB structure confirmed: Empty node with one mesh child.") 1211 | 1212 | # Unparent the mesh from the empty node 1213 | potential_mesh.parent = None 1214 | 1215 | # Remove the empty node 1216 | bpy.data.objects.remove(parent_obj) 1217 | print("Removed empty node, keeping only the mesh.") 1218 | 1219 | mesh_obj = potential_mesh 1220 | else: 1221 | print("Error: Child is not a mesh object.") 1222 | return 1223 | else: 1224 | print("Error: Expected an empty node with one mesh child or a single mesh object.") 1225 | return 1226 | else: 1227 | print("Error: Expected an empty node with one mesh child or a single mesh object.") 1228 | return 1229 | 1230 | # Rename the mesh if needed 1231 | try: 1232 | if mesh_obj and mesh_obj.name is not None and mesh_name: 1233 | mesh_obj.name = mesh_name 1234 | if mesh_obj.data.name is not None: 1235 | mesh_obj.data.name = mesh_name 1236 | print(f"Mesh renamed to: {mesh_name}") 1237 | except Exception as e: 1238 | print("Having issue with renaming, give up renaming.") 1239 | 1240 | return mesh_obj 1241 | 1242 | def import_generated_asset(self, *args, **kwargs): 1243 | match bpy.context.scene.blendermcp_hyper3d_mode: 1244 | case "MAIN_SITE": 1245 | return self.import_generated_asset_main_site(*args, **kwargs) 1246 | case "FAL_AI": 1247 | return self.import_generated_asset_fal_ai(*args, **kwargs) 1248 | case _: 1249 | return f"Error: Unknown Hyper3D Rodin mode!" 1250 | 1251 | def import_generated_asset_main_site(self, task_uuid: str, name: str): 1252 | """Fetch the generated asset, import into blender""" 1253 | response = requests.post( 1254 | "https://hyperhuman.deemos.com/api/v2/download", 1255 | headers={ 1256 | "Authorization": f"Bearer {bpy.context.scene.blendermcp_hyper3d_api_key}", 1257 | }, 1258 | json={ 1259 | 'task_uuid': task_uuid 1260 | } 1261 | ) 1262 | data_ = response.json() 1263 | temp_file = None 1264 | for i in data_["list"]: 1265 | if i["name"].endswith(".glb"): 1266 | temp_file = tempfile.NamedTemporaryFile( 1267 | delete=False, 1268 | prefix=task_uuid, 1269 | suffix=".glb", 1270 | ) 1271 | 1272 | try: 1273 | # Download the content 1274 | response = requests.get(i["url"], stream=True) 1275 | response.raise_for_status() # Raise an exception for HTTP errors 1276 | 1277 | # Write the content to the temporary file 1278 | for chunk in response.iter_content(chunk_size=8192): 1279 | temp_file.write(chunk) 1280 | 1281 | # Close the file 1282 | temp_file.close() 1283 | 1284 | except Exception as e: 1285 | # Clean up the file if there's an error 1286 | temp_file.close() 1287 | os.unlink(temp_file.name) 1288 | return {"succeed": False, "error": str(e)} 1289 | 1290 | break 1291 | else: 1292 | return {"succeed": False, "error": "Generation failed. Please first make sure that all jobs of the task are done and then try again later."} 1293 | 1294 | try: 1295 | obj = self._clean_imported_glb( 1296 | filepath=temp_file.name, 1297 | mesh_name=name 1298 | ) 1299 | result = { 1300 | "name": obj.name, 1301 | "type": obj.type, 1302 | "location": [obj.location.x, obj.location.y, obj.location.z], 1303 | "rotation": [obj.rotation_euler.x, obj.rotation_euler.y, obj.rotation_euler.z], 1304 | "scale": [obj.scale.x, obj.scale.y, obj.scale.z], 1305 | } 1306 | 1307 | if obj.type == "MESH": 1308 | bounding_box = self._get_aabb(obj) 1309 | result["world_bounding_box"] = bounding_box 1310 | 1311 | return { 1312 | "succeed": True, **result 1313 | } 1314 | except Exception as e: 1315 | return {"succeed": False, "error": str(e)} 1316 | 1317 | def import_generated_asset_fal_ai(self, request_id: str, name: str): 1318 | """Fetch the generated asset, import into blender""" 1319 | response = requests.get( 1320 | f"https://queue.fal.run/fal-ai/hyper3d/requests/{request_id}", 1321 | headers={ 1322 | "Authorization": f"Key {bpy.context.scene.blendermcp_hyper3d_api_key}", 1323 | } 1324 | ) 1325 | data_ = response.json() 1326 | temp_file = None 1327 | 1328 | temp_file = tempfile.NamedTemporaryFile( 1329 | delete=False, 1330 | prefix=request_id, 1331 | suffix=".glb", 1332 | ) 1333 | 1334 | try: 1335 | # Download the content 1336 | response = requests.get(data_["model_mesh"]["url"], stream=True) 1337 | response.raise_for_status() # Raise an exception for HTTP errors 1338 | 1339 | # Write the content to the temporary file 1340 | for chunk in response.iter_content(chunk_size=8192): 1341 | temp_file.write(chunk) 1342 | 1343 | # Close the file 1344 | temp_file.close() 1345 | 1346 | except Exception as e: 1347 | # Clean up the file if there's an error 1348 | temp_file.close() 1349 | os.unlink(temp_file.name) 1350 | return {"succeed": False, "error": str(e)} 1351 | 1352 | try: 1353 | obj = self._clean_imported_glb( 1354 | filepath=temp_file.name, 1355 | mesh_name=name 1356 | ) 1357 | result = { 1358 | "name": obj.name, 1359 | "type": obj.type, 1360 | "location": [obj.location.x, obj.location.y, obj.location.z], 1361 | "rotation": [obj.rotation_euler.x, obj.rotation_euler.y, obj.rotation_euler.z], 1362 | "scale": [obj.scale.x, obj.scale.y, obj.scale.z], 1363 | } 1364 | 1365 | if obj.type == "MESH": 1366 | bounding_box = self._get_aabb(obj) 1367 | result["world_bounding_box"] = bounding_box 1368 | 1369 | return { 1370 | "succeed": True, **result 1371 | } 1372 | except Exception as e: 1373 | return {"succeed": False, "error": str(e)} 1374 | #endregion 1375 | 1376 | # Blender UI Panel 1377 | class BLENDERMCP_PT_Panel(bpy.types.Panel): 1378 | bl_label = "Blender MCP" 1379 | bl_idname = "BLENDERMCP_PT_Panel" 1380 | bl_space_type = 'VIEW_3D' 1381 | bl_region_type = 'UI' 1382 | bl_category = 'BlenderMCP' 1383 | 1384 | def draw(self, context): 1385 | layout = self.layout 1386 | scene = context.scene 1387 | 1388 | layout.prop(scene, "blendermcp_port") 1389 | layout.prop(scene, "blendermcp_use_polyhaven", text="Use assets from Poly Haven") 1390 | 1391 | layout.prop(scene, "blendermcp_use_hyper3d", text="Use Hyper3D Rodin 3D model generation") 1392 | if scene.blendermcp_use_hyper3d: 1393 | layout.prop(scene, "blendermcp_hyper3d_mode", text="Rodin Mode") 1394 | layout.prop(scene, "blendermcp_hyper3d_api_key", text="API Key") 1395 | layout.operator("blendermcp.set_hyper3d_free_trial_api_key", text="Set Free Trial API Key") 1396 | 1397 | if not scene.blendermcp_server_running: 1398 | layout.operator("blendermcp.start_server", text="Connect to MCP server") 1399 | else: 1400 | layout.operator("blendermcp.stop_server", text="Disconnect from MCP server") 1401 | layout.label(text=f"Running on port {scene.blendermcp_port}") 1402 | 1403 | # Operator to set Hyper3D API Key 1404 | class BLENDERMCP_OT_SetFreeTrialHyper3DAPIKey(bpy.types.Operator): 1405 | bl_idname = "blendermcp.set_hyper3d_free_trial_api_key" 1406 | bl_label = "Set Free Trial API Key" 1407 | 1408 | def execute(self, context): 1409 | context.scene.blendermcp_hyper3d_api_key = RODIN_FREE_TRIAL_KEY 1410 | context.scene.blendermcp_hyper3d_mode = 'MAIN_SITE' 1411 | self.report({'INFO'}, "API Key set successfully!") 1412 | return {'FINISHED'} 1413 | 1414 | # Operator to start the server 1415 | class BLENDERMCP_OT_StartServer(bpy.types.Operator): 1416 | bl_idname = "blendermcp.start_server" 1417 | bl_label = "Connect to Claude" 1418 | bl_description = "Start the BlenderMCP server to connect with Claude" 1419 | 1420 | def execute(self, context): 1421 | scene = context.scene 1422 | 1423 | # Create a new server instance 1424 | if not hasattr(bpy.types, "blendermcp_server") or not bpy.types.blendermcp_server: 1425 | bpy.types.blendermcp_server = BlenderMCPServer(port=scene.blendermcp_port) 1426 | 1427 | # Start the server 1428 | bpy.types.blendermcp_server.start() 1429 | scene.blendermcp_server_running = True 1430 | 1431 | return {'FINISHED'} 1432 | 1433 | # Operator to stop the server 1434 | class BLENDERMCP_OT_StopServer(bpy.types.Operator): 1435 | bl_idname = "blendermcp.stop_server" 1436 | bl_label = "Stop the connection to Claude" 1437 | bl_description = "Stop the connection to Claude" 1438 | 1439 | def execute(self, context): 1440 | scene = context.scene 1441 | 1442 | # Stop the server if it exists 1443 | if hasattr(bpy.types, "blendermcp_server") and bpy.types.blendermcp_server: 1444 | bpy.types.blendermcp_server.stop() 1445 | del bpy.types.blendermcp_server 1446 | 1447 | scene.blendermcp_server_running = False 1448 | 1449 | return {'FINISHED'} 1450 | 1451 | # Registration functions 1452 | def register(): 1453 | bpy.types.Scene.blendermcp_port = IntProperty( 1454 | name="Port", 1455 | description="Port for the BlenderMCP server", 1456 | default=9876, 1457 | min=1024, 1458 | max=65535 1459 | ) 1460 | 1461 | bpy.types.Scene.blendermcp_server_running = bpy.props.BoolProperty( 1462 | name="Server Running", 1463 | default=False 1464 | ) 1465 | 1466 | bpy.types.Scene.blendermcp_use_polyhaven = bpy.props.BoolProperty( 1467 | name="Use Poly Haven", 1468 | description="Enable Poly Haven asset integration", 1469 | default=False 1470 | ) 1471 | 1472 | bpy.types.Scene.blendermcp_use_hyper3d = bpy.props.BoolProperty( 1473 | name="Use Hyper3D Rodin", 1474 | description="Enable Hyper3D Rodin generatino integration", 1475 | default=False 1476 | ) 1477 | 1478 | bpy.types.Scene.blendermcp_hyper3d_mode = bpy.props.EnumProperty( 1479 | name="Rodin Mode", 1480 | description="Choose the platform used to call Rodin APIs", 1481 | items=[ 1482 | ("MAIN_SITE", "hyper3d.ai", "hyper3d.ai"), 1483 | ("FAL_AI", "fal.ai", "fal.ai"), 1484 | ], 1485 | default="MAIN_SITE" 1486 | ) 1487 | 1488 | bpy.types.Scene.blendermcp_hyper3d_api_key = bpy.props.StringProperty( 1489 | name="Hyper3D API Key", 1490 | subtype="PASSWORD", 1491 | description="API Key provided by Hyper3D", 1492 | default="" 1493 | ) 1494 | 1495 | bpy.utils.register_class(BLENDERMCP_PT_Panel) 1496 | bpy.utils.register_class(BLENDERMCP_OT_SetFreeTrialHyper3DAPIKey) 1497 | bpy.utils.register_class(BLENDERMCP_OT_StartServer) 1498 | bpy.utils.register_class(BLENDERMCP_OT_StopServer) 1499 | 1500 | print("BlenderMCP addon registered") 1501 | 1502 | def unregister(): 1503 | # Stop the server if it's running 1504 | if hasattr(bpy.types, "blendermcp_server") and bpy.types.blendermcp_server: 1505 | bpy.types.blendermcp_server.stop() 1506 | del bpy.types.blendermcp_server 1507 | 1508 | bpy.utils.unregister_class(BLENDERMCP_PT_Panel) 1509 | bpy.utils.unregister_class(BLENDERMCP_OT_SetFreeTrialHyper3DAPIKey) 1510 | bpy.utils.unregister_class(BLENDERMCP_OT_StartServer) 1511 | bpy.utils.unregister_class(BLENDERMCP_OT_StopServer) 1512 | 1513 | del bpy.types.Scene.blendermcp_port 1514 | del bpy.types.Scene.blendermcp_server_running 1515 | del bpy.types.Scene.blendermcp_use_polyhaven 1516 | del bpy.types.Scene.blendermcp_use_hyper3d 1517 | del bpy.types.Scene.blendermcp_hyper3d_mode 1518 | del bpy.types.Scene.blendermcp_hyper3d_api_key 1519 | 1520 | print("BlenderMCP addon unregistered") 1521 | 1522 | if __name__ == "__main__": 1523 | register() 1524 | -------------------------------------------------------------------------------- /assets/addon-instructions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ahujasid/blender-mcp/972096e9dc0183c0ac72a5391a99c99533c7f783/assets/addon-instructions.png -------------------------------------------------------------------------------- /assets/hammer-icon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/ahujasid/blender-mcp/972096e9dc0183c0ac72a5391a99c99533c7f783/assets/hammer-icon.png -------------------------------------------------------------------------------- /main.py: -------------------------------------------------------------------------------- 1 | from blender_mcp.server import main as server_main 2 | 3 | def main(): 4 | """Entry point for the blender-mcp package""" 5 | server_main() 6 | 7 | if __name__ == "__main__": 8 | main() 9 | -------------------------------------------------------------------------------- /pyproject.toml: -------------------------------------------------------------------------------- 1 | [project] 2 | name = "blender-mcp" 3 | version = "1.1.3" 4 | description = "Blender integration through the Model Context Protocol" 5 | readme = "README.md" 6 | requires-python = ">=3.10" 7 | authors = [ 8 | {name = "Your Name", email = "your.email@example.com"} 9 | ] 10 | license = {text = "MIT"} 11 | classifiers = [ 12 | "Programming Language :: Python :: 3", 13 | "License :: OSI Approved :: MIT License", 14 | "Operating System :: OS Independent", 15 | ] 16 | dependencies = [ 17 | "mcp[cli]>=1.3.0", 18 | ] 19 | 20 | [project.scripts] 21 | blender-mcp = "blender_mcp.server:main" 22 | 23 | [build-system] 24 | requires = ["setuptools>=61.0", "wheel"] 25 | build-backend = "setuptools.build_meta" 26 | 27 | [tool.setuptools] 28 | package-dir = {"" = "src"} 29 | 30 | [project.urls] 31 | "Homepage" = "https://github.com/yourusername/blender-mcp" 32 | "Bug Tracker" = "https://github.com/yourusername/blender-mcp/issues" 33 | -------------------------------------------------------------------------------- /src/blender_mcp/__init__.py: -------------------------------------------------------------------------------- 1 | """Blender integration through the Model Context Protocol.""" 2 | 3 | __version__ = "0.1.0" 4 | 5 | # Expose key classes and functions for easier imports 6 | from .server import BlenderConnection, get_blender_connection 7 | -------------------------------------------------------------------------------- /src/blender_mcp/server.py: -------------------------------------------------------------------------------- 1 | # blender_mcp_server.py 2 | from mcp.server.fastmcp import FastMCP, Context, Image 3 | import socket 4 | import json 5 | import asyncio 6 | import logging 7 | from dataclasses import dataclass 8 | from contextlib import asynccontextmanager 9 | from typing import AsyncIterator, Dict, Any, List 10 | import os 11 | from pathlib import Path 12 | import base64 13 | from urllib.parse import urlparse 14 | 15 | # Configure logging 16 | logging.basicConfig(level=logging.INFO, 17 | format='%(asctime)s - %(name)s - %(levelname)s - %(message)s') 18 | logger = logging.getLogger("BlenderMCPServer") 19 | 20 | @dataclass 21 | class BlenderConnection: 22 | host: str 23 | port: int 24 | sock: socket.socket = None # Changed from 'socket' to 'sock' to avoid naming conflict 25 | 26 | def connect(self) -> bool: 27 | """Connect to the Blender addon socket server""" 28 | if self.sock: 29 | return True 30 | 31 | try: 32 | self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) 33 | self.sock.connect((self.host, self.port)) 34 | logger.info(f"Connected to Blender at {self.host}:{self.port}") 35 | return True 36 | except Exception as e: 37 | logger.error(f"Failed to connect to Blender: {str(e)}") 38 | self.sock = None 39 | return False 40 | 41 | def disconnect(self): 42 | """Disconnect from the Blender addon""" 43 | if self.sock: 44 | try: 45 | self.sock.close() 46 | except Exception as e: 47 | logger.error(f"Error disconnecting from Blender: {str(e)}") 48 | finally: 49 | self.sock = None 50 | 51 | def receive_full_response(self, sock, buffer_size=8192): 52 | """Receive the complete response, potentially in multiple chunks""" 53 | chunks = [] 54 | # Use a consistent timeout value that matches the addon's timeout 55 | sock.settimeout(15.0) # Match the addon's timeout 56 | 57 | try: 58 | while True: 59 | try: 60 | chunk = sock.recv(buffer_size) 61 | if not chunk: 62 | # If we get an empty chunk, the connection might be closed 63 | if not chunks: # If we haven't received anything yet, this is an error 64 | raise Exception("Connection closed before receiving any data") 65 | break 66 | 67 | chunks.append(chunk) 68 | 69 | # Check if we've received a complete JSON object 70 | try: 71 | data = b''.join(chunks) 72 | json.loads(data.decode('utf-8')) 73 | # If we get here, it parsed successfully 74 | logger.info(f"Received complete response ({len(data)} bytes)") 75 | return data 76 | except json.JSONDecodeError: 77 | # Incomplete JSON, continue receiving 78 | continue 79 | except socket.timeout: 80 | # If we hit a timeout during receiving, break the loop and try to use what we have 81 | logger.warning("Socket timeout during chunked receive") 82 | break 83 | except (ConnectionError, BrokenPipeError, ConnectionResetError) as e: 84 | logger.error(f"Socket connection error during receive: {str(e)}") 85 | raise # Re-raise to be handled by the caller 86 | except socket.timeout: 87 | logger.warning("Socket timeout during chunked receive") 88 | except Exception as e: 89 | logger.error(f"Error during receive: {str(e)}") 90 | raise 91 | 92 | # If we get here, we either timed out or broke out of the loop 93 | # Try to use what we have 94 | if chunks: 95 | data = b''.join(chunks) 96 | logger.info(f"Returning data after receive completion ({len(data)} bytes)") 97 | try: 98 | # Try to parse what we have 99 | json.loads(data.decode('utf-8')) 100 | return data 101 | except json.JSONDecodeError: 102 | # If we can't parse it, it's incomplete 103 | raise Exception("Incomplete JSON response received") 104 | else: 105 | raise Exception("No data received") 106 | 107 | def send_command(self, command_type: str, params: Dict[str, Any] = None) -> Dict[str, Any]: 108 | """Send a command to Blender and return the response""" 109 | if not self.sock and not self.connect(): 110 | raise ConnectionError("Not connected to Blender") 111 | 112 | command = { 113 | "type": command_type, 114 | "params": params or {} 115 | } 116 | 117 | try: 118 | # Log the command being sent 119 | logger.info(f"Sending command: {command_type} with params: {params}") 120 | 121 | # Send the command 122 | self.sock.sendall(json.dumps(command).encode('utf-8')) 123 | logger.info(f"Command sent, waiting for response...") 124 | 125 | # Set a timeout for receiving - use the same timeout as in receive_full_response 126 | self.sock.settimeout(15.0) # Match the addon's timeout 127 | 128 | # Receive the response using the improved receive_full_response method 129 | response_data = self.receive_full_response(self.sock) 130 | logger.info(f"Received {len(response_data)} bytes of data") 131 | 132 | response = json.loads(response_data.decode('utf-8')) 133 | logger.info(f"Response parsed, status: {response.get('status', 'unknown')}") 134 | 135 | if response.get("status") == "error": 136 | logger.error(f"Blender error: {response.get('message')}") 137 | raise Exception(response.get("message", "Unknown error from Blender")) 138 | 139 | return response.get("result", {}) 140 | except socket.timeout: 141 | logger.error("Socket timeout while waiting for response from Blender") 142 | # Don't try to reconnect here - let the get_blender_connection handle reconnection 143 | # Just invalidate the current socket so it will be recreated next time 144 | self.sock = None 145 | raise Exception("Timeout waiting for Blender response - try simplifying your request") 146 | except (ConnectionError, BrokenPipeError, ConnectionResetError) as e: 147 | logger.error(f"Socket connection error: {str(e)}") 148 | self.sock = None 149 | raise Exception(f"Connection to Blender lost: {str(e)}") 150 | except json.JSONDecodeError as e: 151 | logger.error(f"Invalid JSON response from Blender: {str(e)}") 152 | # Try to log what was received 153 | if 'response_data' in locals() and response_data: 154 | logger.error(f"Raw response (first 200 bytes): {response_data[:200]}") 155 | raise Exception(f"Invalid response from Blender: {str(e)}") 156 | except Exception as e: 157 | logger.error(f"Error communicating with Blender: {str(e)}") 158 | # Don't try to reconnect here - let the get_blender_connection handle reconnection 159 | self.sock = None 160 | raise Exception(f"Communication error with Blender: {str(e)}") 161 | 162 | @asynccontextmanager 163 | async def server_lifespan(server: FastMCP) -> AsyncIterator[Dict[str, Any]]: 164 | """Manage server startup and shutdown lifecycle""" 165 | # We don't need to create a connection here since we're using the global connection 166 | # for resources and tools 167 | 168 | try: 169 | # Just log that we're starting up 170 | logger.info("BlenderMCP server starting up") 171 | 172 | # Try to connect to Blender on startup to verify it's available 173 | try: 174 | # This will initialize the global connection if needed 175 | blender = get_blender_connection() 176 | logger.info("Successfully connected to Blender on startup") 177 | except Exception as e: 178 | logger.warning(f"Could not connect to Blender on startup: {str(e)}") 179 | logger.warning("Make sure the Blender addon is running before using Blender resources or tools") 180 | 181 | # Return an empty context - we're using the global connection 182 | yield {} 183 | finally: 184 | # Clean up the global connection on shutdown 185 | global _blender_connection 186 | if _blender_connection: 187 | logger.info("Disconnecting from Blender on shutdown") 188 | _blender_connection.disconnect() 189 | _blender_connection = None 190 | logger.info("BlenderMCP server shut down") 191 | 192 | # Create the MCP server with lifespan support 193 | mcp = FastMCP( 194 | "BlenderMCP", 195 | description="Blender integration through the Model Context Protocol", 196 | lifespan=server_lifespan 197 | ) 198 | 199 | # Resource endpoints 200 | 201 | # Global connection for resources (since resources can't access context) 202 | _blender_connection = None 203 | _polyhaven_enabled = False # Add this global variable 204 | 205 | def get_blender_connection(): 206 | """Get or create a persistent Blender connection""" 207 | global _blender_connection, _polyhaven_enabled # Add _polyhaven_enabled to globals 208 | 209 | # If we have an existing connection, check if it's still valid 210 | if _blender_connection is not None: 211 | try: 212 | # First check if PolyHaven is enabled by sending a ping command 213 | result = _blender_connection.send_command("get_polyhaven_status") 214 | # Store the PolyHaven status globally 215 | _polyhaven_enabled = result.get("enabled", False) 216 | return _blender_connection 217 | except Exception as e: 218 | # Connection is dead, close it and create a new one 219 | logger.warning(f"Existing connection is no longer valid: {str(e)}") 220 | try: 221 | _blender_connection.disconnect() 222 | except: 223 | pass 224 | _blender_connection = None 225 | 226 | # Create a new connection if needed 227 | if _blender_connection is None: 228 | _blender_connection = BlenderConnection(host="localhost", port=9876) 229 | if not _blender_connection.connect(): 230 | logger.error("Failed to connect to Blender") 231 | _blender_connection = None 232 | raise Exception("Could not connect to Blender. Make sure the Blender addon is running.") 233 | logger.info("Created new persistent connection to Blender") 234 | 235 | return _blender_connection 236 | 237 | 238 | @mcp.tool() 239 | def get_scene_info(ctx: Context) -> str: 240 | """Get detailed information about the current Blender scene""" 241 | try: 242 | blender = get_blender_connection() 243 | result = blender.send_command("get_scene_info") 244 | 245 | # Just return the JSON representation of what Blender sent us 246 | return json.dumps(result, indent=2) 247 | except Exception as e: 248 | logger.error(f"Error getting scene info from Blender: {str(e)}") 249 | return f"Error getting scene info: {str(e)}" 250 | 251 | @mcp.tool() 252 | def get_object_info(ctx: Context, object_name: str) -> str: 253 | """ 254 | Get detailed information about a specific object in the Blender scene. 255 | 256 | Parameters: 257 | - object_name: The name of the object to get information about 258 | """ 259 | try: 260 | blender = get_blender_connection() 261 | result = blender.send_command("get_object_info", {"name": object_name}) 262 | 263 | # Just return the JSON representation of what Blender sent us 264 | return json.dumps(result, indent=2) 265 | except Exception as e: 266 | logger.error(f"Error getting object info from Blender: {str(e)}") 267 | return f"Error getting object info: {str(e)}" 268 | 269 | 270 | 271 | @mcp.tool() 272 | def execute_blender_code(ctx: Context, code: str) -> str: 273 | """ 274 | Execute arbitrary Python code in Blender. Make sure to do it step-by-step by breaking it into smaller chunks. 275 | 276 | Parameters: 277 | - code: The Python code to execute 278 | """ 279 | try: 280 | # Get the global connection 281 | blender = get_blender_connection() 282 | 283 | result = blender.send_command("execute_code", {"code": code}) 284 | return f"Code executed successfully: {result.get('result', '')}" 285 | except Exception as e: 286 | logger.error(f"Error executing code: {str(e)}") 287 | return f"Error executing code: {str(e)}" 288 | 289 | @mcp.tool() 290 | def get_polyhaven_categories(ctx: Context, asset_type: str = "hdris") -> str: 291 | """ 292 | Get a list of categories for a specific asset type on Polyhaven. 293 | 294 | Parameters: 295 | - asset_type: The type of asset to get categories for (hdris, textures, models, all) 296 | """ 297 | try: 298 | blender = get_blender_connection() 299 | if not _polyhaven_enabled: 300 | return "PolyHaven integration is disabled. Select it in the sidebar in BlenderMCP, then run it again." 301 | result = blender.send_command("get_polyhaven_categories", {"asset_type": asset_type}) 302 | 303 | if "error" in result: 304 | return f"Error: {result['error']}" 305 | 306 | # Format the categories in a more readable way 307 | categories = result["categories"] 308 | formatted_output = f"Categories for {asset_type}:\n\n" 309 | 310 | # Sort categories by count (descending) 311 | sorted_categories = sorted(categories.items(), key=lambda x: x[1], reverse=True) 312 | 313 | for category, count in sorted_categories: 314 | formatted_output += f"- {category}: {count} assets\n" 315 | 316 | return formatted_output 317 | except Exception as e: 318 | logger.error(f"Error getting Polyhaven categories: {str(e)}") 319 | return f"Error getting Polyhaven categories: {str(e)}" 320 | 321 | @mcp.tool() 322 | def search_polyhaven_assets( 323 | ctx: Context, 324 | asset_type: str = "all", 325 | categories: str = None 326 | ) -> str: 327 | """ 328 | Search for assets on Polyhaven with optional filtering. 329 | 330 | Parameters: 331 | - asset_type: Type of assets to search for (hdris, textures, models, all) 332 | - categories: Optional comma-separated list of categories to filter by 333 | 334 | Returns a list of matching assets with basic information. 335 | """ 336 | try: 337 | blender = get_blender_connection() 338 | result = blender.send_command("search_polyhaven_assets", { 339 | "asset_type": asset_type, 340 | "categories": categories 341 | }) 342 | 343 | if "error" in result: 344 | return f"Error: {result['error']}" 345 | 346 | # Format the assets in a more readable way 347 | assets = result["assets"] 348 | total_count = result["total_count"] 349 | returned_count = result["returned_count"] 350 | 351 | formatted_output = f"Found {total_count} assets" 352 | if categories: 353 | formatted_output += f" in categories: {categories}" 354 | formatted_output += f"\nShowing {returned_count} assets:\n\n" 355 | 356 | # Sort assets by download count (popularity) 357 | sorted_assets = sorted(assets.items(), key=lambda x: x[1].get("download_count", 0), reverse=True) 358 | 359 | for asset_id, asset_data in sorted_assets: 360 | formatted_output += f"- {asset_data.get('name', asset_id)} (ID: {asset_id})\n" 361 | formatted_output += f" Type: {['HDRI', 'Texture', 'Model'][asset_data.get('type', 0)]}\n" 362 | formatted_output += f" Categories: {', '.join(asset_data.get('categories', []))}\n" 363 | formatted_output += f" Downloads: {asset_data.get('download_count', 'Unknown')}\n\n" 364 | 365 | return formatted_output 366 | except Exception as e: 367 | logger.error(f"Error searching Polyhaven assets: {str(e)}") 368 | return f"Error searching Polyhaven assets: {str(e)}" 369 | 370 | @mcp.tool() 371 | def download_polyhaven_asset( 372 | ctx: Context, 373 | asset_id: str, 374 | asset_type: str, 375 | resolution: str = "1k", 376 | file_format: str = None 377 | ) -> str: 378 | """ 379 | Download and import a Polyhaven asset into Blender. 380 | 381 | Parameters: 382 | - asset_id: The ID of the asset to download 383 | - asset_type: The type of asset (hdris, textures, models) 384 | - resolution: The resolution to download (e.g., 1k, 2k, 4k) 385 | - file_format: Optional file format (e.g., hdr, exr for HDRIs; jpg, png for textures; gltf, fbx for models) 386 | 387 | Returns a message indicating success or failure. 388 | """ 389 | try: 390 | blender = get_blender_connection() 391 | result = blender.send_command("download_polyhaven_asset", { 392 | "asset_id": asset_id, 393 | "asset_type": asset_type, 394 | "resolution": resolution, 395 | "file_format": file_format 396 | }) 397 | 398 | if "error" in result: 399 | return f"Error: {result['error']}" 400 | 401 | if result.get("success"): 402 | message = result.get("message", "Asset downloaded and imported successfully") 403 | 404 | # Add additional information based on asset type 405 | if asset_type == "hdris": 406 | return f"{message}. The HDRI has been set as the world environment." 407 | elif asset_type == "textures": 408 | material_name = result.get("material", "") 409 | maps = ", ".join(result.get("maps", [])) 410 | return f"{message}. Created material '{material_name}' with maps: {maps}." 411 | elif asset_type == "models": 412 | return f"{message}. The model has been imported into the current scene." 413 | else: 414 | return message 415 | else: 416 | return f"Failed to download asset: {result.get('message', 'Unknown error')}" 417 | except Exception as e: 418 | logger.error(f"Error downloading Polyhaven asset: {str(e)}") 419 | return f"Error downloading Polyhaven asset: {str(e)}" 420 | 421 | @mcp.tool() 422 | def set_texture( 423 | ctx: Context, 424 | object_name: str, 425 | texture_id: str 426 | ) -> str: 427 | """ 428 | Apply a previously downloaded Polyhaven texture to an object. 429 | 430 | Parameters: 431 | - object_name: Name of the object to apply the texture to 432 | - texture_id: ID of the Polyhaven texture to apply (must be downloaded first) 433 | 434 | Returns a message indicating success or failure. 435 | """ 436 | try: 437 | # Get the global connection 438 | blender = get_blender_connection() 439 | 440 | result = blender.send_command("set_texture", { 441 | "object_name": object_name, 442 | "texture_id": texture_id 443 | }) 444 | 445 | if "error" in result: 446 | return f"Error: {result['error']}" 447 | 448 | if result.get("success"): 449 | material_name = result.get("material", "") 450 | maps = ", ".join(result.get("maps", [])) 451 | 452 | # Add detailed material info 453 | material_info = result.get("material_info", {}) 454 | node_count = material_info.get("node_count", 0) 455 | has_nodes = material_info.get("has_nodes", False) 456 | texture_nodes = material_info.get("texture_nodes", []) 457 | 458 | output = f"Successfully applied texture '{texture_id}' to {object_name}.\n" 459 | output += f"Using material '{material_name}' with maps: {maps}.\n\n" 460 | output += f"Material has nodes: {has_nodes}\n" 461 | output += f"Total node count: {node_count}\n\n" 462 | 463 | if texture_nodes: 464 | output += "Texture nodes:\n" 465 | for node in texture_nodes: 466 | output += f"- {node['name']} using image: {node['image']}\n" 467 | if node['connections']: 468 | output += " Connections:\n" 469 | for conn in node['connections']: 470 | output += f" {conn}\n" 471 | else: 472 | output += "No texture nodes found in the material.\n" 473 | 474 | return output 475 | else: 476 | return f"Failed to apply texture: {result.get('message', 'Unknown error')}" 477 | except Exception as e: 478 | logger.error(f"Error applying texture: {str(e)}") 479 | return f"Error applying texture: {str(e)}" 480 | 481 | @mcp.tool() 482 | def get_polyhaven_status(ctx: Context) -> str: 483 | """ 484 | Check if PolyHaven integration is enabled in Blender. 485 | Returns a message indicating whether PolyHaven features are available. 486 | """ 487 | try: 488 | blender = get_blender_connection() 489 | result = blender.send_command("get_polyhaven_status") 490 | enabled = result.get("enabled", False) 491 | message = result.get("message", "") 492 | 493 | return message 494 | except Exception as e: 495 | logger.error(f"Error checking PolyHaven status: {str(e)}") 496 | return f"Error checking PolyHaven status: {str(e)}" 497 | 498 | @mcp.tool() 499 | def get_hyper3d_status(ctx: Context) -> str: 500 | """ 501 | Check if Hyper3D Rodin integration is enabled in Blender. 502 | Returns a message indicating whether Hyper3D Rodin features are available. 503 | 504 | Don't emphasize the key type in the returned message, but sliently remember it. 505 | """ 506 | try: 507 | blender = get_blender_connection() 508 | result = blender.send_command("get_hyper3d_status") 509 | enabled = result.get("enabled", False) 510 | message = result.get("message", "") 511 | if enabled: 512 | message += "" 513 | return message 514 | except Exception as e: 515 | logger.error(f"Error checking Hyper3D status: {str(e)}") 516 | return f"Error checking Hyper3D status: {str(e)}" 517 | 518 | def _process_bbox(original_bbox: list[float] | list[int] | None) -> list[int] | None: 519 | if original_bbox is None: 520 | return None 521 | if all(isinstance(i, int) for i in original_bbox): 522 | return original_bbox 523 | if any(i<=0 for i in original_bbox): 524 | raise ValueError("Incorrect number range: bbox must be bigger than zero!") 525 | return [int(float(i) / max(original_bbox) * 100) for i in original_bbox] if original_bbox else None 526 | 527 | @mcp.tool() 528 | def generate_hyper3d_model_via_text( 529 | ctx: Context, 530 | text_prompt: str, 531 | bbox_condition: list[float]=None 532 | ) -> str: 533 | """ 534 | Generate 3D asset using Hyper3D by giving description of the desired asset, and import the asset into Blender. 535 | The 3D asset has built-in materials. 536 | The generated model has a normalized size, so re-scaling after generation can be useful. 537 | 538 | Parameters: 539 | - text_prompt: A short description of the desired model in **English**. 540 | - bbox_condition: Optional. If given, it has to be a list of floats of length 3. Controls the ratio between [Length, Width, Height] of the model. 541 | 542 | Returns a message indicating success or failure. 543 | """ 544 | try: 545 | blender = get_blender_connection() 546 | result = blender.send_command("create_rodin_job", { 547 | "text_prompt": text_prompt, 548 | "images": None, 549 | "bbox_condition": _process_bbox(bbox_condition), 550 | }) 551 | succeed = result.get("submit_time", False) 552 | if succeed: 553 | return json.dumps({ 554 | "task_uuid": result["uuid"], 555 | "subscription_key": result["jobs"]["subscription_key"], 556 | }) 557 | else: 558 | return json.dumps(result) 559 | except Exception as e: 560 | logger.error(f"Error generating Hyper3D task: {str(e)}") 561 | return f"Error generating Hyper3D task: {str(e)}" 562 | 563 | @mcp.tool() 564 | def generate_hyper3d_model_via_images( 565 | ctx: Context, 566 | input_image_paths: list[str]=None, 567 | input_image_urls: list[str]=None, 568 | bbox_condition: list[float]=None 569 | ) -> str: 570 | """ 571 | Generate 3D asset using Hyper3D by giving images of the wanted asset, and import the generated asset into Blender. 572 | The 3D asset has built-in materials. 573 | The generated model has a normalized size, so re-scaling after generation can be useful. 574 | 575 | Parameters: 576 | - input_image_paths: The **absolute** paths of input images. Even if only one image is provided, wrap it into a list. Required if Hyper3D Rodin in MAIN_SITE mode. 577 | - input_image_urls: The URLs of input images. Even if only one image is provided, wrap it into a list. Required if Hyper3D Rodin in FAL_AI mode. 578 | - bbox_condition: Optional. If given, it has to be a list of ints of length 3. Controls the ratio between [Length, Width, Height] of the model. 579 | 580 | Only one of {input_image_paths, input_image_urls} should be given at a time, depending on the Hyper3D Rodin's current mode. 581 | Returns a message indicating success or failure. 582 | """ 583 | if input_image_paths is not None and input_image_urls is not None: 584 | return f"Error: Conflict parameters given!" 585 | if input_image_paths is None and input_image_urls is None: 586 | return f"Error: No image given!" 587 | if input_image_paths is not None: 588 | if not all(os.path.exists(i) for i in input_image_paths): 589 | return "Error: not all image paths are valid!" 590 | images = [] 591 | for path in input_image_paths: 592 | with open(path, "rb") as f: 593 | images.append( 594 | (Path(path).suffix, base64.b64encode(f.read()).decode("ascii")) 595 | ) 596 | elif input_image_urls is not None: 597 | if not all(urlparse(i) for i in input_image_paths): 598 | return "Error: not all image URLs are valid!" 599 | images = input_image_urls.copy() 600 | try: 601 | blender = get_blender_connection() 602 | result = blender.send_command("create_rodin_job", { 603 | "text_prompt": None, 604 | "images": images, 605 | "bbox_condition": _process_bbox(bbox_condition), 606 | }) 607 | succeed = result.get("submit_time", False) 608 | if succeed: 609 | return json.dumps({ 610 | "task_uuid": result["uuid"], 611 | "subscription_key": result["jobs"]["subscription_key"], 612 | }) 613 | else: 614 | return json.dumps(result) 615 | except Exception as e: 616 | logger.error(f"Error generating Hyper3D task: {str(e)}") 617 | return f"Error generating Hyper3D task: {str(e)}" 618 | 619 | @mcp.tool() 620 | def poll_rodin_job_status( 621 | ctx: Context, 622 | subscription_key: str=None, 623 | request_id: str=None, 624 | ): 625 | """ 626 | Check if the Hyper3D Rodin generation task is completed. 627 | 628 | For Hyper3D Rodin mode MAIN_SITE: 629 | Parameters: 630 | - subscription_key: The subscription_key given in the generate model step. 631 | 632 | Returns a list of status. The task is done if all status are "Done". 633 | If "Failed" showed up, the generating process failed. 634 | This is a polling API, so only proceed if the status are finally determined ("Done" or "Canceled"). 635 | 636 | For Hyper3D Rodin mode FAL_AI: 637 | Parameters: 638 | - request_id: The request_id given in the generate model step. 639 | 640 | Returns the generation task status. The task is done if status is "COMPLETED". 641 | The task is in progress if status is "IN_PROGRESS". 642 | If status other than "COMPLETED", "IN_PROGRESS", "IN_QUEUE" showed up, the generating process might be failed. 643 | This is a polling API, so only proceed if the status are finally determined ("COMPLETED" or some failed state). 644 | """ 645 | try: 646 | blender = get_blender_connection() 647 | kwargs = {} 648 | if subscription_key: 649 | kwargs = { 650 | "subscription_key": subscription_key, 651 | } 652 | elif request_id: 653 | kwargs = { 654 | "request_id": request_id, 655 | } 656 | result = blender.send_command("poll_rodin_job_status", kwargs) 657 | return result 658 | except Exception as e: 659 | logger.error(f"Error generating Hyper3D task: {str(e)}") 660 | return f"Error generating Hyper3D task: {str(e)}" 661 | 662 | @mcp.tool() 663 | def import_generated_asset( 664 | ctx: Context, 665 | name: str, 666 | task_uuid: str=None, 667 | request_id: str=None, 668 | ): 669 | """ 670 | Import the asset generated by Hyper3D Rodin after the generation task is completed. 671 | 672 | Parameters: 673 | - name: The name of the object in scene 674 | - task_uuid: For Hyper3D Rodin mode MAIN_SITE: The task_uuid given in the generate model step. 675 | - request_id: For Hyper3D Rodin mode FAL_AI: The request_id given in the generate model step. 676 | 677 | Only give one of {task_uuid, request_id} based on the Hyper3D Rodin Mode! 678 | Return if the asset has been imported successfully. 679 | """ 680 | try: 681 | blender = get_blender_connection() 682 | kwargs = { 683 | "name": name 684 | } 685 | if task_uuid: 686 | kwargs["task_uuid"] = task_uuid 687 | elif request_id: 688 | kwargs["request_id"] = request_id 689 | result = blender.send_command("import_generated_asset", kwargs) 690 | return result 691 | except Exception as e: 692 | logger.error(f"Error generating Hyper3D task: {str(e)}") 693 | return f"Error generating Hyper3D task: {str(e)}" 694 | 695 | @mcp.prompt() 696 | def asset_creation_strategy() -> str: 697 | """Defines the preferred strategy for creating assets in Blender""" 698 | return """When creating 3D content in Blender, always start by checking if integrations are available: 699 | 700 | 0. Before anything, always check the scene from get_scene_info() 701 | 1. First use the following tools to verify if the following integrations are enabled: 702 | 1. PolyHaven 703 | Use get_polyhaven_status() to verify its status 704 | If PolyHaven is enabled: 705 | - For objects/models: Use download_polyhaven_asset() with asset_type="models" 706 | - For materials/textures: Use download_polyhaven_asset() with asset_type="textures" 707 | - For environment lighting: Use download_polyhaven_asset() with asset_type="hdris" 708 | 2. Hyper3D(Rodin) 709 | Hyper3D Rodin is good at generating 3D models for single item. 710 | So don't try to: 711 | 1. Generate the whole scene with one shot 712 | 2. Generate ground using Hyper3D 713 | 3. Generate parts of the items separately and put them together afterwards 714 | 715 | Use get_hyper3d_status() to verify its status 716 | If Hyper3D is enabled: 717 | - For objects/models, do the following steps: 718 | 1. Create the model generation task 719 | - Use generate_hyper3d_model_via_images() if image(s) is/are given 720 | - Use generate_hyper3d_model_via_text() if generating 3D asset using text prompt 721 | If key type is free_trial and insufficient balance error returned, tell the user that the free trial key can only generated limited models everyday, they can choose to: 722 | - Wait for another day and try again 723 | - Go to hyper3d.ai to find out how to get their own API key 724 | - Go to fal.ai to get their own private API key 725 | 2. Poll the status 726 | - Use poll_rodin_job_status() to check if the generation task has completed or failed 727 | 3. Import the asset 728 | - Use import_generated_asset() to import the generated GLB model the asset 729 | 4. After importing the asset, ALWAYS check the world_bounding_box of the imported mesh, and adjust the mesh's location and size 730 | Adjust the imported mesh's location, scale, rotation, so that the mesh is on the right spot. 731 | 732 | You can reuse assets previous generated by running python code to duplicate the object, without creating another generation task. 733 | 734 | 3. Always check the world_bounding_box for each item so that: 735 | - Ensure that all objects that should not be clipping are not clipping. 736 | - Items have right spatial relationship. 737 | 738 | 739 | Only fall back to scripting when: 740 | - PolyHaven and Hyper3D are disabled 741 | - A simple primitive is explicitly requested 742 | - No suitable PolyHaven asset exists 743 | - Hyper3D Rodin failed to generate the desired asset 744 | - The task specifically requires a basic material/color 745 | """ 746 | 747 | # Main execution 748 | 749 | def main(): 750 | """Run the MCP server""" 751 | mcp.run() 752 | 753 | if __name__ == "__main__": 754 | main() -------------------------------------------------------------------------------- /uv.lock: -------------------------------------------------------------------------------- 1 | version = 1 2 | revision = 1 3 | requires-python = ">=3.10" 4 | 5 | [[package]] 6 | name = "annotated-types" 7 | version = "0.7.0" 8 | source = { registry = "https://pypi.org/simple" } 9 | sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 } 10 | wheels = [ 11 | { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 }, 12 | ] 13 | 14 | [[package]] 15 | name = "anyio" 16 | version = "4.8.0" 17 | source = { registry = "https://pypi.org/simple" } 18 | dependencies = [ 19 | { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, 20 | { name = "idna" }, 21 | { name = "sniffio" }, 22 | { name = "typing-extensions", marker = "python_full_version < '3.13'" }, 23 | ] 24 | sdist = { url = "https://files.pythonhosted.org/packages/a3/73/199a98fc2dae33535d6b8e8e6ec01f8c1d76c9adb096c6b7d64823038cde/anyio-4.8.0.tar.gz", hash = "sha256:1d9fe889df5212298c0c0723fa20479d1b94883a2df44bd3897aa91083316f7a", size = 181126 } 25 | wheels = [ 26 | { url = "https://files.pythonhosted.org/packages/46/eb/e7f063ad1fec6b3178a3cd82d1a3c4de82cccf283fc42746168188e1cdd5/anyio-4.8.0-py3-none-any.whl", hash = "sha256:b5011f270ab5eb0abf13385f851315585cc37ef330dd88e27ec3d34d651fd47a", size = 96041 }, 27 | ] 28 | 29 | [[package]] 30 | name = "blender-mcp" 31 | version = "1.1.2" 32 | source = { editable = "." } 33 | dependencies = [ 34 | { name = "mcp", extra = ["cli"] }, 35 | ] 36 | 37 | [package.metadata] 38 | requires-dist = [{ name = "mcp", extras = ["cli"], specifier = ">=1.3.0" }] 39 | 40 | [[package]] 41 | name = "certifi" 42 | version = "2025.1.31" 43 | source = { registry = "https://pypi.org/simple" } 44 | sdist = { url = "https://files.pythonhosted.org/packages/1c/ab/c9f1e32b7b1bf505bf26f0ef697775960db7932abeb7b516de930ba2705f/certifi-2025.1.31.tar.gz", hash = "sha256:3d5da6925056f6f18f119200434a4780a94263f10d1c21d032a6f6b2baa20651", size = 167577 } 45 | wheels = [ 46 | { url = "https://files.pythonhosted.org/packages/38/fc/bce832fd4fd99766c04d1ee0eead6b0ec6486fb100ae5e74c1d91292b982/certifi-2025.1.31-py3-none-any.whl", hash = "sha256:ca78db4565a652026a4db2bcdf68f2fb589ea80d0be70e03929ed730746b84fe", size = 166393 }, 47 | ] 48 | 49 | [[package]] 50 | name = "click" 51 | version = "8.1.8" 52 | source = { registry = "https://pypi.org/simple" } 53 | dependencies = [ 54 | { name = "colorama", marker = "sys_platform == 'win32'" }, 55 | ] 56 | sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593 } 57 | wheels = [ 58 | { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188 }, 59 | ] 60 | 61 | [[package]] 62 | name = "colorama" 63 | version = "0.4.6" 64 | source = { registry = "https://pypi.org/simple" } 65 | sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 } 66 | wheels = [ 67 | { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 }, 68 | ] 69 | 70 | [[package]] 71 | name = "exceptiongroup" 72 | version = "1.2.2" 73 | source = { registry = "https://pypi.org/simple" } 74 | sdist = { url = "https://files.pythonhosted.org/packages/09/35/2495c4ac46b980e4ca1f6ad6db102322ef3ad2410b79fdde159a4b0f3b92/exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc", size = 28883 } 75 | wheels = [ 76 | { url = "https://files.pythonhosted.org/packages/02/cc/b7e31358aac6ed1ef2bb790a9746ac2c69bcb3c8588b41616914eb106eaf/exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b", size = 16453 }, 77 | ] 78 | 79 | [[package]] 80 | name = "h11" 81 | version = "0.14.0" 82 | source = { registry = "https://pypi.org/simple" } 83 | sdist = { url = "https://files.pythonhosted.org/packages/f5/38/3af3d3633a34a3316095b39c8e8fb4853a28a536e55d347bd8d8e9a14b03/h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d", size = 100418 } 84 | wheels = [ 85 | { url = "https://files.pythonhosted.org/packages/95/04/ff642e65ad6b90db43e668d70ffb6736436c7ce41fcc549f4e9472234127/h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761", size = 58259 }, 86 | ] 87 | 88 | [[package]] 89 | name = "httpcore" 90 | version = "1.0.7" 91 | source = { registry = "https://pypi.org/simple" } 92 | dependencies = [ 93 | { name = "certifi" }, 94 | { name = "h11" }, 95 | ] 96 | sdist = { url = "https://files.pythonhosted.org/packages/6a/41/d7d0a89eb493922c37d343b607bc1b5da7f5be7e383740b4753ad8943e90/httpcore-1.0.7.tar.gz", hash = "sha256:8551cb62a169ec7162ac7be8d4817d561f60e08eaa485234898414bb5a8a0b4c", size = 85196 } 97 | wheels = [ 98 | { url = "https://files.pythonhosted.org/packages/87/f5/72347bc88306acb359581ac4d52f23c0ef445b57157adedb9aee0cd689d2/httpcore-1.0.7-py3-none-any.whl", hash = "sha256:a3fff8f43dc260d5bd363d9f9cf1830fa3a458b332856f34282de498ed420edd", size = 78551 }, 99 | ] 100 | 101 | [[package]] 102 | name = "httpx" 103 | version = "0.28.1" 104 | source = { registry = "https://pypi.org/simple" } 105 | dependencies = [ 106 | { name = "anyio" }, 107 | { name = "certifi" }, 108 | { name = "httpcore" }, 109 | { name = "idna" }, 110 | ] 111 | sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406 } 112 | wheels = [ 113 | { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517 }, 114 | ] 115 | 116 | [[package]] 117 | name = "httpx-sse" 118 | version = "0.4.0" 119 | source = { registry = "https://pypi.org/simple" } 120 | sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624 } 121 | wheels = [ 122 | { url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819 }, 123 | ] 124 | 125 | [[package]] 126 | name = "idna" 127 | version = "3.10" 128 | source = { registry = "https://pypi.org/simple" } 129 | sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 } 130 | wheels = [ 131 | { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 }, 132 | ] 133 | 134 | [[package]] 135 | name = "markdown-it-py" 136 | version = "3.0.0" 137 | source = { registry = "https://pypi.org/simple" } 138 | dependencies = [ 139 | { name = "mdurl" }, 140 | ] 141 | sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596 } 142 | wheels = [ 143 | { url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528 }, 144 | ] 145 | 146 | [[package]] 147 | name = "mcp" 148 | version = "1.3.0" 149 | source = { registry = "https://pypi.org/simple" } 150 | dependencies = [ 151 | { name = "anyio" }, 152 | { name = "httpx" }, 153 | { name = "httpx-sse" }, 154 | { name = "pydantic" }, 155 | { name = "pydantic-settings" }, 156 | { name = "sse-starlette" }, 157 | { name = "starlette" }, 158 | { name = "uvicorn" }, 159 | ] 160 | sdist = { url = "https://files.pythonhosted.org/packages/6b/b6/81e5f2490290351fc97bf46c24ff935128cb7d34d68e3987b522f26f7ada/mcp-1.3.0.tar.gz", hash = "sha256:f409ae4482ce9d53e7ac03f3f7808bcab735bdfc0fba937453782efb43882d45", size = 150235 } 161 | wheels = [ 162 | { url = "https://files.pythonhosted.org/packages/d0/d2/a9e87b506b2094f5aa9becc1af5178842701b27217fa43877353da2577e3/mcp-1.3.0-py3-none-any.whl", hash = "sha256:2829d67ce339a249f803f22eba5e90385eafcac45c94b00cab6cef7e8f217211", size = 70672 }, 163 | ] 164 | 165 | [package.optional-dependencies] 166 | cli = [ 167 | { name = "python-dotenv" }, 168 | { name = "typer" }, 169 | ] 170 | 171 | [[package]] 172 | name = "mdurl" 173 | version = "0.1.2" 174 | source = { registry = "https://pypi.org/simple" } 175 | sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729 } 176 | wheels = [ 177 | { url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979 }, 178 | ] 179 | 180 | [[package]] 181 | name = "pydantic" 182 | version = "2.10.6" 183 | source = { registry = "https://pypi.org/simple" } 184 | dependencies = [ 185 | { name = "annotated-types" }, 186 | { name = "pydantic-core" }, 187 | { name = "typing-extensions" }, 188 | ] 189 | sdist = { url = "https://files.pythonhosted.org/packages/b7/ae/d5220c5c52b158b1de7ca89fc5edb72f304a70a4c540c84c8844bf4008de/pydantic-2.10.6.tar.gz", hash = "sha256:ca5daa827cce33de7a42be142548b0096bf05a7e7b365aebfa5f8eeec7128236", size = 761681 } 190 | wheels = [ 191 | { url = "https://files.pythonhosted.org/packages/f4/3c/8cc1cc84deffa6e25d2d0c688ebb80635dfdbf1dbea3e30c541c8cf4d860/pydantic-2.10.6-py3-none-any.whl", hash = "sha256:427d664bf0b8a2b34ff5dd0f5a18df00591adcee7198fbd71981054cef37b584", size = 431696 }, 192 | ] 193 | 194 | [[package]] 195 | name = "pydantic-core" 196 | version = "2.27.2" 197 | source = { registry = "https://pypi.org/simple" } 198 | dependencies = [ 199 | { name = "typing-extensions" }, 200 | ] 201 | sdist = { url = "https://files.pythonhosted.org/packages/fc/01/f3e5ac5e7c25833db5eb555f7b7ab24cd6f8c322d3a3ad2d67a952dc0abc/pydantic_core-2.27.2.tar.gz", hash = "sha256:eb026e5a4c1fee05726072337ff51d1efb6f59090b7da90d30ea58625b1ffb39", size = 413443 } 202 | wheels = [ 203 | { url = "https://files.pythonhosted.org/packages/3a/bc/fed5f74b5d802cf9a03e83f60f18864e90e3aed7223adaca5ffb7a8d8d64/pydantic_core-2.27.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2d367ca20b2f14095a8f4fa1210f5a7b78b8a20009ecced6b12818f455b1e9fa", size = 1895938 }, 204 | { url = "https://files.pythonhosted.org/packages/71/2a/185aff24ce844e39abb8dd680f4e959f0006944f4a8a0ea372d9f9ae2e53/pydantic_core-2.27.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:491a2b73db93fab69731eaee494f320faa4e093dbed776be1a829c2eb222c34c", size = 1815684 }, 205 | { url = "https://files.pythonhosted.org/packages/c3/43/fafabd3d94d159d4f1ed62e383e264f146a17dd4d48453319fd782e7979e/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7969e133a6f183be60e9f6f56bfae753585680f3b7307a8e555a948d443cc05a", size = 1829169 }, 206 | { url = "https://files.pythonhosted.org/packages/a2/d1/f2dfe1a2a637ce6800b799aa086d079998959f6f1215eb4497966efd2274/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3de9961f2a346257caf0aa508a4da705467f53778e9ef6fe744c038119737ef5", size = 1867227 }, 207 | { url = "https://files.pythonhosted.org/packages/7d/39/e06fcbcc1c785daa3160ccf6c1c38fea31f5754b756e34b65f74e99780b5/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e2bb4d3e5873c37bb3dd58714d4cd0b0e6238cebc4177ac8fe878f8b3aa8e74c", size = 2037695 }, 208 | { url = "https://files.pythonhosted.org/packages/7a/67/61291ee98e07f0650eb756d44998214231f50751ba7e13f4f325d95249ab/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:280d219beebb0752699480fe8f1dc61ab6615c2046d76b7ab7ee38858de0a4e7", size = 2741662 }, 209 | { url = "https://files.pythonhosted.org/packages/32/90/3b15e31b88ca39e9e626630b4c4a1f5a0dfd09076366f4219429e6786076/pydantic_core-2.27.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47956ae78b6422cbd46f772f1746799cbb862de838fd8d1fbd34a82e05b0983a", size = 1993370 }, 210 | { url = "https://files.pythonhosted.org/packages/ff/83/c06d333ee3a67e2e13e07794995c1535565132940715931c1c43bfc85b11/pydantic_core-2.27.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:14d4a5c49d2f009d62a2a7140d3064f686d17a5d1a268bc641954ba181880236", size = 1996813 }, 211 | { url = "https://files.pythonhosted.org/packages/7c/f7/89be1c8deb6e22618a74f0ca0d933fdcb8baa254753b26b25ad3acff8f74/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:337b443af21d488716f8d0b6164de833e788aa6bd7e3a39c005febc1284f4962", size = 2005287 }, 212 | { url = "https://files.pythonhosted.org/packages/b7/7d/8eb3e23206c00ef7feee17b83a4ffa0a623eb1a9d382e56e4aa46fd15ff2/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:03d0f86ea3184a12f41a2d23f7ccb79cdb5a18e06993f8a45baa8dfec746f0e9", size = 2128414 }, 213 | { url = "https://files.pythonhosted.org/packages/4e/99/fe80f3ff8dd71a3ea15763878d464476e6cb0a2db95ff1c5c554133b6b83/pydantic_core-2.27.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7041c36f5680c6e0f08d922aed302e98b3745d97fe1589db0a3eebf6624523af", size = 2155301 }, 214 | { url = "https://files.pythonhosted.org/packages/2b/a3/e50460b9a5789ca1451b70d4f52546fa9e2b420ba3bfa6100105c0559238/pydantic_core-2.27.2-cp310-cp310-win32.whl", hash = "sha256:50a68f3e3819077be2c98110c1f9dcb3817e93f267ba80a2c05bb4f8799e2ff4", size = 1816685 }, 215 | { url = "https://files.pythonhosted.org/packages/57/4c/a8838731cb0f2c2a39d3535376466de6049034d7b239c0202a64aaa05533/pydantic_core-2.27.2-cp310-cp310-win_amd64.whl", hash = "sha256:e0fd26b16394ead34a424eecf8a31a1f5137094cabe84a1bcb10fa6ba39d3d31", size = 1982876 }, 216 | { url = "https://files.pythonhosted.org/packages/c2/89/f3450af9d09d44eea1f2c369f49e8f181d742f28220f88cc4dfaae91ea6e/pydantic_core-2.27.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:8e10c99ef58cfdf2a66fc15d66b16c4a04f62bca39db589ae8cba08bc55331bc", size = 1893421 }, 217 | { url = "https://files.pythonhosted.org/packages/9e/e3/71fe85af2021f3f386da42d291412e5baf6ce7716bd7101ea49c810eda90/pydantic_core-2.27.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:26f32e0adf166a84d0cb63be85c562ca8a6fa8de28e5f0d92250c6b7e9e2aff7", size = 1814998 }, 218 | { url = "https://files.pythonhosted.org/packages/a6/3c/724039e0d848fd69dbf5806894e26479577316c6f0f112bacaf67aa889ac/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c19d1ea0673cd13cc2f872f6c9ab42acc4e4f492a7ca9d3795ce2b112dd7e15", size = 1826167 }, 219 | { url = "https://files.pythonhosted.org/packages/2b/5b/1b29e8c1fb5f3199a9a57c1452004ff39f494bbe9bdbe9a81e18172e40d3/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5e68c4446fe0810e959cdff46ab0a41ce2f2c86d227d96dc3847af0ba7def306", size = 1865071 }, 220 | { url = "https://files.pythonhosted.org/packages/89/6c/3985203863d76bb7d7266e36970d7e3b6385148c18a68cc8915fd8c84d57/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d9640b0059ff4f14d1f37321b94061c6db164fbe49b334b31643e0528d100d99", size = 2036244 }, 221 | { url = "https://files.pythonhosted.org/packages/0e/41/f15316858a246b5d723f7d7f599f79e37493b2e84bfc789e58d88c209f8a/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:40d02e7d45c9f8af700f3452f329ead92da4c5f4317ca9b896de7ce7199ea459", size = 2737470 }, 222 | { url = "https://files.pythonhosted.org/packages/a8/7c/b860618c25678bbd6d1d99dbdfdf0510ccb50790099b963ff78a124b754f/pydantic_core-2.27.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c1fd185014191700554795c99b347d64f2bb637966c4cfc16998a0ca700d048", size = 1992291 }, 223 | { url = "https://files.pythonhosted.org/packages/bf/73/42c3742a391eccbeab39f15213ecda3104ae8682ba3c0c28069fbcb8c10d/pydantic_core-2.27.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d81d2068e1c1228a565af076598f9e7451712700b673de8f502f0334f281387d", size = 1994613 }, 224 | { url = "https://files.pythonhosted.org/packages/94/7a/941e89096d1175d56f59340f3a8ebaf20762fef222c298ea96d36a6328c5/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:1a4207639fb02ec2dbb76227d7c751a20b1a6b4bc52850568e52260cae64ca3b", size = 2002355 }, 225 | { url = "https://files.pythonhosted.org/packages/6e/95/2359937a73d49e336a5a19848713555605d4d8d6940c3ec6c6c0ca4dcf25/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:3de3ce3c9ddc8bbd88f6e0e304dea0e66d843ec9de1b0042b0911c1663ffd474", size = 2126661 }, 226 | { url = "https://files.pythonhosted.org/packages/2b/4c/ca02b7bdb6012a1adef21a50625b14f43ed4d11f1fc237f9d7490aa5078c/pydantic_core-2.27.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:30c5f68ded0c36466acede341551106821043e9afaad516adfb6e8fa80a4e6a6", size = 2153261 }, 227 | { url = "https://files.pythonhosted.org/packages/72/9d/a241db83f973049a1092a079272ffe2e3e82e98561ef6214ab53fe53b1c7/pydantic_core-2.27.2-cp311-cp311-win32.whl", hash = "sha256:c70c26d2c99f78b125a3459f8afe1aed4d9687c24fd677c6a4436bc042e50d6c", size = 1812361 }, 228 | { url = "https://files.pythonhosted.org/packages/e8/ef/013f07248041b74abd48a385e2110aa3a9bbfef0fbd97d4e6d07d2f5b89a/pydantic_core-2.27.2-cp311-cp311-win_amd64.whl", hash = "sha256:08e125dbdc505fa69ca7d9c499639ab6407cfa909214d500897d02afb816e7cc", size = 1982484 }, 229 | { url = "https://files.pythonhosted.org/packages/10/1c/16b3a3e3398fd29dca77cea0a1d998d6bde3902fa2706985191e2313cc76/pydantic_core-2.27.2-cp311-cp311-win_arm64.whl", hash = "sha256:26f0d68d4b235a2bae0c3fc585c585b4ecc51382db0e3ba402a22cbc440915e4", size = 1867102 }, 230 | { url = "https://files.pythonhosted.org/packages/d6/74/51c8a5482ca447871c93e142d9d4a92ead74de6c8dc5e66733e22c9bba89/pydantic_core-2.27.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:9e0c8cfefa0ef83b4da9588448b6d8d2a2bf1a53c3f1ae5fca39eb3061e2f0b0", size = 1893127 }, 231 | { url = "https://files.pythonhosted.org/packages/d3/f3/c97e80721735868313c58b89d2de85fa80fe8dfeeed84dc51598b92a135e/pydantic_core-2.27.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:83097677b8e3bd7eaa6775720ec8e0405f1575015a463285a92bfdfe254529ef", size = 1811340 }, 232 | { url = "https://files.pythonhosted.org/packages/9e/91/840ec1375e686dbae1bd80a9e46c26a1e0083e1186abc610efa3d9a36180/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:172fce187655fece0c90d90a678424b013f8fbb0ca8b036ac266749c09438cb7", size = 1822900 }, 233 | { url = "https://files.pythonhosted.org/packages/f6/31/4240bc96025035500c18adc149aa6ffdf1a0062a4b525c932065ceb4d868/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:519f29f5213271eeeeb3093f662ba2fd512b91c5f188f3bb7b27bc5973816934", size = 1869177 }, 234 | { url = "https://files.pythonhosted.org/packages/fa/20/02fbaadb7808be578317015c462655c317a77a7c8f0ef274bc016a784c54/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:05e3a55d124407fffba0dd6b0c0cd056d10e983ceb4e5dbd10dda135c31071d6", size = 2038046 }, 235 | { url = "https://files.pythonhosted.org/packages/06/86/7f306b904e6c9eccf0668248b3f272090e49c275bc488a7b88b0823444a4/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c3ed807c7b91de05e63930188f19e921d1fe90de6b4f5cd43ee7fcc3525cb8c", size = 2685386 }, 236 | { url = "https://files.pythonhosted.org/packages/8d/f0/49129b27c43396581a635d8710dae54a791b17dfc50c70164866bbf865e3/pydantic_core-2.27.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fb4aadc0b9a0c063206846d603b92030eb6f03069151a625667f982887153e2", size = 1997060 }, 237 | { url = "https://files.pythonhosted.org/packages/0d/0f/943b4af7cd416c477fd40b187036c4f89b416a33d3cc0ab7b82708a667aa/pydantic_core-2.27.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28ccb213807e037460326424ceb8b5245acb88f32f3d2777427476e1b32c48c4", size = 2004870 }, 238 | { url = "https://files.pythonhosted.org/packages/35/40/aea70b5b1a63911c53a4c8117c0a828d6790483f858041f47bab0b779f44/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:de3cd1899e2c279b140adde9357c4495ed9d47131b4a4eaff9052f23398076b3", size = 1999822 }, 239 | { url = "https://files.pythonhosted.org/packages/f2/b3/807b94fd337d58effc5498fd1a7a4d9d59af4133e83e32ae39a96fddec9d/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:220f892729375e2d736b97d0e51466252ad84c51857d4d15f5e9692f9ef12be4", size = 2130364 }, 240 | { url = "https://files.pythonhosted.org/packages/fc/df/791c827cd4ee6efd59248dca9369fb35e80a9484462c33c6649a8d02b565/pydantic_core-2.27.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:a0fcd29cd6b4e74fe8ddd2c90330fd8edf2e30cb52acda47f06dd615ae72da57", size = 2158303 }, 241 | { url = "https://files.pythonhosted.org/packages/9b/67/4e197c300976af185b7cef4c02203e175fb127e414125916bf1128b639a9/pydantic_core-2.27.2-cp312-cp312-win32.whl", hash = "sha256:1e2cb691ed9834cd6a8be61228471d0a503731abfb42f82458ff27be7b2186fc", size = 1834064 }, 242 | { url = "https://files.pythonhosted.org/packages/1f/ea/cd7209a889163b8dcca139fe32b9687dd05249161a3edda62860430457a5/pydantic_core-2.27.2-cp312-cp312-win_amd64.whl", hash = "sha256:cc3f1a99a4f4f9dd1de4fe0312c114e740b5ddead65bb4102884b384c15d8bc9", size = 1989046 }, 243 | { url = "https://files.pythonhosted.org/packages/bc/49/c54baab2f4658c26ac633d798dab66b4c3a9bbf47cff5284e9c182f4137a/pydantic_core-2.27.2-cp312-cp312-win_arm64.whl", hash = "sha256:3911ac9284cd8a1792d3cb26a2da18f3ca26c6908cc434a18f730dc0db7bfa3b", size = 1885092 }, 244 | { url = "https://files.pythonhosted.org/packages/41/b1/9bc383f48f8002f99104e3acff6cba1231b29ef76cfa45d1506a5cad1f84/pydantic_core-2.27.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7d14bd329640e63852364c306f4d23eb744e0f8193148d4044dd3dacdaacbd8b", size = 1892709 }, 245 | { url = "https://files.pythonhosted.org/packages/10/6c/e62b8657b834f3eb2961b49ec8e301eb99946245e70bf42c8817350cbefc/pydantic_core-2.27.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:82f91663004eb8ed30ff478d77c4d1179b3563df6cdb15c0817cd1cdaf34d154", size = 1811273 }, 246 | { url = "https://files.pythonhosted.org/packages/ba/15/52cfe49c8c986e081b863b102d6b859d9defc63446b642ccbbb3742bf371/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:71b24c7d61131bb83df10cc7e687433609963a944ccf45190cfc21e0887b08c9", size = 1823027 }, 247 | { url = "https://files.pythonhosted.org/packages/b1/1c/b6f402cfc18ec0024120602bdbcebc7bdd5b856528c013bd4d13865ca473/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:fa8e459d4954f608fa26116118bb67f56b93b209c39b008277ace29937453dc9", size = 1868888 }, 248 | { url = "https://files.pythonhosted.org/packages/bd/7b/8cb75b66ac37bc2975a3b7de99f3c6f355fcc4d89820b61dffa8f1e81677/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ce8918cbebc8da707ba805b7fd0b382816858728ae7fe19a942080c24e5b7cd1", size = 2037738 }, 249 | { url = "https://files.pythonhosted.org/packages/c8/f1/786d8fe78970a06f61df22cba58e365ce304bf9b9f46cc71c8c424e0c334/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:eda3f5c2a021bbc5d976107bb302e0131351c2ba54343f8a496dc8783d3d3a6a", size = 2685138 }, 250 | { url = "https://files.pythonhosted.org/packages/a6/74/d12b2cd841d8724dc8ffb13fc5cef86566a53ed358103150209ecd5d1999/pydantic_core-2.27.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bd8086fa684c4775c27f03f062cbb9eaa6e17f064307e86b21b9e0abc9c0f02e", size = 1997025 }, 251 | { url = "https://files.pythonhosted.org/packages/a0/6e/940bcd631bc4d9a06c9539b51f070b66e8f370ed0933f392db6ff350d873/pydantic_core-2.27.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8d9b3388db186ba0c099a6d20f0604a44eabdeef1777ddd94786cdae158729e4", size = 2004633 }, 252 | { url = "https://files.pythonhosted.org/packages/50/cc/a46b34f1708d82498c227d5d80ce615b2dd502ddcfd8376fc14a36655af1/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7a66efda2387de898c8f38c0cf7f14fca0b51a8ef0b24bfea5849f1b3c95af27", size = 1999404 }, 253 | { url = "https://files.pythonhosted.org/packages/ca/2d/c365cfa930ed23bc58c41463bae347d1005537dc8db79e998af8ba28d35e/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:18a101c168e4e092ab40dbc2503bdc0f62010e95d292b27827871dc85450d7ee", size = 2130130 }, 254 | { url = "https://files.pythonhosted.org/packages/f4/d7/eb64d015c350b7cdb371145b54d96c919d4db516817f31cd1c650cae3b21/pydantic_core-2.27.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ba5dd002f88b78a4215ed2f8ddbdf85e8513382820ba15ad5ad8955ce0ca19a1", size = 2157946 }, 255 | { url = "https://files.pythonhosted.org/packages/a4/99/bddde3ddde76c03b65dfd5a66ab436c4e58ffc42927d4ff1198ffbf96f5f/pydantic_core-2.27.2-cp313-cp313-win32.whl", hash = "sha256:1ebaf1d0481914d004a573394f4be3a7616334be70261007e47c2a6fe7e50130", size = 1834387 }, 256 | { url = "https://files.pythonhosted.org/packages/71/47/82b5e846e01b26ac6f1893d3c5f9f3a2eb6ba79be26eef0b759b4fe72946/pydantic_core-2.27.2-cp313-cp313-win_amd64.whl", hash = "sha256:953101387ecf2f5652883208769a79e48db18c6df442568a0b5ccd8c2723abee", size = 1990453 }, 257 | { url = "https://files.pythonhosted.org/packages/51/b2/b2b50d5ecf21acf870190ae5d093602d95f66c9c31f9d5de6062eb329ad1/pydantic_core-2.27.2-cp313-cp313-win_arm64.whl", hash = "sha256:ac4dbfd1691affb8f48c2c13241a2e3b60ff23247cbcf981759c768b6633cf8b", size = 1885186 }, 258 | { url = "https://files.pythonhosted.org/packages/46/72/af70981a341500419e67d5cb45abe552a7c74b66326ac8877588488da1ac/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:2bf14caea37e91198329b828eae1618c068dfb8ef17bb33287a7ad4b61ac314e", size = 1891159 }, 259 | { url = "https://files.pythonhosted.org/packages/ad/3d/c5913cccdef93e0a6a95c2d057d2c2cba347815c845cda79ddd3c0f5e17d/pydantic_core-2.27.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:b0cb791f5b45307caae8810c2023a184c74605ec3bcbb67d13846c28ff731ff8", size = 1768331 }, 260 | { url = "https://files.pythonhosted.org/packages/f6/f0/a3ae8fbee269e4934f14e2e0e00928f9346c5943174f2811193113e58252/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:688d3fd9fcb71f41c4c015c023d12a79d1c4c0732ec9eb35d96e3388a120dcf3", size = 1822467 }, 261 | { url = "https://files.pythonhosted.org/packages/d7/7a/7bbf241a04e9f9ea24cd5874354a83526d639b02674648af3f350554276c/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d591580c34f4d731592f0e9fe40f9cc1b430d297eecc70b962e93c5c668f15f", size = 1979797 }, 262 | { url = "https://files.pythonhosted.org/packages/4f/5f/4784c6107731f89e0005a92ecb8a2efeafdb55eb992b8e9d0a2be5199335/pydantic_core-2.27.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:82f986faf4e644ffc189a7f1aafc86e46ef70372bb153e7001e8afccc6e54133", size = 1987839 }, 263 | { url = "https://files.pythonhosted.org/packages/6d/a7/61246562b651dff00de86a5f01b6e4befb518df314c54dec187a78d81c84/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:bec317a27290e2537f922639cafd54990551725fc844249e64c523301d0822fc", size = 1998861 }, 264 | { url = "https://files.pythonhosted.org/packages/86/aa/837821ecf0c022bbb74ca132e117c358321e72e7f9702d1b6a03758545e2/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:0296abcb83a797db256b773f45773da397da75a08f5fcaef41f2044adec05f50", size = 2116582 }, 265 | { url = "https://files.pythonhosted.org/packages/81/b0/5e74656e95623cbaa0a6278d16cf15e10a51f6002e3ec126541e95c29ea3/pydantic_core-2.27.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0d75070718e369e452075a6017fbf187f788e17ed67a3abd47fa934d001863d9", size = 2151985 }, 266 | { url = "https://files.pythonhosted.org/packages/63/37/3e32eeb2a451fddaa3898e2163746b0cffbbdbb4740d38372db0490d67f3/pydantic_core-2.27.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:7e17b560be3c98a8e3aa66ce828bdebb9e9ac6ad5466fba92eb74c4c95cb1151", size = 2004715 }, 267 | ] 268 | 269 | [[package]] 270 | name = "pydantic-settings" 271 | version = "2.8.1" 272 | source = { registry = "https://pypi.org/simple" } 273 | dependencies = [ 274 | { name = "pydantic" }, 275 | { name = "python-dotenv" }, 276 | ] 277 | sdist = { url = "https://files.pythonhosted.org/packages/88/82/c79424d7d8c29b994fb01d277da57b0a9b09cc03c3ff875f9bd8a86b2145/pydantic_settings-2.8.1.tar.gz", hash = "sha256:d5c663dfbe9db9d5e1c646b2e161da12f0d734d422ee56f567d0ea2cee4e8585", size = 83550 } 278 | wheels = [ 279 | { url = "https://files.pythonhosted.org/packages/0b/53/a64f03044927dc47aafe029c42a5b7aabc38dfb813475e0e1bf71c4a59d0/pydantic_settings-2.8.1-py3-none-any.whl", hash = "sha256:81942d5ac3d905f7f3ee1a70df5dfb62d5569c12f51a5a647defc1c3d9ee2e9c", size = 30839 }, 280 | ] 281 | 282 | [[package]] 283 | name = "pygments" 284 | version = "2.19.1" 285 | source = { registry = "https://pypi.org/simple" } 286 | sdist = { url = "https://files.pythonhosted.org/packages/7c/2d/c3338d48ea6cc0feb8446d8e6937e1408088a72a39937982cc6111d17f84/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f", size = 4968581 } 287 | wheels = [ 288 | { url = "https://files.pythonhosted.org/packages/8a/0b/9fcc47d19c48b59121088dd6da2488a49d5f72dacf8262e2790a1d2c7d15/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c", size = 1225293 }, 289 | ] 290 | 291 | [[package]] 292 | name = "python-dotenv" 293 | version = "1.0.1" 294 | source = { registry = "https://pypi.org/simple" } 295 | sdist = { url = "https://files.pythonhosted.org/packages/bc/57/e84d88dfe0aec03b7a2d4327012c1627ab5f03652216c63d49846d7a6c58/python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca", size = 39115 } 296 | wheels = [ 297 | { url = "https://files.pythonhosted.org/packages/6a/3e/b68c118422ec867fa7ab88444e1274aa40681c606d59ac27de5a5588f082/python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a", size = 19863 }, 298 | ] 299 | 300 | [[package]] 301 | name = "rich" 302 | version = "13.9.4" 303 | source = { registry = "https://pypi.org/simple" } 304 | dependencies = [ 305 | { name = "markdown-it-py" }, 306 | { name = "pygments" }, 307 | { name = "typing-extensions", marker = "python_full_version < '3.11'" }, 308 | ] 309 | sdist = { url = "https://files.pythonhosted.org/packages/ab/3a/0316b28d0761c6734d6bc14e770d85506c986c85ffb239e688eeaab2c2bc/rich-13.9.4.tar.gz", hash = "sha256:439594978a49a09530cff7ebc4b5c7103ef57baf48d5ea3184f21d9a2befa098", size = 223149 } 310 | wheels = [ 311 | { url = "https://files.pythonhosted.org/packages/19/71/39c7c0d87f8d4e6c020a393182060eaefeeae6c01dab6a84ec346f2567df/rich-13.9.4-py3-none-any.whl", hash = "sha256:6049d5e6ec054bf2779ab3358186963bac2ea89175919d699e378b99738c2a90", size = 242424 }, 312 | ] 313 | 314 | [[package]] 315 | name = "shellingham" 316 | version = "1.5.4" 317 | source = { registry = "https://pypi.org/simple" } 318 | sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310 } 319 | wheels = [ 320 | { url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755 }, 321 | ] 322 | 323 | [[package]] 324 | name = "sniffio" 325 | version = "1.3.1" 326 | source = { registry = "https://pypi.org/simple" } 327 | sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 } 328 | wheels = [ 329 | { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 }, 330 | ] 331 | 332 | [[package]] 333 | name = "sse-starlette" 334 | version = "2.2.1" 335 | source = { registry = "https://pypi.org/simple" } 336 | dependencies = [ 337 | { name = "anyio" }, 338 | { name = "starlette" }, 339 | ] 340 | sdist = { url = "https://files.pythonhosted.org/packages/71/a4/80d2a11af59fe75b48230846989e93979c892d3a20016b42bb44edb9e398/sse_starlette-2.2.1.tar.gz", hash = "sha256:54470d5f19274aeed6b2d473430b08b4b379ea851d953b11d7f1c4a2c118b419", size = 17376 } 341 | wheels = [ 342 | { url = "https://files.pythonhosted.org/packages/d9/e0/5b8bd393f27f4a62461c5cf2479c75a2cc2ffa330976f9f00f5f6e4f50eb/sse_starlette-2.2.1-py3-none-any.whl", hash = "sha256:6410a3d3ba0c89e7675d4c273a301d64649c03a5ef1ca101f10b47f895fd0e99", size = 10120 }, 343 | ] 344 | 345 | [[package]] 346 | name = "starlette" 347 | version = "0.46.0" 348 | source = { registry = "https://pypi.org/simple" } 349 | dependencies = [ 350 | { name = "anyio" }, 351 | ] 352 | sdist = { url = "https://files.pythonhosted.org/packages/44/b6/fb9a32e3c5d59b1e383c357534c63c2d3caa6f25bf3c59dd89d296ecbaec/starlette-0.46.0.tar.gz", hash = "sha256:b359e4567456b28d473d0193f34c0de0ed49710d75ef183a74a5ce0499324f50", size = 2575568 } 353 | wheels = [ 354 | { url = "https://files.pythonhosted.org/packages/41/94/8af675a62e3c91c2dee47cf92e602cfac86e8767b1a1ac3caf1b327c2ab0/starlette-0.46.0-py3-none-any.whl", hash = "sha256:913f0798bd90ba90a9156383bcf1350a17d6259451d0d8ee27fc0cf2db609038", size = 71991 }, 355 | ] 356 | 357 | [[package]] 358 | name = "typer" 359 | version = "0.15.2" 360 | source = { registry = "https://pypi.org/simple" } 361 | dependencies = [ 362 | { name = "click" }, 363 | { name = "rich" }, 364 | { name = "shellingham" }, 365 | { name = "typing-extensions" }, 366 | ] 367 | sdist = { url = "https://files.pythonhosted.org/packages/8b/6f/3991f0f1c7fcb2df31aef28e0594d8d54b05393a0e4e34c65e475c2a5d41/typer-0.15.2.tar.gz", hash = "sha256:ab2fab47533a813c49fe1f16b1a370fd5819099c00b119e0633df65f22144ba5", size = 100711 } 368 | wheels = [ 369 | { url = "https://files.pythonhosted.org/packages/7f/fc/5b29fea8cee020515ca82cc68e3b8e1e34bb19a3535ad854cac9257b414c/typer-0.15.2-py3-none-any.whl", hash = "sha256:46a499c6107d645a9c13f7ee46c5d5096cae6f5fc57dd11eccbbb9ae3e44ddfc", size = 45061 }, 370 | ] 371 | 372 | [[package]] 373 | name = "typing-extensions" 374 | version = "4.12.2" 375 | source = { registry = "https://pypi.org/simple" } 376 | sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec321ba8bce9420de607a1d37f8342eee1863174c69557/typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8", size = 85321 } 377 | wheels = [ 378 | { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 }, 379 | ] 380 | 381 | [[package]] 382 | name = "uvicorn" 383 | version = "0.34.0" 384 | source = { registry = "https://pypi.org/simple" } 385 | dependencies = [ 386 | { name = "click" }, 387 | { name = "h11" }, 388 | { name = "typing-extensions", marker = "python_full_version < '3.11'" }, 389 | ] 390 | sdist = { url = "https://files.pythonhosted.org/packages/4b/4d/938bd85e5bf2edeec766267a5015ad969730bb91e31b44021dfe8b22df6c/uvicorn-0.34.0.tar.gz", hash = "sha256:404051050cd7e905de2c9a7e61790943440b3416f49cb409f965d9dcd0fa73e9", size = 76568 } 391 | wheels = [ 392 | { url = "https://files.pythonhosted.org/packages/61/14/33a3a1352cfa71812a3a21e8c9bfb83f60b0011f5e36f2b1399d51928209/uvicorn-0.34.0-py3-none-any.whl", hash = "sha256:023dc038422502fa28a09c7a30bf2b6991512da7dcdb8fd35fe57cfc154126f4", size = 62315 }, 393 | ] 394 | --------------------------------------------------------------------------------