├── tests ├── __init__.py ├── empty.txt ├── Noëlご.jpg ├── sg_logo.jpg ├── test_config_file ├── run_appveyor.bat ├── ci_requirements.txt ├── test_proxy.py ├── example_config └── test_api_long.py ├── shotgun_api3 ├── py.typed ├── lib │ ├── __init__.py │ ├── certifi │ │ ├── py.typed │ │ ├── __init__.py │ │ ├── __main__.py │ │ └── core.py │ ├── .gitignore │ ├── httplib2 │ │ ├── certs.py │ │ ├── error.py │ │ ├── auth.py │ │ └── iri2uri.py │ ├── requirements.txt │ ├── README.md │ ├── mockgun │ │ ├── __init__.py │ │ ├── errors.py │ │ └── schema.py │ └── sgtimezone.py └── __init__.py ├── docs ├── changelog.rst ├── images │ ├── scripts_page.png │ ├── split_tasks_1.png │ ├── split_tasks_2.png │ ├── split_tasks_3.png │ ├── split_tasks_4.png │ ├── split_tasks_5.png │ ├── split_tasks_6.png │ ├── split_tasks_7.png │ ├── split_tasks_8.png │ └── split_tasks_9.png ├── cookbook │ ├── tasks.rst │ ├── tutorials.rst │ ├── examples │ │ ├── basic_sg_instance.rst │ │ ├── basic_upload_thumbnail_version.rst │ │ ├── basic_delete_shot.rst │ │ ├── basic_find_shot.rst │ │ ├── basic_create_shot_task_template.rst │ │ ├── basic_update_shot.rst │ │ ├── basic_create_shot.rst │ │ ├── basic_create_version_link_shot.rst │ │ ├── svn_integration.rst │ │ ├── ami_handler.rst │ │ └── ami_version_packager.rst │ ├── smart_cut_fields.rst │ ├── tasks │ │ ├── split_tasks.rst │ │ ├── updating_tasks.rst │ │ └── task_dependencies.rst │ └── usage_tips.rst ├── advanced.rst ├── advanced │ ├── iron_python.rst │ └── packaging.rst ├── cookbook.rst ├── index.rst ├── installation.rst └── authentication.rst ├── .gitattributes ├── nose.cfg ├── .flake8 ├── run-tests ├── .coveragerc ├── .gitignore ├── SECURITY.md ├── setup.py ├── .pre-commit-config.yaml ├── azure-pipelines-templates ├── type_checking.yml └── run-tests.yml ├── LICENSE ├── developer └── README.md ├── README.md ├── azure-pipelines.yml ├── update_httplib2.py └── software_credits /tests/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /tests/empty.txt: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /shotgun_api3/py.typed: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /shotgun_api3/lib/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /shotgun_api3/lib/certifi/py.typed: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /shotgun_api3/lib/.gitignore: -------------------------------------------------------------------------------- 1 | *.dist-info -------------------------------------------------------------------------------- /tests/Noëlご.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/tests/Noëlご.jpg -------------------------------------------------------------------------------- /tests/sg_logo.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/tests/sg_logo.jpg -------------------------------------------------------------------------------- /docs/changelog.rst: -------------------------------------------------------------------------------- 1 | .. currentmodule:: shotgun_api3.shotgun.Shotgun 2 | 3 | .. include:: ../HISTORY.rst 4 | -------------------------------------------------------------------------------- /docs/images/scripts_page.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/scripts_page.png -------------------------------------------------------------------------------- /docs/images/split_tasks_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_1.png -------------------------------------------------------------------------------- /docs/images/split_tasks_2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_2.png -------------------------------------------------------------------------------- /docs/images/split_tasks_3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_3.png -------------------------------------------------------------------------------- /docs/images/split_tasks_4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_4.png -------------------------------------------------------------------------------- /docs/images/split_tasks_5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_5.png -------------------------------------------------------------------------------- /docs/images/split_tasks_6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_6.png -------------------------------------------------------------------------------- /docs/images/split_tasks_7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_7.png -------------------------------------------------------------------------------- /docs/images/split_tasks_8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_8.png -------------------------------------------------------------------------------- /docs/images/split_tasks_9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/shotgunsoftware/python-api/HEAD/docs/images/split_tasks_9.png -------------------------------------------------------------------------------- /shotgun_api3/lib/certifi/__init__.py: -------------------------------------------------------------------------------- 1 | from .core import contents, where 2 | 3 | __all__ = ["contents", "where"] 4 | __version__ = "2025.07.14" 5 | -------------------------------------------------------------------------------- /tests/test_config_file: -------------------------------------------------------------------------------- 1 | [SERVER_INFO] 2 | server_url : https://url 3 | script_name : xyz 4 | api_key : %%abce 5 | 6 | [TEST_DATA] 7 | project_name : hjkl 8 | -------------------------------------------------------------------------------- /.gitattributes: -------------------------------------------------------------------------------- 1 | # Handle line endings automatically for files detected as text 2 | # and leave all files detected as binary untouched. 3 | * text=auto 4 | 5 | # Force the following filetypes to have unix eols, so Windows does not break them 6 | *.pickle text eol=lf 7 | -------------------------------------------------------------------------------- /shotgun_api3/lib/certifi/__main__.py: -------------------------------------------------------------------------------- 1 | import argparse 2 | 3 | from certifi import contents, where 4 | 5 | parser = argparse.ArgumentParser() 6 | parser.add_argument("-c", "--contents", action="store_true") 7 | args = parser.parse_args() 8 | 9 | if args.contents: 10 | print(contents()) 11 | else: 12 | print(where()) 13 | -------------------------------------------------------------------------------- /docs/cookbook/tasks.rst: -------------------------------------------------------------------------------- 1 | ################## 2 | Working With Tasks 3 | ################## 4 | 5 | Tasks have various special functionality available in the UI that can also be queried and 6 | manipulated through the API. The sections below cover these topics. 7 | 8 | .. toctree:: 9 | :maxdepth: 2 10 | 11 | tasks/updating_tasks 12 | tasks/task_dependencies 13 | tasks/split_tasks 14 | -------------------------------------------------------------------------------- /docs/advanced.rst: -------------------------------------------------------------------------------- 1 | .. _advanced_topics: 2 | 3 | ############### 4 | Advanced Topics 5 | ############### 6 | 7 | Below are some more advanced topics regarding usage of the Python API. If you would like to see 8 | something that's missing here, please feel free to contact support at https://www.autodesk.com/support 9 | with your suggestions and we'll get it added! 10 | 11 | .. toctree:: 12 | :maxdepth: 1 13 | 14 | advanced/packaging 15 | advanced/iron_python 16 | changelog 17 | -------------------------------------------------------------------------------- /nose.cfg: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | [nosetests] 12 | exclude-dir=shotgun_api3/lib 13 | -------------------------------------------------------------------------------- /.flake8: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | [flake8] 12 | max-line-length = 120 13 | exclude = shotgun_api3/lib/httplib2/*,tests/httplib2test.py 14 | -------------------------------------------------------------------------------- /run-tests: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # Copyright (c) 2019 Shotgun Software Inc. 4 | # 5 | # CONFIDENTIAL AND PROPRIETARY 6 | # 7 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 8 | # Source Code License included in this distribution package. See LICENSE. 9 | # By accessing, using, copying or modifying this work you indicate your 10 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 11 | # not expressly granted therein are reserved by Shotgun Software Inc. 12 | 13 | clear && find ./ -name ".coverage" -delete && find ./ -name "*.pyc" -delete && nosetests -vd --config="nose.cfg" --with-cover --cover-package=shotgun_api3 14 | -------------------------------------------------------------------------------- /.coveragerc: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2018 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | # 11 | # coverage configuration - used by https://coveralls.io/ integration 12 | # 13 | # 14 | 15 | [run] 16 | source=shotgun_api3 17 | omit= 18 | shotgun_api3/lib/httplib2/* 19 | shotgun_api3/lib/certify/* 20 | shotgun_api3/lib/pyparsing.py 21 | -------------------------------------------------------------------------------- /tests/run_appveyor.bat: -------------------------------------------------------------------------------- 1 | :: Copyright (c) 2018 Shotgun Software Inc. 2 | :: 3 | :: CONFIDENTIAL AND PROPRIETARY 4 | :: 5 | :: This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | :: Source Code License included in this distribution package. See LICENSE. 7 | :: By accessing, using, copying or modifying this work you indicate your 8 | :: agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | :: not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | :: 12 | :: This file is run by the appveyor builds. 13 | :: 14 | 15 | copy tests\example_config tests\config 16 | %PYTHON%\Scripts\pip install -r tests/ci_requirements.txt 17 | %PYTHON%\Scripts\pip install . -U 18 | cd tests 19 | %PYTHON%\Scripts\nosetests.exe -v --no-path-adjustment 20 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | #python specific 12 | *.pyc 13 | 14 | ## generic files to ignore 15 | *~ 16 | *.lock 17 | *.DS_Store 18 | *.swp 19 | *.out 20 | *.bak 21 | 22 | # test related 23 | tests/config 24 | .coverage 25 | .cache 26 | .travis-solo 27 | htmlcov 28 | test-output.xml 29 | coverage.xml 30 | 31 | # setup related 32 | build 33 | dist 34 | shotgun_api3.egg-info 35 | /%1 36 | -------------------------------------------------------------------------------- /docs/cookbook/tutorials.rst: -------------------------------------------------------------------------------- 1 | ######## 2 | Examples 3 | ######## 4 | 5 | Here's a list of various simple tutorials to walk through that should provide you with a good base 6 | understanding of how to use the Flow Production Tracking API and what you can do with it. 7 | 8 | ***** 9 | Basic 10 | ***** 11 | 12 | .. toctree:: 13 | :maxdepth: 1 14 | 15 | examples/basic_sg_instance 16 | examples/basic_create_shot 17 | examples/basic_find_shot 18 | examples/basic_update_shot 19 | examples/basic_delete_shot 20 | examples/basic_create_shot_task_template 21 | examples/basic_create_version_link_shot 22 | examples/basic_upload_thumbnail_version 23 | 24 | ******** 25 | Advanced 26 | ******** 27 | 28 | .. toctree:: 29 | :maxdepth: 1 30 | 31 | examples/ami_handler 32 | examples/ami_version_packager 33 | examples/svn_integration 34 | -------------------------------------------------------------------------------- /tests/ci_requirements.txt: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | coverage 12 | coveralls==1.1 13 | # Don't restrict flake8 version, since we install this in CI against Python 2.6, 14 | # where flake8 has discontinued support for newer releases. On Python 2.7 and 15 | # Python 3.7, linting has been performed with flake8 3.7.8 16 | flake8 17 | nose==1.3.7 18 | nose-exclude==0.5.0 19 | pytest 20 | pytest-azurepipelines 21 | pytest-coverage 22 | -------------------------------------------------------------------------------- /shotgun_api3/lib/httplib2/certs.py: -------------------------------------------------------------------------------- 1 | """Utilities for certificate management.""" 2 | 3 | import os 4 | 5 | certifi_available = False 6 | certifi_where = None 7 | try: 8 | from certifi import where as certifi_where 9 | certifi_available = True 10 | except ImportError: 11 | pass 12 | 13 | custom_ca_locater_available = False 14 | custom_ca_locater_where = None 15 | try: 16 | from ca_certs_locater import get as custom_ca_locater_where 17 | custom_ca_locater_available = True 18 | except ImportError: 19 | pass 20 | 21 | 22 | BUILTIN_CA_CERTS = os.path.join( 23 | os.path.dirname(os.path.abspath(__file__)), "cacerts.txt" 24 | ) 25 | 26 | 27 | def where(): 28 | env = os.environ.get("HTTPLIB2_CA_CERTS") 29 | if env is not None: 30 | if os.path.isfile(env): 31 | return env 32 | else: 33 | raise RuntimeError("Environment variable HTTPLIB2_CA_CERTS not a valid file") 34 | if custom_ca_locater_available: 35 | return custom_ca_locater_where() 36 | if certifi_available: 37 | return certifi_where() 38 | return BUILTIN_CA_CERTS 39 | 40 | 41 | if __name__ == "__main__": 42 | print(where()) 43 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_sg_instance.rst: -------------------------------------------------------------------------------- 1 | .. _example_sg_instance: 2 | 3 | Create a Flow Production Tracking API instance 4 | ============================================== 5 | 6 | This example shows you how to establish your initial connection to Flow Production Tracking using script-based 7 | authentication. ``sg`` represents your Flow Production Tracking API instance. Be sure you've read 8 | :ref:`Setting Up Flow Production Tracking for API Access `. 9 | :: 10 | 11 | import pprint # Useful for debugging 12 | 13 | import shotgun_api3 14 | 15 | SERVER_PATH = "https://my-site.shotgrid.autodesk.com" 16 | SCRIPT_NAME = 'my_script' 17 | SCRIPT_KEY = '27b65d7063f46b82e670fe807bd2b6f3fd1676c1' 18 | 19 | sg = shotgun_api3.Shotgun(SERVER_PATH, SCRIPT_NAME, SCRIPT_KEY) 20 | 21 | # Just for demo purposes, this will print out property and method names available on the 22 | # sg connection object 23 | pprint.pprint([symbol for symbol in sorted(dir(sg)) if not symbol.startswith('_')]) 24 | 25 | For further information on what you can do with this Flow Production Tracking object you can read the 26 | :ref:`API reference `. 27 | -------------------------------------------------------------------------------- /shotgun_api3/lib/httplib2/error.py: -------------------------------------------------------------------------------- 1 | # All exceptions raised here derive from HttpLib2Error 2 | class HttpLib2Error(Exception): 3 | pass 4 | 5 | 6 | # Some exceptions can be caught and optionally 7 | # be turned back into responses. 8 | class HttpLib2ErrorWithResponse(HttpLib2Error): 9 | def __init__(self, desc, response, content): 10 | self.response = response 11 | self.content = content 12 | HttpLib2Error.__init__(self, desc) 13 | 14 | 15 | class RedirectMissingLocation(HttpLib2ErrorWithResponse): 16 | pass 17 | 18 | 19 | class RedirectLimit(HttpLib2ErrorWithResponse): 20 | pass 21 | 22 | 23 | class FailedToDecompressContent(HttpLib2ErrorWithResponse): 24 | pass 25 | 26 | 27 | class UnimplementedDigestAuthOptionError(HttpLib2ErrorWithResponse): 28 | pass 29 | 30 | 31 | class UnimplementedHmacDigestAuthOptionError(HttpLib2ErrorWithResponse): 32 | pass 33 | 34 | 35 | class MalformedHeader(HttpLib2Error): 36 | pass 37 | 38 | 39 | class RelativeURIError(HttpLib2Error): 40 | pass 41 | 42 | 43 | class ServerNotFoundError(HttpLib2Error): 44 | pass 45 | 46 | 47 | class ProxiesUnavailableError(HttpLib2Error): 48 | pass 49 | -------------------------------------------------------------------------------- /tests/test_proxy.py: -------------------------------------------------------------------------------- 1 | #! /opt/local/bin/python 2 | 3 | # Copyright (c) 2019 Shotgun Software Inc. 4 | # 5 | # CONFIDENTIAL AND PROPRIETARY 6 | # 7 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 8 | # Source Code License included in this distribution package. See LICENSE. 9 | # By accessing, using, copying or modifying this work you indicate your 10 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 11 | # not expressly granted therein are reserved by Shotgun Software Inc. 12 | import sys 13 | from . import base 14 | import shotgun_api3 as api 15 | 16 | 17 | class ServerConnectionTest(base.TestBase): 18 | """Tests for server connection""" 19 | 20 | def setUp(self): 21 | super().setUp() 22 | 23 | def test_connection(self): 24 | """Tests server connects and returns nothing""" 25 | result = self.sg.connect() 26 | self.assertEqual(result, None) 27 | 28 | def test_proxy_info(self): 29 | """check proxy value depending http_proxy setting in config""" 30 | self.sg.connect() 31 | if self.config.http_proxy: 32 | sys.stderr.write("[WITH PROXY] ") 33 | self.assertTrue( 34 | isinstance(self.sg._connection.proxy_info, api.lib.httplib2.ProxyInfo) 35 | ) 36 | else: 37 | sys.stderr.write("[NO PROXY] ") 38 | self.assertEqual(self.sg._connection.proxy_info, None) 39 | -------------------------------------------------------------------------------- /tests/example_config: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | # Example server and login information to use for tests which require a live server. 12 | # This file should be renamed to config and the appropriate values should be added. 13 | 14 | [SERVER_INFO] 15 | 16 | # Full url to the Flow Production Tracking server 17 | # e.g. https://my-site.shotgrid.autodesk.com 18 | # be careful to not end server_url with a "/", or some tests may fail 19 | server_url : http://0.0.0.0:3000 20 | # script name as key as listed in admin panel for your server 21 | script_name : test script name 22 | api_key : test script api key 23 | http_proxy : 24 | session_uuid : 25 | 26 | [TEST_DATA] 27 | project_name : PTR unittest project 28 | 29 | human_name : Sg unittest human 30 | human_login : sgunittesthuman 31 | human_password : human password 32 | 33 | asset_code : Sg unittest asset 34 | version_code : Sg unittest version 35 | shot_code : Sg unittest shot 36 | task_content : Sg unittest task 37 | playlist_code : Sg unittest playlist 38 | -------------------------------------------------------------------------------- /SECURITY.md: -------------------------------------------------------------------------------- 1 | # Security Policy 2 | 3 | ## Security 4 | 5 | At Autodesk, we know that the security of your data is critical to your studio’s 6 | operation. 7 | As the industry shifts to the cloud, Flow Production Tracking knows that security and service 8 | models are more important than ever. 9 | 10 | The confidentiality, integrity, and availability of your content is at the top 11 | of our priority list. 12 | Not only do we have a team of Flow Production Tracking engineers dedicated to platform security 13 | and performance, we are also backed by Autodesk’s security team, also invests 14 | heavily in the security for broad range of industries and customers. 15 | We constantly reassess, develop, and improve our risk management program because 16 | we know that the landscape of security is ever-changing. 17 | 18 | If you believe you have found a security vulnerability in any Flow Production Tracking-owned 19 | repository, please report it to us as described below. 20 | 21 | 22 | ## Reporting Security Issues 23 | 24 | **Please do not report security vulnerabilities through public GitHub issues.** 25 | 26 | Instead, please report them by sending a message to 27 | [Autodesk Trust Center](https://www.autodesk.com/trust/contact-us). 28 | 29 | Please include as much information as you can provide such as locations, 30 | configurations, reproduction steps, exploit code, impact, etc. 31 | 32 | 33 | ## Additional Information 34 | 35 | Please check out the [Flow Production Tracking Security White Paper](https://help.autodesk.com/view/SGSUB/ENU/?guid=SG_Administrator_ar_general_security_ar_security_white_paper_html). 36 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_upload_thumbnail_version.rst: -------------------------------------------------------------------------------- 1 | Upload a Thumbnail for a Version 2 | ================================ 3 | 4 | So you've created a new Version of a Shot, and you've updated Flow Production Tracking, but now you want to upload a 5 | beauty frame to display as the thumbnail for your Version. We'll assume you already have the image 6 | made (located on your machine at ``/v1/gun/s100/010/beauties/anim/100_010_animv1.jpg``) . And since 7 | you've just created your Version in Flow Production Tracking, you know its ``id`` is **214**. 8 | 9 | .. note:: If you upload a movie file or image to the ``sg_uploaded_movie`` field and you have 10 | transcoding enabled on your server (the default for hosted sites), a thumbnail will be 11 | generated automatically as well as a filmstrip thumbnail (if possible). 12 | This is a basic example of how to manually provide or replace a thumbnail image. 13 | 14 | Upload the Image using :meth:`~shotgun_api3.Shotgun.upload_thumbnail` 15 | --------------------------------------------------------------------- 16 | :: 17 | 18 | sg.upload_thumbnail("Version", 214, "/v1/gun/s100/010/beauties/anim/100_010_animv1.jpg") 19 | 20 | 21 | Flow Production Tracking will take care of resizing the thumbnail for you. If something does go wrong, an exception 22 | will be thrown and you'll see the error details. 23 | 24 | .. note:: The result returned by :meth:`~shotgun_api3.Shotgun.upload_thumbnail` is an integer 25 | representing the id of a special ``Attachment`` entity in Flow Production Tracking. Working with Attachments 26 | is beyond the scope of this example. :) 27 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_delete_shot.rst: -------------------------------------------------------------------------------- 1 | Delete A Shot 2 | ============= 3 | 4 | Calling :meth:`~shotgun_api3.Shotgun.delete` 5 | -------------------------------------------- 6 | Deleting an entity in Flow Production Tracking is pretty straight-forward. No extraneous steps required.:: 7 | 8 | result = sg.delete("Shot", 40435) 9 | 10 | Delete Shot Result 11 | ------------------ 12 | If the Shot was deleted successfully ``result`` will contain:: 13 | 14 | True 15 | 16 | The Complete Example for deleting a Shot 17 | ---------------------------------------- 18 | :: 19 | 20 | #!/usr/bin/env python 21 | 22 | # -------------------------------------- 23 | # Imports 24 | # -------------------------------------- 25 | import shotgun_api3 26 | from pprint import pprint # useful for debugging 27 | 28 | # -------------------------------------- 29 | # Globals 30 | # -------------------------------------- 31 | # make sure to change this to match your Flow Production Tracking server and auth credentials. 32 | SERVER_PATH = "https://my-site.shotgrid.autodesk.com" 33 | SCRIPT_NAME = 'my_script' 34 | SCRIPT_KEY = '27b65d7063f46b82e670fe807bd2b6f3fd1676c1' 35 | 36 | # -------------------------------------- 37 | # Main 38 | # -------------------------------------- 39 | if __name__ == '__main__': 40 | 41 | sg = shotgun_api3.Shotgun(SERVER_PATH, SCRIPT_NAME, SCRIPT_KEY) 42 | 43 | # -------------------------------------- 44 | # Delete a Shot by id 45 | # -------------------------------------- 46 | result = sg.delete("Shot", 40435) 47 | pprint(result) 48 | 49 | And here is the output:: 50 | 51 | True 52 | -------------------------------------------------------------------------------- /setup.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | # Copyright (c) 2019 Shotgun Software Inc. 3 | # 4 | # CONFIDENTIAL AND PROPRIETARY 5 | # 6 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 7 | # Source Code License included in this distribution package. See LICENSE. 8 | # By accessing, using, copying or modifying this work you indicate your 9 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 10 | # not expressly granted therein are reserved by Shotgun Software Inc. 11 | 12 | import sys 13 | from setuptools import setup, find_packages 14 | 15 | f = open("README.md") 16 | readme = f.read().strip() 17 | 18 | f = open("LICENSE") 19 | license = f.read().strip() 20 | 21 | setup( 22 | name="shotgun_api3", 23 | version="3.9.2", 24 | description="Flow Production Tracking Python API", 25 | long_description=readme, 26 | author="Autodesk", 27 | author_email="https://www.autodesk.com/support", 28 | url="https://github.com/shotgunsoftware/python-api", 29 | license=license, 30 | packages=find_packages(exclude=("tests",)), 31 | script_args=sys.argv[1:], 32 | include_package_data=True, 33 | package_data={"": ["cacerts.txt", "cacert.pem", "py.typed"]}, 34 | zip_safe=False, 35 | python_requires=">=3.9.0", 36 | classifiers=[ 37 | "Development Status :: 5 - Production/Stable", 38 | "Intended Audience :: Developers", 39 | "Programming Language :: Python", 40 | "Programming Language :: Python :: 3.9", 41 | "Programming Language :: Python :: 3.10", 42 | "Programming Language :: Python :: 3.11", 43 | "Programming Language :: Python :: 3.12", 44 | "Operating System :: OS Independent", 45 | ], 46 | ) 47 | -------------------------------------------------------------------------------- /shotgun_api3/lib/requirements.txt: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2009-2019, Shotgun Software Inc. 2 | # 3 | # Redistribution and use in source and binary forms, with or without 4 | # modification, are permitted provided that the following conditions are met: 5 | # 6 | # - Redistributions of source code must retain the above copyright notice, 7 | # this list of conditions and the following disclaimer. 8 | # 9 | # - Redistributions in binary form must reproduce the above copyright notice, 10 | # this list of conditions and the following disclaimer in the documentation 11 | # and/or other materials provided with the distribution. 12 | # 13 | # - Neither the name of the Shotgun Software Inc nor the names of its 14 | # contributors may be used to endorse or promote products derived from this 15 | # software without specific prior written permission. 16 | # 17 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 18 | # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 19 | # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE 20 | # ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE 21 | # LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR 22 | # CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF 23 | # SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS 24 | # INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN 25 | # CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) 26 | # ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 27 | # POSSIBILITY OF SUCH DAMAGE. 28 | 29 | # This file is unused. It is left there so Github can warn us is a CVE is 30 | # released for our dependencies. 31 | httplib2==0.22.0 32 | certifi==2024.7.4 33 | pyparsing==2.4.7 34 | -------------------------------------------------------------------------------- /shotgun_api3/lib/README.md: -------------------------------------------------------------------------------- 1 | # Lib Submodules 2 | 3 | ## Third Party Modules 4 | 5 | Some third-party modules are bundled with `python-api` inside lib. 6 | 7 | ### httplib2 8 | 9 | `httplib2` is used to make http connections to the Flow Production Tracking server. We bundle both python2 and python3 compatible versions since httplib2 chose to maintain parallel versions of the module for python 2 and 3 compatibility. 10 | 11 | The version of `httplib2` bundled should be updated manually, however its version is included in the unused `shotgun_api3/lib/requirements.txt` to allow Github's automated CVE notifications to work. 12 | 13 | ## Flow Production Tracking Modules 14 | 15 | ### sgtimezone 16 | 17 | `sgtimezone` contains classes for easing the conversion between the server (UTC) timezone and client timezone. 18 | 19 | ### mockgun 20 | 21 | Mockgun is a Flow Production Tracking API mocker. It's a class that has got *most* of the same 22 | methods and parameters that the Flow Production Tracking API has got. Mockgun is essentially a 23 | Flow Production Tracking *emulator* that (for basic operations) looks and feels like Flow Production Tracking. 24 | 25 | The primary purpose of Mockgun is to drive unit test rigs where it becomes 26 | too slow, cumbersome or non-practical to connect to a real Flow Production Tracking. Using a 27 | Mockgun for unit tests means that a test can be rerun over and over again 28 | from exactly the same database state. This can be hard to do if you connect 29 | to a live Flow Production Tracking instance. 30 | 31 | ## Lib `requirements.txt` 32 | 33 | The file `shotgun_api3/lib/requirements.txt` is not used to install any packages, however exists so that automated checks for CVEs in dependencies will know about bundled packages. 34 | 35 | For this reason, it's important to add any newly bundled packages to this file, and to keep the file up to date if the bundled version of a module changes. -------------------------------------------------------------------------------- /shotgun_api3/lib/mockgun/__init__.py: -------------------------------------------------------------------------------- 1 | """ 2 | ----------------------------------------------------------------------------- 3 | Copyright (c) 2009-2017, Shotgun Software Inc 4 | 5 | Redistribution and use in source and binary forms, with or without 6 | modification, are permitted provided that the following conditions are met: 7 | 8 | - Redistributions of source code must retain the above copyright notice, this 9 | list of conditions and the following disclaimer. 10 | 11 | - Redistributions in binary form must reproduce the above copyright notice, 12 | this list of conditions and the following disclaimer in the documentation 13 | and/or other materials provided with the distribution. 14 | 15 | - Neither the name of the Shotgun Software Inc nor the names of its 16 | contributors may be used to endorse or promote products derived from this 17 | software without specific prior written permission. 18 | 19 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 20 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 21 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 22 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 23 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 24 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 25 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 26 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 27 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 28 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 29 | 30 | ----------------------------------------------------------------------------- 31 | """ 32 | 33 | from .schema import generate_schema # noqa 34 | from .mockgun import Shotgun # noqa 35 | from .errors import MockgunError # noqa -------------------------------------------------------------------------------- /docs/cookbook/smart_cut_fields.rst: -------------------------------------------------------------------------------- 1 | .. _smart_cut_fields: 2 | 3 | ################ 4 | Smart Cut Fields 5 | ################ 6 | 7 | .. warning:: 8 | Smart Cut Fields should be considered deprecated. ShotGrid v7.0, introduced a new version of 9 | cut support. `Read the Cut Support Documentation here `_. 10 | 11 | If you want to work with 'smart' cut fields through the API, your script should use a corresponding 12 | 'raw' fields for all updates. The 'smart_cut_fields' are primarily for display in the UI, the real 13 | data is stored in a set of 'raw' fields that have different names. 14 | 15 | ************ 16 | Smart Fields 17 | ************ 18 | 19 | In the UI these fields attempt to calculate values based on data entered into the various fields. 20 | These fields can be queried via the API using the find() method, but not updated. Note that we are 21 | deprecating this feature and recommend creating your own cut fields from scratch, which will not 22 | contain any calculations which have proven to be too magical at times. 23 | 24 | - ``smart_cut_duration`` 25 | - ``smart_cut_in`` 26 | - ``smart_cut_out`` 27 | - ``smart_cut_summary_display`` * 28 | - ``smart_cut_duration_display`` * 29 | - ``smart_head_duration`` 30 | - ``smart_head_in`` 31 | - ``smart_head_out`` 32 | - ``smart_tail_duration`` 33 | - ``smart_tail_in`` 34 | - ``smart_tail_out`` 35 | - ``smart_working_duration`` * 36 | 37 | .. note:: \* these are special summary fields that have no corresponding "raw" field. 38 | 39 | ********** 40 | Raw Fields 41 | ********** 42 | 43 | These are the "raw" fields that can be queried and updated through the API: 44 | 45 | - ``cut_duration`` 46 | - ``cut_in`` 47 | - ``cut_out`` 48 | - ``head_duration`` 49 | - ``head_in`` 50 | - ``head_out`` 51 | - ``tail_duration`` 52 | - ``tail_in`` 53 | - ``tail_out`` 54 | -------------------------------------------------------------------------------- /docs/advanced/iron_python.rst: -------------------------------------------------------------------------------- 1 | ********** 2 | IronPython 3 | ********** 4 | 5 | We do not test against IronPython and cannot be sure that we won't introduce breaking changes or 6 | that we will be compatible with future releases of IronPython. While we don't officially support 7 | IronPython, we certainly will do our best to figure out any issues that come up while using it and 8 | how to avoid them. 9 | 10 | 11 | Legacy Information 12 | ------------------ 13 | 14 | This following information is provided for historical purposes only. 15 | 16 | As of July 9, 2015 you can look at this fork of the repo to see what changes were needed as of that 17 | date to make things work. The original fork was as of v3.0.20 of the API. Big thanks to our awesome 18 | clients Pixomondo for making their work public and letting us refer to it: 19 | 20 | https://github.com/Pixomondo/python-api/tree/v3.0.20.ipy 21 | 22 | v3.0.20 can be used with IronPython with a little bit of added work: 23 | 24 | - The Python API uses the zlib module to handle decompressing the gzipped response from the server. 25 | There's no built-in zlib module in IronPython, but there's a potential solution from Jeff Hardy at 26 | https://bitbucket.org/jdhardy/ironpythonzlib/src/. And the blog post about it here 27 | http://blog.jdhardy.ca/2008/12/solving-zlib-problem-ironpythonzlib.html 28 | 29 | - If you encounter ``LookupError: unknown encoding: idna``, you can force utf-8 by changing 30 | iri2uri.py ~ln 71 from ``authority = authority.encode('idna')`` to 31 | ``authority = authority.encode('utf-8')`` 32 | 33 | - If you encounter an SSL error such as ``SSL3_READ_BYTES:sslv3 alert handshake failure``, then the 34 | lower level SSL library backing python's network infrastructure is attempting to connect to our 35 | servers via SSLv3, which is no longer supported. You can use the code from this gist to force the 36 | SSL connections to use a specific protocol. The forked repo linked above has an example of how to 37 | do that to force the use of TLSv1. 38 | -------------------------------------------------------------------------------- /docs/advanced/packaging.rst: -------------------------------------------------------------------------------- 1 | .. _packaging: 2 | 3 | ################################################ 4 | Packaging an application with py2app (or py2exe) 5 | ################################################ 6 | 7 | You can create standalone applications with Python scripts by using 8 | `py2app `_ on OS X or `py2exe `_ on 9 | Windows. This is often done to more easily distribute applications that have a GUI based on 10 | toolkits like Tk, Qt or others. 11 | 12 | There are caveats you need to be aware of when creating such an app. 13 | 14 | ******************************** 15 | HTTPS Validation and cacerts.txt 16 | ******************************** 17 | When creating the connection to Flow Production Tracking, a file is used to validate the Flow Production Tracking 18 | certificate. This file is located at ``shotgun_api3/lib/httplib2/cacerts.txt``. Because this file is not a Python 19 | file imported by your application, py2app will not know to include it in your package, it will 20 | need to be explicitly specified in your ``setup.py`` file (edit the path based on the location 21 | where your ``shotgun_api3`` package is located):: 22 | 23 | DATA_FILES = [ 24 | ('shotgun_api3', ['/shotgun/src/python-api/shotgun_api3/lib/httplib2/cacerts.txt']) 25 | ] 26 | 27 | Once you create your py2app package its contents should include two files (among others) in the 28 | following structure:: 29 | 30 | ./Contents/Resources/shotgun_api3/cacerts.txt 31 | ./Contents/Resources/my_script.py 32 | 33 | Where in ``my_script.py`` you can access the ``cacerts.txt`` file using a relative path to pass it 34 | into the Flow Production Tracking connection's constructor:: 35 | 36 | ca_certs = os.path.join(os.path.dirname(__file__), 'shotgun_api3', 'cacerts.txt') 37 | sg = shotgun_api3.Shotgun('https://my-site.shotgrid.autodesk.com', 'script_name', 'script_key', 38 | ca_certs=ca_certs) 39 | 40 | The process for py2exe should be similar. 41 | -------------------------------------------------------------------------------- /docs/cookbook.rst: -------------------------------------------------------------------------------- 1 | ************ 2 | API Cookbook 3 | ************ 4 | 5 | Here we have a collection of useful information you can use for reference when writing your API 6 | scripts. From usage tips and gotchas to deeper examples of working with entities like Tasks and 7 | Files, there's a lot of example code in here for you to play with. 8 | 9 | .. rubric:: Usage Tips 10 | 11 | These are some best-practices and good guidelines to follow when developing your scripts. 12 | You'll also find some gotchas you can avoid. 13 | 14 | .. toctree:: 15 | :maxdepth: 2 16 | 17 | cookbook/usage_tips 18 | 19 | .. rubric:: Examples 20 | 21 | Some basic example scripts that we walk you through from beginning to end. Feel free to copy 22 | and paste any of these into your own scripts. 23 | 24 | .. toctree:: 25 | :maxdepth: 3 26 | 27 | cookbook/tutorials 28 | 29 | .. rubric:: Working With Files 30 | 31 | You'll probably be doing some work with files at your studio. This is a deep dive into some of 32 | the inners of how Flow Production Tracking handles files (also called Attachments) and the different ways to link 33 | to them. 34 | 35 | .. toctree:: 36 | :maxdepth: 2 37 | 38 | cookbook/attachments 39 | 40 | .. rubric:: Working With Tasks 41 | 42 | Scheduling is a complex beast. Flow Production Tracking can handle lots of different types of functionality around 43 | scheduling like split tasks, dependencies, and more. These docs walk you through the details of 44 | how Flow Production Tracking thinks when it's handling Task changes and how you can make your scripts do what you 45 | need to do. 46 | 47 | .. toctree:: 48 | :maxdepth: 2 49 | 50 | cookbook/tasks 51 | 52 | .. rubric:: Smart Cut Fields 53 | 54 | Smart Cut Fields are deprecated in favor of the 55 | `new cut support added in ShotGrid v7.0 `_. 56 | This documentation remains only to support studios who may not have upgraded to the new cut support 57 | features. 58 | 59 | .. toctree:: 60 | :maxdepth: 2 61 | 62 | cookbook/smart_cut_fields 63 | -------------------------------------------------------------------------------- /shotgun_api3/lib/mockgun/errors.py: -------------------------------------------------------------------------------- 1 | """ 2 | ----------------------------------------------------------------------------- 3 | Copyright (c) 2009-2017, Shotgun Software Inc 4 | 5 | Redistribution and use in source and binary forms, with or without 6 | modification, are permitted provided that the following conditions are met: 7 | 8 | - Redistributions of source code must retain the above copyright notice, this 9 | list of conditions and the following disclaimer. 10 | 11 | - Redistributions in binary form must reproduce the above copyright notice, 12 | this list of conditions and the following disclaimer in the documentation 13 | and/or other materials provided with the distribution. 14 | 15 | - Neither the name of the Shotgun Software Inc nor the names of its 16 | contributors may be used to endorse or promote products derived from this 17 | software without specific prior written permission. 18 | 19 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 20 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 21 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 22 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 23 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 24 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 25 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 26 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 27 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 28 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 29 | 30 | ----------------------------------------------------------------------------- 31 | """ 32 | 33 | 34 | # ---------------------------------------------------------------------------- 35 | # Errors 36 | class MockgunError(Exception): 37 | """ 38 | Base for all Mockgun related API Errors. 39 | These are errors that relate to mockgun specifically, for example 40 | relating to mockups setup and initialization. For operational errors, 41 | mockgun raises ShotgunErrors just like the Shotgun API. 42 | """ 43 | -------------------------------------------------------------------------------- /.pre-commit-config.yaml: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2024, Shotgun Software Inc. 2 | # 3 | # Redistribution and use in source and binary forms, with or without 4 | # modification, are permitted provided that the following conditions are met: 5 | # 6 | # - Redistributions of source code must retain the above copyright notice, this 7 | # list of conditions and the following disclaimer. 8 | # 9 | # - Redistributions in binary form must reproduce the above copyright notice, 10 | # this list of conditions and the following disclaimer in the documentation 11 | # and/or other materials provided with the distribution. 12 | # 13 | # - Neither the name of the Shotgun Software Inc nor the names of its 14 | # contributors may be used to endorse or promote products derived from this 15 | # software without specific prior written permission. 16 | # 17 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 18 | # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 19 | # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 20 | # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 21 | # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 22 | # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 23 | # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 24 | # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 25 | # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 26 | # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 27 | 28 | # Styles the code properly 29 | 30 | # Exclude Third Pary components 31 | exclude: "shotgun_api3/lib/.*" 32 | 33 | # List of super useful formatters. 34 | repos: 35 | - repo: https://github.com/pre-commit/pre-commit-hooks 36 | rev: v5.0.0 37 | hooks: 38 | - id: check-ast 39 | - id: check-case-conflict 40 | - id: check-executables-have-shebangs 41 | - id: check-merge-conflict 42 | - id: end-of-file-fixer 43 | - id: requirements-txt-fixer 44 | - id: trailing-whitespace 45 | 46 | - repo: https://github.com/psf/black 47 | rev: 25.1.0 48 | hooks: 49 | - id: black 50 | -------------------------------------------------------------------------------- /shotgun_api3/__init__.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | import os 12 | import sys 13 | import warnings 14 | 15 | if sys.version_info < (3, 7): 16 | if os.environ.get("SHOTGUN_ALLOW_OLD_PYTHON", "0") != "1": 17 | # This is our preferred default behavior when using an old 18 | # unsupported Python version. 19 | # This way, we can control where the exception is raised, and it provides a 20 | # comprehensive error message rather than having users facing a random 21 | # Python traceback and trying to understand this is due to using an 22 | # unsupported Python version. 23 | 24 | raise RuntimeError("This module requires Python version 3.7 or higher.") 25 | 26 | warnings.warn( 27 | "Python versions older than 3.7 are no longer supported as of January " 28 | "2023. Since the SHOTGUN_ALLOW_OLD_PYTHON variable is enabled, this " 29 | "module is raising a warning instead of an exception. " 30 | "However, it is very likely that this module will not be able to work " 31 | "on this Python version.", 32 | RuntimeWarning, 33 | stacklevel=2, 34 | ) 35 | elif sys.version_info < (3, 9): 36 | warnings.warn( 37 | "Python versions older than 3.9 are no longer supported as of March " 38 | "2025 and compatibility will be discontinued after March 2026. " 39 | "Please update to Python 3.11 or any other supported version.", 40 | DeprecationWarning, 41 | stacklevel=2, 42 | ) 43 | 44 | 45 | from .shotgun import ( 46 | Shotgun, 47 | ShotgunError, 48 | ShotgunFileDownloadError, # noqa unused imports 49 | ShotgunThumbnailNotReady, 50 | Fault, 51 | AuthenticationFault, 52 | MissingTwoFactorAuthenticationFault, 53 | UserCredentialsNotAllowedForSSOAuthenticationFault, 54 | ProtocolError, 55 | ResponseError, 56 | Error, 57 | __version__, 58 | ) 59 | from .shotgun import SG_TIMEZONE as sg_timezone # noqa unused imports 60 | -------------------------------------------------------------------------------- /shotgun_api3/lib/httplib2/auth.py: -------------------------------------------------------------------------------- 1 | import base64 2 | import re 3 | 4 | from .. import pyparsing as pp 5 | 6 | from .error import * 7 | 8 | 9 | try: # pyparsing>=3.0.0 10 | downcaseTokens = pp.common.downcaseTokens 11 | except AttributeError: 12 | downcaseTokens = pp.downcaseTokens 13 | 14 | UNQUOTE_PAIRS = re.compile(r"\\(.)") 15 | unquote = lambda s, l, t: UNQUOTE_PAIRS.sub(r"\1", t[0][1:-1]) 16 | 17 | # https://tools.ietf.org/html/rfc7235#section-1.2 18 | # https://tools.ietf.org/html/rfc7235#appendix-B 19 | tchar = "!#$%&'*+-.^_`|~" + pp.nums + pp.alphas 20 | token = pp.Word(tchar).setName("token") 21 | token68 = pp.Combine(pp.Word("-._~+/" + pp.nums + pp.alphas) + pp.Optional(pp.Word("=").leaveWhitespace())).setName( 22 | "token68" 23 | ) 24 | 25 | quoted_string = pp.dblQuotedString.copy().setName("quoted-string").setParseAction(unquote) 26 | auth_param_name = token.copy().setName("auth-param-name").addParseAction(downcaseTokens) 27 | auth_param = auth_param_name + pp.Suppress("=") + (quoted_string | token) 28 | params = pp.Dict(pp.delimitedList(pp.Group(auth_param))) 29 | 30 | scheme = token("scheme") 31 | challenge = scheme + (params("params") | token68("token")) 32 | 33 | authentication_info = params.copy() 34 | www_authenticate = pp.delimitedList(pp.Group(challenge)) 35 | 36 | 37 | def _parse_authentication_info(headers, headername="authentication-info"): 38 | """https://tools.ietf.org/html/rfc7615 39 | """ 40 | header = headers.get(headername, "").strip() 41 | if not header: 42 | return {} 43 | try: 44 | parsed = authentication_info.parseString(header) 45 | except pp.ParseException as ex: 46 | # print(ex.explain(ex)) 47 | raise MalformedHeader(headername) 48 | 49 | return parsed.asDict() 50 | 51 | 52 | def _parse_www_authenticate(headers, headername="www-authenticate"): 53 | """Returns a dictionary of dictionaries, one dict per auth_scheme.""" 54 | header = headers.get(headername, "").strip() 55 | if not header: 56 | return {} 57 | try: 58 | parsed = www_authenticate.parseString(header) 59 | except pp.ParseException as ex: 60 | # print(ex.explain(ex)) 61 | raise MalformedHeader(headername) 62 | 63 | retval = { 64 | challenge["scheme"].lower(): challenge["params"].asDict() 65 | if "params" in challenge 66 | else {"token": challenge.get("token")} 67 | for challenge in parsed 68 | } 69 | return retval 70 | -------------------------------------------------------------------------------- /azure-pipelines-templates/type_checking.yml: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2025, Shotgun Software Inc. 2 | # 3 | # Redistribution and use in source and binary forms, with or without 4 | # modification, are permitted provided that the following conditions are met: 5 | # 6 | # - Redistributions of source code must retain the above copyright notice, this 7 | # list of conditions and the following disclaimer. 8 | # 9 | # - Redistributions in binary form must reproduce the above copyright notice, 10 | # this list of conditions and the following disclaimer in the documentation 11 | # and/or other materials provided with the distribution. 12 | # 13 | # - Neither the name of the Shotgun Software Inc nor the names of its 14 | # contributors may be used to endorse or promote products derived from this 15 | # software without specific prior written permission. 16 | # 17 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 18 | # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 19 | # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 20 | # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 21 | # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 22 | # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 23 | # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 24 | # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 25 | # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 26 | # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 27 | 28 | jobs: 29 | - job: type_checking 30 | displayName: Type Checking (beta) 31 | pool: 32 | vmImage: 'ubuntu-latest' 33 | 34 | steps: 35 | - task: UsePythonVersion@0 36 | inputs: 37 | versionSpec: 3.9 38 | addToPath: True 39 | architecture: 'x64' 40 | 41 | - script: | 42 | pip install --upgrade pip setuptools wheel 43 | pip install --upgrade mypy 44 | displayName: Install dependencies 45 | 46 | # Placeholder to future static type checking. For now we just run mypy and skip all known errors. 47 | - bash: mypy shotgun_api3/shotgun.py --follow-imports skip --pretty --no-strict-optional --disable-error-code arg-type --disable-error-code assignment --disable-error-code return --disable-error-code return-value --disable-error-code attr-defined 48 | displayName: Run type checking 49 | -------------------------------------------------------------------------------- /docs/index.rst: -------------------------------------------------------------------------------- 1 | ########################################### 2 | Flow Production Tracking Python API library 3 | ########################################### 4 | Release |version|. (:ref:`Installation `) 5 | 6 | .. image:: https://img.shields.io/badge/shotgun-api-blue.svg 7 | 8 | 9 | 10 | Flow Production Tracking (FPTR) provides a simple Python-based API for accessing FPTR and integrating with other tools. 11 | The Flow Production Tracking API3, also known as "Python API", allows users to integrate their tools with Flow Production Tracking very easily. Using this simple 12 | but powerful python module, you can quickly get your scripts integrated with Flow Production Tracking's CRUD-based 13 | API. 14 | 15 | Because the needs of every studio can prove to be very different, we don't include a lot of 16 | "automation" or "smarts" in our API. We have kept it pretty low-level and leave most of those 17 | decisions to you. The API is powerful enough you can write your own "smarts" in a wrapper on top 18 | of the of the FPTR API3. 19 | 20 | .. _pythonoverviewvideo: 21 | 22 | Overview Video of Setting Up Your Environment with the Python API 23 | ================================================================= 24 | 25 | .. raw:: html 26 | 27 | 28 | 29 | In addition to basic metadata, the API contains methods for managing media including thumbnails, 30 | filmstrip thumbnails, images, uploaded, and both locally and remotely linked media like 31 | Quicktimes, etc. 32 | 33 | **Example**:: 34 | 35 | sg = shotgun_api3.Shotgun("https://my-site.shotgrid.autodesk.com", 36 | login="rhendriks", 37 | password="c0mPre$Hi0n") 38 | sg.find("Shot", filters=[["sg_status_list", "is", "ip"]], fields=["code", "sg_status_list"]) 39 | 40 | **Output**:: 41 | 42 | [{'code': 'bunny_020_0170', 43 | 'id': 896, 44 | 'sg_sequence': {'id': 5, 'name': 'bunny_020', 'type': 'Sequence'}, 45 | 'sg_status_list': 'ip', 46 | 'type': 'Shot'}, 47 | {'code': 'bunny_020_0200', 48 | 'id': 899, 49 | 'sg_sequence': {'id': 5, 'name': 'bunny_020', 'type': 'Sequence'}, 50 | 'sg_status_list': 'ip', 51 | 'type': 'Shot'}, 52 | {'code': 'bunny_030_0080', 53 | 'id': 907, 54 | 'sg_sequence': {'id': 6, 'name': 'bunny_030', 'type': 'Sequence'}, 55 | 'sg_status_list': 'ip', 56 | 'type': 'Shot'}] 57 | 58 | 59 | ********** 60 | User Guide 61 | ********** 62 | .. toctree:: 63 | :maxdepth: 2 64 | 65 | installation 66 | authentication 67 | reference 68 | cookbook 69 | advanced 70 | 71 | 72 | Indices and tables 73 | ================== 74 | 75 | * :ref:`genindex` 76 | * :ref:`modindex` 77 | * :ref:`search` 78 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2009-2011, Shotgun Software Inc 2 | All rights reserved. 3 | 4 | Redistribution and use in source and binary forms, with or without 5 | modification, are permitted provided that the following conditions are met: 6 | 7 | - Redistributions of source code must retain the above copyright notice, this 8 | list of conditions and the following disclaimer. 9 | 10 | - Redistributions in binary form must reproduce the above copyright notice, 11 | this list of conditions and the following disclaimer in the documentation 12 | and/or other materials provided with the distribution. 13 | 14 | - Neither the name of the Shotgun Software Inc nor the names of its 15 | contributors may be used to endorse or promote products derived from this 16 | software without specific prior written permission. 17 | 18 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 19 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 20 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 21 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 22 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 23 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 24 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 25 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 26 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 27 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 28 | 29 | Portions of code (xml-rpc client libs from standard python distro): 30 | 31 | Copyright (c) 1999-2002 by Secret Labs AB 32 | Copyright (c) 1999-2002 by Fredrik Lundh 33 | 34 | By obtaining, using, and/or copying this software and/or its 35 | associated documentation, you agree that you have read, understood, 36 | and will comply with the following terms and conditions: 37 | 38 | Permission to use, copy, modify, and distribute this software and 39 | its associated documentation for any purpose and without fee is 40 | hereby granted, provided that the above copyright notice appears in 41 | all copies, and that both that copyright notice and this permission 42 | notice appear in supporting documentation, and that the name of 43 | Secret Labs AB or the author not be used in advertising or publicity 44 | pertaining to distribution of the software without specific, written 45 | prior permission. 46 | 47 | SECRET LABS AB AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD 48 | TO THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANT- 49 | ABILITY AND FITNESS. IN NO EVENT SHALL SECRET LABS AB OR THE AUTHOR 50 | BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY 51 | DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, 52 | WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS 53 | ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE 54 | OF THIS SOFTWARE. 55 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_find_shot.rst: -------------------------------------------------------------------------------- 1 | .. _example_find_shot: 2 | 3 | Find a Shot 4 | =========== 5 | 6 | Building the Query 7 | ------------------ 8 | We are going to assume we know the 'id' of the Shot we're looking for in this example.:: 9 | 10 | filters = [['id', 'is', 40435]] 11 | result = sg.find_one('Shot', filters) 12 | 13 | Pretty simple right? Well here's a little more insight into what's going on. 14 | 15 | - ``filters`` is an list of filter conditions. In this example we are filtering for Shots where 16 | the ``id`` column is **40435**. 17 | - ``sg`` is the Flow Production Tracking API instance. 18 | - ``find_one()`` is the :meth:`~shotgun_api3.Shotgun.find_one` API method we are calling. We 19 | provide it with the entity type we're searching for and our filters. 20 | 21 | 22 | Seeing the Result 23 | ----------------- 24 | So what does this return? The variable result now contains:: 25 | 26 | {'type': 'Shot','id': 40435} 27 | 28 | By default, :meth:`~shotgun_api3.Shotgun.find_one` returns a single dictionary object with 29 | the ``type`` and ``id`` fields. So in this example, we found a Shot matching that id, and Flow Production Tracking 30 | returned it as a dictionary object with ``type`` and ``id`` keys . 31 | 32 | How do we know that result contains the Shot dictionary object? You can trust us... but just to be 33 | sure, the :mod:`pprint` (PrettyPrint) module from the Python standard library is a really good tool 34 | to help with debugging. It will print out objects in a nicely formatted way that makes things 35 | easier to read. So we'll add that to the import section of our script.:: 36 | 37 | import shotgun_api3 38 | from pprint import pprint # useful for debugging 39 | 40 | The Complete Example for finding a Shot 41 | --------------------------------------- 42 | :: 43 | 44 | #!/usr/bin/env python 45 | 46 | # -------------------------------------- 47 | # Imports 48 | # -------------------------------------- 49 | import shotgun_api3 50 | from pprint import pprint # useful for debugging 51 | 52 | # -------------------------------------- 53 | # Globals 54 | # -------------------------------------- 55 | # make sure to change this to match your Flow Production Tracking server and auth credentials. 56 | SERVER_PATH = "https://my-site.shotgrid.autodesk.com" 57 | SCRIPT_NAME = 'my_script' 58 | SCRIPT_KEY = '27b65d7063f46b82e670fe807bd2b6f3fd1676c1' 59 | 60 | # -------------------------------------- 61 | # Main 62 | # -------------------------------------- 63 | if __name__ == '__main__': 64 | 65 | sg = shotgun_api3.Shotgun(SERVER_PATH, SCRIPT_NAME, SCRIPT_KEY) 66 | 67 | # -------------------------------------- 68 | # Find a Shot by id 69 | # -------------------------------------- 70 | filters = [['id', 'is', 40435]] 71 | result = sg.find_one('Shot', filters) 72 | pprint(result) 73 | 74 | And here is the output:: 75 | 76 | {'type': 'Shot','id': 40435} 77 | -------------------------------------------------------------------------------- /developer/README.md: -------------------------------------------------------------------------------- 1 | 2 | # Updating HTTPLib2 3 | 4 | The API comes with a copy of the `httplib2` inside the `shotgun_api3/lib` folder. To update the copy to a more recent version of the API, you can run the `update_httplib2.py` script at the root of this repository like this: 5 | 6 | python update_httplib2.py vX.Y.Z 7 | 8 | where `vX.Y.Z` is a release found on `httplib2`'s [release page](https://github.com/httplib2/httplib2/releases). 9 | 10 | 11 | # Release process 12 | 13 | ## Packaging up new release 14 | 15 | 1) Update the Changelog in the `HISTORY.rst` file 16 | - Add bullet points for any changes that have happened since the previous release. This may include changes you did not make so look at the commit history and make sure we don't miss anything. If you notice something was done that wasn't added to the changelog, hunt down that engineer and make them feel guilty for not doing so. This is a required step in making changes to the API. 17 | - Try and match the language of previous change log messages. We want to keep a consistent voice. 18 | - Make sure the date of the release matches today. We try and keep this TBD until we're ready to do a release so it's easy to catch that it needs to be updated. 19 | - Make sure the version number is filled out and correct. We follow semantic versioning. 20 | 2) Ensure any changes or additions to public methods are documented 21 | - Ensure that doc strings are updated in the code itself to work with Sphinx and are correctly formatted. 22 | - Examples are always good especially if this a new feature or method. 23 | - Think about a new user to the API trying to figure out how to use the features you're documenting. 24 | 3) Update the version value in `python-api/setup.py` to match the version you are packaging. This controls what version users will get when installing via pip. 25 | 4) Update the `__version__` value in `shotgun_api3/shotgun.py` to the version you're releasing. This identifies the current version within the API itself. 26 | 5) Commit these changes in master with a commit message like `packaging for the vx.x.x release`. 27 | 6) Create a tag based off of the master branch called `vx.x.x` to match the version number you're releasing. 28 | 7) Push master and your tag to Github. 29 | 8) Update the Releases page with your new release. 30 | - The release should already be there from your tag but if not, create a new one. 31 | - Add more detailed information regarding the changes in this release. This is a great place to add examples, and reasons for the change! 32 | 33 | ## Letting the world know 34 | Post a message in the [Pipeline Community channel](https://community.shotgridsoftware.com/c/pipeline). 35 | 36 | ## Prepare for the Next Dev Cycle 37 | 1) Update the `__version__` value in `shotgun_api3/shotgun.py` to the next version number with `.dev` appended to it. For example, `v3.0.24.dev` 38 | 2) Add a new section to the Changelog in the `HISTORY.rst` file with the next version number and a TBD date 39 | ``` 40 | **v3.0.24 - TBD** 41 | + TBD 42 | ``` 43 | 3) Commit the changes to master with a commit message like `Bump version to v3.0.24.dev` 44 | 4) Push master to Github 45 | -------------------------------------------------------------------------------- /docs/installation.rst: -------------------------------------------------------------------------------- 1 | ############ 2 | Installation 3 | ############ 4 | 5 | ******************** 6 | Minimum Requirements 7 | ******************** 8 | 9 | .. note:: 10 | Some features of the API are only supported by more recent versions of the Flow Production Tracking server. 11 | These features are added to the Python API in a backwards compatible way so that existing 12 | scripts will continue to function as expected. Accessing a method that is not supported for 13 | your version of Flow Production Tracking will raise an appropriate exception. In general, we attempt to 14 | document these where possible. 15 | 16 | Python versions 17 | =============== 18 | 19 | The Python API library supports the following Python versions: `3.9 - 3.11`. We recommend using Python 3.11. 20 | 21 | .. important:: 22 | Python versions older than 3.9 are no longer supported as of March 2025 and compatibility will be discontinued after 23 | March 2026. 24 | 25 | 26 | ****************************** 27 | Installing into ``PYTHONPATH`` 28 | ****************************** 29 | You can `download the latest release from Github `_ 30 | or `clone the repo `_ to your local filesystem. 31 | You'll need to save it somewhere your local Python installation can find it. 32 | 33 | .. seealso:: For more information on ``PYTHONPATH`` and using modules in Python, see 34 | http://docs.python.org/tutorial/modules.html 35 | 36 | .. note:: 37 | :ref:`Visit the introduction to the Python API ` to see an overview video of Setting Up Your Environment with the Python API. 38 | 39 | *********************** 40 | Installing with ``pip`` 41 | *********************** 42 | 43 | Installing the Master Branch From Github 44 | ======================================== 45 | If you wish to install the current master, use the following command:: 46 | 47 | pip install git+https://github.com/shotgunsoftware/python-api.git 48 | 49 | .. note:: The master branch contains the latest revisions and while largely considered "stable" it 50 | is not an official packaged release. 51 | 52 | Installing A specific Version From Github 53 | ========================================= 54 | To install a specific version of the package from Github, run the following command. This example 55 | installs the v3.0.26 tag, replace the version tag with the one you want:: 56 | 57 | pip install git+https://github.com/shotgunsoftware/python-api.git@v3.0.26 58 | 59 | 60 | ``requirements.txt`` 61 | ~~~~~~~~~~~~~~~~~~~~ 62 | If you're using pip with `requirements.txt`, add the following line:: 63 | 64 | git+https://github.com/shotgunsoftware/python-api.git 65 | 66 | 67 | **************************** 68 | Installing with ``setup.py`` 69 | **************************** 70 | 71 | From a local copy of the repository, you can run ``python setup.py install`` to copy the package inside your python ``site-packages``. Note that while ``setuptools`` will complain about syntax errors when installing the library, the library is fully functional. However, it ships with both Python 2 and Python 3 copies of ``httplib2``, which will generate syntax errors when byte-compiling the Python modules. 72 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_create_shot_task_template.rst: -------------------------------------------------------------------------------- 1 | Create a Shot with a Task Template 2 | ================================== 3 | Creating a new Shot with a Task Template is just like linking it to another entity, but Flow Production Tracking will apply the Task Template in the background and create the appropriate Tasks on the Shot for you. 4 | 5 | Find the Task Template 6 | ---------------------- 7 | First we need to find the Task Template we're going to apply. We'll assume you know the name of the Task Template you want to use. 8 | :: 9 | 10 | filters = [['code','is', '3D Shot Template' ]] 11 | template = sg.find_one('TaskTemplate', filters) 12 | 13 | 14 | The Resulting Task Template 15 | --------------------------- 16 | 17 | Assuming the task template was found, we will now have something like this in our ``template`` 18 | variable:: 19 | 20 | {'type': 'TaskTemplate', 'id': 12} 21 | 22 | Create the Shot 23 | --------------- 24 | Now we can create the Shot with the link to the ``TaskTemplate`` to apply. 25 | :: 26 | 27 | data = {'project': {'type': 'Project','id': 4}, 28 | 'code': '100_010', 29 | 'description': 'dark shot with wicked cool ninjas', 30 | 'task_template': template } 31 | result = sg.create('Shot', data) 32 | 33 | This will create a new Shot named "100_010" linked to the TaskTemplate "3D Shot Template" and 34 | Flow Production Tracking will then create the Tasks defined in the template and link them to the Shot you just 35 | created. 36 | 37 | - ``data`` is a list of key/value pairs where the key is the column name to update and the value is 38 | the value. 39 | - ``project`` and `code` are required 40 | - ``description`` is just a text field that you might want to update as well 41 | - ``task_template`` is another entity column where we set the Task Template which has the Tasks we 42 | wish to create by default on this Shot. We found the specific template we wanted to assign in the 43 | previous block by searching 44 | 45 | Create Shot Result 46 | ------------------ 47 | The variable ``result`` now contains the dictionary of the new Shot that was created. 48 | :: 49 | 50 | { 51 | 'code': '010_010', 52 | 'description': 'dark shot with wicked cool ninjas', 53 | 'id': 2345, 54 | 'project': {'id': 4, 'name': 'Gunslinger', 'type': 'Project'}, 55 | 'task_template': {'id': 12, 56 | 'name': '3D Shot Template', 57 | 'type': 'TaskTemplate'}, 58 | 'type': 'Shot' 59 | } 60 | 61 | 62 | If we now search for the Tasks linked to the Shot, we'll find the Tasks that match our 63 | ``TaskTemplate``:: 64 | 65 | tasks = sg.find('Task', filters=[['entity', 'is', result]]) 66 | 67 | .. note:: You can use an entity dictionary that was returned from the API in a filter as we have 68 | done above. Flow Production Tracking will only look at the ``id`` and ``type`` keys and will ignore the rest. 69 | This is a handy way to pass around entity dictionaries without having to reformat them. 70 | 71 | Now the ``tasks`` variable contains the following:: 72 | 73 | [{'id': 3253, 'type': 'Task'}, 74 | {'id': 3254, 'type': 'Task'}, 75 | {'id': 3255, 'type': 'Task'}, 76 | {'id': 3256, 'type': 'Task'}, 77 | {'id': 3257, 'type': 'Task'}] 78 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_update_shot.rst: -------------------------------------------------------------------------------- 1 | Update A Shot 2 | ============= 3 | 4 | Building the data and calling :meth:`~shotgun_api3.Shotgun.update` 5 | ------------------------------------------------------------------ 6 | To update a Shot, you need to provide the ``id`` of the Shot and a list of fields you want to 7 | update.:: 8 | 9 | data = { 10 | 'description': 'Open on a beautiful field with fuzzy bunnies', 11 | 'sg_status_list': 'ip' 12 | } 13 | result = sg.update('Shot', 40435, data) 14 | 15 | This will update the ``description`` and the ``sg_status_list`` fields for the Shot with ``id`` of 16 | **40435**. 17 | 18 | - ``data`` is a list of key/value pairs where the key is the field name to update and the value to 19 | update it to. 20 | - ``sg`` is the Flow Production Tracking API instance. 21 | - ``update()`` is the :meth:`shotgun_api3.Shotgun.update` API method we are calling. We provide it 22 | with the entity type we're updating, the ``id`` of the entity, and the data we're updating it 23 | with. 24 | 25 | Result 26 | ------ 27 | The variable ``result`` now contains the Shot object that with the updated values.:: 28 | 29 | { 30 | 'description': 'Opening establishing shot with titles and fuzzy bunnies', 31 | 'sg_status_list': 'ip', 32 | 'type': 'Shot', 33 | 'id': 40435 34 | } 35 | 36 | In addition, Flow Production Tracking has returned the ``id`` for the Shot, as well as a ``type`` value. ``type`` 37 | is provided for convenience simply to help you identify what entity type this dictionary represents. 38 | It does not correspond to any field in Flow Production Tracking. 39 | 40 | Flow Production Tracking will *always* return the ``id`` and ``type`` keys in the dictionary when there are results 41 | representing an entity. 42 | 43 | The Complete Example for updating a Shot 44 | ---------------------------------------- 45 | :: 46 | 47 | #!/usr/bin/env python 48 | 49 | # -------------------------------------- 50 | # Imports 51 | # -------------------------------------- 52 | import shotgun_api3 53 | from pprint import pprint # useful for debugging 54 | 55 | # -------------------------------------- 56 | # Globals 57 | # -------------------------------------- 58 | # make sure to change this to match your Flow Production Tracking server and auth credentials. 59 | SERVER_PATH = "https://my-site.shotgrid.autodesk.com" 60 | SCRIPT_NAME = 'my_script' 61 | SCRIPT_KEY = '27b65d7063f46b82e670fe807bd2b6f3fd1676c1' 62 | 63 | # -------------------------------------- 64 | # Main 65 | # -------------------------------------- 66 | if __name__ == '__main__': 67 | 68 | sg = shotgun_api3.Shotgun(SERVER_PATH, SCRIPT_NAME, SCRIPT_KEY) 69 | 70 | # -------------------------------------- 71 | # Update Shot with data 72 | # -------------------------------------- 73 | data = { 74 | 'description': 'Open on a beautiful field with fuzzy bunnies', 75 | 'sg_status_list': 'ip' 76 | } 77 | result = sg.update('Shot', 40435, data) 78 | pprint(result) 79 | 80 | And here is the output:: 81 | 82 | {'description': 'Opening establishing shot with titles and fuzzy bunnies', 83 | 'id': 40435, 84 | 'sg_status_list': 'ip', 85 | 'type': 'Shot'} 86 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [![Supported VFX Platform: 2022 - 2025](https://img.shields.io/badge/VFX_Platform-2022_|_2023_|_2024_|_2025-blue)](http://www.vfxplatform.com/ "Supported VFX Platform") 2 | [![Supported Python versions: 3.9 - 3.11](https://img.shields.io/badge/Python-3.9_|_3.10_|_3.11-blue?logo=python&logoColor=f5f5f5)](https://www.python.org/ "Supported Python versions") 3 | [![Reference Documentation](http://img.shields.io/badge/Reference-documentation-blue.svg?logo=wikibooks&logoColor=f5f5f5)](http://developer.shotgridsoftware.com/python-api) 4 | 5 | [![Build Status](https://dev.azure.com/shotgun-ecosystem/Python%20API/_apis/build/status/shotgunsoftware.python-api?branchName=master)](https://dev.azure.com/shotgun-ecosystem/Python%20API/_build/latest?definitionId=108&branchName=master) 6 | [![Coverage Status](https://coveralls.io/repos/github/shotgunsoftware/python-api/badge.svg?branch=master)](https://coveralls.io/github/shotgunsoftware/python-api?branch=master) 7 | 8 | # Flow Production Tracking Python API 9 | 10 | Autodesk provides a simple Python-based API for accessing Flow Production Tracking and integrating with other tools. This is the official API that is maintained by Autodesk (https://www.autodesk.com/support) 11 | 12 | The latest version can always be found at http://github.com/shotgunsoftware/python-api 13 | 14 | ## Documentation 15 | Tutorials and detailed documentation about the Python API are available at http://developer.shotgridsoftware.com/python-api). 16 | 17 | Some useful direct links: 18 | 19 | * [Installing](http://developer.shotgridsoftware.com/python-api/installation.html) 20 | * [Tutorials](http://developer.shotgridsoftware.com/python-api/cookbook/tutorials.html) 21 | * [API Reference](http://developer.shotgridsoftware.com/python-api/reference.html) 22 | * [Data Types](http://developer.shotgridsoftware.com/python-api/reference.html#data-types) 23 | * [Filter Syntax](http://developer.shotgridsoftware.com/python-api/reference.html#filter-syntax) 24 | 25 | ## Changelog 26 | 27 | You can see the [full history of the Python API on the documentation site](http://developer.shotgridsoftware.com/python-api/changelog.html). 28 | 29 | 30 | ## Tests 31 | 32 | Integration and unit tests are provided. 33 | 34 | - All tests require: 35 | - The [nose unit testing tools](http://nose.readthedocs.org), 36 | - The [nose-exclude nose plugin](https://pypi.org/project/nose-exclude/) 37 | - (Note: Running `pip install -r tests/ci_requirements.txt` will install this package) 38 | - A `tests/config` file (you can copy an example from `tests/example_config`). 39 | - Tests can be run individually like this: `nosetests --config="nose.cfg" tests/test_client.py` 40 | - Make sure to not forget the `--config="nose.cfg"` option. This option tells nose to use our config file. 41 | - `test_client` and `tests_unit` use mock server interaction and do not require a Flow Production Tracking instance to be available (no modifications to `tests/config` are necessary). 42 | - `test_api` and `test_api_long` *do* require a Flow Production Tracking instance, with a script key available for the tests. The server and script user values must be supplied in the `tests/config` file. The tests will add test data to your server based on information in your config. This data will be manipulated by the tests, and should not be used for other purposes. 43 | - To run all of the tests, use the shell script `run-tests`. 44 | -------------------------------------------------------------------------------- /shotgun_api3/lib/certifi/core.py: -------------------------------------------------------------------------------- 1 | """ 2 | certifi.py 3 | ~~~~~~~~~~ 4 | 5 | This module returns the installation location of cacert.pem or its contents. 6 | """ 7 | import sys 8 | import atexit 9 | 10 | def exit_cacert_ctx() -> None: 11 | _CACERT_CTX.__exit__(None, None, None) # type: ignore[union-attr] 12 | 13 | 14 | if sys.version_info >= (3, 11): 15 | 16 | from importlib.resources import as_file, files 17 | 18 | _CACERT_CTX = None 19 | _CACERT_PATH = None 20 | 21 | def where() -> str: 22 | # This is slightly terrible, but we want to delay extracting the file 23 | # in cases where we're inside of a zipimport situation until someone 24 | # actually calls where(), but we don't want to re-extract the file 25 | # on every call of where(), so we'll do it once then store it in a 26 | # global variable. 27 | global _CACERT_CTX 28 | global _CACERT_PATH 29 | if _CACERT_PATH is None: 30 | # This is slightly janky, the importlib.resources API wants you to 31 | # manage the cleanup of this file, so it doesn't actually return a 32 | # path, it returns a context manager that will give you the path 33 | # when you enter it and will do any cleanup when you leave it. In 34 | # the common case of not needing a temporary file, it will just 35 | # return the file system location and the __exit__() is a no-op. 36 | # 37 | # We also have to hold onto the actual context manager, because 38 | # it will do the cleanup whenever it gets garbage collected, so 39 | # we will also store that at the global level as well. 40 | _CACERT_CTX = as_file(files("certifi").joinpath("cacert.pem")) 41 | _CACERT_PATH = str(_CACERT_CTX.__enter__()) 42 | atexit.register(exit_cacert_ctx) 43 | 44 | return _CACERT_PATH 45 | 46 | def contents() -> str: 47 | return files("certifi").joinpath("cacert.pem").read_text(encoding="ascii") 48 | 49 | else: 50 | 51 | from importlib.resources import path as get_path, read_text 52 | 53 | _CACERT_CTX = None 54 | _CACERT_PATH = None 55 | 56 | def where() -> str: 57 | # This is slightly terrible, but we want to delay extracting the 58 | # file in cases where we're inside of a zipimport situation until 59 | # someone actually calls where(), but we don't want to re-extract 60 | # the file on every call of where(), so we'll do it once then store 61 | # it in a global variable. 62 | global _CACERT_CTX 63 | global _CACERT_PATH 64 | if _CACERT_PATH is None: 65 | # This is slightly janky, the importlib.resources API wants you 66 | # to manage the cleanup of this file, so it doesn't actually 67 | # return a path, it returns a context manager that will give 68 | # you the path when you enter it and will do any cleanup when 69 | # you leave it. In the common case of not needing a temporary 70 | # file, it will just return the file system location and the 71 | # __exit__() is a no-op. 72 | # 73 | # We also have to hold onto the actual context manager, because 74 | # it will do the cleanup whenever it gets garbage collected, so 75 | # we will also store that at the global level as well. 76 | _CACERT_CTX = get_path("certifi", "cacert.pem") 77 | _CACERT_PATH = str(_CACERT_CTX.__enter__()) 78 | atexit.register(exit_cacert_ctx) 79 | 80 | return _CACERT_PATH 81 | 82 | def contents() -> str: 83 | return read_text("certifi", "cacert.pem", encoding="ascii") 84 | -------------------------------------------------------------------------------- /azure-pipelines.yml: -------------------------------------------------------------------------------- 1 | # ----------------------------------------------------------------------------- 2 | # Copyright (c) 2009-2021, Shotgun Software Inc. 3 | # 4 | # Redistribution and use in source and binary forms, with or without 5 | # modification, are permitted provided that the following conditions are met: 6 | # 7 | # - Redistributions of source code must retain the above copyright notice, this 8 | # list of conditions and the following disclaimer. 9 | # 10 | # - Redistributions in binary form must reproduce the above copyright notice, 11 | # this list of conditions and the following disclaimer in the documentation 12 | # and/or other materials provided with the distribution. 13 | # 14 | # - Neither the name of the Shotgun Software Inc nor the names of its 15 | # contributors may be used to endorse or promote products derived from this 16 | # software without specific prior written permission. 17 | # 18 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 19 | # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 20 | # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 21 | # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 22 | # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 23 | # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 24 | # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 25 | # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 26 | # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 27 | # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 28 | 29 | resources: 30 | repositories: 31 | - repository: templates 32 | type: github 33 | name: shotgunsoftware/tk-ci-tools 34 | ref: refs/heads/master 35 | endpoint: shotgunsoftware 36 | # Despite using the "tk-" prefix, tk-ci-tools is not a Toolkit only tool. 37 | # We use it to avoid duplicating and maintaining CI pipeline code. 38 | 39 | # We've stored some variables in Azure. They contain credentials 40 | # and are encrypted. They are also not available to clients. 41 | # This statement says which variable groups this repo requires. 42 | # We have two: one that can be shared by all repos that use CI 43 | # and another that is used just by the Python API, which creates 44 | # HumanUser's and as such need a password, which we can't 45 | # hardcode in the .yml files. 46 | variables: 47 | - group: sg-credentials 48 | 49 | # We want builds to trigger for 3 reasons: 50 | # - The master branch sees new commits 51 | # - Each PR should get rebuilt when commits are added to it. 52 | trigger: 53 | branches: 54 | include: 55 | - master 56 | pr: 57 | branches: 58 | include: 59 | - "*" 60 | 61 | # This here is the list of jobs we want to run for our build. 62 | # Jobs run in parallel. 63 | jobs: 64 | - template: build-pipeline.yml@templates 65 | parameters: 66 | # Python API does not follow the exact same Python version lifecycle than 67 | # Toolkit. So we prefer to control the test execution here instead. 68 | has_unit_tests: false 69 | 70 | has_ui_resources: false 71 | 72 | - template: azure-pipelines-templates/type_checking.yml 73 | 74 | # These are jobs templates, they allow to reduce the redundancy between 75 | # variations of the same build. We pass in the image name 76 | # and a friendly name that then template can then use to create jobs. 77 | - template: azure-pipelines-templates/run-tests.yml 78 | parameters: 79 | name: Linux 80 | vm_image: 'ubuntu-latest' 81 | 82 | - template: azure-pipelines-templates/run-tests.yml 83 | parameters: 84 | name: macOS 85 | vm_image: 'macOS-latest' 86 | 87 | - template: azure-pipelines-templates/run-tests.yml 88 | parameters: 89 | name: Windows 90 | vm_image: 'windows-latest' 91 | -------------------------------------------------------------------------------- /shotgun_api3/lib/sgtimezone.py: -------------------------------------------------------------------------------- 1 | #! /opt/local/bin/python 2 | 3 | # Copyright (c) 2019 Shotgun Software Inc. 4 | # 5 | # CONFIDENTIAL AND PROPRIETARY 6 | # 7 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 8 | # Source Code License included in this distribution package. See LICENSE. 9 | # By accessing, using, copying or modifying this work you indicate your 10 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 11 | # not expressly granted therein are reserved by Shotgun Software Inc. 12 | 13 | # ---------------------------------------------------------------------------- 14 | # SG_TIMEZONE module 15 | # this is rolled into the this shotgun api file to avoid having to require 16 | # current users of api2 to install new modules and modify PYTHONPATH info. 17 | # ---------------------------------------------------------------------------- 18 | 19 | from datetime import tzinfo, timedelta 20 | import time as _time 21 | 22 | 23 | class SgTimezone(object): 24 | ''' 25 | Shotgun's server infrastructure is configured for Coordinated Universal 26 | Time (UTC). In order to provide relevant local timestamps to users, we wrap 27 | the datetime module's tzinfo to provide convenient conversion methods. 28 | ''' 29 | 30 | ZERO = timedelta(0) 31 | STDOFFSET = timedelta(seconds=-_time.timezone) 32 | if _time.daylight: 33 | DSTOFFSET = timedelta(seconds=-_time.altzone) 34 | else: 35 | DSTOFFSET = STDOFFSET 36 | DSTDIFF = DSTOFFSET - STDOFFSET 37 | 38 | def __init__(self): 39 | self.utc = UTC() 40 | self.local = LocalTimezone() 41 | 42 | @classmethod 43 | def UTC(cls): 44 | ''' 45 | For backwards compatibility, from when UTC was a nested class, 46 | we allow instantiation via SgTimezone 47 | ''' 48 | return UTC() 49 | 50 | @classmethod 51 | def LocalTimezone(cls): 52 | ''' 53 | For backwards compatibility, from when LocalTimezone was a nested 54 | class, we allow instantiation via SgTimezone 55 | ''' 56 | return LocalTimezone() 57 | 58 | 59 | class UTC(tzinfo): 60 | ''' 61 | Implementation of datetime's tzinfo to provide consistent calculated 62 | offsets against Coordinated Universal Time (UTC) 63 | ''' 64 | 65 | def utcoffset(self, dt): 66 | return SgTimezone.ZERO 67 | 68 | def tzname(self, dt): 69 | return "UTC" 70 | 71 | def dst(self, dt): 72 | return SgTimezone.ZERO 73 | 74 | 75 | class LocalTimezone(tzinfo): 76 | ''' 77 | Implementation of datetime's tzinfo to provide convenient conversion 78 | between Shotgun server time and local user time 79 | ''' 80 | 81 | def utcoffset(self, dt): 82 | ''' 83 | Difference between the user's local timezone and UTC timezone in seconds 84 | ''' 85 | if self._isdst(dt): 86 | return SgTimezone.DSTOFFSET 87 | else: 88 | return SgTimezone.STDOFFSET 89 | 90 | def dst(self, dt): 91 | ''' 92 | Daylight savings time (dst) offset in seconds 93 | ''' 94 | if self._isdst(dt): 95 | return SgTimezone.DSTDIFF 96 | else: 97 | return SgTimezone.ZERO 98 | 99 | def tzname(self, dt): 100 | ''' 101 | Name of the user's local timezone, including a reference 102 | to daylight savings time (dst) if applicable 103 | ''' 104 | return _time.tzname[self._isdst(dt)] 105 | 106 | def _isdst(self, dt): 107 | ''' 108 | Calculate whether the timestamp in question was in daylight savings 109 | ''' 110 | tt = (dt.year, dt.month, dt.day, dt.hour, dt.minute, dt.second, dt.weekday(), 0, -1) 111 | stamp = _time.mktime(tt) 112 | tt = _time.localtime(stamp) 113 | return tt.tm_isdst > 0 114 | -------------------------------------------------------------------------------- /update_httplib2.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | 3 | """ 4 | Updates the httplib2 module. 5 | 6 | Run as "./upgrade_httplib2.py vX.Y.Z" to get a specific release from github. 7 | """ 8 | 9 | import pathlib 10 | import tempfile 11 | import shutil 12 | import subprocess 13 | import sys 14 | 15 | 16 | class Utilities: 17 | def download_archive(self, file_path, file_name): 18 | """Download the archive from github.""" 19 | print(f"Downloading {file_name}") 20 | subprocess.check_output( 21 | [ 22 | "curl", 23 | "-L", 24 | f"https://github.com/httplib2/httplib2/archive/{file_name}", 25 | "-o", 26 | file_path, 27 | ] 28 | ) 29 | 30 | def unzip_archive(self, file_path, file_name, temp_dir): 31 | """Unzip in a temp dir.""" 32 | print(f"Unzipping {file_name}") 33 | subprocess.check_output(["unzip", str(file_path), "-d", str(temp_dir)]) 34 | 35 | def remove_folder(self, path): 36 | """Remove a folder recursively.""" 37 | print(f"Removing the folder {path}") 38 | shutil.rmtree(path, ignore_errors=True) 39 | 40 | def git_remove(self, target): 41 | print(f"Removing {target} in git.") 42 | try: 43 | subprocess.check_output( 44 | [ 45 | "git", 46 | "rm", 47 | "-rf", 48 | ] 49 | + target 50 | ) 51 | except Exception as e: 52 | pass 53 | 54 | def copy_folder(self, source, target): 55 | """Copy a folder recursively.""" 56 | shutil.copytree(source, target) 57 | 58 | def sanitize_file(self, file_path): 59 | """Normalize file imports and remove unnecessary strings.""" 60 | with open(file_path, "r") as f: 61 | contents = f.read() 62 | 63 | contents = contents.replace("from httplib2.", "from .") 64 | contents = contents.replace("from httplib2", "from .") 65 | contents = contents.replace( 66 | "import pyparsing as pp", "from .. import pyparsing as pp" 67 | ) 68 | 69 | with open(file_path, "w") as f: 70 | f.write(contents) 71 | 72 | 73 | def main(temp_path, repo_root, version): 74 | # Paths and file names 75 | httplib2_dir = repo_root / "shotgun_api3" / "lib" / "httplib2" 76 | file_name = f"{version}.zip" 77 | file_path = temp_path / file_name 78 | 79 | utilities = Utilities() 80 | 81 | # Downloads the archive from github 82 | utilities.download_archive(file_path, file_name) 83 | 84 | # Unzip in a temp dir 85 | unzipped_folder = temp_path / "unzipped" 86 | unzipped_folder.mkdir() 87 | utilities.unzip_archive(file_path, file_name, unzipped_folder) 88 | 89 | # Removes the previous version of httplib2 90 | utilities.git_remove([str(httplib2_dir)]) 91 | utilities.remove_folder(httplib2_dir) 92 | 93 | # Copies a new version into place. 94 | print("Copying new version of httplib2") 95 | root_folder = unzipped_folder / f"httplib2-{version[1:]}" 96 | utilities.copy_folder(str(root_folder / "python3" / "httplib2"), httplib2_dir) 97 | utilities.remove_folder(f"{httplib2_dir}/test") 98 | 99 | # Patches the httplib2 imports so they are relative instead of absolute. 100 | print("Patching imports") 101 | for python_file in httplib2_dir.rglob("*.py"): 102 | utilities.sanitize_file(python_file) 103 | 104 | # Adding files to the git repo. 105 | print("Adding to git") 106 | subprocess.check_output(["git", "add", str(httplib2_dir)]) # nosec B607 107 | 108 | 109 | if __name__ == "__main__": 110 | try: 111 | temp_path = pathlib.Path(tempfile.mkdtemp()) 112 | main(temp_path, pathlib.Path(__file__).parent, sys.argv[1]) 113 | finally: 114 | shutil.rmtree(temp_path) 115 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_create_shot.rst: -------------------------------------------------------------------------------- 1 | .. _example_create_shot: 2 | 3 | Create A Shot 4 | ============= 5 | 6 | Building the data and calling :meth:`~shotgun_api3.Shotgun.create` 7 | ------------------------------------------------------------------ 8 | To create a Shot, you need to provide the following values: 9 | 10 | - ``project`` is a link to the Project the Shot belongs to. It should be a dictionary like 11 | ``{"type": "Project", "id": 123}`` where ``id`` is the ``id`` of the Project. 12 | - ``code`` (this is the field that stores the name Shot) 13 | - optionally any other info you want to provide 14 | 15 | Example:: 16 | 17 | data = { 18 | 'project': {"type":"Project","id": 4}, 19 | 'code': '100_010', 20 | 'description': 'Open on a beautiful field with fuzzy bunnies', 21 | 'sg_status_list': 'ip' 22 | } 23 | result = sg.create('Shot', data) 24 | 25 | 26 | This will create a new Shot named "100_010" in the Project "Gunslinger" (which has an ``id`` of 4). 27 | 28 | - ``data`` is a list of key/value pairs where the key is the column name to update and the value 29 | is the the value to set. 30 | - ``sg`` is the Flow Production Tracking API instance you created in :ref:`example_sg_instance`. 31 | - ``create()`` is the :meth:`shotgun_api3.Shotgun.create` API method we are calling. We pass in the 32 | entity type we're searching for and the data we're setting. 33 | 34 | .. rubric:: Result 35 | 36 | The variable ``result`` now contains a dictionary hash with the Shot information you created.:: 37 | 38 | { 39 | 'code': '100_010', 40 | 'description': 'Open on a beautiful field with fuzzy bunnies', 41 | 'id': 40435, 42 | 'project': {'id': 4, 'name': 'Demo Project', 'type': 'Project'}, 43 | 'sg_status_list': 'ip', 44 | 'type': 'Shot' 45 | } 46 | 47 | In addition, Flow Production Tracking has returned the ``id`` that it has assigned to the Shot, as well as a 48 | ``type`` value. ``type`` is provided for convenience simply to help you identify what entity type 49 | this dictionary represents. It does not correspond to any field in Flow Production Tracking. 50 | 51 | Flow Production Tracking will *always* return the ``id`` and ``type`` keys in the dictionary when there are results 52 | representing an entity. 53 | 54 | The Complete Example for creating a Shot 55 | ---------------------------------------- 56 | :: 57 | 58 | #!/usr/bin/env python 59 | 60 | # -------------------------------------- 61 | # Imports 62 | # -------------------------------------- 63 | import shotgun_api3 64 | from pprint import pprint # useful for debugging 65 | 66 | # -------------------------------------- 67 | # Globals 68 | # -------------------------------------- 69 | # make sure to change this to match your Flow Production Tracking server and auth credentials. 70 | SERVER_PATH = "https://my-site.shotgrid.autodesk.com" 71 | SCRIPT_NAME = 'my_script' 72 | SCRIPT_KEY = '27b65d7063f46b82e670fe807bd2b6f3fd1676c1' 73 | 74 | # -------------------------------------- 75 | # Main 76 | # -------------------------------------- 77 | if __name__ == '__main__': 78 | 79 | sg = shotgun_api3.Shotgun(SERVER_PATH, SCRIPT_NAME, SCRIPT_KEY) 80 | 81 | # -------------------------------------- 82 | # Create a Shot with data 83 | # -------------------------------------- 84 | data = { 85 | 'project': {"type":"Project","id": 4}, 86 | 'code': '100_010', 87 | 'description': 'Open on a beautiful field with fuzzy bunnies', 88 | 'sg_status_list': 'ip' 89 | } 90 | result = sg.create('Shot', data) 91 | pprint(result) 92 | print("The id of the {} is {}.".format(result['type'], result['id'])) 93 | 94 | And here is the output:: 95 | 96 | {'code': '100_010', 97 | 'description': 'Open on a beautiful field with fuzzy bunnies', 98 | 'id': 40435, 99 | 'project': {'id': 4, 'name': 'Demo Project', 'type': 'Project'}, 100 | 'sg_status_list': 'ip', 101 | 'type': 'Shot'} 102 | The id of the Shot is 40435. 103 | -------------------------------------------------------------------------------- /docs/cookbook/examples/basic_create_version_link_shot.rst: -------------------------------------------------------------------------------- 1 | Create a Version Linked to a Shot 2 | ================================= 3 | You've just created a sweet new Version of your shot. Now you want to update Flow Production Tracking and create a 4 | new ``Version`` entity linked to the Shot. 5 | 6 | Find the Shot 7 | ------------- 8 | First we need to find the Shot since we'll need to know know its ``id`` in order to link our Version 9 | to it. 10 | :: 11 | 12 | filters = [ ['project', 'is', {'type': 'Project', 'id': 4}], 13 | ['code', 'is', '100_010'] ] 14 | shot = sg.find_one('Shot', filters) 15 | 16 | 17 | Find the Task 18 | ------------- 19 | Now we find the Task that the Version relates to, again so we can use the ``id`` to link it to the 20 | Version we're creating. For this search we'll use the Shot ``id`` (which we have now in the ``shot`` 21 | variable from the previous search) and the Task Name, which maps to the ``content`` field. 22 | :: 23 | 24 | filters = [ ['project', 'is', {'type': 'Project', 'id': 4}], 25 | ['entity', 'is',{'type':'Shot', 'id': shot['id']}], 26 | ['content', 'is', 'Animation'] ] 27 | task = sg.find_one('Task', filters) 28 | 29 | .. note:: Linking a Task to the Version is good practice. By doing so it is easy for users to see 30 | at what stage a particular Version was created, and opens up other possibilities for tracking 31 | in Flow Production Tracking. We highly recommend doing this whenever possible. 32 | 33 | Create the Version 34 | ------------------ 35 | Now we can create the Version with the link to the Shot and the Task:: 36 | 37 | data = { 'project': {'type': 'Project','id': 4}, 38 | 'code': '100_010_anim_v1', 39 | 'description': 'first pass at opening shot with bunnies', 40 | 'sg_path_to_frames': '/v1/gun/s100/010/frames/anim/100_010_animv1_jack.#.jpg', 41 | 'sg_status_list': 'rev', 42 | 'entity': {'type': 'Shot', 'id': shot['id']}, 43 | 'sg_task': {'type': 'Task', 'id': task['id']}, 44 | 'user': {'type': 'HumanUser', 'id': 165} } 45 | result = sg.create('Version', data) 46 | 47 | This will create a new Version named '100_010_anim_v1' linked to the 'Animation' Task for Shot 48 | '100_010' in the Project 'Gunslinger'. 49 | 50 | - ``data`` is a list of key/value pairs where the key is the column name to update and the value is 51 | the value to set. 52 | - ``project`` and ``code`` are required 53 | - ``description`` and ``sg_path_to_frames`` are just text fields that you might want to update as 54 | well 55 | - ``sg_status_list`` is the status column for the Version. Here we are setting it to "rev" (Pending 56 | Review) so that it will get reviewed in the next dailies session and people will "ooh" and "aaah". 57 | - ``entity`` is where we link this version to the Shot. Entity columns are always handled with this 58 | format. You must provide the entity ``type`` and its ``id``. 59 | - ``sg_task`` is another entity link field specifically for the Version's Task link. This uses the 60 | same entity format as the Shot link, but pointing to the Task entity this time. 61 | - ``user`` is another entity column where we set the artist responsible for this masterpiece. In 62 | this example, I know the 'id' that corresponds to this user, but if you don't know the id you can 63 | look it up by searching on any of the fields, similar to what we did for the Shot above, like:: 64 | 65 | filters = [['login', 'is', 'jschmoe']] 66 | user = sg.find('HumanUser', filters) 67 | 68 | The ``result`` variable now contains the ``id`` of the new Version that was created:: 69 | 70 | 214 71 | 72 | 73 | Upload a movie for review in Screening Room 74 | ------------------------------------------- 75 | If Screening Room's transcoding feature is enabled on your site (hosted sites have this by 76 | default), then you can use the :meth:`~shotgun_api3.Shotgun.upload` method to upload a QuickTime 77 | movie, PDF, still image, etc. to the ``sg_uploaded_movie`` field on a Version. Once the movie is 78 | uploaded, it will automatically be queued for transcoding. When transcoding is complete, the 79 | Version will be playable in the Screening Room app, or in the Overlay player by clicking on the 80 | Play button that will appear on the Version's thumbnail. 81 | 82 | .. note:: Transcoding also generates a thumbnail and filmstrip thumbnail automatically. 83 | -------------------------------------------------------------------------------- /shotgun_api3/lib/httplib2/iri2uri.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | """Converts an IRI to a URI.""" 3 | 4 | __author__ = "Joe Gregorio (joe@bitworking.org)" 5 | __copyright__ = "Copyright 2006, Joe Gregorio" 6 | __contributors__ = [] 7 | __version__ = "1.0.0" 8 | __license__ = "MIT" 9 | 10 | import urllib.parse 11 | 12 | # Convert an IRI to a URI following the rules in RFC 3987 13 | # 14 | # The characters we need to enocde and escape are defined in the spec: 15 | # 16 | # iprivate = %xE000-F8FF / %xF0000-FFFFD / %x100000-10FFFD 17 | # ucschar = %xA0-D7FF / %xF900-FDCF / %xFDF0-FFEF 18 | # / %x10000-1FFFD / %x20000-2FFFD / %x30000-3FFFD 19 | # / %x40000-4FFFD / %x50000-5FFFD / %x60000-6FFFD 20 | # / %x70000-7FFFD / %x80000-8FFFD / %x90000-9FFFD 21 | # / %xA0000-AFFFD / %xB0000-BFFFD / %xC0000-CFFFD 22 | # / %xD0000-DFFFD / %xE1000-EFFFD 23 | 24 | escape_range = [ 25 | (0xA0, 0xD7FF), 26 | (0xE000, 0xF8FF), 27 | (0xF900, 0xFDCF), 28 | (0xFDF0, 0xFFEF), 29 | (0x10000, 0x1FFFD), 30 | (0x20000, 0x2FFFD), 31 | (0x30000, 0x3FFFD), 32 | (0x40000, 0x4FFFD), 33 | (0x50000, 0x5FFFD), 34 | (0x60000, 0x6FFFD), 35 | (0x70000, 0x7FFFD), 36 | (0x80000, 0x8FFFD), 37 | (0x90000, 0x9FFFD), 38 | (0xA0000, 0xAFFFD), 39 | (0xB0000, 0xBFFFD), 40 | (0xC0000, 0xCFFFD), 41 | (0xD0000, 0xDFFFD), 42 | (0xE1000, 0xEFFFD), 43 | (0xF0000, 0xFFFFD), 44 | (0x100000, 0x10FFFD), 45 | ] 46 | 47 | 48 | def encode(c): 49 | retval = c 50 | i = ord(c) 51 | for low, high in escape_range: 52 | if i < low: 53 | break 54 | if i >= low and i <= high: 55 | retval = "".join(["%%%2X" % o for o in c.encode("utf-8")]) 56 | break 57 | return retval 58 | 59 | 60 | def iri2uri(uri): 61 | """Convert an IRI to a URI. Note that IRIs must be 62 | passed in a unicode strings. That is, do not utf-8 encode 63 | the IRI before passing it into the function.""" 64 | if isinstance(uri, str): 65 | (scheme, authority, path, query, fragment) = urllib.parse.urlsplit(uri) 66 | authority = authority.encode("idna").decode("utf-8") 67 | # For each character in 'ucschar' or 'iprivate' 68 | # 1. encode as utf-8 69 | # 2. then %-encode each octet of that utf-8 70 | uri = urllib.parse.urlunsplit((scheme, authority, path, query, fragment)) 71 | uri = "".join([encode(c) for c in uri]) 72 | return uri 73 | 74 | 75 | if __name__ == "__main__": 76 | import unittest 77 | 78 | class Test(unittest.TestCase): 79 | def test_uris(self): 80 | """Test that URIs are invariant under the transformation.""" 81 | invariant = [ 82 | "ftp://ftp.is.co.za/rfc/rfc1808.txt", 83 | "http://www.ietf.org/rfc/rfc2396.txt", 84 | "ldap://[2001:db8::7]/c=GB?objectClass?one", 85 | "mailto:John.Doe@example.com", 86 | "news:comp.infosystems.www.servers.unix", 87 | "tel:+1-816-555-1212", 88 | "telnet://192.0.2.16:80/", 89 | "urn:oasis:names:specification:docbook:dtd:xml:4.1.2", 90 | ] 91 | for uri in invariant: 92 | self.assertEqual(uri, iri2uri(uri)) 93 | 94 | def test_iri(self): 95 | """Test that the right type of escaping is done for each part of the URI.""" 96 | self.assertEqual( 97 | "http://xn--o3h.com/%E2%98%84", 98 | iri2uri("http://\N{COMET}.com/\N{COMET}"), 99 | ) 100 | self.assertEqual( 101 | "http://bitworking.org/?fred=%E2%98%84", 102 | iri2uri("http://bitworking.org/?fred=\N{COMET}"), 103 | ) 104 | self.assertEqual( 105 | "http://bitworking.org/#%E2%98%84", 106 | iri2uri("http://bitworking.org/#\N{COMET}"), 107 | ) 108 | self.assertEqual("#%E2%98%84", iri2uri("#\N{COMET}")) 109 | self.assertEqual( 110 | "/fred?bar=%E2%98%9A#%E2%98%84", 111 | iri2uri("/fred?bar=\N{BLACK LEFT POINTING INDEX}#\N{COMET}"), 112 | ) 113 | self.assertEqual( 114 | "/fred?bar=%E2%98%9A#%E2%98%84", 115 | iri2uri(iri2uri("/fred?bar=\N{BLACK LEFT POINTING INDEX}#\N{COMET}")), 116 | ) 117 | self.assertNotEqual( 118 | "/fred?bar=%E2%98%9A#%E2%98%84", 119 | iri2uri( 120 | "/fred?bar=\N{BLACK LEFT POINTING INDEX}#\N{COMET}".encode("utf-8") 121 | ), 122 | ) 123 | 124 | unittest.main() 125 | -------------------------------------------------------------------------------- /shotgun_api3/lib/mockgun/schema.py: -------------------------------------------------------------------------------- 1 | """ 2 | ----------------------------------------------------------------------------- 3 | Copyright (c) 2009-2017, Shotgun Software Inc 4 | 5 | Redistribution and use in source and binary forms, with or without 6 | modification, are permitted provided that the following conditions are met: 7 | 8 | - Redistributions of source code must retain the above copyright notice, this 9 | list of conditions and the following disclaimer. 10 | 11 | - Redistributions in binary form must reproduce the above copyright notice, 12 | this list of conditions and the following disclaimer in the documentation 13 | and/or other materials provided with the distribution. 14 | 15 | - Neither the name of the Shotgun Software Inc nor the names of its 16 | contributors may be used to endorse or promote products derived from this 17 | software without specific prior written permission. 18 | 19 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 20 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 21 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 22 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 23 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 24 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 25 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 26 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 27 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 28 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 29 | 30 | ----------------------------------------------------------------------------- 31 | """ 32 | 33 | import os 34 | import pickle 35 | 36 | from .errors import MockgunError 37 | 38 | 39 | class SchemaFactory(object): 40 | """ 41 | Allows to instantiate a pickled schema. 42 | """ 43 | 44 | _schema_entity_cache = None 45 | _schema_entity_cache_path = None 46 | _schema_cache = None 47 | _schema_cache_path = None 48 | 49 | @classmethod 50 | def get_schemas(cls, schema_path: str, schema_entity_path: str) -> tuple: 51 | """ 52 | Retrieves the schemas from disk. 53 | 54 | :param str schema_path: Path to the schema. 55 | :param str schema_entity_path: Path to the entities schema. 56 | 57 | :returns: Pair of dictionaries holding the schema and entities schema. 58 | :rtype: tuple 59 | """ 60 | if not os.path.exists(schema_path): 61 | raise MockgunError("Cannot locate Mockgun schema file '%s'!" % schema_path) 62 | 63 | if not os.path.exists(schema_entity_path): 64 | raise MockgunError("Cannot locate Mockgun schema file '%s'!" % schema_entity_path) 65 | 66 | # Poor man's attempt at a cache. All of our use cases deal with a single pair of files 67 | # for the duration of the unit tests, so keep a cache for both inputs. We don't want 68 | # to deal with ever growing caches anyway. Just having this simple cache has shown 69 | # speed increases of up to 500% for Toolkit unit tests alone. 70 | 71 | if schema_path != cls._schema_cache_path: 72 | cls._schema_cache = cls._read_file(schema_path) 73 | cls._schema_cache_path = schema_path 74 | 75 | if schema_entity_path != cls._schema_entity_cache_path: 76 | cls._schema_entity_cache = cls._read_file(schema_entity_path) 77 | cls._schema_entity_cache_path = schema_entity_path 78 | 79 | return cls._schema_cache, cls._schema_entity_cache 80 | 81 | @classmethod 82 | def _read_file(cls, path): 83 | with open(path, "rb") as fh: 84 | return pickle.load(fh) 85 | 86 | 87 | # Highest protocol that Python 2.4 supports, which is the earliest version of Python we support. 88 | # Actually, this is the same version that Python 2.7 supports at the moment! 89 | _HIGHEST_24_PICKLE_PROTOCOL = 2 90 | 91 | 92 | # ---------------------------------------------------------------------------- 93 | # Utility methods 94 | def generate_schema(shotgun, schema_file_path, schema_entity_file_path): 95 | """ 96 | Helper method for mockgun. 97 | Generates the schema files needed by the mocker by connecting to a real shotgun 98 | and downloading the schema information for that site. Once the generated schema 99 | files are being passed to mockgun, it will mimic the site's schema structure. 100 | 101 | :param sg_url: Shotgun site url 102 | :param sg_script: Script name to connect with 103 | :param sg_key: Script key to connect with 104 | :param schema_file_path: Path where to write the main schema file to 105 | :param schema_entity_file_path: Path where to write the entity schema file to 106 | """ 107 | 108 | schema = shotgun.schema_read() 109 | fh = open(schema_file_path, "wb") 110 | try: 111 | pickle.dump(schema, fh, protocol=_HIGHEST_24_PICKLE_PROTOCOL) 112 | finally: 113 | fh.close() 114 | 115 | schema_entity = shotgun.schema_entity_read() 116 | fh = open(schema_entity_file_path, "wb") 117 | try: 118 | pickle.dump(schema_entity, fh, protocol=_HIGHEST_24_PICKLE_PROTOCOL) 119 | finally: 120 | fh.close() 121 | -------------------------------------------------------------------------------- /docs/authentication.rst: -------------------------------------------------------------------------------- 1 | ############## 2 | Authentication 3 | ############## 4 | 5 | In order to communicate with your server via the API, you must provide valid authentication credentials. The API allows you to authenticate with user-based, or script-based credentials. 6 | 7 | ************************* 8 | User-based Authentication 9 | ************************* 10 | When authenticating as a user, you provide your normal login and password when instantiating your :class:`shotgun_api3.Shotgun` object. The actions performed by this instance will be limited to your permission level just as they are in the web application. :: 11 | 12 | sg = shotgun_api3.Shotgun("https://my-site.shotgrid.autodesk.com", 13 | login="rhendriks", 14 | password="c0mPre$Hi0n") 15 | 16 | 17 | *************************** 18 | Script-based Authentication 19 | *************************** 20 | In order to authenticate as a script, your script must be :ref:`registered with Flow Production Tracking and have a valid API key `. When creating your :class:`shotgun_api3.Shotgun` object, provide the ``script_name`` and ``api_key``.:: 21 | 22 | sg = shotgun_api3.Shotgun("https://my-site.shotgrid.autodesk.com", 23 | script_name="compress", 24 | api_key="0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef") 25 | 26 | .. note:: When using script-based authentication, we **strongly** recommend you register each script separately with Flow Production Tracking and have individual API keys for each. This allows you to track down each of your scripts and the actions they are performing much more accurately in the event logs. 27 | 28 | 29 | .. _setting_up_shotgrid: 30 | 31 | Adding Script Users 32 | =================== 33 | If you'll be using script-based authentication, you need to create a Script entity in Flow Production Tracking. To create a new key, click the + button on the "Scripts" page in the Admin section and give your script a useful name. It's a good idea to add any other relevant information that be be helpful to your other friendly Flow Production Tracking users such as a description of what the script does that is using this key, the email address of the maintainer, etc.: 34 | 35 | .. image:: images/scripts_page.png 36 | 37 | Once you save your new Script entity, Flow Production Tracking will automatically generate an application key which will act as the script's password. The key will look something like this: ``0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef``. 38 | 39 | Why have different application keys for different scripts? 40 | ========================================================== 41 | We recommend you create a new Script entity (and application key) for each script that is using script-based authentication so you can accurately log what scripts are doing what in case one of them causes problems. This will also allow you to better see what scripts are performing what actions in the EventLog. We've found that even though you may *think* you'll probably never need to know, the extra 2 minutes of setup now can prevent hours of headache in the future. 42 | 43 | Event Logging 44 | ============= 45 | By default, events generated by scripts using an script-based authentication are logged in Flow Production Tracking's event log. You can turn this off by un-checking the "Generate Events" checkbox either in the script detail page or from the main Scripts admin page in Flow Production Tracking. 46 | 47 | .. note:: Turning off event logging will also prevent any email notifications from being triggered by your scripts since the email notifier relies on the event log to find events to notify for. 48 | 49 | Scripts using user-based authentication will generate events similarly to if you were performing the same actions in the Flow Production Tracking web application, though there is some additional metadata stored in the ``EventLogEntry`` that identifies the event as created from a script acting on behalf of the user. 50 | 51 | Why would you want to turn event logging off for scripts? 52 | --------------------------------------------------------- 53 | It is an optimization that is not used often, but some users have integration scripts that are pushing data into Flow Production Tracking just for reference, like publishes from their asset management system. This publish data is never changed later, so the data itself has the entire history, and the events would just clutter the event log. The event log can grow very large. So if you have no need to audit the history of what your script does, and it's generating an large amount of event log entries, you may find it's not necessary to create these events. 54 | 55 | *********** 56 | Permissions 57 | *********** 58 | Users and scripts are both bound by the restrictions of their permission role in Flow Production Tracking. The permission role is assigned by the **Permission Role** field for each entity type. 59 | 60 | For Scripts, the default permission role is "API Admin User" which allows full access to create, update, and delete entities and fields, including editing the "date created" audit field and creating event log entries. If you have other permission roles for ApiUsers, you can set the default role that will be assigned when a new script is created, in your Flow Production Tracking site preferences. 61 | 62 | When using user-based authentication in your script, it will be bound by the permission role assigned to you in Flow Production Tracking. For example, if you don't have access to edit the status field on Shots, your script won't be able to either. Attempting to perform actions that are prohibited by permissions will raise an appropriate exception. 63 | 64 | .. seealso:: `Permissions Documentation `_ 65 | -------------------------------------------------------------------------------- /software_credits: -------------------------------------------------------------------------------- 1 | The Flow Production Tracking Python API uses the following software. 2 | Thanks to their creators, license information below. 3 | 4 | ============================== PYTHON ============================== 5 | 6 | Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 7 | 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019 Python Software Foundation. All 8 | rights reserved. 9 | 10 | Copyright (c) 2000 BeOpen.com. 11 | All rights reserved. 12 | 13 | Copyright (c) 1995-2001 Corporation for National Research Initiatives. 14 | All rights reserved. 15 | 16 | Copyright (c) 1991-1995 Stichting Mathematisch Centrum. 17 | All rights reserved. 18 | 19 | 1. This LICENSE AGREEMENT is between the Python Software Foundation ("PSF"), and 20 | the Individual or Organization ("Licensee") accessing and otherwise using Python 21 | 2.7.17 software in source or binary form and its associated documentation. 22 | 23 | 2. Subject to the terms and conditions of this License Agreement, PSF hereby 24 | grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce, 25 | analyze, test, perform and/or display publicly, prepare derivative works, 26 | distribute, and otherwise use Python 2.7.17 alone or in any derivative 27 | version, provided, however, that PSF's License Agreement and PSF's notice of 28 | copyright, i.e., "Copyright © 2001-2020 Python Software Foundation; All Rights 29 | Reserved" are retained in Python 2.7.17 alone or in any derivative version 30 | prepared by Licensee. 31 | 32 | 3. In the event Licensee prepares a derivative work that is based on or 33 | incorporates Python 2.7.17 or any part thereof, and wants to make the 34 | derivative work available to others as provided herein, then Licensee hereby 35 | agrees to include in any such work a brief summary of the changes made to Python 36 | 2.7.17. 37 | 38 | 4. PSF is making Python 2.7.17 available to Licensee on an "AS IS" basis. 39 | PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR IMPLIED. BY WAY OF 40 | EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND DISCLAIMS ANY REPRESENTATION OR 41 | WARRANTY OF MERCHANTABILITY OR FITNESS FOR ANY PARTICULAR PURPOSE OR THAT THE 42 | USE OF PYTHON 2.7.17 WILL NOT INFRINGE ANY THIRD PARTY RIGHTS. 43 | 44 | 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON 2.7.17 45 | FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS A RESULT OF 46 | MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 2.7.17, OR ANY DERIVATIVE 47 | THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 48 | 49 | 6. This License Agreement will automatically terminate upon a material breach of 50 | its terms and conditions. 51 | 52 | 7. Nothing in this License Agreement shall be deemed to create any relationship 53 | of agency, partnership, or joint venture between PSF and Licensee. This License 54 | Agreement does not grant permission to use PSF trademarks or trade name in a 55 | trademark sense to endorse or promote products or services of Licensee, or any 56 | third party. 57 | 58 | 8. By copying, installing or otherwise using Python 2.7.17, Licensee agrees 59 | to be bound by the terms and conditions of this License Agreement. 60 | 61 | 62 | ============================== Certifi ============================== 63 | 64 | Copyright © 2024 Contributors 65 | This Autodesk software contains the unmodified python-certifi 2024.07.04 66 | package. The use and distribution terms for this software are covered by the 67 | Mozilla Public License 2.0 (https://www.mozilla.org/en-US/MPL/2.0/ ). By using 68 | this software in any fashion, you are agreeing to be bound by the terms of this 69 | license. The source code for python-certifi is available from 70 | https://github.com/certifi/python-certifi/releases/tag/2024.07.04 71 | 72 | 73 | ============================== Httplib2 ============================== 74 | Httplib2 Software License 75 | 76 | Copyright (c) 2006 by Joe Gregorio 77 | 78 | Permission is hereby granted, free of charge, to any person 79 | obtaining a copy of this software and associated documentation 80 | files (the "Software"), to deal in the Software without restriction, 81 | including without limitation the rights to use, copy, modify, merge, 82 | publish, distribute, sublicense, and/or sell copies of the Software, 83 | and to permit persons to whom the Software is furnished to do so, 84 | subject to the following conditions: 85 | 86 | The above copyright notice and this permission notice shall be 87 | included in all copies or substantial portions of the Software. 88 | 89 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, 90 | EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES 91 | OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND 92 | NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS 93 | BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN 94 | ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN 95 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 96 | SOFTWARE. 97 | 98 | 99 | ============================== pyparsing ============================== 100 | 101 | Copyright (c) 2003-2019 Paul T. McGuire 102 | 103 | Permission is hereby granted, free of charge, to any person obtaining 104 | a copy of this software and associated documentation files (the 105 | "Software"), to deal in the Software without restriction, including 106 | without limitation the rights to use, copy, modify, merge, publish, 107 | distribute, sublicense, and/or sell copies of the Software, and to 108 | permit persons to whom the Software is furnished to do so, subject to 109 | the following conditions: 110 | 111 | The above copyright notice and this permission notice shall be 112 | included in all copies or substantial portions of the Software. 113 | 114 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, 115 | EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF 116 | MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. 117 | IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY 118 | CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, 119 | TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE 120 | SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 121 | 122 | 123 | ============================== SIX ============================== 124 | 125 | Copyright (c) 2010-2020 Benjamin Peterson 126 | 127 | Permission is hereby granted, free of charge, to any person obtaining a copy of 128 | this software and associated documentation files (the "Software"), to deal in 129 | the Software without restriction, including without limitation the rights to 130 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of 131 | the Software, and to permit persons to whom the Software is furnished to do so, 132 | subject to the following conditions: 133 | 134 | The above copyright notice and this permission notice shall be included in all 135 | copies or substantial portions of the Software. 136 | 137 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 138 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS 139 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR 140 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER 141 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN 142 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 143 | -------------------------------------------------------------------------------- /docs/cookbook/tasks/split_tasks.rst: -------------------------------------------------------------------------------- 1 | .. _split_tasks: 2 | 3 | ########### 4 | Split Tasks 5 | ########### 6 | 7 | Split tasks can be created and edited via the API but must comply to some rules. Before going 8 | further, a good understanding of :ref:`how Flow Production Tracking handles task dates is useful `. 9 | 10 | ******** 11 | Overview 12 | ******** 13 | 14 | The Task entity has a field called ``splits`` which is a list of dictionaries. Each dictionary 15 | in the list has two string keys, ``start`` and ``end``, who's values are strings representing dates 16 | in the ``YYYY-mm-dd`` format. 17 | 18 | :: 19 | 20 | [{'start': '2012-12-11', 'end': '2012-12-12'}, {'start': '2012-12-18', 'end': '2012-12-19'}] 21 | 22 | - Splits should be ordered from eldest to newest. 23 | - There should be gaps between each split. 24 | 25 | - Gaps are defined as at least one working day. Non-workdays such as weekends and holidays 26 | are not gaps. 27 | 28 | If there are multiple splits but there between two or more splits there is no gap, an error will be 29 | raised. For example:: 30 | 31 | >>> sg.update('Task', 2088, {'splits':[{'start':'2012-12-10', 'end':'2012-12-11'}, {'start':'2012-12-12', 'end':'2012-12-14'}, {'start':'2012-12-19', 'end':'2012-12-20'}]}) 32 | Traceback (most recent call last): 33 | File "", line 1, in 34 | File "/shotgun/src/python-api/shotgun_api3/shotgun.py", line 600, in update 35 | record = self._call_rpc("update", params) 36 | File "/shotgun/src/python-api/shotgun_api3/shotgun.py", line 1239, in _call_rpc 37 | self._response_errors(response) 38 | File "/shotgun/src/python-api/shotgun_api3/shotgun.py", line 1435, in _response_errors 39 | "Unknown Error")) 40 | shotgun_api3.shotgun.Fault: API update() CRUD ERROR #5: Update failed for [Task.splits]: (task.rb) The start date in split segment 2 is only one calendar day away from the end date of the previous segment. There must be calendar days between split segments. 41 | 42 | Alternately, a split value can be set to ``None`` to remove splits (you can also use an empty list). 43 | This will preserve the ``start_date`` and ``due_date`` values but recalculate the ``duration`` value 44 | while appropriately considering all workday rules in effect. 45 | 46 | ******************************************************** 47 | How Do Splits Influence Dates And Dates Influence Splits 48 | ******************************************************** 49 | 50 | - If splits are specified the supplied ``start_date``, ``due_date`` and ``duration`` fields will be ignored. 51 | - The ``start_date`` will be inferred from the earliest split. 52 | - The ``due_date`` will be inferred from the last split. 53 | - If the ``start_date`` is changed on a task that has splits the first split will be moved to start 54 | on the new ``start_date`` and all further splits will be moved while maintaining gap lengths 55 | between splits and respecting workday rules. 56 | - If the ``due_date`` is changed on a task that has splits the last split will be moved to end on 57 | the new ``due_date`` and all prior splits will be moved while maintaining gap lengths between 58 | splits and respecting workday rules. 59 | - If the ``duration`` is changed two scenarios are possible 60 | 61 | - In the case of a longer duration, additional days will be added to the end of the last split 62 | - In the case of a shorter duration splits, starting with the latest ones, will be either 63 | removed or shortened until the new duration is met. 64 | 65 | Examples for splitting Tasks 66 | ============================ 67 | Throughout the following examples, each successive one will build on the previous. 68 | 69 | start_date, due_date and duration being ignored 70 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 71 | 72 | :: 73 | 74 | sg.update('Task', 2088, { 75 | 'start_date': '2012-12-06', 76 | 'due_date': '2012-12-23', 77 | 'duration': 3600, 78 | 'splits': [ 79 | {'start': '2012-12-11', 'end': '2012-12-12'}, 80 | {'start': '2012-12-18', 'end': '2012-12-19'} 81 | ] 82 | }) 83 | 84 | # Task = { 85 | # 'start_date': '2012-12-11', 86 | # 'due_date': '2012-12-19', 87 | # 'duration': 2400, 88 | # 'splits': [ 89 | # {'start': '2012-12-11', 'end': '2012-12-12'}, 90 | # {'start': '2012-12-18', 'end': '2012-12-19'} 91 | # ] 92 | # } 93 | 94 | Result: 95 | 96 | .. image:: /images/split_tasks_1.png 97 | 98 | Moving the start_date of a split task 99 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 100 | 101 | :: 102 | 103 | sg.update('Task', 2088, { 104 | 'start_date': '2012-12-10' 105 | }) 106 | 107 | # Task = { 108 | # 'start_date': '2012-12-10', 109 | # 'due_date': '2012-12-18', 110 | # 'splits': [ 111 | # {'start': '2012-12-10', 'end': '2012-12-11'}, 112 | # {'start': '2012-12-14', 'end': '2012-12-18'} 113 | # ] 114 | # } 115 | 116 | Result: 117 | 118 | .. image:: /images/split_tasks_2.png 119 | 120 | Moving the due_date of a split task 121 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 122 | 123 | :: 124 | 125 | sg.update('Task', 2088, { 126 | 'due_date': '2012-12-19' 127 | }) 128 | 129 | # Task = { 130 | # 'start_date': '2012-12-10', 131 | # 'due_date': '2012-12-19', 132 | # 'splits': [ 133 | # {'start': '2012-12-10', 'end': '2012-12-11'}, 134 | # {'start': '2012-12-14', 'end': '2012-12-19'} 135 | # ] 136 | # } 137 | 138 | Result: 139 | 140 | .. image:: /images/split_tasks_3.png 141 | 142 | Setting a longer duration 143 | ~~~~~~~~~~~~~~~~~~~~~~~~~ 144 | 145 | :: 146 | 147 | sg.update('Task', 2088, { 148 | 'duration': 4200 149 | }) 150 | 151 | # Task = { 152 | # 'start_date': '2012-12-10', 153 | # 'due_date': '2012-12-21', 154 | # 'duration': 4200, 155 | # 'splits': [ 156 | # {'start': '2012-12-10', 'end': '2012-12-11'}, 157 | # {'start': '2012-12-14', 'end': '2012-12-21'} 158 | # ] 159 | # } 160 | 161 | Result: 162 | 163 | .. image:: /images/split_tasks_4.png 164 | 165 | Setting a shorter duration 166 | ~~~~~~~~~~~~~~~~~~~~~~~~~~ 167 | 168 | :: 169 | 170 | sg.update('Task', 2088, { 171 | 'duration': 2400 172 | }) 173 | 174 | # Task = { 175 | # 'start_date': '2012-12-10', 176 | # 'due_date': '2012-12-18', 177 | # 'duration': 2400, 178 | # 'splits': [ 179 | # {'start': '2012-12-10', 'end': '2012-12-11'}, 180 | # {'start': '2012-12-14', 'end': '2012-12-18'} 181 | # ] 182 | # } 183 | 184 | Result: 185 | 186 | .. image:: /images/split_tasks_5.png 187 | 188 | Another example of shorter duration 189 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 190 | We won't be using the previous result for this example but rather, the following: 191 | 192 | .. image:: /images/split_tasks_6.png 193 | 194 | who's duration we will shorten past the last split. 195 | 196 | :: 197 | 198 | sg.update('Task', 2088, { 199 | 'duration': 1800 200 | }) 201 | 202 | # Task = { 203 | # 'start_date': '2012-12-10', 204 | # 'due_date': '2012-12-18', 205 | # 'duration': 2400, 206 | # 'splits': [ 207 | # {'start': '2012-12-10', 'end': '2012-12-11'}, 208 | # {'start': '2012-12-14', 'end': '2012-12-18'} 209 | # ] 210 | # } 211 | 212 | Result: 213 | 214 | .. image:: /images/split_tasks_7.png 215 | 216 | Setting the due_date in a gap 217 | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 218 | 219 | When a due date is set in a gap later splits are removed and the day of the due date is considered 220 | a day when work will be done. 221 | 222 | For this example let's assume as a starting point the result of the 5th example: 223 | 224 | .. image:: /images/split_tasks_8.png 225 | 226 | :: 227 | 228 | sg.update('Task', 2088, { 229 | 'due_date': '2012-12-13' 230 | }) 231 | 232 | # Task = { 233 | # 'start_date': '2012-12-10', 234 | # 'due_date': '2012-12-13', 235 | # 'duration': 1800, 236 | # 'splits': [ 237 | # {'start': '2012-12-10', 'end': '2012-12-11'}, 238 | # {'start': '2012-12-13', 'end': '2012-12-13'} 239 | # ] 240 | # } 241 | 242 | Result: 243 | 244 | .. image:: /images/split_tasks_9.png 245 | -------------------------------------------------------------------------------- /tests/test_api_long.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Shotgun Software Inc. 2 | # 3 | # CONFIDENTIAL AND PROPRIETARY 4 | # 5 | # This work is provided "AS IS" and subject to the Shotgun Pipeline Toolkit 6 | # Source Code License included in this distribution package. See LICENSE. 7 | # By accessing, using, copying or modifying this work you indicate your 8 | # agreement to the Shotgun Pipeline Toolkit Source Code License. All rights 9 | # not expressly granted therein are reserved by Shotgun Software Inc. 10 | 11 | """Longer tests for calling the Shotgun API functions. 12 | 13 | Includes the schema functions and the automated searching for all entity types 14 | """ 15 | 16 | from . import base 17 | import random 18 | import shotgun_api3 19 | 20 | 21 | class TestShotgunApiLong(base.LiveTestBase): 22 | def test_automated_find(self): 23 | """Called find for each entity type and read all fields""" 24 | 25 | # Whitelist certain data types for order_field, since no_sorting is not 26 | # currently exposed. These should be good bets to be sortable. 27 | sortable_types = ("number", "date") 28 | 29 | all_entities = list(self.sg.schema_entity_read().keys()) 30 | direction = "asc" 31 | filter_operator = "all" 32 | limit = 1 33 | page = 1 34 | for entity_type in all_entities: 35 | if entity_type in ( 36 | "Asset", 37 | "Task", 38 | "Shot", 39 | "Attachment", 40 | "Candidate", 41 | "MimProject", 42 | "MimEntity", 43 | "MimField", 44 | ): 45 | continue 46 | print("Finding entity type %s" % entity_type) 47 | 48 | fields = self.sg.schema_field_read(entity_type) 49 | if not fields: 50 | print("No fields for %s skipping" % entity_type) 51 | continue 52 | 53 | # trying to use some different code paths to the other find test 54 | # pivot_column fields aren't valid for sorting so ensure we're 55 | # not using one. 56 | order_field = None 57 | for field_name, field in fields.items(): 58 | # Restrict sorting to only types we know will always be sortable 59 | # Since no_sorting is not exposed to us, we'll have to rely on 60 | # this as a safeguard against trying to sort by a field with 61 | # allow_sorting=false. 62 | if field["data_type"]["value"] in sortable_types: 63 | order_field = field_name 64 | break 65 | # TODO for our test project, we haven't populated these entities.... 66 | order = None 67 | if order_field: 68 | order = [{"field_name": order_field, "direction": direction}] 69 | if "project" in fields: 70 | filters = [["project", "is", self.project]] 71 | else: 72 | filters = [] 73 | 74 | records = self.sg.find( 75 | entity_type, 76 | filters, 77 | fields=list(fields.keys()), 78 | order=order, 79 | filter_operator=filter_operator, 80 | limit=limit, 81 | page=page, 82 | ) 83 | 84 | self.assertTrue(isinstance(records, list)) 85 | 86 | if filter_operator == "all": 87 | filter_operator = "any" 88 | else: 89 | filter_operator = "all" 90 | if direction == "desc": 91 | direction = "asc" 92 | else: 93 | direction = "desc" 94 | limit = (limit % 5) + 1 95 | page = (page % 3) + 1 96 | 97 | def test_schema(self): 98 | """Called schema functions""" 99 | 100 | schema = self.sg.schema_entity_read() 101 | self.assertTrue(schema, dict) 102 | self.assertTrue(len(schema) > 0) 103 | 104 | schema = self.sg.schema_read() 105 | self.assertTrue(schema, dict) 106 | self.assertTrue(len(schema) > 0) 107 | 108 | schema = self.sg.schema_field_read("Version") 109 | self.assertTrue(schema, dict) 110 | self.assertTrue(len(schema) > 0) 111 | 112 | schema = self.sg.schema_field_read("Version", field_name="user") 113 | self.assertTrue(schema, dict) 114 | self.assertTrue(len(schema) > 0) 115 | self.assertTrue("user" in schema) 116 | 117 | # An explanation is in order here. the field code that is created in shotgun is based on the human display name 118 | # that is provided , so for example "Money Count" would generate the field code 'sg_monkey_count' . The field 119 | # that is created in this test is retired at the end of the test but when this test is run again against 120 | # the same database ( which happens on our Continuous Integration server ) trying to create a new field 121 | # called "Monkey Count" will now fail due to the new Delete Field Forever features we have added to shotgun 122 | # since there will a retired field called sg_monkey_count. The old behavior was to go ahead and create a new 123 | # "Monkey Count" field with a field code with an incremented number of the end like sg_monkey_count_1. The new 124 | # behavior is to raise an error in hopes the user will go into the UI and delete the old retired field forever. 125 | 126 | # make a the name of the field somewhat unique 127 | human_field_name = "Monkey " + str(random.getrandbits(24)) 128 | 129 | properties = {"description": "How many monkeys were needed"} 130 | new_field_name = self.sg.schema_field_create( 131 | "Version", "number", human_field_name, properties=properties 132 | ) 133 | 134 | properties = {"description": "How many monkeys turned up"} 135 | ret_val = self.sg.schema_field_update("Version", new_field_name, properties) 136 | self.assertTrue(ret_val) 137 | 138 | ret_val = self.sg.schema_field_delete("Version", new_field_name) 139 | self.assertTrue(ret_val) 140 | 141 | def test_schema_with_project(self): 142 | """Called schema functions with project""" 143 | 144 | project_entity = {"type": "Project", "id": 0} 145 | 146 | if not self.sg.server_caps.version or self.sg.server_caps.version < (5, 4, 4): 147 | # server does not support this! 148 | self.assertRaises( 149 | shotgun_api3.ShotgunError, self.sg.schema_entity_read, project_entity 150 | ) 151 | self.assertRaises( 152 | shotgun_api3.ShotgunError, self.sg.schema_read, project_entity 153 | ) 154 | self.assertRaises( 155 | shotgun_api3.ShotgunError, 156 | self.sg.schema_field_read, 157 | "Version", 158 | None, 159 | project_entity, 160 | ) 161 | self.assertRaises( 162 | shotgun_api3.ShotgunError, 163 | self.sg.schema_field_read, 164 | "Version", 165 | "user", 166 | project_entity, 167 | ) 168 | 169 | else: 170 | schema = self.sg.schema_entity_read(project_entity) 171 | self.assertTrue(schema, dict) 172 | self.assertTrue(len(schema) > 0) 173 | self.assertTrue("Project" in schema) 174 | self.assertTrue("visible" in schema["Project"]) 175 | 176 | schema = self.sg.schema_read(project_entity) 177 | self.assertTrue(schema, dict) 178 | self.assertTrue(len(schema) > 0) 179 | self.assertTrue("Version" in schema) 180 | self.assertFalse("visible" in schema) 181 | 182 | schema = self.sg.schema_field_read("Version", project_entity=project_entity) 183 | self.assertTrue(schema, dict) 184 | self.assertTrue(len(schema) > 0) 185 | self.assertTrue("user" in schema) 186 | self.assertTrue("visible" in schema["user"]) 187 | 188 | schema = self.sg.schema_field_read("Version", "user", project_entity) 189 | self.assertTrue(schema, dict) 190 | self.assertTrue(len(schema) > 0) 191 | self.assertTrue("user" in schema) 192 | self.assertTrue("visible" in schema["user"]) 193 | 194 | 195 | if __name__ == "__main__": 196 | base.unittest.main() 197 | -------------------------------------------------------------------------------- /azure-pipelines-templates/run-tests.yml: -------------------------------------------------------------------------------- 1 | # ----------------------------------------------------------------------------- 2 | # Copyright (c) 2009-2021, Shotgun Software Inc. 3 | # 4 | # Redistribution and use in source and binary forms, with or without 5 | # modification, are permitted provided that the following conditions are met: 6 | # 7 | # - Redistributions of source code must retain the above copyright notice, this 8 | # list of conditions and the following disclaimer. 9 | # 10 | # - Redistributions in binary form must reproduce the above copyright notice, 11 | # this list of conditions and the following disclaimer in the documentation 12 | # and/or other materials provided with the distribution. 13 | # 14 | # - Neither the name of the Shotgun Software Inc nor the names of its 15 | # contributors may be used to endorse or promote products derived from this 16 | # software without specific prior written permission. 17 | # 18 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 19 | # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 20 | # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 21 | # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 22 | # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 23 | # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 24 | # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 25 | # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 26 | # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 27 | # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 28 | 29 | # This is the list of parameters for this template and their default values. 30 | parameters: 31 | name: '' 32 | vm_image: '' 33 | 34 | jobs: 35 | # The job will be named after the OS and Azure will suffix the strategy to make it unique 36 | # so we'll have a job name "Windows Python 3.9" for example. What's a strategy? Strategies are the 37 | # name of the keys under the strategy.matrix scope. So for each OS we'll have " Python 3.9" and 38 | # " Python 3.10". 39 | - job: ${{ parameters.name }} 40 | pool: 41 | vmImage: ${{ parameters.vm_image }} 42 | # The strategy is another way of removing repetition. It will create one job per entry in the 43 | # matrix. 44 | strategy: 45 | matrix: 46 | # We support these versions of Python. 47 | Python 3.9: 48 | python.version: '3.9' 49 | Python 3.10: 50 | python.version: '3.10' 51 | Python 3.11: 52 | python.version: '3.11' 53 | 54 | maxParallel: 4 55 | 56 | variables: 57 | group: sg-credentials 58 | 59 | # These are the steps that will be executed inside each job. 60 | steps: 61 | # Specifies which version of Python we want to use. That's where the strategy comes in. 62 | # Each job will share this set of steps, but each of them will receive a different 63 | # $(python.version) 64 | # TODO: We should provide `githubToken` if we want to download a python release. 65 | # Otherwise we may hit the GitHub anonymous download limit. 66 | - task: UsePythonVersion@0 67 | inputs: 68 | versionSpec: '$(python.version)' 69 | addToPath: True 70 | 71 | # Install all dependencies needed for running the tests. This command is good 72 | # for all OSes 73 | - script: | 74 | python -m pip install --upgrade pip setuptools wheel 75 | python -m pip install -r tests/ci_requirements.txt 76 | displayName: Install tools 77 | 78 | # The {{}} syntax is meant for the the pre-processor of Azure pipeline. Every statement inside 79 | # a {{}} block will be evaluated and substituted before the file is parsed to create the jobs. 80 | # So here we're inserting an extra step if the template is being invoked for Windows. 81 | - ${{ if eq(parameters.name, 'Windows') }}: 82 | # On Windows, we need to update the certificates, the cert store is missing the newer one 83 | # from Amazon like some clients experienced a while back. Who would have thought Microsoft 84 | # would have been out of date! ;) 85 | - powershell: | 86 | $cert_url = "https://www.amazontrust.com/repository/SFSRootCAG2.cer" 87 | $cert_file = New-TemporaryFile 88 | Invoke-WebRequest -Uri $cert_url -UseBasicParsing -OutFile $cert_file.FullName 89 | Import-Certificate -FilePath $cert_file.FullName -CertStoreLocation Cert:\LocalMachine\Root 90 | displayName: Updating OS Certificates 91 | 92 | # Runs the tests and generates test coverage. The tests results are uploaded to Azure Pipelines in the 93 | # Tests tab of the build and each test run will be named after the --test-run-title argument to pytest, 94 | # for example 'Windows - 2.7' 95 | - bash: | 96 | cp ./tests/example_config ./tests/config 97 | pytest --durations=0 -v --cov shotgun_api3 --cov-report xml --test-run-title="${{parameters.name}}-$(python.version)" 98 | displayName: Running tests 99 | env: 100 | # Pass the values needed to authenticate with the Flow Production Tracking site and create some entities. 101 | # Remember, on a pull request from a client or on forked repos, those variables 102 | # will be empty! 103 | SG_SERVER_URL: $(ci_site) 104 | SG_SCRIPT_NAME: $(ci_site_script_name) 105 | SG_API_KEY: $(ci_site_script_key) 106 | # The unit tests manipulate the user and project during the tests, which can cause collisions, 107 | # so sandbox each build variant. 108 | # Ideally, we would use the agent name here. The problem is that the agent name is in a build 109 | # variable, so we can't edit it's value through a ${{replace(a,b,c)}} expression, which are evaluated before 110 | # build variables are available. Because of this, we need to find another way to generate a 111 | # unique login. So instead, we'll use the the name of the platform and the python version, 112 | # which should make it unique. 113 | SG_HUMAN_LOGIN: $(python_api_human_login)-${{parameters.name}}-$(python.version) 114 | # This will give a user name like 'something macOS 2.7' 115 | SG_HUMAN_NAME: $(python_api_human_name) ${{parameters.name}} $(python.version) 116 | SG_HUMAN_PASSWORD: $(python_api_human_password) 117 | # So, first, we need to make sure that two builds running at the same time do not manipulate 118 | # the same entities, so we're sandboxing build nodes based on their name. 119 | SG_PROJECT_NAME: Python API CI - $(Agent.Name) 120 | # The entities created and then reused between tests assume that the same user is always 121 | # manipulating them. Because different builds will be assigned different agents and therefore 122 | # different projects, it means each project needs to have an entity specific to a given user. 123 | # Again, this would have been a lot simpler if we could simply have had a login based on the 124 | # agent name, but alas, the agent name has a space in it which needs to be replaced to something 125 | # else and string substitution can't be made on build variables, only template parameters. 126 | SG_ASSET_CODE: CI-$(python_api_human_login)-${{parameters.name}}-$(python.version) 127 | SG_VERSION_CODE: CI-$(python_api_human_login)-${{parameters.name}}-$(python.version) 128 | SG_SHOT_CODE: CI-$(python_api_human_login)-${{parameters.name}}-$(python.version) 129 | SG_TASK_CONTENT: CI-$(python_api_human_login)-${{parameters.name}}-$(python.version) 130 | SG_PLAYLIST_CODE: CI-$(python_api_human_login)-${{parameters.name}}-$(python.version) 131 | 132 | # Upload the code coverage result to codecov.io. 133 | - ${{ if eq(parameters.name, 'Windows') }}: 134 | - powershell: | 135 | $ProgressPreference = 'SilentlyContinue' 136 | Invoke-WebRequest -Uri https://uploader.codecov.io/latest/windows/codecov.exe -Outfile codecov.exe 137 | .\codecov.exe -f coverage.xml 138 | displayName: Uploading code coverage 139 | - ${{ elseif eq(parameters.name, 'Linux') }}: 140 | - script: | 141 | curl -Os https://uploader.codecov.io/latest/linux/codecov 142 | chmod +x codecov 143 | ./codecov -f coverage.xml 144 | displayName: Uploading code coverage 145 | - ${{ else }}: 146 | - script: | 147 | curl -Os https://uploader.codecov.io/v0.7.3/macos/codecov 148 | chmod +x codecov 149 | ./codecov -f coverage.xml 150 | displayName: Uploading code coverage 151 | -------------------------------------------------------------------------------- /docs/cookbook/examples/svn_integration.rst: -------------------------------------------------------------------------------- 1 | .. _svn_integration: 2 | 3 | ############################ 4 | Subversion (SVN) Integration 5 | ############################ 6 | 7 | Integrating Flow Production Tracking with Subversion consists of two basic parts: 8 | 9 | - Setup a post-commit hook in Subversion. 10 | - Create a Flow Production Tracking API script to create the Revision in Flow Production Tracking. This script will be called by 11 | the post-commit hook. 12 | 13 | **************** 14 | Post-Commit Hook 15 | **************** 16 | 17 | To setup the post-commit hook: 18 | 19 | - Locate the ``post-commit.tmpl`` file, which is found inside the ``hooks`` folder in your 20 | repository directory. This is a template script that has lots of useful comments and can serve 21 | as a starting point for the real thing. 22 | - Create your very own executable script, and save it in the same ``hooks`` folder, name it 23 | ``post-commit``, and give it executable permission. 24 | - In your ``post-commit`` script, invoke your Flow Production Tracking API script. 25 | 26 | If this is entirely new to you, we highly suggest reading up on the topic. O'Reilly has `a free 27 | online guide for Subversion 1.5 and 1.6 28 | `_ 29 | 30 | Here's an example of a post-commit hook that we've made for Subversion 1.6 using an executable 31 | Unix shell script. The last line invokes "shotgun_api_script.py" which is our Python script that 32 | will do all the heavy lifting. Lines 4 thru 8 queue up some objects that we'll use later on. 33 | 34 | .. code-block:: sh 35 | :linenos: 36 | 37 | #!/bin/sh 38 | # POST-COMMIT HOOK 39 | 40 | REPOS="$1" 41 | REV="$2" 42 | 43 | export AUTHOR="$(svnlook author $REPOS --revision $REV)" 44 | export COMMENT="$(svnlook log $REPOS --revision $REV)" 45 | 46 | /Absolute/path/to/shotgun_api_script.py 47 | 48 | Explanation of selected lines 49 | ============================= 50 | 51 | - lines ``4-5``: After the commit, Subversion leaves us two string objects in the environment: 52 | ``REPOS`` and ``REV`` (the repository path and the revision number, respectively). 53 | - lines ``7-8``: Here we use the shell command ``export`` to create two more string objects in the 54 | environment: ``AUTHOR`` and ``COMMENT``. To get each value, we use the ``svnlook`` command with 55 | our ``REPOS`` and ``REV`` values, first with the ``author``, and then with ``log`` subcommand. 56 | These are actually the first two original lines of code - everything else to this point was 57 | pre-written already in the ``post-commit.tmpl`` file. nice :) 58 | - line ``10``: This is the absolute path to our Flow Production Tracking API Script. 59 | 60 | *********************************** 61 | Flow Production Tracking API Script 62 | *********************************** 63 | 64 | This script will create the Revision and populate it with some metadata using the Flow Production Tracking Python 65 | API. It will create our Revision in Flow Production Tracking along with the author, comment, and because we use 66 | Trac (a web-based interface for Subversion), it will also populate a URL field with a clickable 67 | link to the Revision. 68 | 69 | .. code-block:: python 70 | :linenos: 71 | 72 | 73 | #!/usr/bin/env python 74 | # --------------------------------------------------------------------------------------------- 75 | 76 | # --------------------------------------------------------------------------------------------- 77 | # Imports 78 | # --------------------------------------------------------------------------------------------- 79 | import sys 80 | from shotgun_api3_preview import Shotgun 81 | import os 82 | 83 | # --------------------------------------------------------------------------------------------- 84 | # Globals - update all of these values to those of your studio 85 | # --------------------------------------------------------------------------------------------- 86 | SERVER_PATH = 'https ://my-site.shotgrid.autodesk.com' # or http: 87 | SCRIPT_USER = 'script_name' 88 | SCRIPT_KEY = '3333333333333333333333333333333333333333' 89 | REVISIONS_PATH = 'https ://serveraddress/trac/changeset/' # or other web-based UI 90 | PROJECT = {'type':'Project', 'id':27} 91 | 92 | # --------------------------------------------------------------------------------------------- 93 | # Main 94 | # --------------------------------------------------------------------------------------------- 95 | if __name__ == '__main__': 96 | 97 | sg = Shotgun(SERVER_PATH, SCRIPT_USER, SCRIPT_KEY) 98 | 99 | # Set Python variables from the environment objects 100 | revision_code = os.environ['REV'] 101 | repository = os.environ['REPOS'] 102 | description = os.environ['COMMENT'] 103 | author = os.environ['AUTHOR'] 104 | 105 | # Set the Trac path for this specific revision 106 | revision_url = REVISIONS_PATH + revision_code 107 | 108 | # Validate that author is a valid Flow Production Tracking HumanUser 109 | result = sg.find_one("HumanUser", [['login', 'is', author]]) 110 | if result: 111 | # Create Revision 112 | url = {'content_type':'http_url', 'url':revision_url, 'name':'Trac'} 113 | parameters = {'project':PROJECT, 114 | 'code':str(revision_code), 115 | 'description':description, 116 | 'attachment':url, 117 | 'created_by':{"type":"HumanUser", "id":result['id']} 118 | } 119 | revision = sg.create("Revision", parameters) 120 | print("created Revision #"+str(revision_code)) 121 | 122 | # Send error message if valid HumanUser is not found 123 | else: 124 | print("Unable to find a valid Flow Production Tracking User with login: {}, Revision not created in Flow Production Tracking.".format(author)) 125 | 126 | 127 | 128 | Explanation of selected lines: 129 | ============================== 130 | 131 | - line ``14``: This should be the URL to your instance of Flow Production Tracking. 132 | - lines ``15-16``: Make sure you get these values from the "Scripts" page in the Admin section of 133 | the Flow Production Tracking web application. If you're not sure how to do this, check out :ref:`authentication`. 134 | - line ``17``: This is the address of Trac, our web-based interface that we use with Subversion. 135 | You may use a different interface, or none at all, so feel free to adjust this line or ignore it 136 | as your case may be. 137 | - line ``18``: Every Revision in Flow Production Tracking must have a Project, which is passed to the API as a 138 | dictionary with two keys, the ``type`` and the ``id``. Of course the ``type`` value will always 139 | remain ``Project`` (case sensitive), but the ``id`` will change by Project. To find out the 140 | ``id`` of your Project, go to the Projects page in the Flow Production Tracking web application, locate the 141 | Project where you want your Revisions created, and then locate its ``id`` field (which you may 142 | need to display - if you don't see it, right click on any column header then select 143 | "Insert Column" > "Id"). Note that for this example we assume that all Revisions in this 144 | Subversion repository will belong to the same Project. 145 | - lines ``28-31``: Grab the values from the objects that were left for us in the environment. 146 | - line ``34``: Add the Revision number to complete the path of our Trac url. 147 | - line ``37``: Make sure that a valid User exists in Flow Production Tracking. In our example, we assume that our 148 | Users' Flow Production Tracking logins match their Subversion names. If the user exists in Flow Production Tracking, that 149 | user's ``id`` will be returned as ``result['id']``, which we will need later on in line 46. 150 | - lines ``40-48``: Use all the meta data we've gathered to create a Revision in Flow Production Tracking. If none 151 | of these lines make any sense, check out more on the :meth:`~shotgun_api3.Shotgun.create` method 152 | here. Line 41 deserves special mention: notice that we define a dictionary called ``url`` that 153 | has three important keys: ``content_type``, ``url``, and ``name``, and we then pass this in as 154 | the value for the ``attachment`` field when we create the Revision. If you're even in doubt, 155 | double check the syntax and requirements for the different field types here. 156 | 157 | *************** 158 | Troubleshooting 159 | *************** 160 | 161 | My post-commit script is simply not running. I can run it manually, but commits are not triggering it. 162 | ====================================================================================================== 163 | 164 | Make sure that the script is has explicitly been made executable and that all users who will 165 | invoke it have appropriate permissions for the script and that folders going back to root. 166 | 167 | My Flow Production Tracking API script is not getting called by the post-commit hook. 168 | ===================================================================================== 169 | 170 | Make sure that the script is called using its absolute path. 171 | -------------------------------------------------------------------------------- /docs/cookbook/examples/ami_handler.rst: -------------------------------------------------------------------------------- 1 | .. _ami_handler: 2 | 3 | ############################### 4 | Handling Action Menu Item Calls 5 | ############################### 6 | 7 | This is an example ActionMenu Python class to handle the ``GET`` request sent from an 8 | ActionMenuItem. It doesn't manage dispatching custom protocols but rather takes the arguments 9 | from any ``GET`` data and parses them into the easily accessible and correctly typed instance 10 | variables for your Python scripts. 11 | 12 | Available as a Gist at https://gist.github.com/3253287 13 | 14 | .. seealso:: 15 | Our `support site has more information about Action Menu Items 16 | `_. 17 | 18 | ************ 19 | GET vs. POST 20 | ************ 21 | 22 | Action Menu Items that open a url via `http` or `https` to another web server send their data 23 | via ``POST``. If you're using a custom protocol the data is sent via ``GET``. 24 | 25 | .. note:: 26 | Browsers limit the length of a ``GET`` request. If you exceed this limit by attempting to 27 | select a lot of rows and launch your custom protocol, you may encounter 28 | "Failed to load resource" errors in your console. 29 | 30 | .. seealso:: 31 | This `Stack Overflow article "What is the maximum length of a URL in different browsers?" 32 | `_ 33 | 34 | :: 35 | 36 | #!/usr/bin/env python 37 | # encoding: utf-8 38 | 39 | # --------------------------------------------------------------------------------------------- 40 | # Description 41 | # --------------------------------------------------------------------------------------------- 42 | """ 43 | The values sent by the Action Menu Item are in the form of a GET request that is similar to the 44 | format: myCoolProtocol://doSomethingCool?user_id=24&user_login=shotgun&title=All%20Versions&... 45 | 46 | In a more human-readable state that would translate to something like this: 47 | { 48 | 'project_name': 'Demo Project', 49 | 'user_id': '24', 50 | 'title': 'All Versions', 51 | 'user_login': 'shotgun', 52 | 'sort_column': 'created_at', 53 | 'entity_type': 'Version', 54 | 'cols': 'created_at', 55 | 'ids': '5,2', 56 | 'selected_ids': '2,5', 57 | 'sort_direction': 'desc', 58 | 'project_id': '4', 59 | 'session_uuid': 'd8592bd6-fc41-11e1-b2c5-000c297a5f50', 60 | 'column_display_names': 61 | [ 62 | 'Version Name', 63 | 'Thumbnail', 64 | 'Link', 65 | 'Artist', 66 | 'Description', 67 | 'Status', 68 | 'Path to frames', 69 | 'QT', 70 | 'Date Created' 71 | ] 72 | } 73 | 74 | This simple class parses the url into easy to access types variables from the parameters, 75 | action, and protocol sections of the url. This example url 76 | myCoolProtocol://doSomethingCool?user_id=123&user_login=miled&title=All%20Versions&... 77 | would be parsed like this: 78 | 79 | (string) protocol: myCoolProtocol 80 | (string) action: doSomethingCool 81 | (dict) params: user_id=123&user_login=miled&title=All%20Versions&... 82 | 83 | The parameters variable will be returned as a dictionary of string key/value pairs. Here's 84 | how to instantiate: 85 | 86 | sa = ShotgunAction(sys.argv[1]) # sys.argv[1] 87 | 88 | sa.params['user_login'] # returns 'miled' 89 | sa.params['user_id'] # returns 123 90 | sa.protocol # returns 'myCoolProtocol' 91 | """ 92 | 93 | 94 | # --------------------------------------------------------------------------------------------- 95 | # Imports 96 | # --------------------------------------------------------------------------------------------- 97 | import sys, os 98 | import logging as logger 99 | 100 | # --------------------------------------------------------------------------------------------- 101 | # Variables 102 | # --------------------------------------------------------------------------------------------- 103 | # location to write logfile for this script 104 | # logging is a bit of overkill for this class, but can still be useful. 105 | logfile = os.path.dirname(sys.argv[0]) + "/shotgun_action.log" 106 | 107 | 108 | # ---------------------------------------------- 109 | # Generic ShotgunAction Exception Class 110 | # ---------------------------------------------- 111 | class ShotgunActionException(Exception): 112 | pass 113 | 114 | 115 | # ---------------------------------------------- 116 | # ShotgunAction Class to manage ActionMenuItem call 117 | # ---------------------------------------------- 118 | class ShotgunAction: 119 | def __init__(self, url): 120 | self.logger = self._init_log(logfile) 121 | self.url = url 122 | self.protocol, self.action, self.params = self._parse_url() 123 | 124 | # entity type that the page was displaying 125 | self.entity_type = self.params["entity_type"] 126 | 127 | # Project info (if the ActionMenuItem was launched from a page not belonging 128 | # to a Project (Global Page, My Page, etc.), this will be blank 129 | if "project_id" in self.params: 130 | self.project = { 131 | "id": int(self.params["project_id"]), 132 | "name": self.params["project_name"], 133 | } 134 | else: 135 | self.project = None 136 | 137 | # Internal column names currently displayed on the page 138 | self.columns = self.params["cols"] 139 | 140 | # Human readable names of the columns currently displayed on the page 141 | self.column_display_names = self.params["column_display_names"] 142 | 143 | # All ids of the entities returned by the query (not just those visible on the page) 144 | self.ids = [] 145 | if len(self.params["ids"]) > 0: 146 | ids = self.params["ids"].split(",") 147 | self.ids = [int(id) for id in ids] 148 | 149 | # All ids of the entities returned by the query in filter format ready 150 | # to use in a find() query 151 | self.ids_filter = self._convert_ids_to_filter(self.ids) 152 | 153 | # ids of entities that were currently selected 154 | self.selected_ids = [] 155 | if len(self.params["selected_ids"]) > 0: 156 | sids = self.params["selected_ids"].split(",") 157 | self.selected_ids = [int(id) for id in sids] 158 | 159 | # All selected ids of the entities returned by the query in filter format ready 160 | # to use in a find() query 161 | self.selected_ids_filter = self._convert_ids_to_filter(self.selected_ids) 162 | 163 | # sort values for the page 164 | # (we don't allow no sort anymore, but not sure if there's legacy here) 165 | if "sort_column" in self.params: 166 | self.sort = { 167 | "column": self.params["sort_column"], 168 | "direction": self.params["sort_direction"], 169 | } 170 | else: 171 | self.sort = None 172 | 173 | # title of the page 174 | self.title = self.params["title"] 175 | 176 | # user info who launched the ActionMenuItem 177 | self.user = {"id": self.params["user_id"], "login": self.params["user_login"]} 178 | 179 | # session_uuid 180 | self.session_uuid = self.params["session_uuid"] 181 | 182 | # ---------------------------------------------- 183 | # Set up logging 184 | # ---------------------------------------------- 185 | def _init_log(self, filename="shotgun_action.log"): 186 | try: 187 | logger.basicConfig( 188 | level=logger.DEBUG, 189 | format="%(asctime)s %(levelname)-8s %(message)s", 190 | datefmt="%Y-%b-%d %H:%M:%S", 191 | filename=filename, 192 | filemode="w+", 193 | ) 194 | except IOError as e: 195 | raise ShotgunActionException("Unable to open logfile for writing: %s" % e) 196 | logger.info("ShotgunAction logging started.") 197 | return logger 198 | 199 | # ---------------------------------------------- 200 | 201 | # Parse ActionMenuItem call into protocol, action and params 202 | # ---------------------------------------------- 203 | def _parse_url(self): 204 | logger.info("Parsing full url received: %s" % self.url) 205 | 206 | # get the protocol used 207 | protocol, path = self.url.split(":", 1) 208 | logger.info("protocol: %s" % protocol) 209 | 210 | # extract the action 211 | action, params = path.split("?", 1) 212 | action = action.strip("/") 213 | logger.info("action: %s" % action) 214 | 215 | # extract the parameters 216 | # 'column_display_names' and 'cols' occurs once for each column displayed so we store it as a list 217 | params = params.split("&") 218 | p = {"column_display_names": [], "cols": []} 219 | for arg in params: 220 | key, value = map(urllib.parse.unquote, arg.split("=", 1)) 221 | if key == "column_display_names" or key == "cols": 222 | p[key].append(value) 223 | else: 224 | p[key] = value 225 | params = p 226 | logger.info("params: %s" % params) 227 | return (protocol, action, params) 228 | 229 | # ---------------------------------------------- 230 | # Convert IDs to filter format to us in find() queries 231 | # ---------------------------------------------- 232 | def _convert_ids_to_filter(self, ids): 233 | filter = [] 234 | for id in ids: 235 | filter.append(["id", "is", id]) 236 | logger.debug("parsed ids into: %s" % filter) 237 | return filter 238 | 239 | 240 | # ---------------------------------------------- 241 | # Main Block 242 | # ---------------------------------------------- 243 | if __name__ == "__main__": 244 | try: 245 | sa = ShotgunAction(sys.argv[1]) 246 | logger.info("ShotgunAction: Firing... %s" % (sys.argv[1])) 247 | except IndexError as e: 248 | raise ShotgunActionException("Missing GET arguments") 249 | logger.info("ShotgunAction process finished.") 250 | -------------------------------------------------------------------------------- /docs/cookbook/examples/ami_version_packager.rst: -------------------------------------------------------------------------------- 1 | .. _ami_version_packager: 2 | 3 | ######################################################## 4 | Using an ActionMenuItem to Package Versions for a Client 5 | ######################################################## 6 | 7 | This is an example script to demonstrate how you can use an ActionMenuItem to launch a local 8 | script to package up files for a client. It performs the following: 9 | 10 | - Downloads Attachments from a specified field for all selected entities. 11 | - Creates an archive. 12 | - Copies the archive to a specified directory. 13 | 14 | It is intended to be used in conjunction with the script dicussed in :ref:`ami_handler`. 15 | 16 | :: 17 | 18 | #!/usr/bin/env python 19 | # encoding: utf-8 20 | """ 21 | version_packager.py 22 | 23 | This example script is meant to be run from an ActionMenuItem in Flow Production Tracking. The menu item uses a custom 24 | protocol in order to launch this script, and is followed by the action 'package4client'. So the full 25 | url would be something like launchme://package4client?.... See: 26 | https://developer.shotgridsoftware.com/python-api/cookbook/examples/ami_handler.html 27 | 28 | It uses the example ActionMenu Python class also located in our docs for parsing the ActionMenuItem 29 | POST variables. For more information about it and accessing the variables in the ActionMenuItem POST request, 30 | See: http://developer.shotgridsoftware.com/python-api/examples/ami_handler 31 | 32 | The purpose of this script is to download attachment files from Flow Production Tracking, create an archive of them 33 | and copy them to a specified directory. You can invoke it with the following minimal example to connect 34 | to Flow Production Tracking, download any file that exists in the specified field ('sg_qt') for each selected_id passed from the 35 | ActionMenu. Then it will create a single archive of the files and move it to the specified directory 36 | ('/path/where/i/want/to/put/the/archive/'). The archive is named with the Project Name, timestamp, and user 37 | login who ran the ActionMenuItem ('Demo_Project_2010-04-29-172210_kp.tar.gz'): 38 | 39 | sa = ShotgunAction(sys.argv[1]) 40 | sg = shotgun_connect() 41 | if sa.action == 'package4client': 42 | r = packageFilesForClient('sg_qt','/path/where/i/want/to/put/the/archive/') 43 | 44 | """ 45 | 46 | # --------------------------------------------------------------------------------------------- 47 | # Imports 48 | # --------------------------------------------------------------------------------------------- 49 | import sys, os 50 | import logging as logger 51 | import subprocess 52 | import re 53 | from datetime import datetime 54 | 55 | from shotgun_api3 import Shotgun 56 | from shotgun_action import ShotgunAction 57 | from pprint import pprint 58 | 59 | # --------------------------------------------------------------------------------------------- 60 | # Variables 61 | # --------------------------------------------------------------------------------------------- 62 | # Flow Production Tracking server auth info 63 | shotgun_conf = { 64 | 'url':'https://my-site.shotgrid.autodesk.com', 65 | 'name':'YOUR_SCRIPT_NAME_HERE', 66 | 'key':'YOUR_SCRIPT_KEY_HERE' 67 | } 68 | 69 | # location to write logfile for this script 70 | logfile = os.path.dirname(sys.argv[0])+"/version_packager.log" 71 | 72 | # temporary directory to download movie files to and create thumbnail files in 73 | file_dir = os.path.dirname(sys.argv[0])+"/tmp" 74 | 75 | # compress command 76 | # tar czf /home/user/backup_www.tar.gz -C / var/www/html 77 | compress_cmd = "tar czf %s -C / %s" 78 | 79 | 80 | 81 | # ---------------------------------------------- 82 | # Generic Flow Production Tracking Exception Class 83 | # ---------------------------------------------- 84 | class ShotgunException(Exception): 85 | pass 86 | 87 | 88 | 89 | # ---------------------------------------------- 90 | # Set up logging 91 | # ---------------------------------------------- 92 | def init_log(filename="version_packager.log"): 93 | try: 94 | logger.basicConfig(level=logger.DEBUG, 95 | format='%(asctime)s %(levelname)-8s %(message)s', 96 | datefmt='%Y-%b-%d %H:%M:%s', 97 | filename=filename, 98 | filemode='w+') 99 | except IOError, e: 100 | raise ShotgunException ("Unable to open logfile for writing: %s" % e) 101 | logger.info("Version Packager logging started.") 102 | return logger 103 | 104 | 105 | # ---------------------------------------------- 106 | # Extract Attachment id from entity field 107 | # ---------------------------------------------- 108 | def extract_attachment_id(attachment): 109 | # extract the Attachment id from the url location 110 | attachment_id = attachment['url'].rsplit('/',1)[1] 111 | try: 112 | attachment_id = int(attachment_id) 113 | except: 114 | # not an integer. 115 | return None 116 | # raise ShotgunException("invalid Attachment id returned. Expected an integer: %s "% attachment_id) 117 | 118 | return attachment_id 119 | 120 | 121 | # ---------------------------------------------- 122 | # Download Movie to Disk 123 | # ---------------------------------------------- 124 | def download_attachment_to_disk(attachment,destination_filename): 125 | attachment_id = extract_attachment_id(attachment) 126 | if type(attachment_id) != int: 127 | return None 128 | # download the attachment file from Flow Production Tracking and write it to local disk 129 | logger.info("Downloading Attachment #%s" % (attachment_id)) 130 | stream = sg.download_attachment(attachment_id) 131 | try: 132 | file = open(destination_filename, 'w') 133 | file.write(stream) 134 | file.close() 135 | logger.info("Downloaded attachment %s" % (destination_filename)) 136 | return True 137 | except e: 138 | raise ShotgunException("unable to write attachment to disk: %s"% e) 139 | 140 | 141 | # ---------------------------------------------- 142 | # Compress files 143 | # ---------------------------------------------- 144 | def compress_files(files,destination_filename): 145 | destination_filename += ".tar.gz" 146 | files = [path.lstrip("/") for path in files] 147 | squish_me = compress_cmd % (destination_filename, " ".join(files) ) 148 | logger.info("Compressing %s files..." % len(files)) 149 | logger.info("Running command: %s" % squish_me) 150 | try: 151 | output = subprocess.Popen(squish_me, shell=True, stdout=subprocess.PIPE).stdout.read() 152 | logger.info('tar/gzip command returned: %s' % output) 153 | except e: 154 | raise ShotgunException("unable compress files: %s"% e) 155 | logger.info("compressed files to: %s" % destination_filename) 156 | return destination_filename 157 | 158 | 159 | # ---------------------------------------------- 160 | # Remove downloaded files 161 | # ---------------------------------------------- 162 | def remove_downloaded_files(files): 163 | remove_me = 'rm %s' % ( " ".join(files) ) 164 | logger.info("Removing %s files..." % len(files)) 165 | logger.info("Running command: %s" % remove_me) 166 | try: 167 | output = subprocess.Popen(remove_me, shell=True, stdout=subprocess.PIPE).stdout.read() 168 | logger.info('rm command returned: %s' % output) 169 | logger.info("removed downloaded files") 170 | return True 171 | except e: 172 | logger.error("unable remove files: %s"% e) 173 | return False 174 | 175 | 176 | # ---------------------------------------------- 177 | # Copy files 178 | # ---------------------------------------------- 179 | def copy_files(files,destination_directory): 180 | if type(files) == list: 181 | files = " ".join(files) 182 | copy_me_args = "%s %s" % (files, destination_directory) 183 | logger.info("Running command: mv %s" % copy_me_args) 184 | try: 185 | result = subprocess.Popen("mv " + copy_me_args, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) 186 | # 0 = success, 1 = recoverable issues 187 | if result.returncode > 0: 188 | response = result.stderr.read() 189 | logger.error("Copy failed: %s"% response) 190 | raise ShotgunException("Copy failed: %s"% response) 191 | except OSError, e: 192 | raise ShotgunException("unable copy files: %s"% e) 193 | 194 | logger.info("copied files to: %s" % destination_directory) 195 | return destination_directory 196 | 197 | 198 | 199 | def packageFilesForClient(file_field,destination_dir): 200 | 201 | # get entities matching the selected ids 202 | logger.info("Querying Shotgun for %s %ss" % (len(sa.selected_ids_filter), sa.params['entity_type'])) 203 | entities = sg.find(sa.params['entity_type'],sa.selected_ids_filter,['id','code',file_field],filter_operator='any') 204 | 205 | # download the attachments for each entity, zip them, and copy to destination directory 206 | files = [] 207 | for e in entities: 208 | if not e[file_field]: 209 | logger.info("%s #%s: No file exists. Skippinsa." % (sa.params['entity_type'], e['id'])) 210 | else: 211 | logger.info("%s #%s: %s" % (sa.params['entity_type'], e['id'], e[file_field])) 212 | path_to_file = file_dir+"/"+re.sub(r"\s+", '_', e[file_field]['name']) 213 | result = download_attachment_to_disk(e[file_field], path_to_file ) 214 | 215 | # only include attachments. urls won't return true 216 | if result: 217 | files.append(path_to_file) 218 | 219 | # compress files 220 | # create a nice valid destination filename 221 | project_name = '' 222 | if 'project_name' in sa.params: 223 | project_name = re.sub(r"\s+", '_', sa.params['project_name'])+'_' 224 | dest_filename = project_name+datetime.today().strftime('%Y-%m-%d-%H%M%S')+"_"+sa.params['user_login'] 225 | archive = compress_files(files,file_dir+"/"+dest_filename) 226 | 227 | # now that we have the archive, remove the downloads 228 | r = remove_downloaded_files(files) 229 | 230 | # copy to directory 231 | result = copy_files([archive],destination_dir) 232 | 233 | return True 234 | 235 | 236 | # ---------------------------------------------- 237 | # Main Block 238 | # ---------------------------------------------- 239 | if __name__ == "__main__": 240 | init_log(logfile) 241 | 242 | try: 243 | sa = ShotgunAction(sys.argv[1]) 244 | logger.info("Firing... %s" % (sys.argv[1]) ) 245 | except IndexError, e: 246 | raise ShotgunException("Missing POST arguments") 247 | 248 | sg = Shotgun(shotgun_conf['url'], shotgun_conf['name'], shotgun_conf['key'],convert_datetimes_to_utc=convert_tz) 249 | 250 | if sa.action == 'package4client': 251 | result = packageFilesForClient('sg_qt','/Users/kp/Documents/shotgun/dev/api/files/') 252 | else: 253 | raise ShotgunException("Unknown action... :%s" % sa.action) 254 | 255 | 256 | print("\nVersion Packager done!") 257 | -------------------------------------------------------------------------------- /docs/cookbook/tasks/updating_tasks.rst: -------------------------------------------------------------------------------- 1 | .. _updating_tasks: 2 | 3 | ######################################################## 4 | Updating Task Dates: How Flow Production Tracking Thinks 5 | ######################################################## 6 | 7 | When updating Task dates in an API update() request, there is no specified order to the values that 8 | are passed in. Flow Production Tracking also does automatic calculation of the``start_date``,``due_date``, and ``duration`` fields for convenience. In order to clarify how updates are handled by Flow Production Tracking we are 9 | providing some general rules below and examples of what will happen when you make updates to your 10 | Tasks. 11 | 12 | ************** 13 | General Rules 14 | ************** 15 | 16 | - Updating the ``start_date`` automatically updates the ``due_date`` (``duration`` remains constant) 17 | - Updating the ``due_date`` automatically updates the ``duration`` (``start_date`` remains constant) 18 | - Updating the ``duration`` automatically updates the ``due_date`` (``start_date`` remains constant) 19 | - When updating Task values, Flow Production Tracking sets schedule fields (``milestone``, ``duration``, 20 | ``start_date``, ``due_date``) after all other fields, because the Project and Task Assignees 21 | affect schedule calculations. 22 | - If ``start_date`` and ``due_date`` are both set, ``duration`` is ignored (``duration`` can often 23 | be wrong since it's easy to calculate scheduling incorrectly). 24 | - If both ``start_date`` and ``due_date`` are provided, Flow Production Tracking sets ``start_date`` before 25 | ``due_date``. 26 | - Set ``milestone`` before other schedule fields (because ``start_date``, ``due_date``, and 27 | ``duration`` get lost if ``milestone`` is not set to ``False`` first) 28 | - If ``milestone`` is being set to ``True``, ``duration`` is ignored. 29 | - If ``milestone`` is set to ``True`` and ``start_date`` and ``due_date`` are also being set to 30 | conflicting values, an Exception is raised. 31 | - If ``due_date`` and ``duration`` are set together (without ``start_date``), ``duration`` is set 32 | first, then ``due_date`` (otherwise setting ``duration`` will change ``due_date`` after it is 33 | set). 34 | 35 | *************************** 36 | Examples for updating Tasks 37 | *************************** 38 | 39 | The following examples show what the resulting Task object will look like after being run on the 40 | initial Task object listed under the header of each section. 41 | 42 | The ``duration`` values in the following examples assume your Flow Production Tracking instance is set to 43 | 10-hour work days. If your server is configured with a different setting, the ``duration`` values 44 | will vary. 45 | 46 | .. note:: The ``duration`` field stores ``duration`` values in minutes 47 | 48 | 49 | ---- 50 | 51 | .. rubric:: Universal 52 | 53 | Regardless of current values on the Task, this behavior rules:: 54 | 55 | Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 56 | 57 | **Update start_date and due_date** 58 | 59 | ``duration`` is ignored if also provided. It is instead set automatically as (``due_date`` - 60 | ``start_date``) 61 | 62 | :: 63 | 64 | sg.update ('Task', 123, {'start_date':'2011-05-25', 'due_date':'2011-05-30', 'duration':1200}) 65 | # Task = {'start_date': '2011-05-25', 'due_date': '2011-05-30', 'duration': 2400, 'id':123} 66 | 67 | - ``start_date`` is updated. 68 | - ``due_date`` is updated. 69 | - ``duration`` is calculated as (``due_date`` - ``start_date``) 70 | 71 | .. note:: The value provided in the update() is ignored (and in this case was also incorrect). 72 | 73 | **Update start_date and duration** 74 | 75 | :: 76 | 77 | sg.update ('Task', 123, {'start_date':'2011-05-25', 'duration':3600}) 78 | # Task = {'start_date': '2011-05-25', 'due_date': '2011-06-01', 'duration': 3600, 'id':123} 79 | 80 | - ``start_date`` is updated. 81 | - ``duration`` is updated. 82 | - ``due_date`` is updated to (``start_date`` + ``duration``). 83 | 84 | **Update due_date and duration** 85 | 86 | :: 87 | 88 | sg.update ('Task', 123, {'due_date': '2011-05-20', 'duration':3600}) 89 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-20', 'duration': 600, 'id':123} 90 | 91 | - ``duration`` is updated. 92 | - ``due_date`` is updated. 93 | - ``duration`` is calculated as (``due_date`` - ``start_date``) 94 | 95 | .. note:: This means the ``duration`` provided is overwritten. 96 | 97 | 98 | ---- 99 | 100 | .. rubric:: Task has start_date only 101 | 102 | If the Task only has a ``start_date`` value and has no other date values, this is how updates 103 | will behave. 104 | 105 | :: 106 | 107 | Task = {'start_date': '2011-05-20', 'due_date': None, 'duration': None, 'id':123} 108 | 109 | **Update start_date** 110 | 111 | :: 112 | 113 | sg.update ('Task', 123, {'start_date':'2011-05-25'}) 114 | # Task = {'start_date': '2011-05-25', 'due_date': None, 'duration': None, 'id':123} 115 | 116 | - Only ``start_date`` is updated. 117 | 118 | **Update due_date** 119 | 120 | :: 121 | 122 | sg.update ('Task', 123, {'due_date':'2011-05-25'}) 123 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 124 | 125 | - ``due_date`` is updated. 126 | - ``duration`` is updated to (``due_date`` - ``start_date``). 127 | 128 | **Update duration** 129 | 130 | :: 131 | 132 | sg.update ('Task', 123, {'duration':2400}) 133 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 134 | 135 | - ``duration`` is updated. 136 | - ``due_date`` is set to (``start_date`` + ``duration``) 137 | 138 | 139 | ---- 140 | 141 | .. rubric:: Task has due_date only 142 | 143 | If the Task only has a ``due_date`` value and has no other date values, this is how updates 144 | will behave. 145 | 146 | :: 147 | 148 | # Task = {'start_date': None, 'due_date': '2011-05-25', 'duration': None, 'id':123} 149 | 150 | **Update start_date** 151 | 152 | :: 153 | 154 | sg.update ('Task', 123, {'start_date':'2011-05-20'}) 155 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 156 | 157 | - ``start_date`` is updated. 158 | - ``duration`` is updated to (``due_date`` - ``start_date``). 159 | 160 | **Update due_date** 161 | 162 | :: 163 | 164 | sg.update ('Task', 123, {'due_date':'2011-05-20'}) 165 | # Task = {'start_date': None, 'due_date': '2011-05-20', 'duration': None, 'id':123} 166 | 167 | - only ``due_date`` is updated. 168 | 169 | **Update duration** 170 | 171 | :: 172 | 173 | sg.update ('Task', 123, {'duration':2400}) 174 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 175 | 176 | - ``duration`` is updated. 177 | - ``start_date`` is set to (``due_date`` - ``duration``) 178 | 179 | 180 | ---- 181 | 182 | .. rubric:: Task has duration only 183 | 184 | If the Task only has a ``duration`` value and has no other date values, this is how updates 185 | will behave. 186 | 187 | :: 188 | 189 | # Task = {'start_date': None, 'due_date': None, 'duration': 2400, 'id':123} 190 | 191 | **Update start_date** 192 | 193 | :: 194 | 195 | sg.update ('Task', 123, {'start_date':'2011-05-20'}) 196 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 197 | 198 | - ``start_date`` is updated. 199 | - ``due_date`` is updated to (``start_date`` + ``duration``). 200 | 201 | **Update due_date** 202 | 203 | :: 204 | 205 | sg.update ('Task', 123, {'due_date':'2011-05-25'}) 206 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 207 | 208 | - ``due_date`` is updated. 209 | - ``start_date`` is updated to (``due_date`` - ``duration``) 210 | 211 | **Update duration** 212 | 213 | :: 214 | 215 | sg.update ('Task', 123, {'duration':3600}) 216 | # Task = {'start_date': None, 'due_date': None, 'duration': 3600, 'id':123} 217 | 218 | - only ``duration`` is updated. 219 | 220 | 221 | ---- 222 | 223 | .. rubric:: Task has start_date and due_date 224 | 225 | If the Task has ``start_date`` and ``due_date`` values but has no ``duration``, this is how updates 226 | will behave. 227 | 228 | :: 229 | 230 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': None, 'id':123} 231 | 232 | **Update start_date** 233 | 234 | :: 235 | 236 | sg.update ('Task', 123, {'start_date':'2011-05-25'}) 237 | # Task = {'start_date': '2011-05-25', 'due_date': '2011-05-25', 'duration': 600, 'id':123} 238 | 239 | - ``start_date`` is updated. 240 | - ``duration`` is updated to (``due_date`` - ``start_date``). 241 | 242 | **Update due_date** 243 | 244 | :: 245 | 246 | sg.update ('Task', 123, {'due_date':'2011-05-30'}) 247 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-30', 'duration': 4200, 'id':123} 248 | 249 | - ``due_date`` is updated. 250 | - ``duration`` is updated to (``due_date`` - ``start_date``) 251 | 252 | **Update duration** 253 | 254 | :: 255 | 256 | sg.update ('Task', 123, {'duration':3600}) 257 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-27', 'duration': 3600, 'id':123} 258 | 259 | - ``duration`` is updated. 260 | - ``due_date`` is updated to (``start_date`` + ``duration``) 261 | 262 | 263 | ---- 264 | 265 | .. rubric:: Task has start_date and duration 266 | 267 | If the Task has ``start_date`` and ``duration`` values but has no ``due_date``, this is how updates 268 | will behave. 269 | 270 | :: 271 | 272 | # Task = {'start_date': '2011-05-20', 'due_date': None, 'duration': 2400, 'id':123} 273 | 274 | **Update start_date** 275 | 276 | :: 277 | 278 | sg.update ('Task', 123, {'start_date':'2011-05-25'}) 279 | # Task = {'start_date': '2011-05-25', 'due_date': '2011-05-30', 'duration': 2400, 'id':123} 280 | 281 | - ``start_date`` is updated. 282 | - ``due_date`` is updated to (``start_date`` +``duration``). 283 | 284 | **Update due_date** 285 | 286 | :: 287 | 288 | sg.update ('Task', 123, {'due_date':'2011-05-30'}) 289 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-30', 'duration': 4200, 'id':123} 290 | 291 | - ``due_date`` is updated. 292 | - ``duration`` is updated to (``due_date`` - ``start_date``). 293 | 294 | **Update duration** 295 | 296 | :: 297 | 298 | sg.update ('Task', 123, {'duration':3600}) 299 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-27', 'duration': 3600, 'id':123} 300 | 301 | - ``duration`` is updated. 302 | - ``due_date`` is updated to (``start_date`` + ``duration``) 303 | 304 | 305 | ---- 306 | 307 | .. rubric:: Task has due_date and duration 308 | 309 | If the Task has ``due_date`` and ``duration`` values but has no ``start_date``, this is how updates 310 | will behave. 311 | 312 | :: 313 | 314 | # Task = {'start_date': None, 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 315 | 316 | **Update start_date** 317 | 318 | :: 319 | 320 | sg.update ('Task', 123, {'start_date':'2011-05-25'}) 321 | # Task = {'start_date': '2011-05-25', 'due_date': '2011-05-30', 'duration': 2400, 'id':123} 322 | 323 | - ``start_date`` is updated. 324 | - ``due_date`` is updated to (``start_date`` + ``duration``). 325 | 326 | **Update due_date** 327 | 328 | :: 329 | 330 | sg.update ('Task', 123, {'due_date':'2011-05-30'}) 331 | # Task = {'start_date': '2011-05-25', 'due_date': '2011-05-30', 'duration': 2400, 'id':123} 332 | 333 | - ``due_date`` is updated. 334 | - ``start_date`` is updated to (``due_date`` - ``duration``). 335 | 336 | **Update duration** 337 | 338 | :: 339 | 340 | sg.update ('Task', 123, {'duration':3600}) 341 | # Task = {'start_date': '2011-05-18', 'due_date': '2011-05-25', 'duration': 3600, 'id':123} 342 | 343 | - ``duration`` is updated. 344 | - ``start_date`` is updated to (``due_date`` - ``duration``) 345 | 346 | 347 | ---- 348 | 349 | .. rubric:: Task has start_date ,due_date, and duration 350 | 351 | If the Task has ``start_date``, ``due_date``, and ``duration``, this is how updates 352 | will behave. 353 | 354 | :: 355 | 356 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-25', 'duration': 2400, 'id':123} 357 | 358 | **Update start_date** 359 | 360 | :: 361 | 362 | sg.update ('Task', 123, {'start_date':'2011-05-25'}) 363 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-30', 'duration': 2400, 'id':123} 364 | 365 | - ``start_date`` is updated. 366 | - ``due_date`` is updated to (``start_date`` + ``duration``). 367 | 368 | **Update due_date** 369 | 370 | :: 371 | 372 | sg.update ('Task', 123, {'due_date':'2011-05-30'}) 373 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-30', 'duration': 4200, 'id':123} 374 | 375 | - ``due_date`` is updated. 376 | - ``duration`` is updated to (``due_date`` - ``start_date``) 377 | 378 | **Update duration** 379 | 380 | :: 381 | 382 | sg.update ('Task', 123, {'duration':3600}) 383 | # Task = {'start_date': '2011-05-20', 'due_date': '2011-05-27', 'duration': 3600, 'id':123} 384 | 385 | - ``duration`` is updated. 386 | - ``due_date`` is updated to (``start_date`` + ``duration``) 387 | -------------------------------------------------------------------------------- /docs/cookbook/usage_tips.rst: -------------------------------------------------------------------------------- 1 | ############## 2 | API Usage Tips 3 | ############## 4 | 5 | Below is a list of helpful tips when using the Flow Production Tracking API3. We have tried to make the API very 6 | simple to use with predictable results while remaining a powerful tool to integrate with your 7 | pipeline. However, there's always a couple of things that crop up that our users might not be 8 | aware of. Those are the types of things you'll find below. We'll be adding to this document over 9 | time as new questions come up from our users that exhibit these types of cases. 10 | 11 | ********* 12 | Importing 13 | ********* 14 | 15 | We strongly recommend you import the entire `shotgun_api3` module instead of just importing the 16 | :class:`shotgun_api3.Shotgun` class from the module. There is other important functionality that 17 | is managed at the module level which may not work as expected if you only import the 18 | :class:`shotgun_api3.Shotgun` object. 19 | 20 | Do:: 21 | 22 | import shotgun_api3 23 | 24 | Don't:: 25 | 26 | from shotgun_api3 import Shotgun 27 | 28 | *************** 29 | Multi-threading 30 | *************** 31 | The Flow Production Tracking API is not thread-safe. If you want to do threading we strongly suggest that you use 32 | one connection object per thread and not share the connection. 33 | 34 | .. _entity-fields: 35 | 36 | ************* 37 | Entity Fields 38 | ************* 39 | 40 | When you do a :meth:`~shotgun_api3.Shotgun.find` or a :meth:`~shotgun_api3.Shotgun.create` call 41 | that returns a field of type **entity** or **multi-entity** (for example the 'Assets' column on Shot), 42 | the entities are returned in a standard dictionary:: 43 | 44 | {'type': 'Asset', 'name': 'redBall', 'id': 1} 45 | 46 | For each entity returned, you will get a ``type``, ``name``, and ``id`` key. This does not mean 47 | there are fields named ``type`` and ``name`` on the Asset. These are only used to provide a 48 | consistent way to represent entities returned via the API. 49 | 50 | - ``type``: the entity type (CamelCase) 51 | - ``name``: the display name of the entity. For most entity types this is the value of the ``code`` 52 | field but not always. For example, on the Ticket and Delivery entities the ``name`` key would 53 | contain the value of the ``title`` field. 54 | 55 | .. _custom_entities: 56 | 57 | ************** 58 | CustomEntities 59 | ************** 60 | Entity types are always referenced by their original names. So if you enable CustomEntity01 and 61 | call it **Widget**. When you access it via the API, you'll still use CustomEntity01 as the 62 | ``entity_type``. 63 | 64 | If you want to be able to remember what all of your CustomEntities represent in a way where you 65 | don't need to go look it up all the time when you're writing a new script, we'd suggest creating 66 | a mapping table or something similar and dumping it in a shared module that your studio uses. 67 | Something like the following:: 68 | 69 | # studio_globals.py 70 | 71 | entity_type_map = { 72 | 'Widget': 'CustomEntity01', 73 | 'Foobar': 'CustomEntity02', 74 | 'Baz': 'CustomNonProjectEntity01, 75 | } 76 | 77 | # or even simpler, you could use a global like this 78 | ENTITY_WIDGET = 'CustomEntity01' 79 | ENTITY_FOOBAR = 'CustomEntity02' 80 | ENTITY_BAZ = 'CustomNonProjectEntity01' 81 | 82 | Then when you're writing scripts, you don't need to worry about remembering which Custom Entity 83 | "Foobars" are, you just use your global:: 84 | 85 | import shotgun_api3 86 | import studio_globals 87 | 88 | sg = shotgun_api3.Shotgun('https://my-site.shotgrid.autodesk.com', 'script_name', '0123456789abcdef0123456789abcdef0123456') 89 | result = sg.find(studio_globals.ENTITY_WIDGET, 90 | filters=[['sg_status_list', 'is', 'ip']], 91 | fields=['code', 'sg_shot']) 92 | 93 | .. _connection_entities: 94 | 95 | ****************** 96 | ConnectionEntities 97 | ****************** 98 | 99 | Connection entities exist behind the scenes for any many-to-many relationship. Most of the time 100 | you won't need to pay any attention to them. But in some cases, you may need to track information 101 | on the instance of one entity's relationship to another. 102 | 103 | For example, when viewing a list of Versions on a Playlist, the Sort Order (``sg_sort_order``) field is an 104 | example of a field that resides on the connection entity between Playlists and Versions. This 105 | connection entity is appropriately called `PlaylistVersionConnection`. Because any Version can 106 | exist in multiple Playlists, the sort order isn't specific to the Version, it's specific to 107 | each _instance_ of the Version in a Playlist. These instances are tracked using connection 108 | entities in Shtogun and are accessible just like any other entity type in Flow Production Tracking. 109 | 110 | To find information about your Versions in the Playlist "Director Review" (let's say it has an 111 | ``id`` of 4). We'd run a query like so:: 112 | 113 | filters = [['playlist', 'is', {'type':'Playlist', 'id':4}]] 114 | fields = ['playlist.Playlist.code', 'sg_sort_order', 'version.Version.code', 'version.Version.user', 'version.Version.entity'] 115 | order=[{'column':'sg_sort_order','direction':'asc'}] 116 | result = sg.find('PlaylistVersionConnection', filters, fields, order) 117 | 118 | 119 | Which returns the following:: 120 | 121 | [{'id': 28, 122 | 'playlist.Playlist.code': 'Director Review', 123 | 'sg_sort_order': 1.0, 124 | 'type': 'PlaylistVersionConnection', 125 | 'version.Version.code': 'bunny_020_0010_comp_v003', 126 | 'version.Version.entity': {'id': 880, 127 | 'name': 'bunny_020_0010', 128 | 'type': 'Shot'}, 129 | 'version.Version.user': {'id': 19, 'name': 'Artist 1', 'type': 'HumanUser'}}, 130 | {'id': 29, 131 | 'playlist.Playlist.code': 'Director Review', 132 | 'sg_sort_order': 2.0, 133 | 'type': 'PlaylistVersionConnection', 134 | 'version.Version.code': 'bunny_020_0020_comp_v003', 135 | 'version.Version.entity': {'id': 881, 136 | 'name': 'bunny_020_0020', 137 | 'type': 'Shot'}, 138 | 'version.Version.user': {'id': 12, 'name': 'Artist 8', 'type': 'HumanUser'}}, 139 | {'id': 30, 140 | 'playlist.Playlist.code': 'Director Review', 141 | 'sg_sort_order': 3.0, 142 | 'type': 'PlaylistVersionConnection', 143 | 'version.Version.code': 'bunny_020_0030_comp_v003', 144 | 'version.Version.entity': {'id': 882, 145 | 'name': 'bunny_020_0030', 146 | 'type': 'Shot'}, 147 | 'version.Version.user': {'id': 33, 'name': 'Admin 5', 'type': 'HumanUser'}}, 148 | {'id': 31, 149 | 'playlist.Playlist.code': 'Director Review', 150 | 'sg_sort_order': 4.0, 151 | 'type': 'PlaylistVersionConnection', 152 | 'version.Version.code': 'bunny_020_0040_comp_v003', 153 | 'version.Version.entity': {'id': 883, 154 | 'name': 'bunny_020_0040', 155 | 'type': 'Shot'}, 156 | 'version.Version.user': {'id': 18, 'name': 'Artist 2', 'type': 'HumanUser'}}, 157 | {'id': 32, 158 | 'playlist.Playlist.code': 'Director Review', 159 | 'sg_sort_order': 5.0, 160 | 'type': 'PlaylistVersionConnection', 161 | 'version.Version.code': 'bunny_020_0050_comp_v003', 162 | 'version.Version.entity': {'id': 884, 163 | 'name': 'bunny_020_0050', 164 | 'type': 'Shot'}, 165 | 'version.Version.user': {'id': 15, 'name': 'Artist 5', 'type': 'HumanUser'}}] 166 | 167 | 168 | - ``version`` is the Version record for this connection instance. 169 | - ``playlist`` is the Playlist record for this connection instance. 170 | - ``sg_sort_order`` is the sort order field on the connection instance. 171 | 172 | We can pull in field values from the linked Playlist and Version entities using dot notation like 173 | ``version.Version.code``. The syntax is ``fieldname.EntityType.fieldname``. In this example, 174 | ``PlaylistVersionConnection`` has a field named ``version``. That field contains a ``Version`` 175 | entity. The field we are interested on the Version is ``code``. Put those together with our f 176 | riend the dot and we have ``version.Version.code``. 177 | 178 | ************************************************************ 179 | Flow Production Tracking UI fields not available via the API 180 | ************************************************************ 181 | 182 | Summary type fields like Query Fields and Pipeline Step summary fields are currently only available 183 | via the UI. Some other fields may not work as expected through the API because they are "display 184 | only" fields made available for convenience and are only available in the browser UI. 185 | 186 | HumanUser 187 | ========= 188 | 189 | - ``name``: This is a UI-only field that is a combination of the ``firstname`` + ``' '`` + 190 | ``lastname``. 191 | 192 | Shot 193 | ==== 194 | 195 | **Smart Cut Fields**: These fields are available only in the browser UI. You can read more about 196 | smart cut fields and the API in the :ref:`Smart Cut Fields doc `:: 197 | 198 | smart_cut_in 199 | smart_cut_out 200 | smart_cut_duration 201 | smart_cut_summary_display 202 | smart_duration_summary_display 203 | smart_head_in 204 | smart_head_out 205 | smart_head_duration 206 | smart_tail_in 207 | smart_tail_out 208 | smart_tail_duration 209 | smart_working_duration 210 | 211 | 212 | Pipeline Step summary fields on entities 213 | ======================================== 214 | 215 | The Pipeline Step summary fields on entities that have Tasks aren't currently available via the API 216 | and are calculated on the client side in the UI. These fields are like ``step_0``, or ``step_13``. 217 | Note that the Pipeline Step entity itself is available via the API as the entity type ``Step``. 218 | 219 | Query Fields 220 | ============ 221 | 222 | Query fields are also summary fields like Pipeline Steps, the query is run from the client side UI 223 | and therefore is not currently supported in the API. 224 | 225 | ************ 226 | Audit Fields 227 | ************ 228 | You can set the ``created_by`` and ``created_at`` fields via the API at creation time. This is 229 | often useful for when you're importing or migrating data from another source and want to keep the 230 | history in tact. However, you cannot set the ``updated_by`` and ``updated_at`` fields. These are 231 | automatically set whenever an entity is created or updated. 232 | 233 | .. _logging: 234 | 235 | ***************************** 236 | Logging Messages from the API 237 | ***************************** 238 | 239 | The API uses standard python logging but does not define a handler. 240 | 241 | To see the logging output in stdout, define a streamhandler in your script:: 242 | 243 | import logging 244 | import shotgun_api3 as shotgun 245 | logging.basicConfig(level=logging.DEBUG) 246 | 247 | To write logging output from the Flow Production Tracking API to a file, define a file handler in your script:: 248 | 249 | import logging 250 | import shotgun_api3 as shotgun 251 | logging.basicConfig(level=logging.DEBUG, filename='/path/to/your/log') 252 | 253 | To suppress the logging output from the API in a script which uses logging, set the level of the 254 | Flow Production Tracking logger to a higher level:: 255 | 256 | import logging 257 | import shotgun_api3 as shotgun 258 | sg_log = logging.getLogger('shotgun_api3') 259 | sg_log.setLevel(logging.ERROR) 260 | 261 | ************* 262 | Optimizations 263 | ************* 264 | 265 | .. _combining-related-queries: 266 | 267 | Combining Related Queries 268 | ========================= 269 | Reducing round-trips for data via the API can significantly improve the speed of your application. 270 | Much like "Bubble Fields" / "Field Hopping" in the UI, we can poll Flow Production Tracking for data on the fields 271 | of entities linked to our main query, both as a part of the query parameters as well as in the 272 | data returned. 273 | 274 | Starting with a simple and common example, many queries require knowing what project your data is 275 | associated with. Without using "field hopping" in an API call, you would first get the project and 276 | then use that data for your follow up query, like so:: 277 | 278 | # Get the project 279 | project_name = 'Big Buck Bunny' 280 | sg_project = sg.find("Project", [['name', 'is', project_name]]) 281 | 282 | # Use project result to get associated shots 283 | sg_shots = sg.find("Shot", [['project', 'is', sg_project]], ['code']) 284 | 285 | With "field hopping" you can combine these queries into:: 286 | 287 | # Get all shots on 'Big Buck Bunny' project 288 | project_name = 'Big Buck Bunny' 289 | sg_shots = sg.find("Shot", [['project.Project.name', 'is', project_name]], ['code']) 290 | 291 | As you can see above, the syntax is to use "``.``" dot notation, joining field names to entity 292 | types in a chain. In this example we start with the field ``project`` on the ``Shot`` entity, then 293 | specify we're looking for the "name" field on the Project entity by specifying ``Project.name``. 294 | 295 | Now that we've demonstrated querying using dot notation, let's take a look at returning linked data 296 | by adding the status of each Sequence entity associated with each Shot in our previous query:: 297 | 298 | # Get shot codes and sequence status all in one query 299 | project_name = 'Big Buck Bunny' 300 | sg_shots = sg.find("Shot", [['project.Project.name', 'is', project_name]], 301 | ['code', 'sg_sequence.Sequence.sg_status_list']) 302 | 303 | The previous examples use the :meth:`~shotgun_api3.Shotgun.find` method. However, it's also applicable 304 | to the :meth:`~shotgun_api3.Shotgun.create` method. 305 | 306 | .. note:: 307 | Due to performance concerns with deep linking, we only support using dot notation chains for 308 | single-entity relationships. This means that if you try to pull data through a multi-entity 309 | field you will not get the desired result. 310 | -------------------------------------------------------------------------------- /docs/cookbook/tasks/task_dependencies.rst: -------------------------------------------------------------------------------- 1 | .. _task_dependencies: 2 | 3 | ################# 4 | Task Dependencies 5 | ################# 6 | 7 | Task dependencies work the same way in the API as they do in the UI. You can filter and sort on 8 | any of the fields. For information about Task Dependencies in Flow Production Tracking, check out the `main 9 | documentation page on our support site 10 | `_ 11 | 12 | ************ 13 | Create Tasks 14 | ************ 15 | 16 | Let's create a couple of Tasks and create dependencies between them. First we'll create a "Layout" 17 | Task for our Shot:: 18 | 19 | data = { 20 | 'project': {'type':'Project', 'id':65}, 21 | 'content': 'Layout', 22 | 'start_date': '2010-04-28', 23 | 'due_date': '2010-05-05', 24 | 'entity': {'type':'Shot', 'id':860} 25 | } 26 | result = sg.create(Task, data) 27 | 28 | 29 | Returns:: 30 | 31 | {'content': 'Layout', 32 | 'due_date': '2010-05-05', 33 | 'entity': {'id': 860, 'name': 'bunny_010_0010', 'type': 'Shot'}, 34 | 'id': 556, 35 | 'project': {'id': 65, 'name': 'Demo Animation Project', 'type': 'Project'}, 36 | 'start_date': '2010-04-28', 37 | 'type': 'Task'} 38 | 39 | 40 | Now let's create an "Anm" Task for our Shot:: 41 | 42 | data = { 43 | 'project': {'type':'Project', 'id':65}, 44 | 'content': 'Anm', 45 | 'start_date': '2010-05-06', 46 | 'due_date': '2010-05-12', 47 | 'entity': {'type':'Shot', 'id':860} 48 | } 49 | result = sg.create(Task, data) 50 | 51 | Returns:: 52 | 53 | {'content': 'Anm', 54 | 'due_date': '2010-05-12', 55 | 'entity': {'id': 860, 'name': 'bunny_010_0010', 'type': 'Shot'}, 56 | 'id': 557, 57 | 'project': {'id': 65, 'name': 'Demo Animation Project', 'type': 'Project'}, 58 | 'start_date': '2010-05-06, 59 | 'type': 'Task'} 60 | 61 | 62 | ******************* 63 | Create A Dependency 64 | ******************* 65 | 66 | Tasks each have an ``upstream_tasks`` field and a ``downstream_tasks`` field. Each field is a 67 | list ``[]`` type and can contain zero, one, or multiple Task entity dictionaries representing the 68 | dependent Tasks. 69 | There are four dependency types from which you can choose: ``finish-to-start-next-day``, ``start-to-finish-next-day``, ``start-to-start``, ``finish-to-finish``. 70 | If no dependency type is provided the default ``finish-to-start-next-day`` will be used. 71 | Here is how to create a dependency between our "Layout" and "Anm" Tasks:: 72 | 73 | # make 'Layout' an upstream Task to 'Anm'. (aka, make 'Anm' dependent on 'Layout') with finish-to-start-next-day dependency type 74 | data = { 75 | 'upstream_tasks':[{'type':'Task','id':556, 'dependency_type': 'finish-to-start-next-day'}] 76 | } 77 | result = sg.update('Task', 557, data) 78 | 79 | Returns:: 80 | 81 | [{'id': 557, 82 | 'type': 'Task', 83 | 'upstream_tasks': [{'id': 556, 'name': 'Layout', 'type': 'Task'}]}] 84 | 85 | This will also automatically update the `downstream_tasks` field on 'Layout' to include the 'Anm' Task. 86 | 87 | *********************** 88 | Query Task Dependencies 89 | *********************** 90 | 91 | Task Dependencies each have a ``dependent_task_id`` and a ``task_id`` fields. 92 | They correspond to ``upstream_task`` and ``downstream_task`` ids of the dependency accordingly. 93 | Here is how to get a TaskDependency using a ``dependent_task_id`` and a ``task_id`` fields:: 94 | 95 | filters = [["dependent_task_id", "is", 72], ["task_id", "is", 75]] 96 | result = sg.find_one('TaskDependency', filters) 97 | 98 | Returns:: 99 | 100 | {'type': 'TaskDependency', 'id': 128} 101 | 102 | **************************** 103 | Updating the Dependency type 104 | **************************** 105 | 106 | When updating the dependency type for the existing dependencies, 107 | update the ``dependency_type`` field of the TaskDependency directly:: 108 | 109 | result = sg.update('TaskDependency', 128, {'dependency_type': 'start-to-start'}) 110 | 111 | Returns:: 112 | 113 | {'dependency_type': 'start-to-start', 'type': 'TaskDependency', 'id': 128} 114 | 115 | ********************************** 116 | Query Tasks with Dependency Fields 117 | ********************************** 118 | 119 | So now lets look at the Tasks we've created and their dependency-related fields:: 120 | 121 | filters = [ 122 | ['entity', 'is', {'type':'Shot', 'id':860}] 123 | ] 124 | fields = [ 125 | 'content', 126 | 'start_date', 127 | 'due_date', 128 | 'upstream_tasks', 129 | 'downstream_tasks', 130 | 'dependency_violation', 131 | 'pinned' 132 | ] 133 | result = sg.find("Task", filters, fields) 134 | 135 | Returns:: 136 | 137 | [{'content': 'Layout', 138 | 'dependency_violation': False, 139 | 'downstream_tasks': [{'type': 'Task', 'name': 'Anm', 'id': 557}], 140 | 'due_date': '2010-05-05', 141 | 'id': 556, 142 | 'pinned': False, 143 | 'start_date': '2010-04-28', 144 | 'type': 'Task', 145 | 'upstream_tasks': []}, 146 | {'content': 'Anm', 147 | 'dependency_violation': False, 148 | 'downstream_tasks': [{'type': 'Task', 'name': 'FX', 'id': 558}], 149 | 'due_date': '2010-05-12', 150 | 'id': 557, 151 | 'pinned': False, 152 | 'start_date': '2010-05-06', 153 | 'type': 'Task', 154 | 'upstream_tasks': [{'type': 'Task', 'name': 'Layout', 'id': 556}]}, 155 | ... 156 | 157 | *Note that we have also created additional Tasks for this Shot but we're going to focus on these 158 | first two for simplicity.* 159 | 160 | ***************************************************************** 161 | Updating the End Date on a Task with Downstream Task Dependencies 162 | ***************************************************************** 163 | 164 | If we update the ``due_date`` field on our "Layout" Task, we'll see that the "Anm" Task dates 165 | will automatically get pushed back to keep the dependency satisfied:: 166 | 167 | result = sg.update('Task', 556, {'due_date': '2010-05-07'}) 168 | 169 | Returns:: 170 | 171 | [{'due_date': '2010-05-07', 'type': 'Task', 'id': 556}] 172 | 173 | Our Tasks now look like this (notice the new dates on the "Anm" Task):: 174 | 175 | [{'content': 'Layout', 176 | 'dependency_violation': False, 177 | 'downstream_tasks': [{'type': 'Task', 'name': 'Anm', 'id': 557}], 178 | 'due_date': '2010-05-07', 179 | 'id': 556, 180 | 'pinned': False, 181 | 'start_date': '2010-04-28', 182 | 'type': 'Task', 183 | 'upstream_tasks': []}, 184 | {'content': 'Anm', 185 | 'dependency_violation': False, 186 | 'downstream_tasks': [{'type': 'Task', 'name': 'FX', 'id': 558}], 187 | 'due_date': '2010-05-14', 188 | 'id': 557, 189 | 'pinned': False, 190 | 'start_date': '2010-05-10', 191 | 'type': 'Task', 192 | 'upstream_tasks': [{'type': 'Task', 'name': 'Layout', 'id': 556}]}, 193 | ... 194 | 195 | 196 | ********************************************************** 197 | Creating a Dependency Violation by pushing up a Start Date 198 | ********************************************************** 199 | 200 | Task Dependencies can work nicely if you are pushing out an end date for a Task as it will just 201 | recalculate the dates for all of the dependent Tasks. But what if we push up the Start Date of our 202 | "Anm" Task to start before our "Layout" Task is scheduled to end? 203 | 204 | :: 205 | 206 | result = sg.update('Task', 557, {'start_date': '2010-05-06'}) 207 | 208 | Returns:: 209 | 210 | [{'type': 'Task', 'start_date': '2010-05-06', 'id': 557}] 211 | 212 | Our Tasks now look like this:: 213 | 214 | [{'content': 'Layout', 215 | 'dependency_violation': False, 216 | 'downstream_tasks': [{'type': 'Task', 'name': 'Anm', 'id': 557}], 217 | 'due_date': '2010-05-07', 218 | 'id': 556, 219 | 'pinned': False, 220 | 'start_date': '2010-04-28', 221 | 'type': 'Task', 222 | 'upstream_tasks': []}, 223 | {'content': 'Anm', 224 | 'dependency_violation': True, 225 | 'downstream_tasks': [{'type': 'Task', 'name': 'FX', 'id': 558}], 226 | 'due_date': '2010-05-12', 227 | 'id': 557, 228 | 'pinned': True, 229 | 'start_date': '2010-05-06', 230 | 'type': 'Task', 231 | 'upstream_tasks': [{'type': 'Task', 'name': 'Layout', 'id': 556}]}, 232 | ... 233 | 234 | Because the "Anm" Task ``start_date`` depends on the ``due_date`` of the "Layout" Task, this 235 | change creates a dependency violation. The update succeeds, but Flow Production Tracking has also set the 236 | ``dependency_violation`` field to ``True`` and has also updated the ``pinned`` field to ``True``. 237 | 238 | The ``pinned`` field simply means that if the upstream Task(s) are moved, the "Anm" Task will no 239 | longer get moved with it. The dependency is still there (in ``upstream_tasks``) but the Task is 240 | now "pinned" to those dates until the Dependency Violation is resolved. 241 | 242 | *********************************************************** 243 | Resolving a Dependency Violation by updating the Start Date 244 | *********************************************************** 245 | 246 | We don't want that violation there. Let's revert that change so the Start Date for "Anm" is after 247 | the End Date of "Layout":: 248 | 249 | result = sg.update('Task', 557, {'start_date': '2010-05-10'}) 250 | 251 | Returns:: 252 | 253 | [{'type': 'Task', 'start_date': '2010-05-10', 'id': 557}] 254 | 255 | Our Tasks now look like this:: 256 | 257 | [{'content': 'Layout', 258 | 'dependency_violation': False, 259 | 'downstream_tasks': [{'type': 'Task', 'name': 'Anm', 'id': 557}], 260 | 'due_date': '2010-05-07', 261 | 'id': 556, 262 | 'pinned': False, 263 | 'start_date': '2010-04-28', 264 | 'type': 'Task', 265 | 'upstream_tasks': []}, 266 | {'content': 'Anm', 267 | 'dependency_violation': False, 268 | 'downstream_tasks': [{'type': 'Task', 'name': 'FX', 'id': 558}], 269 | 'due_date': '2010-05-14', 270 | 'id': 557, 271 | 'pinned': True, 272 | 'start_date': '2010-05-10', 273 | 'type': 'Task', 274 | 'upstream_tasks': [{'type': 'Task', 'name': 'Layout', 'id': 556}]}, 275 | ... 276 | 277 | The ``dependency_violation`` field has now been set back to ``False`` since there is no longer 278 | a violation. But notice that the ``pinned`` field is still ``True``. We will have to manually 279 | update that if we want the Task to travel with its dependencies again:: 280 | 281 | result = sg.update('Task', 557, {'pinned': False}) 282 | 283 | Returns:: 284 | 285 | [{'pinned': False, 'type': 'Task', 'id': 557}] 286 | 287 | Our Tasks now look like this:: 288 | 289 | [{'content': 'Layout', 290 | 'dependency_violation': False, 291 | 'downstream_tasks': [{'type': 'Task', 'name': 'Anm', 'id': 557}], 292 | 'due_date': '2010-05-07', 293 | 'id': 556, 294 | 'pinned': False, 295 | 'start_date': '2010-04-28', 296 | 'type': 'Task', 297 | 'upstream_tasks': []}, 298 | {'content': 'Anm', 299 | 'dependency_violation': False, 300 | 'downstream_tasks': [{'type': 'Task', 'name': 'FX', 'id': 558}], 301 | 'due_date': '2010-05-14', 302 | 'id': 557, 303 | 'pinned': False, 304 | 'start_date': '2010-05-10', 305 | 'type': 'Task', 306 | 'upstream_tasks': [{'type': 'Task', 'name': 'Layout', 'id': 556}]}, 307 | ... 308 | 309 | Looks great. But that's an annoying manual process. What if we want to just reset a Task so that 310 | it automatically gets updated so that the Start Date is after its dependent Tasks? 311 | 312 | ******************************************************************* 313 | Updating the ``pinned`` field on a Task with a Dependency Violation 314 | ******************************************************************* 315 | 316 | Let's go back a couple of steps to where our "Anm" Task had a Dependency Violation because we had 317 | moved the Start Date up before the "Layout" Task End Date. Remember that the ``pinned`` field 318 | was also ``True``. If we simply update the ``pinned`` field to be ``False``, Flow Production Tracking will also 319 | automatically update the Task dates to satisfy the upstream dependencies and reset the 320 | ``dependency_violation`` value to ``False``:: 321 | 322 | result = sg.update('Task', 557, {'pinned': False}) 323 | 324 | Returns:: 325 | 326 | [{'pinned': False, 'type': 'Task', 'id': 557}] 327 | 328 | 329 | Our Tasks now look like this:: 330 | 331 | [{'content': 'Layout', 332 | 'dependency_violation': False, 333 | 'downstream_tasks': [{'type': 'Task', 'name': 'Anm', 'id': 557}], 334 | 'due_date': '2010-05-07', 335 | 'id': 556, 336 | 'pinned': False, 337 | 'start_date': '2010-04-28', 338 | 'type': 'Task', 339 | 'upstream_tasks': []}, 340 | {'content': 'Anm', 341 | 'dependency_violation': False, 342 | 'downstream_tasks': [{'type': 'Task', 'name': 'FX', 'id': 558}], 343 | 'due_date': '2010-05-14', 344 | 'id': 557, 345 | 'pinned': False, 346 | 'start_date': '2010-05-10', 347 | 'type': 'Task', 348 | 'upstream_tasks': [{'type': 'Task', 'name': 'Layout', 'id': 556}]}, 349 | ... 350 | 351 | 352 | Notice by updating ``pinned`` to ``False``, Flow Production Tracking also updated the ``start_date`` and 353 | ``due_date`` fields of our "Anm" Task so it will satisfy the upstream Task dependencies. And since 354 | that succeeded, the ``dependency_violation`` field has also been set to ``False`` 355 | 356 | ******************************************* 357 | ``dependency_violation`` field is read-only 358 | ******************************************* 359 | 360 | The ``dependency_violation`` field is the only dependency-related field that is read-only. Trying 361 | to modify it will generate a Fault:: 362 | 363 | result = sg.update('Task', 557, {'dependency_violation': False}) 364 | 365 | Returns:: 366 | 367 | # -------------------------------------------------------------------------------- 368 | # XMLRPC Fault 103: 369 | # API update() Task.dependency_violation is read only: 370 | # {"value"=>false, "field_name"=>"dependency_violation"} 371 | # -------------------------------------------------------------------------------- 372 | # Traceback (most recent call last): 373 | # ... 374 | --------------------------------------------------------------------------------