├── .gitignore ├── Install.bat ├── LICENSE.txt ├── README.md ├── plug-ins └── zDsonImport.py └── scripts └── dsonimport ├── Qt.py ├── __init__.py ├── auto_rig.py ├── config.py ├── data └── FKHandleShapes.ma ├── dson ├── DSON.py ├── DSONURL.py ├── __init__.py ├── content.py ├── modifiers.py └── test.py ├── dsonimport.py ├── materials ├── __init__.py ├── create_materials.py ├── material_arnold.py ├── material_base.py ├── material_viewport.py ├── materials.py ├── schlick.py └── texture_manager.py ├── maya_helpers.py ├── mayadson.py ├── modifiers_maya.py ├── prefs.py ├── qt ├── import_window.ui ├── modifier_list.ui └── prefs_tab.ui ├── qtpy ├── .gitignore ├── __init__.py └── qt_widgets │ ├── .gitignore │ ├── __init__.py │ └── tree_view.py ├── rig_tools.py ├── rigging.py ├── tcb.py ├── ui.py ├── util.py ├── uvset.py └── wscandir ├── __init__.py ├── _scandir.pyd ├── scandir.py └── source.txt /.gitignore: -------------------------------------------------------------------------------- 1 | *.pyc 2 | .mayaSwatches 3 | *.sqp 4 | -------------------------------------------------------------------------------- /Install.bat: -------------------------------------------------------------------------------- 1 | @echo off 2 | 3 | rem Get the "Documents" directory. 4 | FOR /F "tokens=2* delims= " %%A IN ('REG QUERY "HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders" /v "Personal"') DO SET Documents=%%B 5 | set OUTPUT=%Documents%\maya\modules\zDsonImport.mod 6 | set INSTALL_DIR=%cd% 7 | 8 | rem Create the global modules directory if it doesn't exist. 9 | if not exist %Documents%\maya\modules mkdir %Documents%\maya\modules 10 | 11 | rem Create the .mod file in the user's Maya modules directory. 12 | echo + zDsonImport 1.0 %INSTALL_DIR% > %OUTPUT% 13 | echo MAYA_CUSTOM_TEMPLATE_PATH +:= scripts/NETemplates >> %OUTPUT% 14 | 15 | echo Installed 16 | pause 17 | -------------------------------------------------------------------------------- /LICENSE.txt: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2017 Glenn Maynard 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | 23 | ***************************************************************************** 24 | 25 | tcb.py: 26 | 27 | The MIT License (MIT) 28 | 29 | Copyright (c) 2015 Bret Battey 30 | 31 | Permission is hereby granted, free of charge, to any person obtaining a copy 32 | of this software and associated documentation files (the "Software"), to deal 33 | in the Software without restriction, including without limitation the rights 34 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 35 | copies of the Software, and to permit persons to whom the Software is 36 | furnished to do so, subject to the following conditions: 37 | 38 | The above copyright notice and this permission notice shall be included in all 39 | copies or substantial portions of the Software. 40 | 41 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 42 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 43 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 44 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 45 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 46 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 47 | SOFTWARE. 48 | 49 | Copyright (c) 1993-2015 Ken Martin, Will Schroeder, Bill Lorensen 50 | All rights reserved. 51 | 52 | Redistribution and use in source and binary forms, with or without 53 | modification, are permitted provided that the following conditions are met: 54 | 55 | * Redistributions of source code must retain the above copyright notice, 56 | this list of conditions and the following disclaimer. 57 | 58 | * Redistributions in binary form must reproduce the above copyright notice, 59 | this list of conditions and the following disclaimer in the documentation 60 | and/or other materials provided with the distribution. 61 | 62 | * Neither name of Ken Martin, Will Schroeder, or Bill Lorensen nor the names 63 | of any contributors may be used to endorse or promote products derived 64 | from this software without specific prior written permission. 65 | 66 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS ``AS IS'' 67 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 68 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE 69 | ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHORS OR CONTRIBUTORS BE LIABLE FOR 70 | ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 71 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 72 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 73 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 74 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 75 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 76 | 77 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | zDsonImport 2 | ----------- 3 | 4 | Note 5 | ---- 6 | 7 | I'm releasing this in case somebody finds it useful, but I don't plan to do further 8 | work on it. [DazToMaya](https://www.DAZ3D.com/daz-to-maya) (untested) might be better 9 | maintained and supported. This is only tested with Maya 2018. 10 | 11 | This requires the [zMayaTools](https://github.com/zewt/zMayaTools) plugin for zRigHandle 12 | and zOBBTransform. 13 | 14 | Introduction 15 | ------------ 16 | 17 | zDsonImport imports DSON files from DAZ3D into Maya. 18 | 19 | Some features: 20 | 21 | - Mesh skinning 22 | - Corrective blend shapes 23 | - Rigged facial expressions 24 | - Automatic rigging 25 | - HumanIK rigging 26 | - Character modifiers 27 | - Cloth/prop fitting 28 | - Rigidity groups 29 | - Grafts 30 | - Arnold materials 31 | - Multiple UV sets 32 | 33 | Installation 34 | ------------ 35 | 36 | Run install.bat, and then enable the zDsonImport plugin in Maya's Plug-In Manager. 37 | This will add menu items to the File menu. 38 | 39 | Basic usage 40 | ----------- 41 | 42 | All menu items are in **File > zDsonImport**. First, select **Update library**. This must be 43 | done once before the first import, and if any new assets have been added to your DAZ3D 44 | library. 45 | 46 | Create a character (or import other assets) normally in DAZ3D, and save the result to 47 | a .DUF file. If you're importing a character, be sure that the DUF file contains only 48 | a single character. 49 | 50 | Run **Import DUF**, and select your .DUF file from the file open dialog. Modifiers 51 | required by the scene will be automatically selected. You usually also want corrective 52 | blend shapes, so click the checkbox next to "Base Correctives". Click OK to import the 53 | character. 54 | 55 | **Apply materials** to create materials. 56 | 57 | Finally, apply either custom rigging with **Apply rigging**, or a HumanIK rig with **Apply HumanIK**. 58 | 59 | Tiled textures 60 | -------------- 61 | 62 | If textures don't load in the viewport, they may need to be loaded manually. Open 63 | **Renderer > Viewport 2.0 > Options box** and click **Regenerate All UV Tile Preview Textures**. 64 | 65 | Favorite modifiers 66 | ------------------ 67 | 68 | Facial expressions and other modifiers can be imported. It's tedious to select these every 69 | time on import. Instead, mark any modifiers that you want to import as favorites in DAZ3D. 70 | This will be saved to the .DUF file, and all favorite modifiers will be selected for import 71 | automatically when you open the file. 72 | 73 | Materials 74 | --------- 75 | 76 | Viewport and Arnold materials are supported, and both will be created when materials 77 | are created. This gives a clean appearance in the viewport, and a best-effort attempt 78 | at replicating the original materials in Arnold renders. Note that the renderers 79 | and materials in DAZ3D are different from Arnold, so this is only an approximation. 80 | 81 | Textures will point directly to the original copy in the DAZ3D library. In some cases 82 | a converted copy of textures will be made, to deal with UV tiled textures. Textures will 83 | load with absolute paths, which can be adjusted using the Maya File Path Editor if necessary. 84 | 85 | DSON files can use a lot of different basic material types. Only a few of the most 86 | common ones are supported. 87 | 88 | Limitations 89 | ----------- 90 | 91 | General limitations: 92 | 93 | - Characters can only be exported. If you want to make changes, like adding another 94 | clothing option, re-export the asset. There's no way to export accessories as a separate 95 | referenced scene file. 96 | - DSON corrective pose formulas are based on euler rotation angles. This is an 97 | simplistic way to do pose reading, and doesn't work well when joints are controlled 98 | by quaternion-based constraints. A best effort is made to deal with this. 99 | - Imported meshes sometimes have unused components, which result in warnings when loading 100 | the scene. 101 | 102 | There are a lot of unsupported DSON features. A partial list: 103 | 104 | - "HD" modifiers. These use an opaque file format that I haven't tried to parse. They're 105 | probably just displacement maps. 106 | - Rigid follow nodes. 107 | - Meshes with shared geometry can be instanced, but we won't assign separate materials to 108 | each instance. If your scene has instances of the same geometry with different materials, 109 | turn off instancing. 110 | - A scene can have a node in node_library with "source" pointing at a library asset, and 111 | it'll inherit modifiers for that asset. This seems like inheritance for assets. Assets 112 | are complicated enough as is, and this is rarely used. 113 | - Conformed meshes seem to have their skinning adjusted or smoothed when they're fit to a mesh. 114 | - Smoothing with base mesh collision detection. This is helpful for posing, but would 115 | probably not work for animation. 116 | - Only X, Y and Z scale is supported, not "overall" scale. 117 | - Dynamic properties which affect the skeleton is unsupported, since this would complicate 118 | the rigging significantly. 119 | - Geometry shells. These are sometimes used for layering things like tattoos over the skin 120 | of a character. This is rarely used, and not a good approach for layering (this should be 121 | done with layered textures or layered materials, not with actual geometry). 122 | 123 | Cloth fitting 124 | ------------- 125 | 126 | DAZ3D clothing is modelled against the base character, and a fitting algorithm is used to 127 | fit (conform) it to characters with different shapes. We handle this by using a wrap deformer. 128 | DAZ3D does this dynamically, but to give a simpler final scene, we store the result of the wrap 129 | as a blend shape and then delete the wrap, since wrap deformers can be very slow, especially 130 | with dense meshes like mesh hair. 131 | 132 | In addition to conforming clothing to the character, a blend shape is created for each corrective 133 | blend shape on the character that affects the prop. For example, if an elbow corrective shape 134 | exists, the sleeves of a shirt will have a matching blend shape created, which will activate when 135 | the body corrective shape activates. 136 | 137 | CVWrap 138 | ------ 139 | 140 | Clothing props are applied using a wrap deformer. This is only used at import time, and 141 | the deformer is baked and removed so these deformers don't slow down the scene. Maya's wrap 142 | deformer gives good results, but is very slow, especially with very dense meshes such as 143 | hair. It can also fail silently and corrupt the mesh if it runs out of memory. 144 | 145 | If cvWrap is installed, it can be used instead of Maya's wrap deformer. It's much faster 146 | and doesn't fail silently. The results aren't always quite as good, and it can be disabled 147 | in the options tab. 148 | 149 | https://github.com/chadmv/cvwrap 150 | 151 | Auto-rigging 152 | ============ 153 | 154 | Characters can either be auto-rigged directly, or HumanIK can be applied. 155 | 156 | If any facial expressions and other modifiers are imported, they can be controlled with 157 | nodes inside "Controls". Once a rig is attached, facial handles will be visible in the 158 | viewport, which can be selected for quick access. 159 | 160 | An eye control is created. This can be rotated, which is usually the most convenient way 161 | to manipulate it. It can also be translated, or parented to something the eyes should be 162 | following. By default, the eyes are pointing straight forward and rotate together. 163 | "Eyes Focused" can be used to pull the eyes together or push them apart. "Soft Eyes" 164 | can be used to adjust how much eye motion affects the eyelids. 165 | 166 | HumanIK 167 | ------- 168 | 169 | If a HumanIK rig is applied, the character can be animated normally with HIK, or 170 | motion capture data can be retargetted using HIK retargetting. 171 | 172 | Direct rigging 173 | -------------- 174 | 175 | A custom direct rig can be applied as an alternative to HumanIK. 176 | 177 | The arms and legs can be in IK or FK mode. To switch modes, select an IK or 178 | FK handle and adjust the "IK/FK" attribute. This can be animated to transition 179 | between IK and FK. When IK/FK is 1 (IK only), only the IK handle and pole vector 180 | control will be visible. When it's 0 (FK only), only the FK controls will be visible. 181 | This way, only the controls that are active are visible. 182 | 183 | **Match IK -> FK** sets the FK position for a limb to its IK position. Select 184 | any control on a limb first. 185 | 186 | **Match FK -> IK** sets the IK position to its FK position. Since not every FK 187 | pose can be exactly matched with IK, two algorithms are available. In "distance" mode, we try 188 | to match the elbow and hand position as closely as possible. In "angle" mode, we try 189 | to match their angles. 190 | 191 | Each control has a list of coordinate spaces they can be in. For example, by default 192 | the head control is in the "chestUpper" coordinate space, so it follows the movements of 193 | the upper chest. Changing this to WorldSpace will cause the head to stay in place as the 194 | body moves. 195 | 196 | To cleanly change coordinate spaces, select the coordinate space to switch to in 197 | "Change Coordinate Space" and select "Switch coordinate space". "Coordinate Space" 198 | will switch to the new coordinate space, and the rotation (and position for IK 199 | controls) will be updated to keep the control in the same place. 200 | 201 | "CustomSpace1" and "CustomSpace2" use the position of the transforms in ExtraCoordinateSpaces 202 | as a coordinate space. These controls can be parent constrained to other objects. 203 | 204 | Advanced 205 | ======== 206 | 207 | Options 208 | ------- 209 | 210 | The options panel can generally be left at its defaults. Options include: 211 | 212 | - Straigten pose 213 | 214 | DAZ3D characters are in a relaxed T-pose, which is convenient for modelling, but not 215 | what you want when you're rigging a character. Straighten Pose will adjust the default 216 | pose to a strict T-pose. This is required for auto-rigging. 217 | 218 | - Hide face rig 219 | 220 | Hide joints associated with the face. These are normally controlled with pose modifiers, 221 | and since there are a huge number of joints in the face it's very busy in the viewport. 222 | 223 | - Create end joints 224 | 225 | DAZ3D character skeletons define bones, with a starting point and a length, rather than defining 226 | joints. "Create end joints" will create joints at the end of bones based on their length. This 227 | should be left enabled. 228 | 229 | - Conform joints 230 | 231 | Enable conforming the joints of meshes to their target. 232 | 233 | - Conform meshes 234 | 235 | Enable conforming meshes to their target, such as conformed clothing. 236 | 237 | - Geometry 238 | 239 | Import geometry. 240 | 241 | - Materials 242 | 243 | Create base materials and import UVs. These materials are later replaced by the material 244 | import. 245 | 246 | - Morphs 247 | 248 | Import morphs (blend shapes). 249 | 250 | - Skinning 251 | 252 | Import mesh skinning. 253 | 254 | - Modifiers 255 | 256 | Import modifier formulas. This is the rigging that controls dynamic effects, especially 257 | corrective blend shapes. 258 | 259 | - Grafts 260 | 261 | Apply grafts. 262 | 263 | - Hide internal controls in outliner 264 | 265 | A lot of internal helper controls are created, which can clutter the outliner. Mark these 266 | as "hidden in outliner". 267 | 268 | - Bake static morphs 269 | 270 | Bake morphs (blend shapes) in their configured pose if they're set as static rather than dynamic. 271 | For example, this applies the main character blend shape to the mesh directly. 272 | 273 | - Use cvWrap instead of Maya wrap if available 274 | 275 | If cvWrap is installed, use it for mesh conforming rather than builtin Maya wrap. cvWrap generally 276 | doesn't give quite as good results, but it's much faster, and Maya wrap tends to fail silently 277 | if it runs out of memory, which gives corrupted meshes. 278 | 279 | - Mesh splitting 280 | 281 | DAZ3D meshes tend to be one big complex mesh with a lot of materials. In Maya it's generally preferable 282 | to split meshes. For example, since the eyes contain transparent elements, splitting them into a 283 | separate mesh lets Maya know that the entire mesh doesn't need to be rendered in the viewport as 284 | a transparent object, which improves the viewport significantly. However, splitting incorrectly, such 285 | as splitting arms from the body, can result in lighting seams at the boundaries. 286 | 287 | "Smart split" attempts to split meshes correctly, splitting body parts other than skin (eg. eyes 288 | and tongue) into their own meshes, leaving the body itself as one mesh. Props (clothing) will be 289 | split by material. 290 | 291 | "Don't split props" is like "smart split", but props will be left alone. 292 | 293 | "Don't split meshes" imports meshes as-is. 294 | 295 | Modifier dependencies 296 | --------------------- 297 | 298 | Modifiers can depend on other modifiers. If a modifier is required by another modifier, 299 | it will be selected and can't be disabled. To see the modifiers a modifier depends on, 300 | or is depended on by, right click it in the list and see "Requirements" and "Required by". 301 | If a modifier is checked in dark grey, that means it's enabled as a requirement, and will 302 | go away if it's no longer needed. It can still be checked, turning white, which will 303 | cause it to stay enabled even if it's no longer required, like any other modifier. 304 | 305 | Static vs Dynamic modifiers 306 | --------------------------- 307 | 308 | Modifiers can either be applied statically or dynamically. Dynamic modifiers are rigged 309 | according to their DSON formulas, and static modifiers are permanently baked into the 310 | character at import time. 311 | 312 | To select whether a modifier is dynamic, check or uncheck the "D" column in the modifier 313 | list. The default will be automatically guessed based on the type of modifier, so this 314 | usually doesn't need to be changed. 315 | 316 | For example, corrective blend shapes are always dynamic. An arm bend corrective is connected 317 | to the elbow angle. 318 | 319 | Modifiers that change the size of the character can only be applied statically. These 320 | affect the shape of the skeleton. Changing the base skeleton dynamically isn't supported, 321 | since it would complicate the resulting scene too much and isn't useful most of the time. 322 | For example, major character modifiers generally must be applied statically. 323 | 324 | Caching 325 | ------- 326 | 327 | DAZ3D caches its library using a Postgresql database, which is needed to find and load 328 | assets quickly. zDsonImport also caches data about the library, which is stored as 329 | a simple JSON file in **Documents/maya/dsonimport**. This isn't as fast, but it's simple 330 | and doesn't have any external dependencies. 331 | 332 | Asset references 333 | ---------------- 334 | 335 | Meshes and other resources imported into Maya have their DSON URLs and other data stored 336 | as properties. This allows scripts to find the source file for assets. For example, the 337 | material importer uses this so it can load materials. Note that these contain absolute 338 | paths to the imported DUF file, so they won't work if the file is moved. 339 | 340 | Implementation notes 341 | -------------------- 342 | 343 | DSON files use a URL-like scheme to refer to things like other assets and formulas. 344 | This isn't correctly documented anywhere, and resolving these is fairly complex. 345 | The strings also aren't actually URLs: they can't be correctly parsed with a regular URL 346 | parser. dson/DSONURL.py parses these, and dson/DSON.py handles resolving them against 347 | each other. 348 | 349 | Applying DSON modifiers with Maya nodes results in a huge number of nodes. It can create 350 | so many nodes that graphing it in the node editor fails. Using expressions, or having a 351 | helper plugin to reduce these expressions to one node might have given more manageable 352 | results. This doesn't matter if you're just using the output, but trying to make changes 353 | to the scene isn't much fun. 354 | 355 | DAZ3D's transforms are very different from Maya's: transforms are in world space rather 356 | than object space, which complicates applying formulas. This is handled by MayaWorldSpaceTransformProperty. 357 | 358 | Some rigged controls won't do anything. For example, tongue controls which are replaced with 359 | a normal set of viewport handles, but the original sliders won't be removed and simply won't 360 | do anything. 361 | 362 | -------------------------------------------------------------------------------- /plug-ins/zDsonImport.py: -------------------------------------------------------------------------------- 1 | import inspect, os, sys 2 | 3 | from maya import OpenMaya as om 4 | import pymel.core as pm 5 | from pprint import pprint 6 | 7 | from dsonimport import maya_helpers 8 | 9 | gMainFileMenu = pm.language.melGlobals['gMainFileMenu'] 10 | 11 | # Import only when a menu item is actually used, so we don't spend time during Maya 12 | # load importing scripts that aren't being used. 13 | def do_import(unused): 14 | from dsonimport import ui 15 | ui.go() 16 | 17 | def do_update_library(unused): 18 | from dsonimport import ui 19 | ui.refresh_cache() 20 | 21 | def do_apply_materials(unused): 22 | from dsonimport.materials import create_materials 23 | create_materials.go() 24 | 25 | def do_apply_rigging(unused): 26 | from dsonimport import auto_rig 27 | auto_rig.go(humanik=False) 28 | 29 | def do_apply_humanik(unused): 30 | from dsonimport import auto_rig 31 | auto_rig.go(humanik=True) 32 | 33 | def do_ik_fk(unused): 34 | from dsonimport import rig_tools 35 | rig_tools.ik_to_fk() 36 | def do_fk_ik_angle(unused): 37 | from dsonimport import rig_tools 38 | rig_tools.fk_to_ik('distance') 39 | def do_fk_ik_distance(unused): 40 | from dsonimport import rig_tools 41 | rig_tools.fk_to_ik('angle') 42 | def do_coordinate_space(unused): 43 | from dsonimport import rig_tools 44 | rig_tools.switch_selected_coordinate_spaces() 45 | 46 | class PluginMenu(object): 47 | def add_menu_items(self): 48 | # Work around Python forgetting to define __file__ for files run with execfile(). 49 | filename = os.path.abspath(inspect.getsourcefile(lambda: 0)) 50 | installation_path = os.path.dirname(filename) 51 | 52 | # Add this directory to the Python path. 53 | if installation_path not in sys.path: 54 | sys.path.append(installation_path) 55 | 56 | # Make sure the file menu is built. 57 | pm.mel.eval('buildFileMenu') 58 | 59 | pm.setParent(gMainFileMenu, menu=True) 60 | 61 | # In case this has already been created, remove the old one. Maya is a little silly 62 | # here and throws an error if it doesn't exist, so just ignore that if it happens. 63 | try: 64 | pm.deleteUI('zDsonImport_Top', menuItem=True) 65 | except RuntimeError: 66 | pass 67 | 68 | # Add menu items. 69 | pm.menuItem('zDsonImport_Top', label='zDsonImport', subMenu=True, tearOff=True, insertAfter='exportActiveFileOptions') 70 | pm.menuItem('zDsonImport_Import', label='Import DUF', command=do_import) 71 | pm.menuItem('zDsonImport_UpdateLibrary', label='Update library', command=do_update_library) 72 | pm.menuItem('zDsonImport_Materials', label='Apply materials', command=do_apply_materials) 73 | pm.menuItem('zDsonImport_Rigging', label='Apply rigging', command=do_apply_rigging) 74 | pm.menuItem('zDsonImport_HIK', label='Apply HumanIK', command=do_apply_humanik) 75 | 76 | pm.menuItem(divider=True, label='Rig tools') 77 | pm.menuItem('zDsonImport_IK_FK', label='Match IK -> FK', command=do_ik_fk, 78 | annotation='Match IK to FK for selected rig controls.') 79 | pm.menuItem('zDsonImport_FK_IK1', label='Match FK -> IK (match angle)', command=do_fk_ik_angle, 80 | annotation='Match FK to IK for selected rig controls.') 81 | pm.menuItem('zDsonImport_FK_IK2', label='Match FK -> IK (match distance)', command=do_fk_ik_distance, 82 | annotation='Match FK to IK for selected rig controls.') 83 | pm.menuItem('zDsonImport_CoordinateSpace', label='Switch coordinate space', command=do_coordinate_space, 84 | annotation='Switch the coordinate space for the selected rig controls.') 85 | pm.setParent("..", menu=True) 86 | 87 | def remove_menu_items(self): 88 | try: 89 | pm.deleteUI('zDsonImport_Top', menuItem=True) 90 | except RuntimeError: 91 | pass 92 | 93 | menu = PluginMenu() 94 | def initializePlugin(mobject): 95 | maya_helpers.setup_logging() 96 | menu.add_menu_items() 97 | 98 | def uninitializePlugin(mobject): 99 | menu.remove_menu_items() 100 | 101 | -------------------------------------------------------------------------------- /scripts/dsonimport/Qt.py: -------------------------------------------------------------------------------- 1 | # QT5 moved a bunch of stuff around, moving some things from QtGui into QtWidgets. 2 | # This means that in order to make their API slightly "prettier", QT created a ton 3 | # of useless busywork for thousands of developers, and makes compatibility with Qt4 4 | # and Qt5 painful. I couldn't care less about which module these are in, so work 5 | # around this by flattening everything into one module. 6 | from maya.OpenMaya import MGlobal 7 | if MGlobal.apiVersion() >= 201700: 8 | from PySide2.QtCore import * 9 | from PySide2.QtGui import * 10 | from PySide2.QtWidgets import * 11 | from shiboken2 import wrapInstance 12 | import pyside2uic as pysideuic 13 | else: 14 | from PySide.QtCore import * 15 | from PySide.QtGui import * 16 | from shiboken import wrapInstance 17 | import pysideuic as pysideuic 18 | 19 | -------------------------------------------------------------------------------- /scripts/dsonimport/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zewt/zDsonImport/ae9beb87e09a6366e5f1b07a598478f20fa101ae/scripts/dsonimport/__init__.py -------------------------------------------------------------------------------- /scripts/dsonimport/config.py: -------------------------------------------------------------------------------- 1 | global _config 2 | 3 | def get(key, default=None): 4 | return _config.get(key, default) 5 | 6 | def set(config): 7 | global _config 8 | _config = config 9 | 10 | _default_config = { 11 | } 12 | set(_default_config) 13 | -------------------------------------------------------------------------------- /scripts/dsonimport/dson/DSONURL.py: -------------------------------------------------------------------------------- 1 | # A hack of urlparse to parse DSON "URLs", which are actually not valid URLs at 2 | # all. They can have invalid characters in the scheme, and the fragment and query 3 | # are backwards. 4 | from collections import namedtuple 5 | import urllib 6 | 7 | __all__ = ["DSONURL"] 8 | 9 | # Characters valid in scheme names 10 | _scheme_chars = ('abcdefghijklmnopqrstuvwxyz' 11 | 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' 12 | '0123456789' 13 | '+-._%/') 14 | 15 | _cache = {} 16 | class DSONURL(object): 17 | """ 18 | >>> url = DSONURL('scheme:path#fragment?query') 19 | >>> str(url) 20 | 'scheme:path#fragment?query' 21 | >>> url.scheme = 'test' 22 | >>> str(url) 23 | 'test:path#fragment?query' 24 | 25 | >>> url = DSONURL('scheme:path#fragment?query') 26 | >>> url.scheme = 'test%' 27 | >>> url.escaped_scheme 28 | 'test%25' 29 | >>> url.escaped_scheme = '%70 test' 30 | >>> url.scheme 31 | 'p test' 32 | >>> url.escaped_scheme 33 | '%70 test' 34 | >>> str(url) 35 | '%70 test:path#fragment?query' 36 | 37 | # Escaping is preserved for each part. 38 | >>> url = DSONURL('scheme:path#fragment?query') 39 | >>> url.path = 'test%' 40 | >>> url.escaped_path 41 | 'test%25' 42 | >>> url.escaped_path = '%70 test' 43 | >>> url.path 44 | 'p test' 45 | >>> url.escaped_path 46 | '%70 test' 47 | >>> str(url) 48 | 'scheme:%70 test#fragment?query' 49 | 50 | >>> url = DSONURL('scheme:path#fragment?query') 51 | >>> url.fragment = 'test%' 52 | >>> url.escaped_fragment 53 | 'test%25' 54 | >>> url.escaped_fragment = '%70 test' 55 | >>> url.fragment 56 | 'p test' 57 | >>> url.escaped_fragment 58 | '%70 test' 59 | >>> str(url) 60 | 'scheme:path#%70 test?query' 61 | 62 | >>> url = DSONURL('scheme:path#fragment?query') 63 | >>> url.query = 'test%' 64 | >>> url.escaped_query 65 | 'test%25' 66 | >>> url.escaped_query = '%70 test' 67 | >>> url.query 68 | 'p test' 69 | >>> url.escaped_query 70 | '%70 test' 71 | >>> str(url) 72 | 'scheme:path#fragment?%70 test' 73 | 74 | >>> url = DSONURL('sch%20eme:pa%20th#frag%20ment?que%20ry') 75 | >>> str(url) 76 | 'sch%20eme:pa%20th#frag%20ment?que%20ry' 77 | >>> url.scheme 78 | 'sch eme' 79 | >>> url.path 80 | 'pa th' 81 | >>> url.fragment 82 | 'frag ment' 83 | >>> url.query 84 | 'que ry' 85 | >>> DSONURL('/path') == DSONURL('/path') 86 | True 87 | 88 | # Escaping is preserved, so these URLs aren't the same. 89 | >>> DSONURL('/path') == DSONURL('/%70ath') 90 | False 91 | >>> hash(DSONURL('/path')) == hash(DSONURL('/%70ath')) 92 | False 93 | """ 94 | def __init__(self, url): 95 | self._cached_unquoted_scheme = None 96 | self._cached_unquoted_path = None 97 | self._cached_unquoted_fragment = None 98 | self._cached_unquoted_query = None 99 | 100 | # We don't expire cache. 101 | key = url 102 | cached = _cache.get(key) 103 | if cached is not None: 104 | self._scheme, self._path, self._fragment, self._query = cached 105 | return 106 | 107 | key = url 108 | self._fragment = self._query = self._scheme = '' 109 | i = url.find(':') 110 | if i > 0: 111 | for c in url[:i]: 112 | if c not in _scheme_chars: 113 | break 114 | else: 115 | rest = url[i+1:] 116 | self._scheme, url = url[:i], rest 117 | self._scheme, urllib.unquote(self._scheme) 118 | 119 | 120 | if '?' in url: 121 | url, self._query = url.split('?', 1) 122 | self._query = self._query 123 | if '#' in url: 124 | url, self._fragment = url.split('#', 1) 125 | self._fragment = self._fragment 126 | self._path = url 127 | _cache[key] = self.scheme, self._path, self._fragment, self._query 128 | 129 | def __repr__(self): 130 | url = self._path 131 | if self.scheme: 132 | url = self._scheme + ':' + url 133 | if self._fragment: 134 | url = url + '#' + self._fragment 135 | if self._query: 136 | url = url + '?' + self._query 137 | return url 138 | 139 | @property 140 | def scheme(self): 141 | if self._cached_unquoted_scheme is None: 142 | self._cached_unquoted_scheme = urllib.unquote(self._scheme) 143 | 144 | return self._cached_unquoted_scheme 145 | 146 | @scheme.setter 147 | def scheme(self, value): 148 | self._cached_unquoted_scheme = value 149 | self._scheme = urllib.quote(value) 150 | 151 | @property 152 | def escaped_scheme(self): 153 | return self._scheme 154 | 155 | @escaped_scheme.setter 156 | def escaped_scheme(self, value): 157 | self._scheme = value 158 | self._cached_unquoted_scheme = None 159 | 160 | @property 161 | def path(self): 162 | if self._cached_unquoted_path is None: 163 | self._cached_unquoted_path = urllib.unquote(self._path) 164 | 165 | return self._cached_unquoted_path 166 | 167 | @path.setter 168 | def path(self, value): 169 | self._cached_unquoted_path = value 170 | self._path = urllib.quote(value) 171 | 172 | @property 173 | def escaped_path(self): 174 | return self._path 175 | 176 | @escaped_path.setter 177 | def escaped_path(self, value): 178 | self._cached_unquoted_path = None 179 | self._path = value 180 | 181 | @property 182 | def fragment(self): 183 | if self._cached_unquoted_fragment is None: 184 | self._cached_unquoted_fragment = urllib.unquote(self._fragment) 185 | 186 | return self._cached_unquoted_fragment 187 | 188 | @fragment.setter 189 | def fragment(self, value): 190 | self._fragment = urllib.quote(value) 191 | self._cached_unquoted_fragment = value 192 | 193 | @property 194 | def escaped_fragment(self): 195 | return self._fragment 196 | 197 | @escaped_fragment.setter 198 | def escaped_fragment(self, value): 199 | self._fragment = value 200 | self._cached_unquoted_fragment = None 201 | 202 | @property 203 | def query(self): 204 | if self._cached_unquoted_query is None: 205 | self._cached_unquoted_query = urllib.unquote(self._query) 206 | 207 | return self._cached_unquoted_query 208 | 209 | @query.setter 210 | def query(self, value): 211 | self._query = urllib.quote(value) 212 | self._cached_unquoted_query = value 213 | 214 | @property 215 | def escaped_query(self): 216 | return self._query 217 | 218 | @escaped_query.setter 219 | def escaped_query(self, value): 220 | self._query = value 221 | self._cached_unquoted_query = None 222 | 223 | # We're mutable but also hashable. Don't modify a DSONURL if it's in a set 224 | # or the key of a dictionary. 225 | def __hash__(self): 226 | return hash(self._scheme) + hash(self._path) + hash(self._fragment) + hash(self._query) 227 | 228 | def __eq__(self, other): 229 | return isinstance(other, DSONURL) and self._scheme == other._scheme and self._path == other._path and \ 230 | self._fragment == other.fragment and self._query == other.query 231 | 232 | 233 | if __name__ == "__main__": 234 | import doctest 235 | doctest.testmod() 236 | 237 | -------------------------------------------------------------------------------- /scripts/dsonimport/dson/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zewt/zDsonImport/ae9beb87e09a6366e5f1b07a598478f20fa101ae/scripts/dsonimport/dson/__init__.py -------------------------------------------------------------------------------- /scripts/dsonimport/dson/test.py: -------------------------------------------------------------------------------- 1 | import unittest 2 | import DSON 3 | 4 | import logging 5 | logging.basicConfig() 6 | logging.getLogger().setLevel('DEBUG') 7 | 8 | def get_one(entries): 9 | entries = list(entries) 10 | assert len(entries) == 1, entries 11 | return entries[0] 12 | 13 | 14 | class Tests(unittest.TestCase): 15 | def test_basic(self): 16 | # Quick tests that don't require loading assets. 17 | env = DSON.DSONEnv() 18 | 19 | # These are both root nodes of the scene. 20 | self.assertTrue(env.scene.is_root) 21 | self.assertTrue(env.library.is_root) 22 | 23 | def test_foo(self): 24 | env = DSON.DSONEnv() 25 | env.get_or_load_file('/t.duf') 26 | 27 | # Test referencing array elements. This is an extension. 28 | figure = env.scene.get_url('#Genesis3Female-1') 29 | joints = figure.get_url('#SkinBinding?skin/joints') 30 | for idx in xrange(0, 5): 31 | joint = figure.get_url('#SkinBinding?skin/joints/~%i' % idx) 32 | assert joint.value is joints.value[idx] 33 | 34 | # Test iterating over children. This will create DSONProperty objects with URLs using the indexing 35 | # extension. 36 | for idx, joint in enumerate(joints.array_children): 37 | same_joint = joint.node.get_property(joint.path) 38 | assert joint.value is same_joint.value 39 | if idx > 5: 40 | break 41 | 42 | # asset_file = env.get_or_load_file('data/DAZ 3D/Genesis 3/Female/Morphs/DAZ 3D/Base Correctives/pJCMHandDwn_70_L.dsf') 43 | # modifier = env.scene.get_url('#pJCMHandDwn_70_L') 44 | # env.scene.get_url('/data/DAZ%203D/Genesis%203/Female/UV%20Sets/DAZ%203D/Base/Base%20Female.dsf#SkinBinding') 45 | # self.assertIsNotNone(modifier.get_url('rThumb3:')) 46 | 47 | def test_loading_uninstanced_assets(self): 48 | env = DSON.DSONEnv() 49 | 50 | # This file doesn't override #eCTRLMouthSmileOpen. Until we load the modifier asset, it won't 51 | # have this modifier at all. 52 | env.get_or_load_file('/t.duf') 53 | figure = env.scene.get_url('Genesis3Female:') 54 | 55 | # The modifier isn't loaded yet. 56 | assert sum(1 for c in figure.children if c.node_id == 'eCTRLMouthSmileOpen') == 0 57 | 58 | # Load the modifier. This will add the modifier asset under the Genesis3Female asset, and 59 | # the instance created in t.duf will inherit it. 60 | asset_file = env.get_or_load_file('/data/DAZ 3D/Genesis 3/Female/Morphs/DAZ 3D/Base Pose/eCTRLMouthSmileOpen.dsf') 61 | 62 | modifier_asset = env.library.get_url('#eCTRLMouthSmileOpen') 63 | 64 | # This modifier is an asset, not an instance. 65 | self.assertFalse(modifier_asset.is_instanced) 66 | 67 | # Children of assets don't automatically show up in instances. They need to be 68 | # instanced explicitly. 69 | assert not any(c for c in figure.children if c.node_id == 'eCTRLMouthSmileOpen') 70 | 71 | # Explicitly instance the asset. 72 | DSON.Helpers.recursively_instance_assets(env.scene) 73 | 74 | # Now that we've instanced the modifier, we'll see an instance of it in the figure. 75 | modifier = get_one(c for c in figure.children if c.node_id == 'eCTRLMouthSmileOpen-1') 76 | 77 | # The instanced asset will be registered in the file of the figure, not the asset. 78 | assert modifier.dson_file is figure.dson_file 79 | 80 | # This modifier is an instance. 81 | self.assertTrue(modifier.is_instanced) 82 | 83 | self.assertEqual(modifier.node_type, 'modifier') 84 | 85 | # Test searching for URLs referenced by the modifier. 86 | assert modifier.get_url('lNasolabialMouthCorner:/data/DAZ%203D/Genesis%203/Female/Genesis3Female.dsf#lNasolabialMouthCorner?translation/x') 87 | assert modifier.get_url('Genesis3Female:#eCTRLMouthSmileOpen-1?value') 88 | 89 | # The modifier can be found by ID. 90 | result = figure.get_url('#eCTRLMouthSmileOpen-1') 91 | assert result is not None 92 | 93 | # The asset of the instance is the original asset. 94 | self.assertIs(result.asset, modifier_asset) 95 | 96 | # We can read the value of the modifier on the figure. 97 | result = figure.get_url('Genesis3Female:#eCTRLMouthSmileOpen-1?value') 98 | assert result is not None 99 | 100 | # Since this property is from an instance, instance is the same as the node. 101 | result = modifier.get_property('id') 102 | self.assertIs(result.instance, result.node) 103 | 104 | def test_loading_instanced_assets(self): 105 | env = DSON.DSONEnv() 106 | 107 | # This file does override #eCTRLMouthSmileOpen. It'll load the modifier automatically. 108 | scene_file = env.get_or_load_file('/t2.duf') 109 | figure = env.scene.get_url('Genesis3Female:') 110 | 111 | # We should see one instance of the modifier, and it should be from the scene file, not 112 | # the one inherited from the asset. 113 | modifier = get_one(c for c in figure.children if c.node_id == 'eCTRLMouthSmileOpen') 114 | assert modifier.dson_file == scene_file 115 | 116 | # The type of this modifier is inherited from the asset, which is inherited from 117 | # the asset's asset_info. 118 | self.assertEqual(modifier.node_type, 'modifier') 119 | 120 | # This file's already loaded, so loading it again won't do anything. 121 | asset_file = env.get_or_load_file('/data/DAZ 3D/Genesis 3/Female/Morphs/DAZ 3D/Base Pose/eCTRLMouthSmileOpen.dsf') 122 | modifier = get_one(c for c in figure.children if c.node_id == 'eCTRLMouthSmileOpen') 123 | assert modifier.dson_file == scene_file 124 | 125 | # This modifier is an instance. 126 | self.assertTrue(modifier.is_instanced) 127 | 128 | 129 | 130 | 131 | # Instanced assets are tricky. When you look up a property on an instanced modifier and the 132 | # instance isn't overriding it, you get the property on the underlying asset. If this is a URL, 133 | # that URL is resolved relative to the asset, not the instance. 134 | formulas = modifier['formulas'] 135 | self.assertIs(formulas.__class__, DSON.DSONProperty) 136 | self.assertIs(formulas.node, modifier.asset) 137 | 138 | # When you look up a property on an instance that isn't overridden, the property comes from the 139 | # underlying asset. In this case, .instance points back at the instance you started with. 140 | self.assertIs(formulas.instance, modifier) 141 | 142 | formula = formulas.value[8] 143 | self.assertEqual(formula['output'], 'lLipCorver:/data/DAZ%203D/Genesis%203/Female/Genesis3Female.dsf#lLipCorver?translation/x') 144 | self.assertIsNotNone(formulas.node.get_url(formula['output'])) 145 | 146 | # Test searching for URLs referenced by the modifier. 147 | assert modifier.get_url('lNasolabialMouthCorner:/data/DAZ%203D/Genesis%203/Female/Genesis3Female.dsf#lNasolabialMouthCorner?translation/x') 148 | assert modifier.get_url('Genesis3Female:#eCTRLMouthSmileOpen?value') 149 | 150 | # This property has no parent, so parent_property raises KeyError. 151 | with self.assertRaises(KeyError): 152 | formulas.parent_property 153 | 154 | # Get the output from the first formula. This property keeps the instance you started with. 155 | output = formulas.get_property('~0/output') 156 | self.assertIs(output.instance, modifier) 157 | 158 | # The parent of the output is the array, and the parent of that is the original formulas property. 159 | # This will point to the same place, but it won't be the same instance of DSONProperty. 160 | formulas2 = output.parent_property.parent_property 161 | self.assertIs(formulas2.instance, modifier) 162 | self.assertEquals(formulas, formulas2) 163 | 164 | # The instance follows through array_children. 165 | for formula in formulas.array_children: 166 | self.assertIs(formulas.instance, modifier) 167 | break 168 | 169 | def test(self): 170 | #modifier = DSON.env.get_url('/data/DAZ 3D/Genesis 3/Female/Morphs/DAZ 3D/Base Pose/eCTRLMouthSmileOpen.dsf#eCTRLMouthSmileOpen') 171 | #modifier = DSON.env.get_url('/data/Age%20of%20Armour/Subsurface%20Shaders/AoA_Subsurface/AoA_Subsurface.dsf') 172 | # scene = DSON.env.get_url('/Light%20Presets/omnifreaker/UberEnvironment2/!UberEnvironment2%20Base.duf#environmentSphere_1923?extra/studio_node_channels/channels/Renderable/current_value') 173 | 174 | env = DSON.DSONEnv() 175 | 176 | env.get_or_load_file('/t.duf') 177 | # result = env.scene.get_url('Genesis3Female:') 178 | # assert result 179 | 180 | # Check that our geometry instance was loaded. 181 | result = env.scene.get_url('#Genesis3Female-1') 182 | assert result 183 | 184 | # Test reading the vertex count. This will search the geometry instance in t.duf, not find it, 185 | # and then search the base geometry library. 186 | assert result['vertices/count'] > 0 187 | 188 | # Check that material instances were loaded, and added to the geometry's material list. 189 | # Materials aren't normally parented under the geometry. 190 | assert result.materials.get('Face') 191 | 192 | # Test searching for an ID within another ID, by putting the outer ID on the scheme 193 | # and the inner ID in the fragment. Note that the path is actually unused if the 194 | # scheme is present. 195 | # result = env.scene.get_url('Genesis3Female:/data/DAZ%203D/Genesis%203/Female/Morphs/DAZ%203D/Base%20Pose/eCTRLMouthSmileOpen.dsf#hip') 196 | # assert result 197 | 198 | # Test searching for a property directly on a node. 199 | # assert result.get_url('#?label') 200 | 201 | # Search for a property through a scheme search. 202 | self.assertIsNotNone(env.scene.get_url('Genesis3Female:/data/DAZ%203D/Genesis%203/Female/Morphs/DAZ%203D/Base%20Pose/eCTRLMouthSmileOpen.dsf#hip?label')) 203 | 204 | # Test searching for IDs containing slashes. 205 | self.assertIsNotNone(env.scene.get_url('/Light%20Presets/omnifreaker/UberEnvironment2/!UberEnvironment2%20Base.duf#DzShaderLight%2FomUberEnvironment2')) 206 | 207 | # Test searching for properties containing slashes. 208 | result = env.scene.get_url('/data/Age%20of%20Armour/Subsurface%20Shaders/AoA_Subsurface/AoA_Subsurface.dsf#AoA_Subsurface') 209 | self.assertIsNotNone(result) 210 | result = result.get_url('#?extra/studio%2Fmaterial%2Fdaz_brick') 211 | self.assertIsNotNone(result) 212 | 213 | # This returns a DSONProperty. 214 | assert isinstance(result, DSON.DSONProperty) 215 | 216 | # Force Genesis3Female.dsf to be loaded. 217 | env.get_or_load_file('/data/DAZ 3D/Genesis 3/Female/Genesis3Female.dsf') 218 | 219 | # Test searching the whole scene by scheme. 220 | result = env.scene.get_url('Genesis3Female:') 221 | assert result 222 | 223 | # Test searching the whole scene by fragment. 224 | result = env.scene.get_url('#Genesis3Female') 225 | assert result 226 | 227 | # Test searching from one node to another, where they're in the same file but not the 228 | # same hierarchy. Fragment searches are local to the file, not the node. 229 | result = env.scene.get_url('/data/DAZ%203D/Genesis%203/Female/Morphs/DAZ%203D/Base%20Pose/eCTRLMouthSmileOpen.dsf#eCTRLMouthSmileOpen') 230 | assert result 231 | result = result.get_url('#eCTRLMouthSmileOpen') 232 | assert result 233 | 234 | # Check the "extra-type" property lookup extension. Note that "value" comes from the base asset, 235 | # which is 1, and "current_value" is overridden to a different value in the scene to 2. 236 | geometry = env.scene.get_url('#Genesis3Female-1') 237 | assert geometry 238 | self.assertEqual(geometry.get_property('extra-type/studio_geometry_channels/channels/SubDRenderLevel/value').value, 1) 239 | self.assertEqual(geometry.get_property('extra-type/studio_geometry_channels/channels/SubDRenderLevel/current_value').value, 2) 240 | 241 | # This won't find the node. The scheme only searches nodes in the scene, not in the library, 242 | # and we haven't loaded a character into the scene with this name. 243 | # XXX: need better exceptions so we can test this 244 | # result = env.scene.get_url('Genesis3Female:/data/DAZ%203D/Genesis%203/Female/Morphs/DAZ%203D/Base%20Pose/eCTRLMouthSmileOpen.dsf#eCTRLMouthSmileOpen-1') 245 | # assert result is None 246 | 247 | if __name__ == '__main__': 248 | # import cProfile 249 | # import re 250 | # cProfile.run('unittest.main()', sort='tottime') 251 | 252 | unittest.main() 253 | 254 | -------------------------------------------------------------------------------- /scripts/dsonimport/materials/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zewt/zDsonImport/ae9beb87e09a6366e5f1b07a598478f20fa101ae/scripts/dsonimport/materials/__init__.py -------------------------------------------------------------------------------- /scripts/dsonimport/materials/create_materials.py: -------------------------------------------------------------------------------- 1 | import logging, os, time 2 | from pprint import pprint, pformat 3 | from dsonimport.dson import DSON 4 | from dsonimport import util 5 | from dsonimport import maya_helpers as mh 6 | from dsonimport.dson.DSONURL import DSONURL 7 | import pymel.core as pm 8 | 9 | log = logging.getLogger('DSONImporter.Materials') 10 | 11 | from texture_manager import TextureManager 12 | import material_viewport 13 | import material_arnold 14 | 15 | material_classes = { 16 | 'Arnold': material_arnold.MaterialArnold, 17 | 'viewport': material_viewport.MaterialViewport, 18 | } 19 | 20 | # The material classes to apply, in order. We don't currently have a UI for this. 21 | material_class_names = ['Arnold', 'viewport'] 22 | 23 | def _create_material_for_shading_group(env, sg_node, texture_manager, material_class, attach_to_viewport): 24 | # Load the materials we'll use for this shadingEngine. 25 | # 26 | # If there are multiple entries in dson_materials, the first is the actual material, and the 27 | # rest are just references to other materials merged for scatter. They'll only be used to 28 | # get their texture paths. 29 | dson_material_array_attr = sg_node.attr('dson_materials') 30 | material_indices = dson_material_array_attr.get(mi=True) or [] 31 | dson_materials = [] 32 | for idx in dson_material_array_attr.get(mi=True) or []: 33 | # We can't just get_url this URL, since it points inside a user scene and not something in the 34 | # search path. (DSON search paths unfortuantely look like absolute URLs but aren't, so we can't 35 | # tell which is which, but these are always pointing at the scene we imported.) To access it, first 36 | # load it with load_user_scene, and then get_url will find it without trying to search for it. 37 | material_url = dson_material_array_attr.elementByLogicalIndex(idx).attr('material_url').get() 38 | parsed_url = DSONURL(material_url) 39 | 40 | env.load_user_scene(parsed_url.path) 41 | 42 | dson_material = env.scene.get_url(material_url) 43 | dson_materials.append(dson_material) 44 | 45 | # If we need to convert textures for this renderer, we need a place to put them. 46 | # 47 | # If multiple shading groups use a texture and they both tiling more than one texture, the textures 48 | # might be in different order. If we put the textures in the same place they'll overwrite each other 49 | # and one of the materials will be wrong, but if we put every material's texture in a different 50 | # directory then we won't share any textures across materials. 51 | # 52 | # If the texture has more than one file, put it in a directory named by the shading group. This will 53 | # usually only happen for scatter materials, and we usually only have one scatter material in the scene. 54 | # The other textures aren't tiled, so we can safely put them in the same directory. 55 | # Look at 56 | # the material URL, which is normally the user scene that this was imported from, and put 57 | # them in a textures directory underneath it. 58 | # 59 | # XXX: If a material has two color inputs that share some textures and not others 60 | # and they're copied in Mudbox mode, they'll overwrite each other. 61 | if len(dson_materials) == 1: 62 | first_material_path = dson_material_array_attr.elementByLogicalIndex(0).attr('material_url').get() 63 | parsed_url = DSONURL(first_material_path) 64 | path = os.path.dirname(parsed_url.path) 65 | name = os.path.basename(parsed_url.path) 66 | path = '%s/DSONTextures/%s' % (path, name) 67 | else: 68 | path = os.path.dirname(parsed_url.path) 69 | path = '%s/DSONTextures/%s' % (path, sg_node.name()) 70 | texture_manager.set_path(path) 71 | 72 | # The dson_uvsets array is the UV set connections that we need to make for each texture. This 73 | # will often be empty, if we're only using the default UV set. 74 | uvsets = [] 75 | dson_uvsets = sg_node.attr('dson_uvsets') 76 | for idx in dson_uvsets.get(mi=True) or []: 77 | uvset_attr = dson_uvsets.elementByLogicalIndex(idx) 78 | 79 | # If there's no UV set connection on this index, omit the entry. It'll use the 80 | # default UV set and don't need to set up a uvChooser. 81 | connections = uvset_attr.listConnections(s=True, d=False, p=True) or [] 82 | assert len(connections) <= 1, connections 83 | if len(connections) == 1: 84 | uvsets.append(connections[0]) 85 | 86 | # Create the material, and attach it to the shadingEngine. 87 | main_dson_material = dson_materials[0] 88 | name = main_dson_material.node_id 89 | 90 | if len(dson_materials) > 1: 91 | # If there's more than one material, these were materials merged due to scatter. Instaed 92 | # of naming it after an arbitrarily-selected material, just call it "Scatter". We usually 93 | # won't have more than one scatter material. 94 | name = 'Scatter' 95 | 96 | material_node = material_class(env=env, name=name, dson_material=main_dson_material, source_dson_materials=dson_materials, uvsets=uvsets, texture_manager=texture_manager, attach_to_viewport=attach_to_viewport) 97 | material_node.create(dson_materials[0], sg_node) 98 | 99 | def _create_materials(progress): 100 | env = DSON.DSONEnv() 101 | 102 | # Create a shared TextureManager. All materials that we create will use this to share 103 | # file nodes. 104 | texture_manager = TextureManager(env.find_file) 105 | 106 | progress.show('Applying materials...', len(material_class_names) + 1) 107 | progress.set_main_progress('Loading materials...') 108 | 109 | shading_groups = set() 110 | selection = pm.ls(sl=True) 111 | if selection: 112 | # Get shading groups used by the selection. 113 | for node in selection: 114 | shapes = node.listRelatives(ad=True, shapes=True) 115 | for shape in shapes: 116 | sg_nodes = pm.listConnections(shape, type='shadingEngine') 117 | shading_groups.update(pm.listConnections(shape, type='shadingEngine')) 118 | else: 119 | shading_groups = pm.ls(type='shadingEngine') 120 | 121 | # Filter to shadingEngines created by load_and_assign_materials. 122 | shading_groups = [sg for sg in shading_groups if sg.hasAttr('dson_materials')] 123 | log.debug('Shading groups to apply materials to: %s', shading_groups) 124 | 125 | # Make a mapping from each mesh in the scene to its main DSON transform node. 126 | meshes = {} 127 | for sg_node in shading_groups: 128 | for mesh in pm.sets(sg_node, q=True): 129 | mesh = mesh.node() 130 | 131 | if not mesh.hasAttr('dson_transform'): 132 | continue 133 | 134 | transforms = mesh.attr('dson_transform').listConnections(s=True, d=False) 135 | if not transforms: 136 | continue 137 | 138 | transform = transforms[0] 139 | meshes[mesh] = transform 140 | 141 | # The order we create materials matters. The renderer materials like MaterialArnold 142 | # will connect to surfaceShader, so they'll be used in the viewport if we don't create 143 | # a viewport-specific material. MaterialViewport will then override that connection. 144 | # If we do it the other way around, the wrong material will end up on surfaceShader. 145 | for material_class_name in material_class_names: 146 | material_class = material_classes[material_class_name] 147 | progress.set_main_progress('Applying %s materials...' % material_class_name) 148 | for idx, sg_node in enumerate(shading_groups): 149 | percent = float(idx) / len(shading_groups) 150 | progress.set_task_progress('Creating material: %s' % sg_node, percent) 151 | log.debug('Creating %s material for %s', material_class_name, sg_node) 152 | 153 | # Only attach to the viewport shader if this is the viewport material, unless we're not creating 154 | # the viewport material. In that case, connect the first material. 155 | attach_to_viewport = material_class_name == 'viewport' 156 | if 'viewport' not in material_class_names and material_class_name in material_class_names[0]: 157 | attach_to_viewport = True 158 | 159 | _create_material_for_shading_group(env, sg_node, texture_manager=texture_manager, material_class=material_class, attach_to_viewport=attach_to_viewport) 160 | 161 | # Let the material class apply renderer-specific mesh properties. This isn't specific to 162 | # a material. 163 | material_class.apply_mesh_properties(env, meshes) 164 | 165 | def run_import(): 166 | progress = mh.ProgressWindowMaya() 167 | 168 | try: 169 | pm.waitCursor(state=True) 170 | 171 | _create_materials(progress) 172 | except util.CancelledException as e: 173 | pass 174 | except BaseException as e: 175 | # Log the exception, then swallow it, since throwing an exception up to Maya will just 176 | # log it again. 177 | log.exception(e) 178 | finally: 179 | progress.hide() 180 | pm.waitCursor(state=False) 181 | 182 | def go(): 183 | mh.setup_logging() 184 | run_import() 185 | 186 | -------------------------------------------------------------------------------- /scripts/dsonimport/materials/material_base.py: -------------------------------------------------------------------------------- 1 | import logging, math, os, urllib 2 | from pprint import pprint, pformat 3 | from maya import mel 4 | from dsonimport import util 5 | from dsonimport import maya_helpers as mh 6 | import pymel.core as pm 7 | 8 | log = logging.getLogger('DSONImporter.Materials') 9 | 10 | class MaterialBase(object): 11 | """ 12 | This is a helper base class for classes that create and register textures. 13 | """ 14 | def __init__(self, env=None, name=None, dson_material=None, source_dson_materials=None, uvsets=None, texture_manager=None, attach_to_viewport=False): 15 | self.env = env 16 | self.name = name 17 | self.uvsets = uvsets 18 | self.texture_manager = texture_manager 19 | 20 | self.channels, self.images = self._get_combined_material_channels(dson_material, source_dson_materials) 21 | self.horiz_tiles = 1 22 | self.vert_tiles = 1 23 | self.horiz_offset = 0 24 | self.vert_offset = 0 25 | 26 | # We'll only attach to the viewport shader (surfaceShader) if this is true. 27 | self.attach_to_viewport = attach_to_viewport 28 | 29 | @classmethod 30 | def apply_mesh_properties(cls, env, meshes): 31 | pass 32 | 33 | @classmethod 34 | def _get_material_channels(cls, dson_material): 35 | """ 36 | Gather all studio_material_channels properties to make them easier to access. 37 | 38 | Return (channels, images), where channels is a dictionary of keys to values, and 39 | images is a dictionary of keys to texture paths. 40 | """ 41 | raw_channels = [] 42 | raw_channels.extend(dson_material.get_value('extra/studio_material_channels/channels', [])) 43 | if dson_material.asset: 44 | raw_channels.extend(dson_material.asset.get_value('extra/studio_material_channels/channels', [])) 45 | 46 | channel_ids = {} 47 | for channel in raw_channels: 48 | channel_id = channel['channel']['id'] 49 | 50 | # Some of these channels have slashes in their name, so we have to quote it. 51 | path = 'extra/studio_material_channels/channels/%s' % urllib.quote(channel_id, '') 52 | channel_ids[channel_id] = path 53 | 54 | # Add the top-level properties. 55 | for attr in ['diffuse', 'transparency', 'diffuse_strength', 'specular', 'specular_strength', 'glossiness', 56 | 'ambient', 'ambient_strength', 'reflection', 'reflection_strength', 'refraction', 'refraction_strength', 57 | 'ior', 'bump', 'bump_min', 'bump_max', 'displacement', 'displacement_min', 'displacement_max', 58 | 'normal', 'u_offset', 'u_scale', 'v_offset', 'v_scale']: 59 | channel_ids[attr] = '%s/%s' % (attr, attr) 60 | 61 | channels = {} 62 | images = {} 63 | for channel_id, path in channel_ids.items(): 64 | 65 | # We could call evaluate() here to run modifiers on these channels, but these channels 66 | # are inconsistent with all others: they put colors in a single [0,1,2] array instead 67 | # of putting them in sub-channels. We don't support that right now and I'm not sure 68 | # if modifiers are even actually supported on these channels, so for now we just use 69 | # the static value. 70 | try: 71 | value = dson_material[path].get_value_with_default(apply_limits=True) 72 | except KeyError: 73 | # Values with only maps may have no value. 74 | value = None 75 | 76 | channels[channel_id] = value 77 | 78 | # XXX: There's also "image" to reference image_library, including layered maps 79 | try: 80 | image_file = dson_material[path].get('image_file') 81 | if image_file: 82 | images[channel_id] = urllib.unquote(image_file.value) 83 | except KeyError: 84 | pass 85 | 86 | return channels, images 87 | 88 | @classmethod 89 | def _get_combined_material_channels(cls, dson_material, source_dson_materials): 90 | channels, images = cls._get_material_channels(dson_material) 91 | 92 | if source_dson_materials is not None: 93 | # If there's a source material set, images is eg. { 'diffuse': ['path1', 'path2'] }. Each element 94 | # in the array is the file for the corresponding source material. If a source material doesn't 95 | # have a file, set that element to None. 96 | images = {} 97 | for idx, source_dson_material in enumerate(source_dson_materials): 98 | _, source_images = cls._get_material_channels(source_dson_material) 99 | for key, image in source_images.iteritems(): 100 | list_for_property = images.setdefault(key, []) 101 | if len(list_for_property) < idx + 1: 102 | list_for_property.extend([None] * (idx + 1 - len(list_for_property))) 103 | list_for_property[idx] = image 104 | 105 | # log.debug('Textures per source material:\n%s', pformat(images)) 106 | 107 | # Replace any remaining paths with [path]. 108 | for key, path in images.items(): 109 | if isinstance(path, basestring): 110 | images[key] = [path] 111 | 112 | return channels, images 113 | 114 | 115 | def __repr__(self): 116 | result = self.__class__.__name__ 117 | if self.name is not None: 118 | result += '(%s)' % self.name 119 | return result 120 | 121 | def create(self, dson_material): 122 | raise NotImplemented() 123 | 124 | @classmethod 125 | def _get_dson_material_type(cls, dson_material): 126 | for extra in dson_material.iter_get('extra'): 127 | extra_type = extra.get('type') 128 | if not extra_type: 129 | continue 130 | 131 | if not extra_type.value.startswith('studio/material/'): 132 | continue 133 | 134 | return extra_type.value 135 | 136 | return None 137 | 138 | def make_layer_name(self, layer_name='', prefix='Layer'): 139 | if layer_name == '': 140 | return '%s_%s' % (prefix, self.name) 141 | else: 142 | return '%s_%s_%s' % (prefix, self.name, layer_name) 143 | 144 | def find_or_create_texture(self, *args, **kwargs): 145 | """ 146 | Return a Maya texture node for a given path. Track textures that we've already created, 147 | and return an existing texture if one exists with the same parameters. 148 | 149 | """ 150 | kwargs = dict(kwargs) 151 | kwargs.update({ 152 | 'horiz_tiles': self.horiz_tiles, 153 | 'vert_tiles': self.vert_tiles, 154 | 'horiz_offset': self.horiz_offset, 155 | 'vert_offset': self.vert_offset, 156 | }) 157 | 158 | texture = self.texture_manager.find_or_create_texture(*args, **kwargs) 159 | self.register_texture(texture) 160 | 161 | return texture 162 | 163 | def set_tiles(self, horiz_tiles, vert_tiles, horiz_offset, vert_offset): 164 | """ 165 | Set texture tiling. This will affect all future textures loaded by this material. 166 | """ 167 | self.horiz_tiles = horiz_tiles 168 | self.vert_tiles = vert_tiles 169 | self.horiz_offset = horiz_offset 170 | self.vert_offset = vert_offset 171 | 172 | def set_attr_to_texture_with_color(self, output_attr, *args, **kwargs): 173 | texture_node = self.get_texture_with_color(*args, **kwargs) 174 | mh.set_or_connect(output_attr, texture_node) 175 | 176 | def get_texture_with_color(self, texture, color, mode='rgb', nodeName=None, texture_args={}, srgb_to_linear=True): 177 | """ 178 | If mode is 'rgb', we're connecting the color component of the texture, and color 179 | is (r,g,b). If it's 'alpha', we're connecting the alpha component, and color is 180 | a single float. 181 | 182 | If srgb_to_linear is true, "color" will be converted from SRGB to linear color space. 183 | The texture is unaffected. 184 | """ 185 | texture_args = dict(texture_args) 186 | if color is None: 187 | color = 1 188 | 189 | # Alpha colors are already linear. 190 | if color is not None and srgb_to_linear and mode != 'alpha': 191 | if isinstance(color, tuple) or isinstance(color, list): 192 | color = util.srgb_vector_to_linear(color) 193 | else: 194 | color = util.srgb_to_linear(color) 195 | 196 | if texture is None: 197 | # We have just a constant diffuse color, with no texture. 198 | return color 199 | 200 | if isinstance(texture, pm.PyNode): 201 | texture_node = texture 202 | else: 203 | if mode == 'alpha': 204 | # If we connect to texture.outAlpha then the color space doesn't matter, but if we use .outTransparency 205 | # it does, and we need to explicitly set the color space to raw. 206 | if 'colorSpace' not in texture_args: 207 | texture_args['colorSpace'] = 'Raw' 208 | 209 | alphaIsLuminance = (mode == 'alpha') 210 | texture_node = self.find_or_create_texture(path=texture, alphaIsLuminance=alphaIsLuminance, **texture_args) 211 | 212 | channels = { 213 | 'rgb': 'outColor', 214 | 'r': 'outColorR', 215 | 'alpha': 'outAlpha', 216 | } 217 | texture_node = texture_node.attr(channels[mode]) 218 | 219 | if color is not None: 220 | # We have both a texture and a static color. Create a multiplyDivide node to combine them. 221 | texture_node = mh.math_op('mult', texture_node, color) 222 | 223 | return texture_node 224 | 225 | def set_attr_to_transparency(self, output_attr, *args, **kwargs): 226 | texture_node = self.get_texture_with_alpha(*args, **kwargs) 227 | mh.set_or_connect(output_attr, texture_node) 228 | 229 | def get_texture_with_alpha(self, texture, alpha, mode='rgb', zero_is_invisible=False): 230 | """ 231 | Set a transparency attribute. See also set_attr_to_texture_with_color. 232 | 233 | DSON transparency looks like everything else in the universe: an alpha value, with 0 being 234 | transparent. Maya is extra special and does it backwards, so we have to handle this 235 | differently. 236 | 237 | If mode is rgb, the output is a color channel and we'll connect texture.transparency. 238 | If mode is r, the output is a numeric attribute and we'll connect texture.transparencyR. 239 | """ 240 | # XXX: zero_is_invisible meant the output's 0 is invisible, eg. this is for opacity and 241 | # not transparency. Since the input was also opacity, this meant "don't invert". This 242 | # is pretty confusing. 243 | if alpha is None: 244 | alpha = 1 245 | 246 | if texture is None: 247 | if not zero_is_invisible: 248 | # alpha -> transparency 249 | alpha = mh.math_op('sub', 1, alpha) 250 | return alpha 251 | 252 | # Enable alphaIsLuminance (which really means "luminance is alpha"), and use alphaGain 253 | # to apply the static alpha multiplier. 254 | texture_node = self.find_or_create_texture(path=texture, alphaIsLuminance=1, alphaGain=alpha, colorGain=(alpha,alpha,alpha), colorSpace='Raw') 255 | 256 | if zero_is_invisible: 257 | channels = { 258 | 'rgb': 'outColor', 259 | 'r': 'outAlpha', 260 | } 261 | else: 262 | channels = { 263 | 'rgb': 'outTransparency', 264 | 'r': 'outTransparencyR', 265 | } 266 | texture_node_color = texture_node.attr(channels[mode]) 267 | 268 | return texture_node_color 269 | 270 | def set_attr_to_roughness_from_glossiness(self, texture, value, output_attr): 271 | value = self.get_roughness_from_glossiness(texture, value) 272 | mh.set_or_connect(output_attr, value) 273 | 274 | def get_roughness_from_glossiness(self, texture, value): 275 | """ 276 | Set a roughness attribute from glossiness. 277 | """ 278 | if texture is None: 279 | # If we don't have a texture, just set the value directly. 280 | return mh._convert_glossiness_to_roughness(value) 281 | 282 | # Set alphaIsLuminance, since it's required by remap_glossiness_to_roughness_for_texture. Note that 283 | # it'll also convert the constant value, so we don't want to _convert_glossiness_to_roughness 284 | # the constant value in this case. 285 | texture_node = self.find_or_create_texture(path=texture, alphaIsLuminance=True, alphaGain=value) 286 | 287 | return mh.remap_glossiness_to_roughness_for_texture(texture_node) 288 | 289 | @property 290 | def _uses_scatter(self): 291 | """ 292 | Return True if this material uses scatter. 293 | """ 294 | return False 295 | 296 | def register_texture(self, maya_texture_node): 297 | """ 298 | Register that this material uses maya_texture_node. 299 | """ 300 | for uvset_node in self.uvsets: 301 | # Assign UV sets to the texture. 302 | mh.assign_uvset(uvset_node, maya_texture_node) 303 | 304 | 305 | -------------------------------------------------------------------------------- /scripts/dsonimport/materials/material_viewport.py: -------------------------------------------------------------------------------- 1 | import logging, math, os, urllib 2 | from pprint import pprint, pformat 3 | from dsonimport import maya_helpers as mh 4 | import pymel.core as pm 5 | from material_base import MaterialBase 6 | 7 | log = logging.getLogger('DSONImporter.Materials') 8 | 9 | class MaterialViewport(MaterialBase): 10 | def create(self, dson_material, sg_node): 11 | self.material = pm.shadingNode('lambert', asShader=True) 12 | pm.rename(self.material, 'Mat_Viewport_%s' % mh.cleanup_node_name(self.name)) 13 | self.material.attr('diffuse').set(1) 14 | 15 | material_type = self._get_dson_material_type(dson_material) 16 | 17 | self.set_attr_to_texture_with_color(self.material.attr('color'), self.images.get('diffuse'), self.channels['diffuse'], nodeName=self.name) 18 | 19 | if material_type == 'studio/material/uber_iray': 20 | # Refraction isn't really opacity, but we'll approximate it that way in the viewport. 21 | refraction_opacity = self.get_texture_with_alpha(self.images.get('Refraction Weight'), self.channels['Refraction Weight'], zero_is_invisible=True) 22 | 23 | transparency = self.get_texture_with_alpha(self.images.get('Cutout Opacity'), self.channels['Cutout Opacity']) 24 | 25 | # "Transparency" is a terrible way to represent opacity, because instead of just multiplying 26 | # values to combine them, you have to do 1-((1-t1)*(1-t2)). That gives an ugly shader. 27 | # Cutout opacity and refraction are used for very different types of materials and I've never 28 | # seen them used together, so we cheat here and just add them. 29 | transparency = mh.math_op('add', transparency, refraction_opacity) 30 | else: 31 | transparency = self.get_texture_with_alpha(self.images.get('transparency'), self.channels['transparency']) 32 | 33 | # If transparency is constant, don't let it be 0. Clamp it to 0.5, so it's not completely 34 | # invisible in the viewport. In the real materials there are usually other things causing 35 | # it to be visible, like reflections or refraction. Hack: don't do this for certain eye 36 | # materials, or it'll wash out eyes. 37 | allow_completely_transparent_shaders = [ 38 | 'EyeMoisture', 39 | 'Cornea', 40 | ] 41 | allow_completely_transparent = any(s in str(dson_material) for s in allow_completely_transparent_shaders) 42 | if not isinstance(transparency, pm.PyNode) and not allow_completely_transparent: 43 | transparency = min(transparency, 0.5) 44 | 45 | mh.set_or_connect(self.material.attr('transparency'), transparency) 46 | 47 | # Connect the material. Force this connection, so if it's already connected to lambert1 48 | # we'll override it. 49 | self.material.attr('outColor').connect(sg_node.attr('surfaceShader'), f=True) 50 | 51 | -------------------------------------------------------------------------------- /scripts/dsonimport/materials/materials.py: -------------------------------------------------------------------------------- 1 | import logging, math, os, urllib 2 | from pprint import pprint, pformat 3 | from time import time 4 | from dsonimport.dson import DSON 5 | from dsonimport import util 6 | from dsonimport import maya_helpers as mh 7 | import pymel.core as pm 8 | from dsonimport import uvset 9 | 10 | log = logging.getLogger('DSONImporter.Materials') 11 | 12 | # DSON files often assign several materials to an object. The materials sometimes differ only by 13 | # textures, and sometimes are entirely separate materials. We have a few options for importing this: 14 | # 15 | # - We can import the materials as-is, creating one material per DSON material and assigning them 16 | # separately to the mesh, or create a single material for several DSON materials. 17 | # - We can split the mesh across material boundaries, and assign each material to a single mesh, or 18 | # leave the mesh as a single piece and assign per face. 19 | # 20 | # Splitting meshes across material boundaries generally gives better meshes, since DSON meshes are 21 | # often not logically split, eg. putting eyeballs in the same mesh as the body and unrelated clothing 22 | # parts in the same mesh. However, if we split a mesh that really is a single seamless piece that 23 | # just happens to have multiple materials can cause seams in lighting, since normals won't interpolate 24 | # cleanly across the split. This is controlled by MeshGrouping. 25 | # 26 | # Scatter materials are annoying: even though they usually come in as multiple materials, eg. "torso" 27 | # and "arms", we need to collapse them down to a single material. Scatter doesn't cross shadingEngines, 28 | # so if we don't do this we'll end up with seams at the boundaries. 29 | # 30 | # This creates shadingEngine nodes, assigns them to meshes, creates UV sets, and then just stores pointers 31 | # to the materials in the shadingEngines. We don't create the actual materials here, instead just pointing 32 | # the shadingEngines at lambert1. This allows materials to be created later, independent from the time- 33 | # consuming main import, by just creating new material networks and connecting to the existing shadingEngines, 34 | # without needing to know anything about the geometry. This makes developing material support much easier, 35 | # allows the user to iterate by tweaking the DUF file and reapplying materials, and allows exporting once, 36 | # then applying materials for different renderers. 37 | 38 | # We only use this to check which materials are using scatter. 39 | import material_arnold 40 | 41 | class MaterialSet(object): 42 | def __init__(self, dson_material, name, source_dson_materials_to_uv_sets=None): 43 | """ 44 | When we're combining materials into a single material for scatter, source_dson_materials_to_uv_sets 45 | is a dictionary of the underlying DSON materials and their UV sets. We'll read the texture paths 46 | from these and assign them to tiles. 47 | """ 48 | self.dson_material = dson_material 49 | self.name = name 50 | self.uvset_node = None 51 | self.source_dson_materials = source_dson_materials_to_uv_sets 52 | if self.source_dson_materials is not None: 53 | # Sort the material set. The order will determine tile assignments, so this keeps us 54 | # from randomly changing the tile order. 55 | 56 | # XXX: if we support multiple layered materials, we should keep the tiles in sync, so 57 | # tile 0 on each layer is the same thing, even if there's nothing in that tile 58 | self.source_material_uvsets = source_dson_materials_to_uv_sets 59 | self.source_dson_materials = sorted(self.source_dson_materials.keys(), key=lambda item: item.node_id) 60 | 61 | def __repr__(self): 62 | return 'MaterialSet(%s)' % self.name 63 | 64 | @classmethod 65 | def _create_tiled_uv_set(cls, loaded_mesh, material_set, dson_material_to_faces): 66 | log.debug('Check source material set: %s (%s)', material_set, ', '.join(str(s) for s in dson_material_to_faces.keys())) 67 | 68 | # Create an empty UV set. We'll currently only create at most one of these per mesh for 69 | # the shared scatter material, so just call it "scatter". 70 | empty_uv_indices = [] 71 | for poly in loaded_mesh.polys: 72 | empty_uv_indices.append([0]*len(poly['vertex_indices'])) 73 | new_uvset = uvset.UVSet('Scatter', [(0,0)], empty_uv_indices) 74 | 75 | for tile_idx, source_material in enumerate(material_set.source_dson_materials): 76 | face_list_for_source_material = dson_material_to_faces[source_material] 77 | 78 | # The UV set used by this source material. Due to grafts, source materials may use 79 | # multiple source UV sets. 80 | source_uvset = material_set.source_material_uvsets[source_material] 81 | log.debug('Source %s has %i faces and uses UV set %s', source_material, len(face_list_for_source_material), source_uvset) 82 | 83 | # If this is a combined material, it shares textures from multiple other materials and 84 | # we've assigned tiles according to the order in source_dson_materials_to_uv_sets. 85 | #Create a UV set for this material, with UVs for each material moved to its assigned tile. 86 | # ... if we support layered textures, each layer might come from a different source UV 87 | # set and need a separate generated UV set 88 | 89 | # Get the UV tiles used by these faces. We expect the faces to all lie within the same 90 | # single UV tile. 91 | bounds = source_uvset.get_uv_tile_bounds(face_list_for_source_material) 92 | if bounds[0][0]+1 != bounds[1][0] or bounds[0][1]+1 != bounds[1][1]: 93 | raise RuntimeError('Faces in UV set %s for source %s cross UV tiles. This is unsupported with scatter materials.' % (source_uvset, source_material)) 94 | 95 | # If the UVs in the source UV set are at U = 5 and we're outputting to tile 2, move the 96 | # UVs left by 3. Always move V to 0. 97 | u_offset = tile_idx - bounds[0][0] 98 | v_offset = -bounds[0][1] 99 | 100 | # Add the UVs from the source UV set to this UV set, and offset them by u_offset/v_offset. 101 | # Note that while we're creating a lot of overlapping UVs here since we're adding all UVs 102 | # and not just the ones we need, we simply won't use the overlapping ones and they'll be 103 | # culled before the UV set is created. 104 | first_uv_idx = len(new_uvset.uv_values) 105 | new_uvset.uv_values.extend(source_uvset.uv_values) 106 | for idx in xrange(first_uv_idx, first_uv_idx+len(source_uvset.uv_values)): 107 | value = new_uvset.uv_values[idx] 108 | new_uvset.uv_values[idx] = (value[0] + u_offset, value[1] + v_offset) 109 | 110 | for face_idx in face_list_for_source_material: 111 | uv_indices_for_face = [idx+first_uv_idx for idx in source_uvset.uv_indices_per_poly[face_idx]] 112 | new_uvset.uv_indices_per_poly[face_idx] = uv_indices_for_face 113 | 114 | # Save the new UV set to the mesh. The key doesn't matter and only needs to be 115 | # unique. Save this as the default UV map, since in the common case we'll split 116 | # skin parts of figures into an isolated mesh, and this will be the only UV set 117 | # we use. The default UV set has fewer problems (UV linking is buggy). 118 | key = object() 119 | loaded_mesh.default_uv_set_dson_node = key 120 | loaded_mesh.uv_sets[key] = new_uvset 121 | 122 | return new_uvset 123 | 124 | @classmethod 125 | def _collect_materials(cls, env): 126 | """ 127 | Collect a list of materials that we may need to create. 128 | 129 | DSON scenes have a material for each object inheriting from their base object. Combine 130 | these into a list of identical materials, so we don't create hundreds of duplicate materials. 131 | 132 | Note that some of these descriptions are over 1 MB of JSON, so we need to be fairly 133 | efficient here. Some scenes have multiple identical base materials and then multiple 134 | identical material instances using each of them, so we do need to be thorough in 135 | collapsing these back down. 136 | """ 137 | fields_to_ignore = {'id', 'geometry', 'groups', 'uv_set', 'url'} 138 | 139 | # Find all materials, and sort them by node ID, so we consistently use the same nodes. 140 | materials = [node for node in env.scene.depth_first() if node.node_source == 'material'] 141 | materials.sort(key=lambda item: item.node_id) 142 | 143 | material_group_mapping = {} 144 | log.debug('Collecting materials...') 145 | hashed_materials = {} 146 | 147 | for material in materials: 148 | flat = {} 149 | if material.asset: 150 | flat.update(util.flatten_dictionary(material.asset._data)) 151 | 152 | flat.update(util.flatten_dictionary(material._data)) 153 | 154 | for field in fields_to_ignore: 155 | if field in flat: 156 | del flat[field] 157 | 158 | hashed = util.make_frozen(flat) 159 | 160 | if hashed not in hashed_materials: 161 | hashed_materials[hashed] = material 162 | 163 | # Point this material at the first material we found that has the same properties. 164 | material_group_mapping[material] = hashed_materials[hashed] 165 | return material_group_mapping 166 | 167 | def load_and_assign_materials(env, all_mesh_sets, onprogress=None): 168 | """ 169 | Create Maya materials used by LoadedMeshes, and assign them to each LoadedMesh 170 | so they can be assigned when the Maya mesh is created. 171 | """ 172 | # This gives us a mapping from each DSON material to the material we'll use for it, 173 | # which may be a different material with identical properties. Note that not all of 174 | # these materials may actually be used in the scene. This doesn't look at geometry, 175 | # it only finds materials. 176 | material_group_mapping = MaterialSet._collect_materials(env) 177 | 178 | log.debug('Total materials: %i' % len(material_group_mapping)) 179 | log.debug('Unique materials: %i' % len(set(material_group_mapping.values()))) 180 | 181 | # Make a list of material sets. Each material set is a group of materials used by one 182 | # or more mesh. All meshes that use the same collection of materials will share a 183 | # material set. 184 | # 185 | # Since this is built from the actual list of meshes, this will only contain materials 186 | # that we'll actually use. 187 | dson_material_to_material_set = {} 188 | loaded_mesh_to_material_sets = {} 189 | material_sets_to_loaded_mesh_to_faces = {} 190 | loaded_mesh_to_dson_material_to_faces = {} 191 | 192 | for mesh_set in all_mesh_sets.values(): 193 | for loaded_mesh in mesh_set.meshes.values(): 194 | if loaded_mesh in loaded_mesh_to_dson_material_to_faces: 195 | # If we've already handled this LoadedMesh, this is a shared mesh used by more 196 | # than one MeshSet. 197 | continue 198 | 199 | # Store an array of face indices that will have each material assigned to it. 200 | # For example: { DSONNode(head): [0,1,2,3,4], DSONNode(body): [5,6,7,8,9] } 201 | dson_material_to_faces = loaded_mesh_to_dson_material_to_faces[loaded_mesh] = {} 202 | for poly_idx, poly in enumerate(loaded_mesh.polys): 203 | dson_geometry = poly['dson_geometry'] 204 | dson_material = poly['dson_material'] 205 | dson_material = material_group_mapping[dson_material] 206 | 207 | poly_list = dson_material_to_faces.setdefault(dson_material, []) 208 | poly_list.append(poly_idx) 209 | 210 | #dson_material_to_dson_uv_set = {material_group_mapping[dson_material]: dson_uv_set 211 | # for dson_material, dson_uv_set in loaded_mesh.dson_material_to_dson_uv_set.items()} 212 | dson_material_to_dson_uv_set = {} 213 | for dson_material, dson_uv_set in loaded_mesh.dson_material_to_dson_uv_set.items(): 214 | actual_dson_material = material_group_mapping[dson_material] 215 | dson_material_to_dson_uv_set[actual_dson_material] = dson_uv_set 216 | 217 | # Any materials with SSS need to be flattened to a single material, since scatter won't cross 218 | # materials. Make a set of materials that use scatter. 219 | scatter_materials_to_uv_sets = {} 220 | for dson_material in dson_material_to_faces.iterkeys(): 221 | log.debug('-> Check material %s for scatter', dson_material) 222 | if material_arnold.MaterialArnold.grouped_material(dson_material): 223 | dson_uv_set = dson_material_to_dson_uv_set[dson_material] 224 | uv_sets = loaded_mesh.uv_sets[dson_uv_set] 225 | scatter_materials_to_uv_sets[dson_material] = uv_sets 226 | 227 | combined_scatter_material_set = None 228 | if len(scatter_materials_to_uv_sets) > 1: 229 | log.debug('Multiple scatter materials in %s: %s', loaded_mesh, ', '.join(s.node_id for s in scatter_materials_to_uv_sets.iterkeys())) 230 | 231 | # If collapse_multi_materials is on and this MaterialSet has more than one material 232 | # because it uses scatter, reduce it to a single material. To do this: 233 | # - Texture channels create a tiled texture combining all texture inputs, instead of 234 | # using the texture directly. 235 | # - All other texture properties come from one material input. We assume that the 236 | # materials on a shared group are the same. 237 | # - A material may be used by scatter in multiple meshes. For example, fingernails are 238 | # separated from the body, and often have scatter turned on. We won't share the material 239 | # layer in this case: each unique scatter layer will have its own copy of the material. 240 | primary_scatter_dson_material = sorted(scatter_materials_to_uv_sets.keys(), key=lambda key: key.asset_id)[0] 241 | log.debug('Using material %s as the primary scatter material definition', primary_scatter_dson_material) 242 | 243 | # It's hard to pick a good name for this material, since it's coming from a bunch of other 244 | # materials with their own names. Name it after the mesh we're creating it for. This isn't 245 | # correct if there are multiple meshes using the material, but the common case for scatter 246 | # is a single figure. 247 | name = loaded_mesh.name + '_Scatter' 248 | # name = primary_scatter_dson_material.node_id 249 | combined_scatter_material_set = MaterialSet(primary_scatter_dson_material, name, source_dson_materials_to_uv_sets=scatter_materials_to_uv_sets) 250 | dson_material_to_material_set['scatter'] = combined_scatter_material_set 251 | 252 | # Create non-scatter MaterialSets. 253 | for dson_material in dson_material_to_faces.iterkeys(): 254 | if dson_material in scatter_materials_to_uv_sets and combined_scatter_material_set: 255 | # This material uses combined_scatter_material_set. 256 | continue 257 | 258 | # See if we have a MaterialSet with this set of materials, and create a new one if we don't. 259 | if dson_material in dson_material_to_material_set: 260 | continue 261 | 262 | # XXX: This sometimes gives bad names. For example, if arms and fingernails have the 263 | # same properties and we've lumped them together in material_group_mapping, we'd normally 264 | # create one material using the name of one or the other (arbitrarily). However, if 265 | # arms are a scatter material and we're not going to create a regular material at all 266 | # for the arms, we can still choose "arms" as the name for the fingernails, even if 267 | # the only thing using it is the fingernails. 268 | name = dson_material.node_id 269 | dson_material_to_material_set[dson_material] = MaterialSet(dson_material, name) 270 | 271 | # Create the MaterialSets for this mesh, associated with the faces that'll use it. 272 | for dson_material, faces in dson_material_to_faces.iteritems(): 273 | if dson_material in scatter_materials_to_uv_sets and combined_scatter_material_set: 274 | material_set = combined_scatter_material_set 275 | else: 276 | # The usual case: just a single material in the MaterialSet. 277 | material_set = dson_material_to_material_set[dson_material] 278 | 279 | # Add this MaterialSet/face list to this LoadedMesh. 280 | loaded_mesh_to_material_sets.setdefault(loaded_mesh, set()).add(material_set) 281 | 282 | loaded_mesh_to_faces = material_sets_to_loaded_mesh_to_faces.setdefault(material_set, {}) 283 | faces_for_loaded_mesh = loaded_mesh_to_faces.setdefault(loaded_mesh, []) 284 | faces_for_loaded_mesh.extend(faces) 285 | 286 | # Create shading groups for each material, and assign them to meshes. 287 | log.debug('Creating shading groups for material sets: %i' % len(dson_material_to_material_set)) 288 | material_set_to_shading_engine = {} 289 | for material_set in dson_material_to_material_set.values(): 290 | # Create the shadingEngine for this material. 291 | sg_node = pm.sets(renderable=True, noSurfaceShader=True, empty=True, name='Shader_%s' % material_set.name) 292 | 293 | # For now, assign lambert1 to the shadingEngines so they have a material. This will be 294 | # replaced with the actual material later. 295 | default_material = pm.ls('lambert1')[0] 296 | default_material.attr('outColor').connect(sg_node.attr('surfaceShader')) 297 | 298 | # Remember the shadingEngine node for this material set. 299 | material_set_to_shading_engine[material_set] = sg_node 300 | 301 | loaded_mesh_to_faces = material_sets_to_loaded_mesh_to_faces[material_set] 302 | 303 | # Assign mesh faces to this shading group. 304 | for loaded_mesh, face_indices in loaded_mesh_to_faces.iteritems(): 305 | if loaded_mesh.maya_mesh is None: 306 | continue 307 | 308 | face_indices.sort() 309 | 310 | # Assign materials to each instance. 311 | for maya_instance in loaded_mesh.maya_mesh.getInstances(): 312 | # Make a ['mesh.f[100:200]', 'mesh.f[300:400'] list of the faces. PyMel's MeshFace is 313 | # unusably slow, so we don't use it here. 314 | face_ranges = [] 315 | for start, end in mh.get_contiguous_indices(face_indices): 316 | face_ranges.append('%s.f[%i:%i]' % (maya_instance, start, end)) 317 | 318 | # Assign the shading group to the mesh. 319 | pm.sets(sg_node, edit=True, forceElement=face_ranges) 320 | 321 | # Create UV sets for each material on each mesh. 322 | uvset_nodes_per_material_set = {} 323 | processed_loaded_meshes = set() 324 | for mesh_set in all_mesh_sets.values(): 325 | for loaded_mesh in mesh_set.meshes.values(): 326 | # Skip meshes that didn't have a Maya mesh created. 327 | if loaded_mesh.maya_mesh is None: 328 | continue 329 | 330 | material_sets_for_loaded_mesh = loaded_mesh_to_material_sets.get(loaded_mesh) 331 | if material_sets_for_loaded_mesh is None: 332 | # We aren't loading materials for this mesh. 333 | continue 334 | 335 | if loaded_mesh in processed_loaded_meshes: 336 | # This is another instance of another mesh we've already handled. 337 | continue 338 | 339 | processed_loaded_meshes.add(loaded_mesh) 340 | # Find the UV set to use for each material set on this mesh. This can create extra 341 | 342 | # UV sets, so we have to do this before calling create_uv_sets(). 343 | uvset_per_material_set = {} 344 | for material_set in material_sets_for_loaded_mesh: 345 | if material_set.source_dson_materials and len(material_set.source_dson_materials) > 1: 346 | # For material sets with a source material list, create the UV set for the shared 347 | # material (this is only used for scatter materials). 348 | # 349 | # The faces that will be assigned to each material in this MaterialSet: 350 | dson_material_to_faces = loaded_mesh_to_dson_material_to_faces[loaded_mesh] 351 | 352 | uvset = MaterialSet._create_tiled_uv_set(loaded_mesh, material_set, dson_material_to_faces) 353 | else: 354 | dson_material_to_dson_uv_set = {material_group_mapping[dson_material]: dson_uv_set 355 | for dson_material, dson_uv_set in loaded_mesh.dson_material_to_dson_uv_set.items()} 356 | 357 | dson_uv_set = dson_material_to_dson_uv_set[material_set.dson_material] 358 | uvset = loaded_mesh.uv_sets[dson_uv_set] 359 | 360 | uvset_per_material_set[material_set] = uvset 361 | 362 | log.debug('Creating UV sets for %s %s', loaded_mesh, id(loaded_mesh)) 363 | loaded_mesh.create_uv_sets() 364 | 365 | # Collect attributes for all of the UV sets for this mesh. 366 | for material_set in material_sets_for_loaded_mesh: 367 | uv_set = uvset_per_material_set[material_set] 368 | 369 | uvset_node = mh.find_uvset_by_name(loaded_mesh.maya_mesh, uv_set.name, required=True) 370 | default_uv_set = loaded_mesh.maya_mesh.attr('uvSet[0]').attr('uvSetName') 371 | if uvset_node.get() == default_uv_set.get(): 372 | # This UV set is the default, so we don't need to assign it. This avoids creating unneeded 373 | # uvChooser nodes. 374 | continue 375 | 376 | uvset_nodes_per_material_set.setdefault(material_set, []).append(uvset_node) 377 | 378 | for material_set in dson_material_to_material_set.values(): 379 | sg_node = material_set_to_shading_engine[material_set] 380 | 381 | source_dson_materials = material_set.source_dson_materials 382 | if not source_dson_materials: 383 | source_dson_materials = [material_set.dson_material] 384 | 385 | pm.addAttr(sg_node, longName='dson_materials', niceName='DSON materials', multi=True, numberOfChildren=1, attributeType='compound' ) 386 | pm.addAttr(sg_node, longName='material_url', niceName='Material URL', dt='string', parent='dson_materials') 387 | dson_material_array_attr = sg_node.attr('dson_materials') 388 | 389 | for idx, source_dson_material in enumerate(source_dson_materials): 390 | dson_material_attr = dson_material_array_attr.elementByLogicalIndex(idx) 391 | dson_material_attr.attr('material_url').set(source_dson_material.url) 392 | 393 | pm.addAttr(sg_node, longName='dson_uvsets', niceName='UV sets', at='message', multi=True) 394 | uvset_nodes = uvset_nodes_per_material_set.get(material_set, []) 395 | for idx, uvset_node in enumerate(uvset_nodes): 396 | if uvset_node is None: 397 | continue 398 | uvset_node.connect(sg_node.attr('dson_uvsets').elementByLogicalIndex(idx)) 399 | 400 | -------------------------------------------------------------------------------- /scripts/dsonimport/materials/schlick.py: -------------------------------------------------------------------------------- 1 | import logging, math 2 | import pymel.core as pm 3 | 4 | log = logging.getLogger('DSONImporter.Materials') 5 | 6 | # http://therenderblog.com/fresnel-schlicks-approximation-in-maya/ 7 | # http://therenderblog.com/custom-fresnel-curves-in-maya-part-2/ 8 | 9 | def schlick_aprox(reflt0): 10 | reflt0 = reflt0 11 | theta_deg = 0 12 | 13 | RefltResult = [] 14 | 15 | while theta_deg <= 90: 16 | theta = math.radians(theta_deg) 17 | refVal = reflt0 + (float(1-reflt0)) * (1-math.cos(theta))**5 18 | RefltResult.append(round(refVal,6)) 19 | theta_deg += 1 20 | 21 | return RefltResult 22 | 23 | def _vec2d_dist(p1, p2): 24 | return (p1[0] - p2[0])**2 + (p1[1] - p2[1])**2 25 | 26 | def _vec2d_sub(p1, p2): 27 | return (p1[0]-p2[0], p1[1]-p2[1]) 28 | 29 | def _vec2d_mult(p1, p2): 30 | return p1[0]*p2[0] + p1[1]*p2[1] 31 | 32 | def ramerdouglas(line, dist): 33 | if len(line) < 3: 34 | return line 35 | 36 | begin, end = (line[0], line[-1]) if line[0] != line[-1] else (line[0], line[-2]) 37 | 38 | distSq = [] 39 | for curr in line[1:-1]: 40 | tmp = (_vec2d_dist(begin, curr) - _vec2d_mult(_vec2d_sub(end, begin), _vec2d_sub(curr, begin)) ** 2 / _vec2d_dist(begin, end)) 41 | distSq.append(tmp) 42 | 43 | maxdist = max(distSq) 44 | if maxdist < dist ** 2: 45 | return [begin, end] 46 | 47 | pos = distSq.index(maxdist) 48 | return (ramerdouglas(line[:pos + 2], dist) + ramerdouglas(line[pos + 1:], dist)[1:]) 49 | 50 | def create_ramp_for_schlick(normal_reflectivity, max_points=100): 51 | """ 52 | Create a ramp approximating Schlick's approximation. max_points is the maximum number 53 | of points to place on the ramp. 54 | """ 55 | remap_node = pm.shadingNode('remapValue', asUtility=True) 56 | 57 | sampler_info = pm.shadingNode('samplerInfo', asUtility=True) 58 | sampler_info.attr('facingRatio').connect(remap_node.attr('inputValue')) 59 | 60 | # Calculate Fresnel curve 61 | schlick_list = schlick_aprox(normal_reflectivity) 62 | 63 | # Compensate for non-linear facingRatio 64 | linearValues = [float(i)/90 for i in range(91)] 65 | raw_values = [math.sin(linearValues[i]*90*math.pi/180) for i in range(91)] 66 | raw_values.reverse() 67 | 68 | # Keep decreasing precision until we reduce the curve to the maximum number of points. 69 | myline = zip(raw_values, schlick_list) 70 | precision = 0.00005 71 | simplified = None 72 | while simplified is None or len(simplified) > max_points: 73 | simplified = ramerdouglas(myline, dist=precision) 74 | precision *= 2 75 | 76 | # Remove default values 77 | pm.removeMultiInstance(remap_node.attr('value[0]'), b=1) 78 | pm.removeMultiInstance(remap_node.attr('value[1]'), b=1) 79 | 80 | for i in simplified: 81 | currentSize = pm.getAttr(remap_node.attr('value'), size=1) 82 | 83 | # First and last values with linear interpolation, others with spline interpolation. 84 | if simplified.index(i) == 0 or simplified.index(i) == len(simplified)-1: 85 | interp = 1 86 | else: 87 | interp = 3 88 | 89 | attr = remap_node.attr('value[%i]' % (currentSize+1)) 90 | pm.setAttr(attr, i[0],i[1], interp, type='double3') 91 | 92 | return remap_node.attr('outValue') 93 | 94 | 95 | 96 | -------------------------------------------------------------------------------- /scripts/dsonimport/materials/texture_manager.py: -------------------------------------------------------------------------------- 1 | import logging, os, shutil, subprocess, sys 2 | from pprint import pprint, pformat 3 | from maya import mel 4 | from dsonimport import util 5 | from dsonimport import maya_helpers as mh 6 | import pymel.core as pm 7 | 8 | log = logging.getLogger('DSONImporter.Materials') 9 | 10 | def _remove_ext(path): 11 | parts = path.split('.')[:-1] 12 | if len(parts) == 1: 13 | return path 14 | return '.'.join(parts[:1]) 15 | 16 | def _get_ext(path): 17 | ext = path.split('.')[-1].lower() 18 | return ext 19 | 20 | def _has_src_file_changed(src, dst): 21 | """ 22 | Return true if src is newer than dst, or if dst doesn't exist. 23 | """ 24 | if not os.path.exists(src) or not os.path.exists(dst): 25 | return True 26 | 27 | return os.stat(src).st_mtime > os.stat(dst).st_mtime 28 | 29 | def _convert_image(src, dst): 30 | # Create a temporary filename in the same directory. 31 | path = os.path.dirname(dst) 32 | filename = os.path.basename(dst) 33 | temp_path = '%s/temp_%s' % (path, filename) 34 | 35 | # Make sure the temp file doesn't exist, so we can check that it was created below. 36 | if os.path.exists(temp_path): 37 | os.unlink(temp_path) 38 | 39 | # Use imconvert to convert files. It would be cleaner to do this with something like PIL, 40 | # but that's not installed with Maya's Python. 41 | binary_path = os.environ['MAYA_LOCATION'] 42 | exe = '%s/bin/imconvert' % binary_path 43 | 44 | startupinfo = subprocess.STARTUPINFO() 45 | startupinfo.dwFlags = subprocess.STARTF_USESTDHANDLES | subprocess.STARTF_USESHOWWINDOW 46 | 47 | p = subprocess.Popen([ 48 | exe, 49 | '-compress', 'lzw', 50 | src, 51 | temp_path], 52 | startupinfo=startupinfo, stderr=subprocess.PIPE, stdout=subprocess.PIPE) 53 | out, err = p.communicate() 54 | 55 | # This stupid tool doesn't report errors correctly, so we have to see whether the output file 56 | # exists to try to guess if it actually worked. 57 | if not os.path.exists(temp_path): 58 | log.error('Error converting %s -> %s', src, dst) 59 | log.error('%s%s', out, err) 60 | raise RuntimeError('Error converting %s -> %s', src, dst) 61 | 62 | try: 63 | # Rename the file to its final filename, overwriting any file that was there before. 64 | os.rename(temp_path, dst) 65 | except: 66 | os.unlink(temp_path) 67 | raise 68 | 69 | def _copy_images_to_path(path, output_path): 70 | """ 71 | Given a list of paths, copy them to output_path, converting them to the same file 72 | type and giving them a Mudbox tile filename pattern. 73 | """ 74 | assert len(path) > 0 75 | 76 | # Pick an arbitrary filename to use as the base for the path. We could base this on the 77 | # material slot, if we wanted to propagate that down from the caller. 78 | filename = [_remove_ext(os.path.basename(absolute_path)) for absolute_path in path if absolute_path is not None] 79 | assert len(filename) > 0 80 | filename = filename[0] 81 | 82 | # Remove single quotes from filenames. It triggers a bug in Arnold TX conversion. 83 | filename = filename.replace('\'', '') 84 | 85 | # Check file extensions. Mudbox patterns can't handle files having different extensions, 86 | # so if the input files use more than one file type we need to convert some of them. 87 | extensions = { _get_ext(absolute_path) for absolute_path in path if absolute_path is not None } 88 | assert len(extensions) > 0 89 | if len(extensions) == 1 and list(extensions)[0].lower() in ('exr', 'hdr'): 90 | # Leave 32-bit file types alone. 91 | ext = list(extensions)[0] 92 | else: 93 | # Convert files to TIFF. Just always convert and don't try to be clever, since there 94 | # are too many edge cases. For example, we might be called twice, first with 95 | # ['a.jpg', None, 'c.jpg'] and then with ['a.jpg', 'b.tif', 'c.jpg'], and if we 96 | # copy the files as .jpg the first time around we'll end up with two copies of 97 | # the file. 98 | ext = 'tif' 99 | 100 | # if len(extensions) == 1: 101 | # # Only one file type is used, so we can just copy the files. 102 | # ext = list(extensions)[0] 103 | # else: 104 | # # If we have more than one file type, convert all files to TIFF. We could try to be clever 105 | # # and convert to the most-used file extension, but different file types support different 106 | # # feature and this could do unwanted things. We could check for EXR/HDR to see if we need 107 | # # to convert to a different format for 32-bit textures. We don't bad extensions, like *.TIF 108 | # # files named "*.TIFF" (fix them in the source if you want to prevent a conversion). 109 | # log.info('Material\'s textures use more than one file type (%s). Converting to TIF.', ', '.join(extensions)) 110 | # ext = 'tif' 111 | 112 | for idx, src_path in enumerate(path): 113 | if src_path is None: 114 | continue 115 | 116 | dst_path = '%s/%s.%i.1.%s' % (output_path, filename, idx+1, ext) 117 | 118 | if not _has_src_file_changed(src_path, dst_path): 119 | # The target file already exists and the source hasn't changed, so save time and skip 120 | # the copy/conversion. 121 | continue 122 | 123 | file_ext = _get_ext(src_path) 124 | if file_ext == ext: 125 | # The file is already in this format, so just copy it. 126 | shutil.copyfile(src_path, dst_path) 127 | else: 128 | _convert_image(src_path, dst_path) 129 | 130 | return '%s/%s...%s' % (output_path, filename, ext) 131 | 132 | class TextureManager(object): 133 | """ 134 | This class handles creating file nodes for textures. 135 | 136 | - Many textures will be used by more than one texture. We'll keep track of the nodes we've 137 | created and reuse them. We don't currently serialize this, so if multiple renderer materials 138 | are created (eg. viewport and mental ray), we'll still create duplicate nodes. 139 | - Some renderers don't support explicit tiles. In this case, we need to copy off all textures 140 | that would use it so they use Mudbox tiling. This can lead to duplicated texture files, if the 141 | same texture is used in multiple UV positions. 142 | """ 143 | def __init__(self, find_file): 144 | """ 145 | find_file is a function that takes a path as a parameter, and returns the absolute 146 | path to the image. 147 | """ 148 | self.find_file = find_file 149 | self.path = None 150 | self.created_textures = {} 151 | 152 | def set_path(self, path): 153 | """ 154 | Set the directory to store generated/converted textures in. 155 | """ 156 | self.path = path 157 | 158 | def _create_texture(self, path): 159 | # Get the first non-None path to use as the node name. 160 | first_path = [p for p in path if p is not None][0] 161 | 162 | path = [self.find_file(p)[1] if p is not None else None for p in path] 163 | assert len(path) > 0 164 | 165 | texture, place = mh.create_file_2d() 166 | 167 | path_node_name = 'tex_' + mh.make_node_name_from_filename(first_path) 168 | pm.rename(texture, path_node_name) 169 | 170 | util.mkdir_p(self.path) 171 | 172 | # self.mudbox_tiles = False 173 | self.mudbox_tiles = True 174 | if len(path) > 1: 175 | # There are multiple tiles for this texture. 176 | if self.mudbox_tiles: 177 | texture.attr('uvTilingMode').set(2) # Mudbox (1-based) 178 | pattern_path = _copy_images_to_path(path, self.path) 179 | pattern_path = pattern_path.replace('\\', '/') 180 | 181 | # This doesn't make much sense, but this is what it expects in Mudbox mode. 182 | texture.attr('fileTextureName').set(pattern_path.replace('', '1').replace('', '1')) 183 | texture.attr('fileTextureNamePattern').set(pattern_path) 184 | else: 185 | texture.attr('uvTilingMode').set(4) # explicit tiles 186 | 187 | # Entries that are None don't exist for that material. We'll skip over that entry in 188 | # explicitUvTilePosition, so the UVs stay lined up, but not in the explicitUvTiles list, 189 | # so we don't leave blank file entries that the file node may not understand. 190 | for idx, absolute_path in enumerate(path): 191 | if absolute_path is None: 192 | continue 193 | absolute_path = absolute_path.replace('\\', '/') 194 | if idx == 0: 195 | texture.attr('fileTextureName').set(absolute_path) 196 | else: 197 | tile = texture.attr('explicitUvTiles').elementByLogicalIndex(idx-1) 198 | tile.attr('explicitUvTileName').set(absolute_path) 199 | tile.attr('explicitUvTilePosition').set((idx, 0)) 200 | 201 | mel.eval('generateUvTilePreview %s' % texture.name()) 202 | else: 203 | # Hack: Arnold in Maya 2017 breaks if texture filenames contain '. In that case, always 204 | # copy the file as if this is a tiled texture so we can rename it. 205 | filename = path[0] 206 | if '\'' in filename: 207 | dst_path = '%s/%s' % (self.path, os.path.basename(filename.replace('\'', ''))) 208 | if _has_src_file_changed(filename, dst_path): 209 | log.debug('Copying %s to %s to work around Arnold bug...', filename, dst_path) 210 | shutil.copyfile(filename, dst_path) 211 | 212 | filename = dst_path 213 | 214 | texture.attr('fileTextureName').set(filename) 215 | 216 | return texture, place 217 | 218 | def find_or_create_texture(self, path, alphaIsLuminance=False, alphaGain=1.0, 219 | colorGain=(1,1,1), defaultColor=(0,0,0), colorSpace='sRGB', 220 | horiz_tiles=1, vert_tiles=1, horiz_offset=0, vert_offset=0): 221 | if isinstance(path, list): 222 | path = tuple(path) 223 | assert isinstance(path, tuple), path 224 | 225 | key = { 226 | 'alphaIsLuminance': alphaIsLuminance, 227 | 'alphaGain': alphaGain, 228 | 'colorGain': colorGain, 229 | 'colorSpace': colorSpace, 230 | 'path': path, 231 | 'horiz_tiles': horiz_tiles, 232 | 'vert_tiles': vert_tiles, 233 | 'horiz_offset': horiz_offset, 234 | 'vert_offset': vert_offset, 235 | 'defaultColor': defaultColor, 236 | } 237 | 238 | key = tuple(sorted(key.items())) 239 | if key in self.created_textures: 240 | return self.created_textures[key] 241 | 242 | # We don't have the texture imported. Search the Daz3d path for it and import it. 243 | texture, place = self._create_texture(path) 244 | 245 | texture.attr('alphaIsLuminance').set(alphaIsLuminance) 246 | texture.attr('alphaGain').set(alphaGain) 247 | texture.attr('colorGain').set(colorGain) 248 | texture.attr('colorSpace').set(colorSpace) 249 | texture.attr('defaultColor').set(defaultColor) 250 | texture.attr('ignoreColorSpaceFileRules').set(True) 251 | 252 | place.attr('repeatU').set(horiz_tiles) 253 | place.attr('repeatV').set(vert_tiles) 254 | place.attr('offsetU').set(horiz_offset) 255 | place.attr('offsetV').set(vert_offset) 256 | 257 | # log.debug('Created texture %s: %s', texture, key) 258 | self.created_textures[key] = texture 259 | return texture 260 | 261 | 262 | -------------------------------------------------------------------------------- /scripts/dsonimport/mayadson.py: -------------------------------------------------------------------------------- 1 | import logging, weakref 2 | from pymel import core as pm 3 | import maya_helpers as mh 4 | 5 | log = logging.getLogger('DSONImporter') 6 | 7 | def set_maya_transform_attrs(dson_node, mesh_set): 8 | """ 9 | Record where this transform came from. This makes it easier to figure out what nodes are 10 | in later auto-rigging. 11 | """ 12 | if not dson_node.maya_node: 13 | return 14 | 15 | assert dson_node.maya_node.exists(), dson_node 16 | 17 | maya_node = dson_node.maya_node 18 | 19 | if dson_node.is_top_node: 20 | pm.addAttr(maya_node, longName='dson_top_node', at=bool) 21 | pm.setAttr(maya_node.attr('dson_top_node'), True) 22 | 23 | conform_target = dson_node.get('conform_target') 24 | if conform_target is not None: 25 | conform_target = conform_target.load_url() 26 | 27 | # Save this conform in an attribute for reference. 28 | pm.addAttr(dson_node.maya_node, longName='following_figure', at='message', niceName='Following figure') 29 | conform_target.maya_node.attr('message').connect(dson_node.maya_node.attr('following_figure')) 30 | 31 | # Connect the main DSON transform to each of its meshes, to make them easier to find 32 | # in scripts later. 33 | if mesh_set is not None: 34 | pm.addAttr(dson_node.maya_node, longName='dson_meshes', at='message', niceName='DSON meshes') 35 | for loaded_mesh in mesh_set.meshes.values(): 36 | if loaded_mesh.maya_mesh is None: 37 | continue 38 | 39 | pm.addAttr(loaded_mesh.maya_mesh, longName='dson_transform', at='message', niceName='DSON transform') 40 | dson_node.maya_node.attr('dson_meshes').connect(loaded_mesh.maya_mesh.attr('dson_transform')) 41 | 42 | # Store the node type, to allow distinguishing nodes that represent eg. modifiers. 43 | pm.addAttr(maya_node, longName='dson_type', dt='string', niceName='DSON type') 44 | if dson_node.node_source == 'node': 45 | pm.setAttr(maya_node.attr('dson_type'), dson_node.get_value('type')) 46 | else: 47 | pm.setAttr(maya_node.attr('dson_type'), dson_node.node_source) 48 | 49 | # Store the node's parent's name. For modifiers, this won't be the same as the Maya 50 | # parent. 51 | pm.addAttr(maya_node, longName='dson_parent_name', dt='string', niceName='DSON parent name') 52 | if dson_node.parent and 'name' in dson_node.parent: 53 | maya_node.attr('dson_parent_name').set(dson_node.parent.get_value('name')) 54 | 55 | if dson_node.node_source == 'modifier': 56 | # For modifiers, store a direct connection to the parent. We don't do this for all 57 | # nodes since it doesn't seem useful and adds a ton of DG connections. 58 | pm.addAttr(maya_node, longName='dson_parent', at='message', niceName='DSON parent') 59 | if dson_node.parent and dson_node.parent.maya_node: 60 | dson_node.parent.maya_node.attr('message').connect(dson_node.maya_node.attr('dson_parent')) 61 | 62 | pm.addAttr(maya_node, longName='dson_url', dt='string', niceName='DSON URL') 63 | pm.setAttr(maya_node.attr('dson_url'), dson_node.url) 64 | 65 | if dson_node.asset: 66 | pm.addAttr(maya_node, longName='dson_asset_name', dt='string', niceName='DSON asset name') 67 | pm.setAttr(maya_node.attr('dson_asset_name'), dson_node.asset.get_value('name')) 68 | 69 | pm.addAttr(maya_node, longName='dson_asset_url', dt='string', niceName='DSON asset URL') 70 | pm.setAttr(maya_node.attr('dson_asset_url'), dson_node.asset_url) 71 | 72 | def get_maya_property_name(prop, ignore_channel=False): 73 | """ 74 | Given a property, return a reasonable Maya name to use for it. 75 | If ignore_channel is True, return the property for the whole vector, eg. return 76 | '.translate' instead of '.translateX'. 77 | 78 | This doesn't create or query anything. It just generates a name to use elsewhere. 79 | """ 80 | prop_parts = prop.path.split('/') 81 | 82 | # Get the property key, without any channel suffixes attached. 83 | prop_key = prop_parts[0] 84 | mapping = { 85 | 'translation': 'translate', 86 | 'rotation': 'rotate', 87 | 'scale': 'scale', 88 | } 89 | maya_key = None 90 | if prop_key in mapping: 91 | prop_key = mapping[prop_key] 92 | 93 | if prop.path.count('/') == 1 and not ignore_channel: 94 | # If we've been given a single channel, eg. rotation/x, return it. 95 | assert len(prop_parts) == 2, prop_parts 96 | assert prop_parts[1] in ('x', 'y', 'z'), prop_parts 97 | return '%s%s' % (prop_key, prop_parts[1].upper()) 98 | else: 99 | # Otherwise, return the vector itself. 100 | return prop_key 101 | 102 | _transforms_on_node = weakref.WeakKeyDictionary() 103 | def create_transform_with_parents(parent, name, internal_control=False): 104 | """ 105 | Create a transform, and any transforms above it that don't exist. 106 | """ 107 | for part in name.split('|'): 108 | # Keep track of the transforms we create. This way, if Maya renames part of the 109 | # path, we'll use the node for the next node created inside it, rather than trying 110 | # to find the original name and creating a new one. 111 | part = mh.cleanup_node_name(part) 112 | transforms_on_parent = _transforms_on_node.setdefault(parent, {}) 113 | node = transforms_on_parent.get(part) 114 | if node is not None: 115 | parent = node 116 | continue 117 | 118 | # This group doesn't exist. Recurse, to make sure its parent exists. 119 | node = pm.createNode('transform', p=parent, n=part) 120 | 121 | # Hide the transform properties. This is a data node, so they're just noise. 122 | mh.hide_common_attributes(node, transforms=True, visibility=True) 123 | 124 | if internal_control: 125 | mh.config_internal_control(node) 126 | 127 | transforms_on_parent[part] = node 128 | parent = node 129 | 130 | return node 131 | 132 | def _pretty_print_label(label): 133 | # Rename "X Translate" to "Translate X", to match Maya conventions. 134 | parts = label.split() 135 | if len(parts) == 2 and parts[0] in ('X', 'Y', 'Z'): 136 | return ' '.join([parts[1], parts[0]]) 137 | return label 138 | 139 | def _get_placeholder_group_for_property(node, grouping): 140 | """ 141 | Return a Maya node path to store data for this DSON node. For example: 142 | 143 | |Controls|Genesis3Female|eCTRLMouthSmile 144 | """ 145 | ancestors = [] 146 | next_node = node 147 | while next_node: 148 | if next_node.is_root or next_node.is_top_node: 149 | # Don't include the scene/library node. 150 | break 151 | 152 | name = mh.cleanup_node_name(next_node.get_label()) 153 | 154 | ancestors.append(name) 155 | next_node = next_node.parent 156 | 157 | # Group everything inside top_group, eg. "Controls", and then inside the top node. 158 | ancestors.append(grouping) 159 | 160 | ancestors.reverse() 161 | 162 | return '|'.join(ancestors) 163 | 164 | def _get_control_group_for_property(prop): 165 | # There are two groups of modifier properties: ones with regions and ones without. If 166 | # we just concatenate them we get something reasonable, but pretty sparse with a few 167 | # controls scattered among a lot of controls: 168 | # 169 | # |Pose_Controls|Head|Eyes 170 | # |Pose_Controls|Head|Mouth 171 | # |Pose_Controls|Head|Mouth|Lips 172 | # |Eyes|Real_World 173 | # |Mouth|Real_World 174 | # 175 | # Clean this up so it makes more sense in our tree organization. Remove "Pose_Controls", 176 | # moving its contents up. Move the root controls that we know match with groups inside 177 | # Head into Head, so |Eyes|Real_World becomes |Head|Eyes|Real_World. 178 | group_name = '' 179 | region = prop.node.get_value('region', '') 180 | if region: 181 | group_name += region 182 | 183 | group = prop.node.get_value('group', '').replace('/', '|') 184 | if group: 185 | if group.startswith('|'): 186 | group = group[1:] 187 | if group_name: 188 | group_name += '|' 189 | group_name += group 190 | 191 | parts = group_name.split('|') 192 | 193 | # Remove "Actor" prefixes. 194 | if len(parts) > 0 and parts[0] == 'Actor': 195 | parts = parts[1:] 196 | 197 | # Shaping groups (controls not inside Pose Controls) tend to be small, since we usually 198 | # don't create shaping controls for a lot of controls. For example, it's common to have 199 | # "|Eyes|Real World", with nothing inside Eyes. Remove everything beyond the first entry 200 | # if this isn't inside Pose Controls or Morphs. 201 | if len(parts) > 0 and parts[0] not in ('Pose Controls', 'Morphs'): 202 | parts[1:] = [] 203 | 204 | # Remove the "Pose Controls" or "Morphs" prefix. 205 | if len(parts) > 0 and parts[0] in ('Pose Controls', 'Morphs'): 206 | parts = parts[1:] 207 | 208 | # Move shaping controls in the root to inside the Head group. 209 | if len(parts) > 0 and parts[0] in ('Eyes', 'Mouth', 'Nose', 'Brow'): 210 | parts[0:0] = ['Head'] 211 | 212 | group_name = '|'.join(parts) 213 | if group_name: 214 | group_name = '|' + group_name 215 | return group_name 216 | 217 | def create_attribute_on_control_node(prop, initial_value): 218 | search_node = prop.node.find_top_node() 219 | 220 | # Add the modifier group, if any. We'll usually have a "group", eg. "/Morphs", and 221 | # we may have a region, eg. "Legs". Prefix the region if we have one. 222 | group_name = _get_control_group_for_property(prop) 223 | 224 | # Create the control node for this Maya node if we haven't yet. 225 | control_output_node = create_transform_with_parents(search_node.maya_node, 'Controls'+group_name) 226 | 227 | property_name = mh.cleanup_node_name('%s_%s' % (prop.node.node_id, prop.get_label())) 228 | 229 | # Create the attribute. 230 | if isinstance(prop.value, list): 231 | # This code path isn't currently used. 232 | raise RuntimeError('test me') 233 | 234 | # We're expecting vector3s, eg. translation, rotation, etc. values. Sometimes DSON files contain 235 | # a partial vector, with only one or two axes actually in the array. Check that this actually 236 | # looks like an x/y/z property. 237 | assert len(prop.value) <= 3, 'Property length not handled: %s' % prop 238 | for sub_prop in prop.value: 239 | assert sub_prop['id'] in ('x', 'y', 'z'), 'Array property not handled: %s' % prop 240 | 241 | # This is an array property, which is usually eg. a translation or rotation. We want 242 | # to group these into vector3s, but DSON doesn't have full properties for the group as 243 | # a whole, eg. it doesn't have a label. All we have is the top-level key, like "translation". 244 | pm.addAttr(control_output_node, longName=property_name, shortName=property_name, niceName=property_name, at='double3') # niceName=niceName, 245 | # Group the sub-properties by ID. 246 | sub_properties = {sub_prop['id']: sub_prop for sub_prop in prop.value} 247 | 248 | property_names = [] 249 | for axis in ('x', 'y', 'z'): 250 | # We'll set up all three axes, even if some of them aren't listed in the property. 251 | sub_prop = sub_properties.get(axis, { 252 | 'name': axis, 253 | }) 254 | 255 | sub_property_name = '%s%s' % (property_name, axis.upper()) 256 | property_names.append(sub_property_name) 257 | label = _pretty_print_label(sub_prop.get_label()) 258 | 259 | pm.addAttr(control_output_node, shortName=sub_property_name, longName=sub_property_name, at='float', p=property_name, niceName=label) 260 | 261 | # Set up attribute properties. Maya gets confused if we don't add all three sub-attributes 262 | # before doing this. 263 | for sub_property_name in property_names: 264 | pm.setAttr(control_output_node.attr(sub_property_name), e=True, keyable=True) 265 | attr = control_output_node.attr(property_name) 266 | elif isinstance(prop.value, dict): 267 | # This is a modifier channel. 268 | attr = mh.addAttr(control_output_node, longName=property_name, shortName=property_name, niceName=prop.get_label(), at='float') 269 | 270 | if prop.get_value('visible', True): 271 | pm.setAttr(attr, e=True, keyable=True) 272 | else: 273 | pm.setAttr(attr, e=True, cb=False) 274 | 275 | mh.set_or_connect(attr, initial_value) 276 | 277 | # Create an extra hidden node, and pass the user control values through it instead of using them 278 | # directly. Otherwise, all of the outputs will show up in the CB when the control node is 279 | # selected. There can be hundreds of these and they're all our internal math nodes, so it makes 280 | # a mess. 281 | control_value_node = create_transform_with_parents(search_node.maya_node, 'ControlValues'+group_name, internal_control=True) 282 | value_attr = mh.addAttr(control_value_node, longName=property_name, shortName=property_name, niceName=prop.get_label(), at='float') 283 | pm.connectAttr(attr, value_attr) 284 | attr = value_attr 285 | else: 286 | raise RuntimeError('unknown property type on %s (%s on %s)' % (prop, prop.path, prop.node)) 287 | 288 | return attr 289 | 290 | # Associated Maya attributes 291 | # 292 | # Properties may have Maya attributes associated to tell us where modifiers will connect to. 293 | # 294 | # A property can have different input and output attributes. This is used for rotations. 295 | # Modifiers that change rotation write directly to the rotation, but modifiers that read 296 | # rotations may have a pose reader inserted to give more predictable values. The input 297 | # attribute is where properties write into the attribute; the output attribute is where 298 | # the attribute outputs to other properties. 299 | # 300 | # We may have a callback registered for a property instead of a Maya property PyNode. This 301 | # allows lazily creating the property, for properties that only need to exist if someone is 302 | # using them. If a node has separate input and output properties, we'll call 303 | class MayaPropertyAssociation(object): 304 | def __init__(self, input_attr, property_map=None): 305 | assert isinstance(input_attr, pm.PyNode), input_attr 306 | self._input_attr = input_attr 307 | self._output_attr = None 308 | self._property_map = property_map 309 | 310 | def _get_channel(self, maya_attr, prop): 311 | # If we have no property list, just use the attribute. 312 | if self._property_map is None: 313 | return maya_attr 314 | 315 | # Get a list of the channels on this Maya attribute. 316 | maya_channels = mh.get_channels_cached(maya_attr) 317 | 318 | # If this is the second channel on this property, eg. translate/y, return the second Maya attribute, 319 | # eg. translateY. 320 | maya_channel_idx = self._property_map.index(prop.last_path) 321 | return maya_channels[maya_channel_idx] 322 | 323 | def _get_input_attr(self, prop): 324 | # Look up the Maya property name for this channel. 325 | return self._get_channel(self._input_attr, prop) 326 | 327 | def get_output_attr(self, prop): 328 | # Create the output if we haven't yet. 329 | if self._output_attr is None: 330 | self._output_attr = self._create_output_attr() 331 | 332 | # Look up the Maya property name for this channel. 333 | return self._get_channel(self._output_attr, prop) 334 | 335 | def set_value(self, prop, value): 336 | # The output of the DSONProperty is the input to the Maya attribute. 337 | output_attr = self._get_input_attr(prop) 338 | 339 | mh.set_or_connect(output_attr, value) 340 | 341 | # Mark the property as driven if we've connected something to it, but not if it's constant. 342 | if isinstance(value, pm.PyNode): 343 | mh.config_driven_control(output_attr.node()) 344 | 345 | def _create_output_attr(self): 346 | # By default, the output attribute is the same as the input attribute. 347 | return self._input_attr 348 | 349 | def _add_maya_attribute(prop, assoc): 350 | """ 351 | Assign a Maya property to a DSON property in this node. 352 | 353 | Some DSONProperties have corresponding Maya properties. Properties that do 354 | this will support modifiers. 355 | 356 | maya_path can be a PyNode pointing to a Maya property, or a function. 357 | If a function is given, it'll be called if we turn out to actually need the property. 358 | It should do any work needed to create it, and return the Maya property path. 359 | This can be used to avoid creating nodes when we support a property dynamically, 360 | but no modifiers are actually using it. 361 | """ 362 | assert isinstance(assoc, MayaPropertyAssociation), assoc 363 | 364 | assert prop.path not in prop.node.property_maya_attributes, 'Property %s is already registered' % prop 365 | prop.node.property_maya_attributes[prop.path] = assoc 366 | 367 | def get_maya_output_attribute(prop): 368 | """ 369 | Given a DSON property, return the Maya attribute path to its value. The attribute 370 | will be created if it doesn't exist. 371 | """ 372 | # See if this is a property with an associated Maya attribute. 373 | assoc = prop.node.property_maya_attributes.get(prop.path) 374 | if assoc is not None: 375 | return assoc.get_output_attr(prop) 376 | 377 | return _create_property_placeholder(prop).get_output_attr(prop) 378 | 379 | def _create_property_placeholder(prop): 380 | # This is a property with no equivalent Maya attribute, such as a modifier value. Create 381 | # an attribute to hold the value. 382 | 383 | if prop.node.maya_node is None: 384 | # This property doesn't have a Maya node, either. Create a placeholder for it, and 385 | # set this as the node's Maya node. 386 | property_node_path = _get_placeholder_group_for_property(prop.node, grouping='Properties') 387 | prop.node.maya_node = create_transform_with_parents(prop.node.find_top_node().maya_node, property_node_path, internal_control=True) 388 | 389 | # Create an attribute to hold the value. 390 | property_name = 'Property_' + get_maya_property_name(prop) 391 | pm.addAttr(prop.node.maya_node, at='float', longName=property_name) 392 | 393 | # Save the new paths to property_maya_attributes, so we'll reuse them in later calls. 394 | value_node = pm.PyNode(prop.node.maya_node).attr(property_name) 395 | assert 'rotati' not in str(prop), prop 396 | assoc = MayaPropertyAssociation(value_node) 397 | _add_maya_attribute(prop, assoc) 398 | 399 | return assoc 400 | # return value_node 401 | 402 | # When we create attributes, we create the input and value properties at the same 403 | # time. Keep track of these, so we don't have to go searching for them when we 404 | # look it up again later. 405 | _input_attributes_set = weakref.WeakKeyDictionary() 406 | 407 | def set_final_property_value(prop, value): 408 | """ 409 | value may be a constant value or a PyNode attribute. 410 | """ 411 | # We only set final property values once, when we create formulas for the property. We may 412 | # set values on the underlying Maya attribute earlier, but that's set directly, not through here. 413 | input_attrs = _input_attributes_set.setdefault(prop.node, {}) 414 | assert prop.path not in input_attrs, 'Final property value set more than once: %s, value %s' % (prop, value) 415 | input_attrs[prop.path] = True 416 | 417 | if prop.get_value('clamped', False): 418 | # If this node is clamped, hook up clamping. This is important, since modifier formulas 419 | # may depend on it. We only create the clamp node when somebody actually asks us for 420 | # the input property. If we create this in get_maya_output_attribute for nodes that 421 | # don't need it, we'll leave attributes connected and unchangeable when there's nothing 422 | # connected to the clamp. 423 | min_value = prop.get_value('min') 424 | max_value = prop.get_value('max') 425 | 426 | if isinstance(value, pm.PyNode): 427 | name = mh.cleanup_node_name('Clamp_%s_%s' % (prop.node.node_id, prop['name'].value)) 428 | clamp_node = pm.createNode('clamp', n=name) 429 | 430 | pm.setAttr(clamp_node.attr('minR'), min_value) 431 | pm.setAttr(clamp_node.attr('maxR'), max_value) 432 | pm.connectAttr(value, clamp_node.attr('inputR')) 433 | value = clamp_node.attr('outputR') 434 | else: 435 | # This is just a constant value. Clamp the value directly, so we don't create an unneeded 436 | # clamp node. 437 | value = max(min_value, value) 438 | value = min(max_value, value) 439 | 440 | # Get the MayaPropertyAssociation for this property, creating one if it doesn't exist. 441 | property_association = prop.node.property_maya_attributes.get(prop.path) 442 | if property_association is None: 443 | property_association = _create_property_placeholder(prop) 444 | 445 | property_association.set_value(prop, value) 446 | 447 | def set_attr_to_prop_vector(prop, maya_attr, dson_to_maya_attrs, property_association=None, dynamic=True): 448 | assert isinstance(prop.value, list) 449 | 450 | maya_channels = mh.get_channels_cached(maya_attr) 451 | for maya_channel_idx, channel_prop_name in enumerate(dson_to_maya_attrs): 452 | channel_prop = prop[channel_prop_name] 453 | maya_channel = maya_channels[maya_channel_idx] 454 | set_attr_to_prop(channel_prop, maya_channel, property_association=property_association, dynamic=dynamic) 455 | 456 | def set_attr_to_prop(prop, maya_attr, property_association=None, dynamic=True): 457 | """ 458 | Set a Maya attribute to the value of a DSONProperty, and make it dynamic, so 459 | when modifiers are applied they can affect this property. 460 | 461 | Note that the given node must be both input and output connectable, so modifiers 462 | can use it as both an input and an output. 463 | 464 | If dynamic is False, just set the attribute's property without making it dynamic. 465 | """ 466 | value = prop.evaluate() 467 | if property_association is None: 468 | pm.setAttr(maya_attr, value) 469 | else: 470 | property_association.set_value(prop, value) 471 | 472 | if not dynamic: 473 | return 474 | 475 | if property_association is None: 476 | property_association = MayaPropertyAssociation(maya_attr) 477 | _add_maya_attribute(prop, property_association) 478 | 479 | -------------------------------------------------------------------------------- /scripts/dsonimport/modifiers_maya.py: -------------------------------------------------------------------------------- 1 | import logging 2 | from dson import modifiers 3 | 4 | import maya_helpers as mh 5 | import pymel.core as pm 6 | import mayadson 7 | 8 | log = logging.getLogger('DSONImporter') 9 | 10 | # This contains the Maya-specific logic for DSONModifier and DSONFormula. 11 | def _get_attr_for_property_value(formula, input_prop): 12 | result = mayadson.get_maya_output_attribute(input_prop) 13 | 14 | # Add the formula to the list of formulas that depend on this property. We don't 15 | # need this for evaluation, but it helps us clean up the scene later. 16 | input_prop.node.formulas_output.setdefault(input_prop.path, []).append(formula) 17 | 18 | return result 19 | 20 | class DSONMayaFormula(object): 21 | def __init__(self, value, is_maya_property): 22 | self.value = value 23 | self.is_maya_property = is_maya_property 24 | 25 | def create_maya_output_for_formula(formula): 26 | """ 27 | Create a Maya node network to evaluate this formula, and return a PyNode for 28 | an attribute path to its result. 29 | 30 | If the value of this expression is a constant, the constant will be returned 31 | instead of a PyNode. 32 | 33 | Note that if this formula simply pushes the value of another property, we'll just 34 | return that property's path and not create any extra nodes. 35 | """ 36 | stack = [] 37 | 38 | for op in formula.operations: 39 | op_type = op['op'] 40 | if op_type == 'push': 41 | if 'prop' in op: 42 | # Look up the input property. 43 | input_prop = op['prop'] 44 | 45 | # Non-dynamic properties should be optimized out in optimize(), leaving us with only 46 | # dynamic ones. 47 | assert input_prop.is_dynamic 48 | 49 | # This is pointing to another dynamic DSONProperty. Push the attribute that will 50 | # hold the result of that property. Note that calling this will cause the property 51 | # holder node to be created, so we're careful to only do this now when we're really 52 | # creating the network, after optimize() has had a chance to optimize it out. 53 | input_attr = _get_attr_for_property_value(formula, input_prop) 54 | stack.append({ 55 | 'path': input_attr 56 | }) 57 | 58 | elif 'val' in op: 59 | # We're pushing a constant value. 60 | stack.append({ 61 | 'val': op['val'] 62 | }) 63 | 64 | elif op_type == 'mult': 65 | # Pop the top two values from the stack. 66 | assert len(stack) >= 2 67 | vals = [stack[-1], stack[-2]] 68 | stack = stack[:-2] 69 | 70 | # Create a multiplyDivide node to do the multiplication. 71 | math_node = mh.createNewNode('multiplyDivide', nodeName='DSON_Math') 72 | pm.setAttr(math_node.attr('operation'), 1) # multiply 73 | 74 | # Push the resulting multiplyDivide node onto the stack. 75 | stack.append({ 76 | 'path': math_node.attr('outputX'), 77 | }) 78 | 79 | # Set or connect the two inputs. 80 | # If we support more than a couple operations, we should handle the arguments more generically. 81 | for idx in xrange(len(vals)): 82 | op = vals[idx] 83 | if 'val' in op: 84 | value_attr = op['val'] 85 | # pm.setAttr(input_attr, op['val']) 86 | elif 'path' in op: 87 | # This is the Maya path to a node that we created previously. 88 | value_attr = op['path'] 89 | # pm.connectAttr(op['path'], input_attr) 90 | else: 91 | raise RuntimeError('Unsupported \"push\" operand: %s' % op) 92 | 93 | mh.set_or_connect(math_node.attr('input%iX' % (idx+1)), value_attr) 94 | elif op_type in ('spline_tcb', 'spline_linear', 'spline_constant'): 95 | # The number of values in the array for each keyframe. We don't currently try 96 | # to support TCB values in spline keyframes, since nothing appears to use it. 97 | values_per_key_map = { 98 | 'spline_tcb': 5, 99 | 'spline_linear': 2, 100 | 'spline_constant': 2, 101 | } 102 | values_per_key = values_per_key_map[op_type] 103 | 104 | # The keyframe type on the remapValue node. 105 | types_per_key_map = { 106 | 'spline_tcb': 3, 107 | 'spline_linear': 1, 108 | 'spline_constant': 0, 109 | } 110 | keyframe_type = types_per_key_map[op_type] 111 | 112 | # Keyframes. These aren't supported yet, but parse them out. 113 | # XXX 114 | def pop(): 115 | assert len(stack) >= 1 116 | result = stack[-1] 117 | stack[-1:] = [] 118 | return result 119 | num_keyframes = pop() 120 | 121 | # We only expect to see a constant value for the keyframe count. It doesn't make sense 122 | # to reference a variable value. 123 | assert 'val' in num_keyframes 124 | num_keyframes = num_keyframes['val'] 125 | 126 | keyframes = [] 127 | for idx in xrange(num_keyframes): 128 | keyframe = pop() 129 | 130 | # We expect the keyframes to be constant. 131 | assert 'val' in keyframe, keyframe 132 | assert len(keyframe['val']) == values_per_key, keyframe 133 | keyframes.append(keyframe['val']) 134 | 135 | # The next stack entry is the input into the spline. We should pop it and replace it 136 | # with a reference to the result. Since we don't support this yet, we just pop it and 137 | # replace it with a dummy output value. 138 | value = pop() 139 | 140 | # We should be using animCurveUU here, like set driven key, but Maya makes keyframes 141 | # unreasonably hard to use: unlike every other Maya node, you can't set properties on 142 | # it with setAttr (even though that's what .ma files do), so we can't create the node 143 | # and return the property like we need to. 144 | math_node = mh.createNewNode('remapValue', nodeName='DSON_Keyframe') 145 | 146 | for idx, keyframe in enumerate(keyframes): 147 | input_value = keyframe[0] 148 | output_value = keyframe[1] 149 | 150 | key = math_node.attr('value').elementByLogicalIndex(idx) 151 | pm.setAttr(key.attr('value_Position'), input_value) 152 | pm.setAttr(key.attr('value_FloatValue'), output_value) 153 | pm.setAttr(key.attr('value_Interp'), keyframe_type) 154 | 155 | # Set the input. 156 | if 'val' in value: 157 | value_attr = value['val'] 158 | elif 'path' in value: 159 | # This is the Maya path to a node that we created previously. 160 | value_attr = value['path'] 161 | else: 162 | raise RuntimeError('Unsupported \"push\" operand: %s' % value) 163 | 164 | mh.set_or_connect(math_node.attr('inputValue'), value_attr) 165 | 166 | stack.append({ 167 | 'path': math_node.attr('outValue'), 168 | }) 169 | else: 170 | raise RuntimeError('Unsupported op: %s' % op_type) 171 | 172 | # We should be left with one result. 173 | assert len(stack) == 1, 'Unbalanced formula stack' 174 | result = stack[0] 175 | 176 | if 'val' in result: 177 | return DSONMayaFormula(result['val'], is_maya_property=False) 178 | else: 179 | return DSONMayaFormula(result['path'], is_maya_property=True) 180 | 181 | 182 | -------------------------------------------------------------------------------- /scripts/dsonimport/prefs.py: -------------------------------------------------------------------------------- 1 | import errno, logging, json, os 2 | import util 3 | 4 | log = logging.getLogger('DSONImporter') 5 | 6 | # The directory we store settings, cached, and other persistent files. Is there a better 7 | # place to put this? 8 | storage_path = '%s/dsonimport' % os.environ['MAYA_APP_DIR'] 9 | 10 | def _prefs_file(): 11 | return '%s/prefs.js' % storage_path 12 | 13 | def load_prefs(): 14 | try: 15 | with open(_prefs_file()) as f: 16 | data = f.read() 17 | except IOError as e: 18 | # Don't warn if the file doesn't exist. 19 | if e.errno == errno.ENOENT: 20 | return {} 21 | 22 | try: 23 | return json.loads(data) 24 | except ValueError as e: 25 | log.warning('Error parsing %s: %s', _prefs_file(), e) 26 | return {} 27 | 28 | def save_prefs(prefs): 29 | util.mkdir_p(storage_path) 30 | data = json.dumps(prefs, indent=4) 31 | try: 32 | with open(_prefs_file(), 'w') as f: 33 | f.write(data) 34 | except IOError as e: 35 | log.warning('Error saving %s: %s', _prefs_file(), e) 36 | 37 | 38 | -------------------------------------------------------------------------------- /scripts/dsonimport/qt/import_window.ui: -------------------------------------------------------------------------------- 1 | 2 | 3 | Dialog 4 | 5 | 6 | 7 | 0 8 | 0 9 | 729 10 | 566 11 | 12 | 13 | 14 | Dialog 15 | 16 | 17 | 18 | 19 | 20 | 1 21 | 22 | 23 | 24 | Modifiers 25 | 26 | 27 | 28 | 29 | Prefs 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | Qt::Horizontal 38 | 39 | 40 | QDialogButtonBox::Cancel|QDialogButtonBox::Ok 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 | 49 | buttonBox 50 | accepted() 51 | Dialog 52 | accept() 53 | 54 | 55 | 248 56 | 254 57 | 58 | 59 | 157 60 | 274 61 | 62 | 63 | 64 | 65 | buttonBox 66 | rejected() 67 | Dialog 68 | reject() 69 | 70 | 71 | 316 72 | 260 73 | 74 | 75 | 286 76 | 274 77 | 78 | 79 | 80 | 81 | 82 | -------------------------------------------------------------------------------- /scripts/dsonimport/qt/modifier_list.ui: -------------------------------------------------------------------------------- 1 | 2 | 3 | Form 4 | 5 | 6 | 7 | 0 8 | 0 9 | 693 10 | 395 11 | 12 | 13 | 14 | Form 15 | 16 | 17 | 18 | 19 | 20 | true 21 | 22 | 23 | 24 | 1 25 | 0 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 0 35 | 0 36 | 37 | 38 | 39 | X 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 0 48 | 1 49 | 50 | 51 | 52 | false 53 | 54 | 55 | true 56 | 57 | 58 | false 59 | 60 | 61 | true 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | TreeView 70 | QTreeView 71 |
dsonimport/qtpy/qt_widgets/tree_view.h
72 |
73 |
74 | 75 | 76 |
77 | -------------------------------------------------------------------------------- /scripts/dsonimport/qt/prefs_tab.ui: -------------------------------------------------------------------------------- 1 | 2 | 3 | Form 4 | 5 | 6 | 7 | 0 8 | 0 9 | 591 10 | 517 11 | 12 | 13 | 14 | Form 15 | 16 | 17 | 18 | 19 | 9 20 | 9 21 | 571 22 | 481 23 | 24 | 25 | 26 | 27 | 28 | 29 | Straighten pose 30 | 31 | 32 | 33 | 34 | 35 | 36 | Hide face rig 37 | 38 | 39 | 40 | 41 | 42 | 43 | Create end joints 44 | 45 | 46 | 47 | 48 | 49 | 50 | Conform joints 51 | 52 | 53 | 54 | 55 | 56 | 57 | Conform meshes 58 | 59 | 60 | 61 | 62 | 63 | 64 | Geometry 65 | 66 | 67 | 68 | 69 | 70 | 71 | Materials 72 | 73 | 74 | 75 | 76 | 77 | 78 | Morphs 79 | 80 | 81 | 82 | 83 | 84 | 85 | Skinning 86 | 87 | 88 | 89 | 90 | 91 | 92 | Modifiers 93 | 94 | 95 | 96 | 97 | 98 | 99 | Grafts 100 | 101 | 102 | 103 | 104 | 105 | 106 | Hide internal controls in outliner 107 | 108 | 109 | 110 | 111 | 112 | 113 | Bake static morphs 114 | 115 | 116 | 117 | 118 | 119 | 120 | Use cvWrap instead of Maya wrap if available 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | Mesh splitting 130 | 131 | 132 | 133 | 134 | 135 | 136 | 3 137 | 138 | 139 | QComboBox::AdjustToContents 140 | 141 | 142 | 143 | Smart split 144 | 145 | 146 | 147 | 148 | Don't split props 149 | 150 | 151 | 152 | 153 | Never split 154 | 155 | 156 | 157 | 158 | 159 | 160 | 161 | Qt::Horizontal 162 | 163 | 164 | 165 | 40 166 | 20 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | Qt::Vertical 177 | 178 | 179 | 180 | 20 181 | 40 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | -------------------------------------------------------------------------------- /scripts/dsonimport/qtpy/.gitignore: -------------------------------------------------------------------------------- 1 | *.py 2 | -------------------------------------------------------------------------------- /scripts/dsonimport/qtpy/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zewt/zDsonImport/ae9beb87e09a6366e5f1b07a598478f20fa101ae/scripts/dsonimport/qtpy/__init__.py -------------------------------------------------------------------------------- /scripts/dsonimport/qtpy/qt_widgets/.gitignore: -------------------------------------------------------------------------------- 1 | # *.py is ignored in the parent directory, since they're generated, but scripts in this directory shouldn't be. 2 | !*.py 3 | -------------------------------------------------------------------------------- /scripts/dsonimport/qtpy/qt_widgets/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zewt/zDsonImport/ae9beb87e09a6366e5f1b07a598478f20fa101ae/scripts/dsonimport/qtpy/qt_widgets/__init__.py -------------------------------------------------------------------------------- /scripts/dsonimport/qtpy/qt_widgets/tree_view.py: -------------------------------------------------------------------------------- 1 | import logging 2 | from dsonimport import Qt 3 | 4 | log = logging.getLogger('DSONImporter') 5 | 6 | class TreeView(Qt.QTreeView): 7 | def __init__(self, parent): 8 | super(TreeView, self).__init__(parent) 9 | 10 | self.is_during_mouse_release = False 11 | self.performing_edit_event_index = None 12 | self.performing_edit_event_item = None 13 | 14 | def edit(self, index, trigger, event): 15 | try: 16 | self.performing_edit_event_index = index 17 | if index.model() is not None: 18 | source_index = index.model().mapToSource(index) 19 | source_item = source_index.model().itemFromIndex(source_index) 20 | self.performing_edit_event_item = source_item 21 | 22 | return super(TreeView, self).edit(index, trigger, event) 23 | finally: 24 | self.performing_edit_event_index = None 25 | self.performing_edit_event_item = None 26 | 27 | def mouseReleaseEvent(self, event): 28 | self.is_during_mouse_release = True 29 | try: 30 | return super(TreeView, self).mouseReleaseEvent(event) 31 | finally: 32 | self.is_during_mouse_release = False 33 | 34 | def checkbox_toggled(self, item, value): 35 | """ 36 | This is called by items when their checked state is changed by a call to setData. 37 | If we're in the middle of an edit on that item, assume that the edit caused the 38 | toggle and apply the same change to all selected items in the same column. 39 | """ 40 | if value == Qt.Qt.PartiallyChecked: 41 | return 42 | 43 | if self.performing_edit_event_item is not item: 44 | return 45 | 46 | selected_indexes = item.view.selectedIndexes() 47 | selected_indexes = [idx.model().mapToSource(idx) for idx in selected_indexes] 48 | selected_items = [idx.model().itemFromIndex(idx) for idx in selected_indexes] 49 | 50 | # Only do this if a selected item is being changed. 51 | if not any(item is selected_item for selected_item in selected_items): 52 | return 53 | 54 | for other_item in selected_items: 55 | if other_item is item: 56 | continue 57 | 58 | # Only change items in the same column as us. 59 | if other_item.index().column() != item.index().column(): 60 | continue 61 | 62 | if other_item.isCheckable(): 63 | other_item.setData(value, Qt.Qt.CheckStateRole) 64 | 65 | def selectionCommand(self, index, event): 66 | if self.is_during_mouse_release: 67 | # Work around QT. If a click on an item doesn't cause it to be selected 68 | # (QAbstractItemView::mousePressEvent), QAbstractItemView::mouseReleaseEvent will 69 | # try to select the item instead. We don't want that. If you click on a checkbox, 70 | # the edit will eat the click so it won't cause a selection, so if we let this 71 | # happen, clicking a checkbox when multiple items are selected will select just the 72 | # clicked item after toggling the checkboxes. 73 | return Qt.QItemSelectionModel.NoUpdate 74 | 75 | # if event.type() == Qt.QEvent.MouseButtonPress: 76 | # button = event.button() 77 | # modifiers = event.modifiers() 78 | # is_right_button = bool(button & Qt.Qt.RightButton) 79 | # shift = bool(modifiers & Qt.Qt.ShiftModifier) 80 | # control = bool(modifiers & Qt.Qt.ControlModifier) 81 | # selected = self.selectionModel().isSelected(index) 82 | # return Qt.QItemSelectionModel.ClearAndSelect 83 | # if event.type() == Qt.QEvent.MouseButtonRelease: 84 | # return Qt.QItemSelectionModel.NoUpdate 85 | return super(TreeView, self).selectionCommand(index, event) 86 | 87 | -------------------------------------------------------------------------------- /scripts/dsonimport/rig_tools.py: -------------------------------------------------------------------------------- 1 | import math 2 | from maya import cmds, mel 3 | from pymel import core as pm 4 | import pymel.core.datatypes as dt 5 | 6 | def find_ikfk_node(node): 7 | for conn in node.attr('message').connections(d=True, s=False): 8 | if pm.hasAttr(conn, 'nodeType') and conn.attr('nodeType').get() == 'FKIKReferences': 9 | return conn 10 | raise Exception('Node %s is not an IK/FK control.' % node) 11 | 12 | def get_control(node, name): 13 | return node.attr(name).connections()[0] 14 | 15 | def dist(p1, p2): 16 | vec = (p1[0]-p2[0], p1[1]-p2[1], p1[2]-p2[2]) 17 | return math.pow(vec[0]*vec[0] + vec[1]*vec[1] + vec[2]*vec[2], 0.5) 18 | 19 | def angle_between_quaternions(q1, q2): 20 | dot = q1.x*q2.x + q1.y*q2.y + q1.z*q2.z + q1.w*q2.w 21 | v = 2*dot*dot - 1 22 | if v > .9999: 23 | return 0 24 | return math.acos(v) 25 | 26 | def fk_to_ik(mode='angle'): 27 | """ 28 | Match IK to the current FK pose. 29 | 30 | FK can reach poses that IK can't, this can give bad pole vector positions in those cases. 31 | """ 32 | n = pm.ls(sl=True)[0] 33 | node = find_ikfk_node(n) 34 | 35 | # Align the IK handle to the last FK control's position and rotation. 36 | t = pm.xform(get_control(node, 'FK_3'), q=True, ws=True, t=True) 37 | pm.xform(get_control(node, 'IK_Handle'), ws=True, t=t) 38 | 39 | r = pm.xform(get_control(node, 'FK_3'), q=True, ws=True, ro=True) 40 | pm.xform(get_control(node, 'IK_Handle'), ws=True, ro=r) 41 | 42 | # The pole vector is trickier. It only rotates on one axis, so we'll just do a search to find the orientation 43 | # that most closely matches the FK position, measured by comparing the rotation of the IK_2 joint against the 44 | # FK_2 joint. 45 | # no, measure world space distance? 46 | pole_vector_node = get_control(node, 'Pole_Vector') 47 | pole_vector = pole_vector_node.attr('rx') 48 | fk_2 = get_control(node, 'FK_2') 49 | ik_2 = get_control(node, 'IKJoint_2') 50 | 51 | def current_distance(): 52 | # Return the distance between the IK and FK positions. This measures the error in our current pole 53 | # vector angle. 54 | # 55 | # We can use both distance and orientation to decide how close the pose is. Distance doesn't work when 56 | # the elbow is locked straight, since the pole vector will only rotate the elbow and not rotate it. 57 | if mode == 'angle': 58 | # Read the difference in orientation. 59 | r1 = pm.xform(fk_2, q=True, ws=True, ro=True) 60 | r2 = pm.xform(ik_2, q=True, ws=True, ro=True) 61 | 62 | q1 = dt.EulerRotation(*r1).asQuaternion() 63 | q2 = dt.EulerRotation(*r2).asQuaternion() 64 | 65 | angle = angle_between_quaternions(q1, q2) 66 | angle = angle/math.pi*180.0 67 | return angle 68 | else: 69 | # Read the difference in position. 70 | t1 = pm.xform(fk_2, q=True, ws=True, t=True) 71 | t2 = pm.xform(ik_2, q=True, ws=True, t=True) 72 | 73 | return dist(t1, t2) 74 | 75 | def distance_at_angle(angle): 76 | pole_vector.set(angle) 77 | return current_distance() 78 | 79 | # Do a search to find an enclosing range around the correct value. 80 | best_distance = 999999 81 | start_angle, end_angle = 0, 30 82 | 83 | for angle1 in xrange(-180, 210, 30): 84 | angle2 = angle1 + 30 85 | distance = distance_at_angle(angle1) + distance_at_angle(angle2) 86 | if distance < best_distance: 87 | best_distance = distance 88 | start_angle = angle1 89 | end_angle = angle2 90 | 91 | # Narrow the range until we're within the tolerance. Note that we're rotating the point in an arc, so 92 | # we can overshoot: the distance might get further away before it approaches. 93 | for _ in xrange(25): 94 | half_angle = (start_angle + end_angle) / 2 95 | d1 = distance_at_angle(start_angle) 96 | d2 = distance_at_angle(half_angle) 97 | d3 = distance_at_angle(end_angle) 98 | error1 = abs(d1 - d2) 99 | error2 = abs(d3 - d2) 100 | if error1 < error2: 101 | end_angle = half_angle 102 | else: 103 | start_angle = half_angle 104 | 105 | if abs(error1 - error2) < 0.0001: 106 | break 107 | 108 | pole_vector.set(half_angle) 109 | 110 | def ik_to_fk(): 111 | """ 112 | Match FK to the current IK pose. 113 | """ 114 | n = pm.ls(sl=True)[0] 115 | node = find_ikfk_node(n) 116 | 117 | # Read the IK pose. IKJoint_# points to the current IK transforms, regardless of the current IK/FK weight. 118 | joint1 = get_control(node, 'IKJoint_1') 119 | joint2 = get_control(node, 'IKJoint_2') 120 | joint3 = get_control(node, 'IKJoint_3') 121 | 122 | r1 = pm.xform(joint1, q=True, ws=True, ro=True) 123 | r2 = pm.xform(joint2, q=True, ws=True, ro=True) 124 | r3 = pm.xform(joint3, q=True, ws=True, ro=True) 125 | 126 | pm.xform(get_control(node, 'FK_1'), ws=True, ro=r1) 127 | pm.xform(get_control(node, 'FK_2'), ws=True, ro=r2) 128 | pm.xform(get_control(node, 'FK_3'), ws=True, ro=r3) 129 | 130 | def find_coordinate_space_node(node): 131 | for conn in node.attr('message').connections(d=True, s=False): 132 | if pm.hasAttr(conn, 'nodeType') and conn.attr('nodeType').get() == 'CoordinateSpaceReferences': 133 | return conn 134 | raise Exception('Node %s doesn\'t have a coordinate space attribute.' % node) 135 | 136 | def get_control(node, name): 137 | return node.attr(name).connections()[0] 138 | 139 | def switch_coordinate_space(n): 140 | """ 141 | Switch the coordinate space of the selected node to the value of changeCoordinateSpace, preserving 142 | the current transform. 143 | 144 | Note that only the transform at the current time will be preserved. If the transform is keyed in 145 | the future, animation will break. The coordinate space can be manually restored at the next keyframe. 146 | """ 147 | # Find the RigReferences node for this node's coordinate space. 148 | node = find_coordinate_space_node(n) 149 | if not pm.hasAttr(n, 'changeCoordinateSpace'): 150 | raise Exception('Node %s doesn\'t have a coordinate space attribute.' % n) 151 | if n.attr('changeCoordinateSpace').get() == n.attr('coordinateSpace').get(): 152 | print 'The coordinate space for %s is already up to date.' % n 153 | return 154 | 155 | controlled_transforms = [] 156 | for i in range(10): 157 | try: 158 | control = get_control(node, 'controlledTransform%i' % i) 159 | print control 160 | except pm.MayaAttributeError: 161 | break 162 | controlled_transforms.append(control) 163 | 164 | # Record the current position of the user handles. 165 | translates = [pm.xform(node, q=True, ws=True, t=True) for node in controlled_transforms] 166 | rotates = [pm.xform(node, q=True, ws=True, ro=True) for node in controlled_transforms] 167 | 168 | # Change the coordinate space. 169 | n.attr('coordinateSpace').set(n.attr('changeCoordinateSpace').get()) 170 | 171 | # Restore the position we had before we changed coordinate spaces. 172 | for node, t, r in zip(controlled_transforms, translates, rotates): 173 | if not node.attr('translateX').isLocked(): 174 | pm.xform(node, ws=True, t=t) 175 | 176 | # Do try to set rotations even if rotateX is locked, since pole vector rotations are 177 | # locked on all but Y. 178 | #if not node.attr('rotateX').isLocked() or True: 179 | pm.xform(node, ws=True, ro=r) 180 | 181 | def switch_selected_coordinate_spaces(): 182 | for node in pm.ls(sl=True): 183 | switch_coordinate_space(node) 184 | 185 | -------------------------------------------------------------------------------- /scripts/dsonimport/rigging.py: -------------------------------------------------------------------------------- 1 | import copy, logging, math, os 2 | from pprint import pprint, pformat 3 | import config, util 4 | import maya_helpers as mh 5 | from dson import DSON 6 | 7 | from pymel import core as pm 8 | import pymel.core.datatypes as dt 9 | import pymel 10 | from maya import mel 11 | 12 | log = logging.getLogger('DSONImporter') 13 | 14 | def _create_rotation_rbf(name): 15 | """ 16 | Create an RBF solver. This approximates the rotation on an input plane, with clean falloff 17 | before we flip at 180 degrees. 18 | # 19 | At 1,0, we're at rest. The vector is in its original position, so the angle is 0. 20 | At 0,1, we've rotated 90 degrees. At 0,-1 we've rotated -90 degrees. 21 | We'll place a number of samples around the unit circle, keying them to the angle. 22 | # 23 | We stop before 180 degrees, since 180 degrees could be either 180 or -180 degrees. 24 | There's no way to figure this out, and if we give duplicate inputs to the solver 25 | we'll end up with an unsolvable key set. We stop at 165 degrees in either direction. 26 | If the rotation goes beyond that it'll flip. 27 | """ 28 | 29 | mh.load_plugin('zRBF.py') 30 | 31 | rbf_node = pm.createNode('zRBF', n=name) 32 | min_angle = -165 33 | max_angle = 165 34 | intervals = 6 35 | step = (max_angle - min_angle) / (intervals*2) 36 | for idx, interval in enumerate(xrange(-intervals, intervals+1)): 37 | angle = step * interval 38 | angle = angle * math.pi / 180.0 39 | x = math.cos(angle) 40 | y = math.sin(angle) 41 | point = (x, y, 0) 42 | 43 | value_attr = rbf_node.attr('value').elementByLogicalIndex(idx) 44 | pm.setAttr(value_attr.attr('value_Position'), point) 45 | pm.setAttr(value_attr.attr('value_Value'), angle) 46 | return rbf_node 47 | 48 | def load_plugin(plugin): 49 | # Don't call loadPlugin if the plugin is already loaded. Even though it doesn't do anything, 50 | # it takes about half a second. 51 | if not pm.pluginInfo(plugin, q=True, loaded=True): 52 | pm.loadPlugin(plugin, quiet=True) 53 | 54 | if not pm.pluginInfo(plugin, q=True, registered=True): 55 | raise RuntimeError('Plugin "%s" isn\'t available.' % plugin) 56 | 57 | def straighten_poses(env): 58 | """ 59 | Figures are generally in a relaxed T-pose. Move figures to a full T-pose. 60 | Note that this doesn't bring the arms parallel to the X axis. 61 | """ 62 | 63 | if not config.get('straighten_pose'): 64 | return 65 | 66 | log.debug('Straightening poses') 67 | for dson_node in env.scene.depth_first(): 68 | if dson_node.node_type != 'figure': 69 | continue 70 | 71 | # Ignore eg. SkinBindings. 72 | if dson_node.node_source != 'node': 73 | continue 74 | if 'conform_target' in dson_node: 75 | continue 76 | 77 | # The feet in bind pose are usually pointing slightly outwards. Aim them along 78 | # the Z axis, so they're pointing straight ahead. This is important for HIK floor 79 | # contact, since its contact planes assume that feet are aligned when in bind pose. 80 | # The foot joints aren't aligned to the XZ plane, so there's no axis for us to simply 81 | # align to zero. Instead, look at the world space angle going down to the next joint, 82 | # and rotate by the inverse of that. Do the same for the arm joints and hands. We 83 | # want the hands to be square with the world, so floor contact planes are aligned 84 | # with HIK later. 85 | joints = [ 86 | # Joint to aim End joints Cross, aim, rotate axis Invert 87 | ('lShldrBend', ('lForearmBend',), (1, 0, 2), False), 88 | ('rShldrBend', ('rForearmBend',), (1, 0, 2), False), 89 | ('lForeArm', ('lHand',), (1, 0, 2), False), 90 | ('rForeArm', ('rHand',), (1, 0, 2), False), 91 | ('lForeArm', ('lHand',), (2, 0, 1), True), 92 | ('rForeArm', ('rHand',), (2, 0, 1), True), 93 | 94 | # Aim the hand towards the average of the middle and ring finger. 95 | #('lHand', ('lMid1', 'lRing1'), (2, 0, 1), True), 96 | #('rHand', ('rMid1', 'rRing1'), (2, 0, 1), True), 97 | ('lHand', ('lRing1', ), (2, 0, 1), True), 98 | ('rHand', ('rRing1', ), (2, 0, 1), True), 99 | ('rFoot', ('rMetatarsals',), (0, 2, 1), False), 100 | ('lFoot', ('lMetatarsals',), (0, 2, 1), False), 101 | ] 102 | 103 | # First, find and check all of the joints. If there are problem with any joints, we 104 | # won't apply any changes. 105 | for aim_joint, end_joints, (cross_axis_idx, aim_axis_idx, rotate_axis_idx), invert in joints: 106 | def make_rotations(): 107 | total_angle = 0 108 | try: 109 | j1 = dson_node.find_asset_name(aim_joint) 110 | except KeyError as e: 111 | log.warning('Couldn\'t straighten %s %s: %s', dson_node.node_id, aim_joint, e.message) 112 | return 113 | 114 | # Average the angle towards each of the target joints. 115 | for end_joint in end_joints: 116 | try: 117 | j2 = dson_node.find_asset_name(end_joint) 118 | except KeyError as e: 119 | log.warning('Couldn\'t straighten %s %s: %s', dson_node.node_id, aim_joint, e.message) 120 | return 121 | 122 | pos1 = pm.xform(j1.maya_node, q=True, ws=True, t=True) 123 | pos2 = pm.xform(j2.maya_node, q=True, ws=True, t=True) 124 | if pos2[aim_axis_idx] < pos1[aim_axis_idx]: 125 | pos1, pos2 = pos2, pos1 126 | 127 | angle = math.atan2(pos2[cross_axis_idx] - pos1[cross_axis_idx], pos2[aim_axis_idx] - pos1[aim_axis_idx]) 128 | angle = angle * 180 / math.pi 129 | 130 | if invert: 131 | angle = -angle 132 | total_angle += angle 133 | 134 | total_angle /= len(end_joints) 135 | 136 | rotate = [0,0,0] 137 | rotate[rotate_axis_idx] = -angle 138 | pm.xform(j1.maya_node, ws=True, r=True, ro=rotate) 139 | 140 | rotations = make_rotations() 141 | 142 | -------------------------------------------------------------------------------- /scripts/dsonimport/tcb.py: -------------------------------------------------------------------------------- 1 | # Based on: 2 | # 3 | # https://github.com/Kitware/VTK/blob/master/Common/ComputationalGeometry/vtkKochanekSpline.cxx 4 | # https://github.com/bbattey/KochanekBartelsSpline/blob/master/KochanekBartelsSpline.m 5 | 6 | import bisect 7 | 8 | class KochanekBartelsSpline(object): 9 | """ 10 | >>> kbs = KochanekBartelsSpline([(0, 0, 0.5, 0.5, 0.5), \ 11 | (1, 10, 0.5, 0.5, 0.5), \ 12 | (2, 20, 0.5, 0.5, 0.5)]) 13 | >>> kbs.evaluate(-1) 14 | 0.0 15 | >>> kbs.evaluate(0) 16 | 0.0 17 | >>> kbs.evaluate(1) 18 | 10.0 19 | >>> kbs.evaluate(2) 20 | 20.0 21 | >>> kbs.evaluate(3) 22 | 20.0 23 | >>> kbs.evaluate(1.5) 24 | 15.3125 25 | """ 26 | 27 | def __init__(self, keys): 28 | self.input_points = [] 29 | 30 | for key in keys: 31 | assert len(key) == 5 32 | self.input_points.append(tuple(float(x) for x in key)) 33 | 34 | self.input_points.sort() 35 | 36 | DSvectorsX = [] 37 | DDvectorsX = [] 38 | 39 | # Calculate incoming and outgoing tangent vectors 40 | # for each point (Kochanek and Bartels Equations 8 & 9) 41 | self.coef = [] 42 | for idx in xrange(len(self.input_points)): 43 | next_idx = max(0, min(len(self.input_points)-1, idx + 1)) 44 | prev_idx = max(0, min(len(self.input_points)-1, idx - 1)) 45 | x0, y0, _, _, _ = self.input_points[prev_idx] 46 | x1, y1, t, c, b = self.input_points[idx] 47 | x2, y2, _, _, _ = self.input_points[next_idx] 48 | 49 | cs = y1-y0 50 | cd = y2-y1 51 | 52 | # tension/continuity/bias equations 53 | ds = cs*((1-t) * (1-c) * (1+b)) / 2.0 + cd*((1-t) * (1+c) * (1-b)) / 2.0 54 | dd = cs*((1-t) * (1+c) * (1+b)) / 2.0 + cd*((1-t) * (1-c) * (1-b)) / 2.0 55 | 56 | # adjust deriviatives for non uniform spacing between nodes 57 | n1 = x2 - x1 58 | n0 = x1 - x0 59 | ds *= (2 * n0 / (n0 + n1)) 60 | dd *= (2 * n1 / (n0 + n1)) 61 | 62 | # DS "source derivative" (incoming vector) for this point 63 | DSvectorsX.append(ds) 64 | # DD "desination derivative" (outgoing vector) for this point 65 | DDvectorsX.append(dd) 66 | 67 | for idx in xrange(len(self.input_points)-1): 68 | _, y1, _, _, _ = self.input_points[idx] 69 | _, y2, _, _, _ = self.input_points[idx+1] 70 | 71 | d0 = DDvectorsX[idx] 72 | d1 = DSvectorsX[idx+1] 73 | 74 | coef = [] 75 | coef.append(y1) 76 | coef.append(d0) 77 | coef.append(-3.0*y1 + 3.0*y2 - 2.0*d0 - d1) 78 | coef.append( 2.0*y1 - 2.0*y2 + d0 + d1) 79 | self.coef.append(coef) 80 | 81 | def evaluate(self, f): 82 | f = float(f) 83 | 84 | # Find the keyframes that f lies between. 85 | idx = bisect.bisect([p[0] for p in self.input_points], f)-1 86 | idx = max(0, min(idx, len(self.input_points)-2)) 87 | 88 | x1 = self.input_points[idx][0] 89 | x2 = self.input_points[idx+1][0] 90 | 91 | # Figure out the position between idx and idx+1. 92 | f = (f - x1) / (x2 - x1) 93 | f = min(1, max(0, f)) 94 | 95 | coef = self.coef[idx] 96 | return coef[3]*f*f*f + coef[2]*f*f + coef[1]*f + coef[0] 97 | 98 | if __name__ == "__main__": 99 | import doctest 100 | doctest.testmod() 101 | 102 | #for x in xrange(0, 20): 103 | # f = x / 10.0 104 | # print f, kbs.evaluate(f) 105 | 106 | 107 | # The MIT License (MIT) 108 | # 109 | # Copyright (c) 2015 Bret Battey 110 | # 111 | # Permission is hereby granted, free of charge, to any person obtaining a copy 112 | # of this software and associated documentation files (the "Software"), to deal 113 | # in the Software without restriction, including without limitation the rights 114 | # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 115 | # copies of the Software, and to permit persons to whom the Software is 116 | # furnished to do so, subject to the following conditions: 117 | # 118 | # The above copyright notice and this permission notice shall be included in all 119 | # copies or substantial portions of the Software. 120 | # 121 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 122 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 123 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 124 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 125 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 126 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 127 | # SOFTWARE. 128 | 129 | # Copyright (c) 1993-2015 Ken Martin, Will Schroeder, Bill Lorensen 130 | # All rights reserved. 131 | # 132 | # Redistribution and use in source and binary forms, with or without 133 | # modification, are permitted provided that the following conditions are met: 134 | # 135 | # * Redistributions of source code must retain the above copyright notice, 136 | # this list of conditions and the following disclaimer. 137 | # 138 | # * Redistributions in binary form must reproduce the above copyright notice, 139 | # this list of conditions and the following disclaimer in the documentation 140 | # and/or other materials provided with the distribution. 141 | # 142 | # * Neither name of Ken Martin, Will Schroeder, or Bill Lorensen nor the names 143 | # of any contributors may be used to endorse or promote products derived 144 | # from this software without specific prior written permission. 145 | # 146 | # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS ``AS IS'' 147 | # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 148 | # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE 149 | # ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHORS OR CONTRIBUTORS BE LIABLE FOR 150 | # ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 151 | # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 152 | # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 153 | # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 154 | # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 155 | # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 156 | 157 | -------------------------------------------------------------------------------- /scripts/dsonimport/util.py: -------------------------------------------------------------------------------- 1 | import bisect, cProfile, errno, logging, math, os, platform, pstats, re 2 | from pprint import pprint 3 | from contextlib import contextmanager 4 | from StringIO import StringIO 5 | import subprocess 6 | 7 | try: 8 | import scandir 9 | except ImportError: 10 | from wscandir import scandir 11 | 12 | log = logging.getLogger('DSONImporter') 13 | 14 | @contextmanager 15 | def run_profiling(): 16 | pr = cProfile.Profile() 17 | pr.enable() 18 | try: 19 | yield 20 | finally: 21 | pr.disable() 22 | s = StringIO() 23 | 24 | sortby = 'cumulative' 25 | ps = pstats.Stats(pr, stream=s).sort_stats(sortby) 26 | ps.print_stats() 27 | log.debug(s.getvalue()) 28 | 29 | def flatten_dictionary(data): 30 | """ 31 | Flatten a DSON value dictionary. 32 | 33 | >>> result = flatten_dictionary({ 34 | ... 'a': 1, 35 | ... 'b': { 36 | ... 'c': 3, 37 | ... 'd': { 38 | ... 'e': 4 39 | ... }, 40 | ... }, 41 | ... }) 42 | >>> sorted(result.keys()) 43 | ['a', 'b.c', 'b.d.e'] 44 | >>> result['a'] 45 | 1 46 | >>> result['b.c'] 47 | 3 48 | >>> result['b.d.e'] 49 | 4 50 | 51 | >>> result = flatten_dictionary({ 52 | ... 'a': [ 53 | ... 1, 2, {'x': 1} 54 | ... ] 55 | ... }) 56 | >>> result['a'] 57 | [1, 2, {'x': 1}] 58 | 59 | >>> result = flatten_dictionary({ 60 | ... 'extra': [{ 61 | ... 'type': 'test', 62 | ... 'a': 1 63 | ... }] 64 | ... }) 65 | >>> result['extra.test.a'] 66 | 1 67 | >>> result['extra.test.type'] 68 | 'test' 69 | """ 70 | result = {} 71 | queue = [] 72 | def queue_dictionary(item, top_path): 73 | if top_path: 74 | top_path += '.' 75 | 76 | for key, value in item.iteritems(): 77 | queue.append((top_path + key, value)) 78 | 79 | queue_dictionary(data, '') 80 | while len(queue): 81 | top_path, item = queue.pop() 82 | if top_path == 'extra': 83 | for extra in item: 84 | queue.append(('extra.' + extra['type'], extra)) 85 | continue 86 | 87 | if isinstance(item, dict): 88 | queue_dictionary(item, top_path) 89 | else: 90 | result[top_path] = item 91 | return result 92 | 93 | def make_frozen(value): 94 | """ 95 | Recursively convert lists to tuples, sets to frozensets and dictionaries 96 | to tuples of (key, value) tuples. 97 | """ 98 | if isinstance(value, dict): 99 | return tuple((k, make_frozen(v)) for k, v in sorted(value.items())) 100 | elif isinstance(value, set): 101 | return frozenset(make_frozen(v) for v in value) 102 | elif isinstance(value, tuple): 103 | return tuple(make_frozen(v) for v in value) 104 | elif isinstance(value, list): 105 | return tuple(make_frozen(v) for v in value) 106 | else: 107 | return value 108 | 109 | def srgb_to_linear(value): 110 | """ 111 | Convert an SRGB color value to linear. 112 | """ 113 | if value <= 0.03928: 114 | return value / 12.92 115 | else: 116 | return math.pow((value+0.055) / 1.055, 2.4) 117 | 118 | def srgb_vector_to_linear(value): 119 | """ 120 | Convert an SRGB color value vector to linear. 121 | """ 122 | return [srgb_to_linear(v) for v in value] 123 | 124 | def arrays_equal(a, b, tolerance=0.001): 125 | """ 126 | Return true if the values of the lists a and b are the same, within the 127 | specified tolerance. 128 | 129 | This is used with vectors that we expect to be the same size, so an assertion 130 | is thrown if they differ. 131 | 132 | >>> arrays_equal([0,0,0], [0,0,0]) 133 | True 134 | >>> arrays_equal([0,0,0], [1,0,0]) 135 | False 136 | >>> arrays_equal([0,0,0], [1,0,0], tolerance=1) 137 | True 138 | """ 139 | assert len(a) == len(b) 140 | return all(abs(a[idx] - b[idx]) <= tolerance for idx in xrange(len(a))) 141 | 142 | def mat4_equal(a, b, tolerance=0.001): 143 | """ 144 | Return true if the given matrices are identical within the given tolerance. 145 | 146 | >>> mat4_equal([[0,0],[0,0]], [[0,0],[0,0]]) 147 | True 148 | >>> mat4_equal([[0,0],[0,0]], [[1,0],[0,0]]) 149 | False 150 | >>> mat4_equal([[0,0],[0,0]], [[1,0],[0,0]], tolerance=1) 151 | True 152 | """ 153 | assert len(a) == len(b) 154 | 155 | return all(arrays_equal(a[idx], b[idx], tolerance=tolerance) for idx in xrange(len(a))) 156 | 157 | def delta_list_to_deltas_and_indices(delta_list, tolerance=0.001): 158 | """ 159 | Given a list of vectors, eg. [(0,0,0), (1,0,0), (0,0,0)], return a list of indices 160 | and vectors, filtering out zero vectors: [1], [(1,0,0)]. 161 | 162 | >>> delta_list_to_deltas_and_indices([(1,0,0), (0,0,0.000001), (0,1,0)]) 163 | ([0, 2], [(1, 0, 0), (0, 1, 0)]) 164 | """ 165 | # Make a list of the nonzero indices in the new delta list. 166 | def is_nonzero(value): 167 | return any(abs(v) > tolerance for v in value) 168 | 169 | delta_vertex_idx = [idx for idx, delta in enumerate(delta_list) if is_nonzero(delta)] 170 | deltas = [delta_list[idx] for idx in delta_vertex_idx] 171 | return delta_vertex_idx, deltas 172 | 173 | def blend_merge_shape_deltas(deltas_and_indices): 174 | """ 175 | Given a list of [(indices, deltas), (indices, deltas), ...)], sum the deltas and return 176 | a new (indices, deltas). 177 | 178 | >>> deltas1 = [[0, 1], [(1,0,0), (0,1,1)]] 179 | >>> blend_merge_shape_deltas([deltas1]) 180 | ([0, 1], [(1, 0, 0), (0, 1, 1)]) 181 | >>> blend_merge_shape_deltas([deltas1, deltas1]) 182 | ([0, 1], [(2, 0, 0), (0, 2, 2)]) 183 | 184 | >>> deltas2 = [[2], [(1,0,0)]] 185 | >>> blend_merge_shape_deltas([deltas1, deltas2]) 186 | ([0, 1, 2], [(1, 0, 0), (0, 1, 1), (1, 0, 0)]) 187 | 188 | >>> deltas3 = [[0], [(-1,0,0)]] 189 | >>> blend_merge_shape_deltas([deltas1, deltas3]) 190 | ([1], [(0, 1, 1)]) 191 | 192 | >>> deltas4 = [[0], [(1,2,3)]] 193 | >>> result = blend_merge_shape_deltas([deltas4]) 194 | >>> result[0] is deltas4[0] 195 | True 196 | >>> result[1] is deltas4[1] 197 | True 198 | """ 199 | # If there's only one entry, there's nothing to merge. Return the entry without making a copy, 200 | # for performance. 201 | if len(deltas_and_indices) == 1: 202 | return (deltas_and_indices[0][0], deltas_and_indices[0][1]) 203 | 204 | max_vertex_idx = max(max(delta_vertex_idx) for delta_vertex_idx, deltas in deltas_and_indices) + 1 205 | merged_deltas = [(0,0,0)] * max_vertex_idx 206 | 207 | for delta_vertex_idx, deltas in deltas_and_indices: 208 | for idx, value in zip(delta_vertex_idx, deltas): 209 | total = (merged_deltas[idx][0] + value[0], merged_deltas[idx][1] + value[1], merged_deltas[idx][2] + value[2]) 210 | merged_deltas[idx] = total 211 | 212 | delta_vertex_idx, deltas = delta_list_to_deltas_and_indices(merged_deltas) 213 | return delta_vertex_idx, deltas 214 | 215 | def make_unique(name, name_list): 216 | """ 217 | Make name unique, given a list of existing names. 218 | 219 | If name ends in a number, the result will always end in a numeric suffix. For 220 | example, "foo0" will result in attributes named "foo0", "foo1", "foo2" and so on, 221 | compared to "foo" which will first give the requested "foo" followed by "foo1". 222 | This is useful for array-like attributes. 223 | 224 | >>> make_unique('foo', ['abcd']) 225 | 'foo' 226 | >>> make_unique('foo', ['foo']) 227 | 'foo1' 228 | >>> make_unique('foo', ['foo', 'foo1']) 229 | 'foo2' 230 | >>> make_unique('foo', ['foo', 'foo1', 'foo100']) 231 | 'foo101' 232 | >>> make_unique('foo0', []) 233 | 'foo0' 234 | >>> make_unique('foo0', ['foo0']) 235 | 'foo1' 236 | >>> make_unique('foo0', ['foo0', 'foo1']) 237 | 'foo2' 238 | """ 239 | if name not in name_list: 240 | return name 241 | 242 | prefix = re.match('^(.*?)([0-9]*)$', name).group(1) 243 | next_unused_suffix = 1 244 | 245 | for existing in name_list: 246 | m = re.match('^(.*?)([0-9]+)$', existing) 247 | if not m: 248 | continue 249 | if m.group(1) != prefix: 250 | continue 251 | 252 | idx = int(m.group(2)) 253 | next_unused_suffix = max(next_unused_suffix, idx+1) 254 | 255 | return '%s%s' % (prefix, next_unused_suffix) 256 | 257 | class VertexMapping(object): 258 | """ 259 | Manage a simple mapping from vertex indices to vertex indices. 260 | """ 261 | def __init__(self, value=None): 262 | if value is None: 263 | value = {} 264 | self.mapping = value 265 | 266 | def __repr__(self): 267 | return 'VertexMapping(mappings: %i)' % len(self.mapping) 268 | 269 | def __setitem__(self, src, dst): 270 | self.mapping[src] = dst 271 | def __getitem__(self, src): 272 | return self.mapping[src] 273 | def get(self, idx): 274 | return self.mapping.get(idx) 275 | def __iter__(self): 276 | return iter(self.mapping) 277 | def iteritems(self): 278 | return self.mapping.iteritems() 279 | def items(self): 280 | return self.mapping.items() 281 | def __eq__(self, rhs): 282 | if not isinstance(rhs, VertexMapping): 283 | return False 284 | return self.mapping == rhs.mapping 285 | 286 | def __deepcopy__(self, memo): 287 | result = VertexMapping() 288 | result.mapping = dict(self.mapping) 289 | return result 290 | 291 | def invert(self): 292 | """ 293 | Given a {1:10, 2:20} mapping, return {10:1, 20:2}. 294 | 295 | This requires a many-to-one relationship. This will raise an error if more than one 296 | source vertex is mapped to the same destination vertex, since that can't be inverted 297 | without discarding indices. 298 | 299 | >>> m = VertexMapping({1: 10, 2: 20}) 300 | >>> m2 = m.invert() 301 | >>> m.invert() == m 302 | False 303 | >>> m.invert().invert() == m 304 | True 305 | >>> m2[10] 306 | 1 307 | >>> m2.mapping 308 | {10: 1, 20: 2} 309 | """ 310 | result = VertexMapping() 311 | for orig, new in self.mapping.iteritems(): 312 | assert new not in result.mapping 313 | result.mapping[new] = orig 314 | return result 315 | 316 | @classmethod 317 | def map_destination_indices(cls, src, dst): 318 | """ 319 | Given two mappings, {1:2, 3:4} and {1:100, 3:200}, map the source from each mapping 320 | and return a map from the destination of src to the destination of dst: {2:100, 4:200}. 321 | 322 | >>> m1 = VertexMapping({1: 2, 3: 4}) 323 | >>> m2 = VertexMapping({1: 100, 3: 200}) 324 | >>> VertexMapping.map_destination_indices(m1, m2).mapping 325 | {2: 100, 4: 200} 326 | """ 327 | vertex_mapping = {} 328 | for graft_orig_vertex_idx, graft_new_vertex_idx in src.iteritems(): 329 | target_new_vertex_idx = dst.get(graft_orig_vertex_idx) 330 | if target_new_vertex_idx is None: 331 | continue 332 | vertex_mapping[graft_new_vertex_idx] = target_new_vertex_idx 333 | return VertexMapping(vertex_mapping) 334 | 335 | def remap_indexed_data(self, indices, data): 336 | """ 337 | Given a list of indices and data, remap the indices using this VertexMapping. If an 338 | index isn't present in this mapping, discard it and its accompianing data. 339 | 340 | >>> m = VertexMapping({1: 100, 3: 400}) 341 | >>> indices = [1, 2, 3, 4] 342 | >>> data = ['a', 'b', 'c', 'd'] 343 | >>> m.remap_indexed_data(indices, data) 344 | ([100, 400], ['a', 'c']) 345 | """ 346 | remapped_indices = [] 347 | remapped_data = [] 348 | for idx, delta in zip(indices, data): 349 | if idx not in self.mapping: 350 | continue 351 | 352 | remapped_data.append(delta) 353 | remapped_indices.append(self.mapping[idx]) 354 | 355 | return remapped_indices, remapped_data 356 | 357 | class ComponentList(object): 358 | """ 359 | >>> c = ComponentList(['vtx[1]', 'vtx[10:20]']) 360 | >>> 0 in c 361 | False 362 | >>> 1 in c 363 | True 364 | >>> 2 in c 365 | False 366 | >>> 9 in c 367 | False 368 | >>> 10 in c 369 | True 370 | >>> 15 in c 371 | True 372 | >>> 20 in c 373 | True 374 | >>> 21 in c 375 | False 376 | >>> c.get_flat_list() 377 | [1, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20] 378 | 379 | """ 380 | def __init__(self, components): 381 | # Use a list to store the component ranges, so we can search with bisect. 382 | component_list = [] 383 | 384 | for component in components: 385 | assert component.startswith('vtx[') and component.endswith(']') 386 | component = component[4:-1] 387 | 388 | # We either have a number or a start:end range. 389 | values = component.split(':') 390 | if len(values) == 1: 391 | component = (int(values[0]), int(values[0])+1) 392 | else: 393 | assert len(values) == 2 394 | component = (int(values[0]), int(values[1])+1) 395 | 396 | component_list.append(component) 397 | 398 | self.component_list = component_list 399 | 400 | def __contains__(self, idx): 401 | """ 402 | Return true if idx is in this component list. 403 | """ 404 | entry_idx = bisect.bisect(self.component_list, (idx,)) 405 | if entry_idx == len(self.component_list): 406 | entry_idx -= 1 407 | 408 | if entry_idx > 0 and idx < self.component_list[entry_idx][0]: 409 | entry_idx -= 1 410 | 411 | entry = self.component_list[entry_idx] 412 | return idx >= entry[0] and idx < entry[1] 413 | 414 | def get_flat_list(self): 415 | """ 416 | Return a simple list of indices in this component list. 417 | """ 418 | result = [] 419 | for entry in self.component_list: 420 | for value in xrange(entry[0], entry[1]): 421 | result.append(value) 422 | return result 423 | 424 | def browse_to_file(path): 425 | """ 426 | Open a path in a file browser. 427 | 428 | This is only implemented for Windows. 429 | """ 430 | # Almost every Windows program accepts forward slashes, but Explorer doesn't here. 431 | path = path.replace('/', '\\') 432 | subprocess.Popen('explorer /select,' + path) 433 | 434 | # http://stackoverflow.com/questions/11557241 435 | def topological_sort(source): 436 | """ 437 | Perform topo sort on elements. 438 | 439 | source is a "{name: {set of dependancies}}" dictionary. 440 | Yield a list of names, with dependancies listed first. 441 | """ 442 | all_names = set(source.keys()) 443 | 444 | pending = [] 445 | for name, deps in source.iteritems(): 446 | # Filter out nonexistant dependencies. 447 | deps = deps & all_names 448 | 449 | pending.append((name, deps)) 450 | 451 | emitted = [] 452 | while pending: 453 | next_pending = [] 454 | next_emitted = [] 455 | for entry in pending: 456 | name, deps = entry 457 | deps.difference_update(emitted) # remove deps we emitted last pass 458 | if deps: # still has deps? recheck during next pass 459 | next_pending.append(entry) 460 | continue 461 | 462 | yield name 463 | emitted.append(name) # <-- not required, but helps preserve original ordering 464 | next_emitted.append(name) # remember what we emitted for difference_update() in next pass 465 | if not next_emitted: 466 | # We have remaining items but we didn't yield anything. 467 | log.warning('Cyclic or missing dependancy detected: %s', pformat(next_pending)) 468 | return 469 | 470 | pending = next_pending 471 | emitted = next_emitted 472 | 473 | # Backslashes for path separators are purely cosmetic. Forward slashes are accepted 474 | # almost everywhere. 475 | if platform.system() == 'Windows': 476 | def _map_path_separators(s): 477 | return s.replace('/', '\\') 478 | else: 479 | def _map_path_separators(s): 480 | return s 481 | 482 | _filename_cache = {} 483 | 484 | def normalize_filename(path): 485 | """ 486 | Find a file case-insensitively and return its actual filename case. 487 | """ 488 | path = path.replace('\\', '/') 489 | if path.endswith('/'): 490 | path = path[:-1] 491 | 492 | directory = os.path.dirname(path) 493 | 494 | # Capitalize drive letters on Windows. 495 | if directory[1:2] == ':': 496 | directory = directory[0].upper() + directory[1:] 497 | 498 | if not directory.endswith('/'): 499 | directory += '/' 500 | 501 | directory_lower = directory.lower() 502 | filename = os.path.basename(path) 503 | filename_lower = filename.lower() 504 | 505 | if not filename: 506 | return directory 507 | 508 | paths = _filename_cache.get(directory_lower) 509 | if paths is None: 510 | # We haven't cached this directory yet. 511 | directory = normalize_filename(directory) 512 | if not directory.endswith('/'): 513 | directory += '/' 514 | 515 | paths = os.listdir(directory) 516 | paths = { path.lower(): '%s%s' % (directory, path) for path in paths } 517 | 518 | _filename_cache[directory_lower] = paths 519 | 520 | normalized_filename = paths.get(filename.lower()) 521 | if normalized_filename: 522 | return normalized_filename 523 | else: 524 | return path 525 | 526 | def remove_redundant_path_slashes(path): 527 | """ 528 | If a relative filename contains multiple consecutive / characters (except at the beginning, 529 | in case of //server/host paths), remove them. 530 | >>> remove_redundant_path_slashes('/test//test2') 531 | '/test/test2' 532 | >>> remove_redundant_path_slashes('//test///test2') 533 | '//test/test2' 534 | >>> remove_redundant_path_slashes('') 535 | '' 536 | """ 537 | path_suffix = path[1:] 538 | path_suffix = re.sub(r'//+', '/', path[1:]) 539 | 540 | return path[0:1] + path_suffix 541 | 542 | def normalize_filename_and_relative_path(absolute_path, relative_path): 543 | """ 544 | Given an absolute path and a relative path to the same file, normalize the 545 | absolute path with normalize_filename(), and then make the same adjustment 546 | to relative_path. relative_path must be a suffix of absolute_path. 547 | """ 548 | # If a relative filename contains multiple consecutive / characters (except at the beginning, 549 | # in case of //server/host paths), remove them. normalize_filename will remove them, and we 550 | # need the path lengths to line up. 551 | absolute_path = remove_redundant_path_slashes(absolute_path) 552 | relative_path = remove_redundant_path_slashes(relative_path) 553 | assert absolute_path.endswith(relative_path), (absolute_path, relative_path) 554 | 555 | # Normalize filename case to the actual case on the filesystem. 556 | normalized_absolute_path = normalize_filename(absolute_path) 557 | 558 | # We don't expect the length to change, since we need to map the case adjustment 559 | # back to relative_path. 560 | assert len(normalized_absolute_path) == len(absolute_path), (normalized_absolute_path, absolute_path) 561 | 562 | normalized_relative_path = normalized_absolute_path[-len(relative_path):] 563 | return normalized_absolute_path, normalized_relative_path 564 | 565 | def scandir_walk(top, topdown=True, onerror=None, followlinks=False): 566 | """ 567 | Like scandir, but yield DirEntry instead of pathnames. 568 | 569 | scandir.walk is faster than os.walk, but for some reason it still yields pathnames 570 | instead of DirEntry, and we have to copy over the whole implementation to fix this. 571 | """ 572 | from os.path import join, islink 573 | 574 | dirs = [] 575 | nondirs = [] 576 | 577 | # We may not have read permission for top, in which case we can't 578 | # get a list of the files the directory contains. os.walk 579 | # always suppressed the exception then, rather than blow up for a 580 | # minor reason when (say) a thousand readable directories are still 581 | # left to visit. That logic is copied here. 582 | try: 583 | scandir_it = scandir.scandir(top) 584 | except OSError as error: 585 | if onerror is not None: 586 | onerror(error) 587 | return 588 | 589 | while True: 590 | try: 591 | try: 592 | entry = next(scandir_it) 593 | except StopIteration: 594 | break 595 | except OSError as error: 596 | if onerror is not None: 597 | onerror(error) 598 | return 599 | 600 | try: 601 | is_dir = entry.is_dir() 602 | except OSError: 603 | # If is_dir() raises an OSError, consider that the entry is not 604 | # a directory, same behaviour than os.path.isdir(). 605 | is_dir = False 606 | 607 | if is_dir: 608 | dirs.append(entry) 609 | else: 610 | nondirs.append(entry) 611 | 612 | if not topdown and is_dir: 613 | # Bottom-up: recurse into sub-directory, but exclude symlinks to 614 | # directories if followlinks is False 615 | if followlinks: 616 | walk_into = True 617 | else: 618 | try: 619 | is_symlink = entry.is_symlink() 620 | except OSError: 621 | # If is_symlink() raises an OSError, consider that the 622 | # entry is not a symbolic link, same behaviour than 623 | # os.path.islink(). 624 | is_symlink = False 625 | walk_into = not is_symlink 626 | 627 | if walk_into: 628 | for entry in scandir_walk(entry.path, topdown, onerror, followlinks): 629 | yield entry 630 | 631 | # Yield before recursion if going top down 632 | if topdown: 633 | yield top, dirs, nondirs 634 | 635 | # Recurse into sub-directories 636 | for name in dirs: 637 | new_path = join(top, name.name) 638 | # Issue #23605: os.path.islink() is used instead of caching 639 | # entry.is_symlink() result during the loop on os.scandir() because 640 | # the caller can replace the directory entry during the "yield" 641 | # above. 642 | if followlinks or not islink(new_path): 643 | for entry in scandir_walk(new_path, topdown, onerror, followlinks): 644 | yield entry 645 | else: 646 | # Yield after recursion if going bottom up 647 | yield top, dirs, nondirs 648 | 649 | def mkdir_p(path): 650 | """ 651 | Create a directory and its parents if they don't exist. 652 | """ 653 | # Why does makedirs raise an error if the directory exists? Nobody ever wants 654 | # that. 655 | try: 656 | os.makedirs(path) 657 | except OSError as e: 658 | if e.errno != errno.EEXIST: 659 | raise 660 | 661 | # This exception is raised if the operation is cancelled by the user. 662 | class CancelledException(Exception): pass 663 | 664 | class ProgressWindow(object): 665 | def __init__(self): 666 | self._cancel = False 667 | 668 | def show(self, title, total_progress_values): 669 | pass 670 | 671 | def hide(self): 672 | pass 673 | 674 | def cancel(self): 675 | self._cancel = True 676 | 677 | def check_cancellation(self): 678 | if self._cancel: 679 | raise CancelledException() 680 | 681 | def set_main_progress(self, job): 682 | # Check for cancellation when we update progress. 683 | self.check_cancellation() 684 | 685 | def set_task_progress(self, label, percent=None, force=False): 686 | # Check for cancellation when we update progress. 687 | self.check_cancellation() 688 | 689 | if __name__ == "__main__": 690 | import doctest 691 | doctest.testmod() 692 | 693 | -------------------------------------------------------------------------------- /scripts/dsonimport/uvset.py: -------------------------------------------------------------------------------- 1 | import logging, math 2 | from dson import DSON 3 | 4 | log = logging.getLogger('DSONImporter') 5 | 6 | class UVSet(object): 7 | @classmethod 8 | def create_from_dson_uvset(cls, dson_uv_set, polys): 9 | """ 10 | Create a UVSet from a DSON UV set node. 11 | """ 12 | uv_indices = dson_uv_set['polygon_vertex_indices'] 13 | static_uv_indices = dson_uv_set['vertex_count'] 14 | poly_vert_map = { (entry[0], entry[1]): entry[2] for entry in uv_indices.value } 15 | 16 | # DSON compresses UVs by storing a UV per vertex, then having a mapping of vertex/face pairs 17 | # that have a different UV, for where adjacent faces have different UVs. We need to send the 18 | # UVs to Maya per face vertex anyway, and it's more convenient to have a single mapping, so 19 | # we'll just create a per-face-vertex UV list and throw away the compressed data. 20 | uv_indices_per_poly = [] 21 | for poly_idx, poly in enumerate(polys): 22 | # Unless the UV set remaps an individual face vertex, this is the index in the UV set. 23 | # This will be changed when the mesh is culled or grafted. 24 | # poly['uv_idx'] = poly_idx 25 | uv_indices = [] 26 | source_poly_idx = poly['source_poly_idx'] 27 | for vertex_idx in poly['vertex_indices']: 28 | try: 29 | uv_idx = poly_vert_map[source_poly_idx, vertex_idx] 30 | except KeyError as e: 31 | if vertex_idx < static_uv_indices: 32 | uv_idx = vertex_idx 33 | else: 34 | raise RuntimeError('Out of range index: (%i,%i,%i)' % (poly_idx, vertex_idx, static_indices)) 35 | 36 | uv_indices.append(uv_idx) 37 | uv_indices_per_poly.append(uv_indices) 38 | 39 | return cls(dson_uv_set.get_label(), dson_uv_set['uvs/values'].value, uv_indices_per_poly, dson_uv_set=dson_uv_set) 40 | 41 | def __init__(self, name, uv_values, uv_indices_per_poly, dson_uv_set=None): 42 | if dson_uv_set is not None: 43 | assert isinstance(dson_uv_set, DSON.DSONNode), dson_uv_set 44 | 45 | self.dson_uv_set = dson_uv_set 46 | self.name = name 47 | 48 | # Be sure to make a copy of this, since we can modify it when remapping and we don't 49 | # want to modify it in the source data. 50 | self.uv_values = list(uv_values) 51 | self.uv_indices_per_poly = uv_indices_per_poly 52 | 53 | def __repr__(self): 54 | return 'UVSet(%s)' % self.name 55 | 56 | def cull_unused(self, polys): 57 | """ 58 | Remove unused UVs. polys is a face list from LoadedMesh.polys. 59 | """ 60 | used_uv_indices = [] 61 | 62 | uv_indices_per_poly = self.uv_indices_per_poly 63 | for poly_idx, poly in enumerate(polys): 64 | uv_indices = uv_indices_per_poly[poly_idx] 65 | for idx, vertex_idx in enumerate(poly['vertex_indices']): 66 | used_uv_indices.append(uv_indices[idx]) 67 | 68 | used_uv_indices = list(set(used_uv_indices)) 69 | used_uv_indices.sort() 70 | 71 | # XXX array instead of dict? 72 | old_uv_indices_to_new = {old_uv_idx: new_uv_idx for new_uv_idx, old_uv_idx in enumerate(used_uv_indices)} 73 | new_uv_indices_to_old = {b: a for a, b in old_uv_indices_to_new.items()} 74 | 75 | self.uv_values = [self.uv_values[new_uv_indices_to_old[idx]] for idx in xrange(len(new_uv_indices_to_old))] 76 | 77 | for poly_idx, poly in enumerate(polys): 78 | uv_indices_per_poly[poly_idx] = [old_uv_indices_to_new[uv_idx] for uv_idx in uv_indices_per_poly[poly_idx]] 79 | 80 | def get_uv_bounds(self, face_indices): 81 | """ 82 | Return (min_u, min_v), (max_u_ max_v) for UVs used by faces in face_indices. 83 | """ 84 | minimum = [10e10, 10e10] 85 | maximum = [-10e10, -10e10] 86 | for face_idx in face_indices: 87 | uv_indices = self.uv_indices_per_poly[face_idx] 88 | for uv_index in uv_indices: 89 | value = self.uv_values[uv_index] 90 | 91 | minimum[0] = min(minimum[0], value[0]) 92 | minimum[1] = min(minimum[1], value[1]) 93 | maximum[0] = max(maximum[0], value[0]) 94 | maximum[1] = max(maximum[1], value[1]) 95 | 96 | return minimum, maximum 97 | 98 | def get_uv_tile_bounds(self, face_indices): 99 | """ 100 | Return (min_u, min_v), (max_u_ max_v) in this UVSet, rounded outwards to an integer. 101 | 102 | If this UVSet only uses UVs within 0-1, return (0,0), (1,1). 103 | """ 104 | minimum, maximum = self.get_uv_bounds(face_indices) 105 | minimum[0] = int(math.floor(minimum[0])) 106 | minimum[1] = int(math.floor(minimum[1])) 107 | maximum[0] = int(math.ceil(maximum[0])) 108 | maximum[1] = int(math.ceil(maximum[1])) 109 | return minimum, maximum 110 | 111 | -------------------------------------------------------------------------------- /scripts/dsonimport/wscandir/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zewt/zDsonImport/ae9beb87e09a6366e5f1b07a598478f20fa101ae/scripts/dsonimport/wscandir/__init__.py -------------------------------------------------------------------------------- /scripts/dsonimport/wscandir/_scandir.pyd: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/zewt/zDsonImport/ae9beb87e09a6366e5f1b07a598478f20fa101ae/scripts/dsonimport/wscandir/_scandir.pyd -------------------------------------------------------------------------------- /scripts/dsonimport/wscandir/source.txt: -------------------------------------------------------------------------------- 1 | Built for Windows from https://github.com/benhoyt/scandir v1.2. Building this is annoying: 2 | 3 | - Python modules only build with VC2013, not 2015. 4 | - Install Python 2.7 for Windows. 5 | - Run SET VS90COMNTOOLS=%VS120COMNTOOLS% to work around "Unable to find vcvarsall.bat". 6 | - "C:\Program Files\Python27\python.exe" setup.py build 7 | 8 | On other OS's where this isn't built it'll use the fallback. This optimization is mostly 9 | important on Windows, so that's OK. 10 | 11 | 12 | This is renamed to "wscandir" to avoid confusion with any scandir installed by the system. 13 | --------------------------------------------------------------------------------