├── .gitignore
├── COMP3360_ANI_T1.pdf
├── COMP3360_ANI_T2.pdf
├── COMP3360_ANI_T3.pdf
├── COMP3360_ANI_T4.pdf
├── COMP3360_ANI_T5.pdf
├── COMP3360_ANI_T6.pdf
├── COMP3360_ANI_T7.pdf
├── README.md
├── assignment_1
├── README.md
├── data
│ ├── default_skeleton.npy
│ ├── materials
│ │ ├── Checker.png
│ │ └── GroundScene.egg
│ └── motion_walking.bvh
├── env_test.py
├── file_io.py
├── task1
│ └── hm.obj
├── task2_forward_kinematic.py
├── task3_inverse_kinematic.py
└── viewer.py
├── assignment_2
├── README.md
├── Viewer
│ ├── controller.py
│ ├── viewer.py
│ └── visualize_utils.py
├── data
│ ├── default_skeleton.npy
│ ├── materials
│ │ ├── Checker.png
│ │ └── GroundScene.egg
│ ├── motion_running.bvh
│ ├── motion_walking.bvh
│ ├── motion_walking_long.bvh
│ └── push.bvh
├── env_test.py
├── file_io.py
├── matching_utils.py
├── task1_motion_editing.py
└── task2_motion_matching.py
├── assignment_3
├── README.md
├── cat.obj
├── rigid_body_dynamic.py
└── sphere.obj
└── assignment_4
├── README.md
├── cloth.obj
├── cloth_pbd.py
├── obstacle.obj
└── torus.obj
/.gitignore:
--------------------------------------------------------------------------------
1 | [Ll]ibrary/
2 | [Tt]emp/
3 | [Oo]bj/
4 | [Bb]uild/
5 | [Bb]uilds/
6 | __pycache__/
7 | Assets/AssetStoreTools*
8 | .DS_Store
9 | # Visual Studio cache directory
10 | .vs/
11 | solutions/
12 | keynotes/
13 |
14 | # Autogenerated VS/MD/Consulo solution and project files
15 | ExportedObj/
16 | .consulo/
17 | *.csproj
18 | *.unityproj
19 | *.sln
20 | *.suo
21 | *.tmp
22 | *.user
23 | *.userprefs
24 | *.pidb
25 | *.booproj
26 | *.svd
27 | *.pdb
28 | *.opendb
29 | *.VC.db
30 |
31 | # Unity3D generated meta files
32 | *.pidb.meta
33 | *.pdb.meta
34 |
35 | # Unity3D Generated File On Crash Reports
36 | sysinfo.txt
37 |
38 | # Builds
39 | *.apk
40 | *.unitypackage
--------------------------------------------------------------------------------
/COMP3360_ANI_T1.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/COMP3360_ANI_T1.pdf
--------------------------------------------------------------------------------
/COMP3360_ANI_T2.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/COMP3360_ANI_T2.pdf
--------------------------------------------------------------------------------
/COMP3360_ANI_T3.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/COMP3360_ANI_T3.pdf
--------------------------------------------------------------------------------
/COMP3360_ANI_T4.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/COMP3360_ANI_T4.pdf
--------------------------------------------------------------------------------
/COMP3360_ANI_T5.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/COMP3360_ANI_T5.pdf
--------------------------------------------------------------------------------
/COMP3360_ANI_T6.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/COMP3360_ANI_T6.pdf
--------------------------------------------------------------------------------
/COMP3360_ANI_T7.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/COMP3360_ANI_T7.pdf
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # COMP3360 Data-Driven Computer Animation
2 |
3 | Welcome to the COMP3360 in year 2023-2024!
4 |
5 | Here is the code repository of HKU course COMP3360. Any commit and issue will be welcome.
6 |
7 | Instructor: [Prof. Taku Komura](https://www.cs.hku.hk/index.php/people/academic-staff/taku)
8 |
9 | TAs: @[Mingyi Shi](https://rubbly.cn) @[Zhouyingcheng Liao](https://zycliao.com/)
10 |
11 | 
12 |
13 | ## Instruction
14 |
15 | * Get the latest version of this repo
16 |
17 | ```shell
18 | git clone https://github.com/Shimingyi/COMP3360_Data_Driven_Animation.git -b 2024
19 | ```
20 |
21 | * Don't hesitate to seek helps with issue workspace
22 |
23 | ## Assignment 1 - Basic Character Animation
24 |
25 | In this assignment, you will become familiar with fundamental data structures and the animation creation workflow. Your task is to craft an animation clip using the provided infrastructure.
26 | Also, you need to understand the mathematical concepts behind Forward Kinematics (FK) and Inverse Kinematics (IK), and then to interpret the motion capture files and interact with them.
27 |
28 | Details: [[subfolder](./assignment_1)]
29 |
30 | #### Tutorial Slides
31 |
32 | 1. Basic Linear Algebra in Graphics [[slide](./COMP3360_ANI_T1.pdf)]
33 | 2. Forward and Inverse Kinematics [[slide](./COMP3360_ANI_T2.pdf)]
34 |
35 | #### Assessments
36 |
37 | - A rendered video with character animation (Task 1, 40%)
38 | - Core code implementation of Forward Kinematics (Task 2, 25%)
39 | - Core code implementation of Inverse Kinematics - CCD IK (Task 3, 25%)
40 | - Report (10%)
41 |
42 | #### Useful tutorials of Blender
43 |
44 | - [Basic operations of Blender](https://www.youtube.com/watch?v=B0J27sf9N1Y)
45 | - [Character rigging](https://www.youtube.com/watch?v=9dZjcFW3BRY)
46 | - [Keyframing animation](https://youtu.be/yjjLD3h3yRc?si=_-X3Nb3PRaNWeq6h)
47 |
48 | ## Assignment 2 - Animation Processing and Scripting
49 |
50 | This assignment will provide a practical introduction to working with animation data through various algorithms such as interpolation and concatenation. Additionally, you will learn to consider various variables from motion data to enhance the performance of the motion matching method.
51 |
52 | Detials: [[subfolder](./assignment_2)]
53 |
54 | #### Tutorial Slides
55 |
56 | 1. Basic motion processing [[slides](./COMP3360_ANI_T3.pdf)]
57 | 2. Real-time Character Control [[slides](./COMP3360_ANI_T4.pdf)]
58 |
59 | #### Assessments
60 |
61 | * part1_key_framing (30%)
62 | * Linear interpolation (10%); Slerp Interpolation (15%)
63 | * Report the different performance by giving different numbers (5%)
64 | * part2_concatenation (35%)
65 | * Define the search window (10%) + Calculate the sim_matrix (10%);
66 | * Find the real_i and real_j (10%);
67 | * The shifting on the root joint position (5)
68 | * part3_motion_matching (25%)
69 | * Less variables, and better performance(total 15%, 22% - your_variable_num)
70 | * System analyzation (10%) about variable selection, future frame range, etc.
71 | * Report (8%) + 2 videos (2%)
72 | * Including necessary experiment results by *different parameters* (4%) and your *thinking*(4%) for how to produce high quality motions.
73 |
74 | Pls, contact zycliao@cs.hku.hk or myshi@cs.hku.hk if there is any question.
75 |
--------------------------------------------------------------------------------
/assignment_1/README.md:
--------------------------------------------------------------------------------
1 | # Assignment 1 - Basic Character Animation
2 |
3 |
4 | * Submission DDL: Feb. 19th (Monday)
5 |
6 | ## Introduction
7 |
8 | In this assignment, you will learn the basic data structure and animation creation pipeline and are required to create an animation clip with provided infrastructure. Also, you need to understand the mathematics in FK and IK to read the motion capture files and play with them.
9 |
10 | The experiment environment is re-designed by [GAMES105](https://github.com/GAMES-105/GAMES-105), thanks for the open sourcing.
11 |
12 | #### Submission
13 |
14 | File format: A compress file [uid_name_assignment1.zip] with:
15 |
16 | 1. video_file.mp4
17 | 2. task2_forward_kinematic.py, task3_inverse_kinematic.py
18 | 3. uid_name_1.pdf
19 |
20 | ## Examples
21 |
22 |
23 | Desired results for Task 1 (from students in 2022)
24 |
25 |
26 | 
27 |
28 |
29 | Desired results for Task 2 & 3
30 |
31 |
32 | 
33 |
34 | ## Task 1 - A rendered video with character animation(40%)
35 |
36 | - Download [Blender](https://www.blender.org/download/)
37 | - Import the [provided mesh](https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/blob/2024/assignment_1/task1/hm.obj) (feel free to use your mesh if you like)
38 | - Define your key joints and skeleton
39 | - Rigging/Skinning
40 | - Design keyframes animation (feel free to make use of your creativity to add any objects you like)
41 | - Render the video (make use of lights, camera)
42 |
43 | ## Before Task 2 & 3
44 |
45 | #### Enviroment Setting
46 |
47 | Task 2 & 3 requires Python 3 runtime, and Panda3d will be used for rendering.
48 |
49 | ```shell
50 | # recommend to use Anaconda to manage enviroment
51 | conda create -n comp3360 python=3.8
52 | conda activate comp3360
53 | conda install numpy scipy
54 | pip install panda3d
55 |
56 | # Enviromtns verification. After running, you should see a skeleton in a 3D space
57 | # If you are using Appli-Silicon Chip and meet black screen problem, check this ISSUE: https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/issues/4
58 | cd ./assignment_1
59 | python env_test.py
60 | ```
61 |
62 | #### Pre-knowledge
63 |
64 | Human Skeleton and Motion
65 |
66 | * As introduced in the tutorial, a body skeleton is a collection of bones representing the connection of body joints. The body joint is a 3D point in space, with (x, y, z) positional information as primary data. Sometimes, the offset between two joints will also be used, formulated by (x2-x1, y2-y1, z2-z1) with a vector.
67 | * When we rotate a bone, a rotation value will be applied to the parent joint of this bone. For example, if we have joint_1 in (0, 0) and joint_2 in (1, 0), then a 45-degree anticlockwise rotation on joint_1 will yield the positional movement of joint_2, from (1, 0) to (sin45, sin45).
68 | * Global/Local Rotation. The local rotation always means the relative rotation from the parent to the child's joint, and the global rotation (orientation) represents the global rotation of the whole 3D space. In the above case, after the rotation, the orientation of J1 and J2 is both 45 degrees, but the local rotation of J2 keeps the same as 0 degrees, only the local rotation of J1 will be updated to 45 degrees.
69 | * BVH file is a combination of joint offset and local rotations, and you can find more details with this [link](https://research.cs.wisc.edu/graphics/Courses/cs-838-1999/Jeff/BVH.html).
70 | * Quaternion is a rotation representation, you can find the details and its history with [wiki](https://en.wikipedia.org/wiki/Quaternion). In this project, (x, y, z, w) is the default order.
71 |
72 | Vector and Rotation
73 |
74 | * NumPy and SciPy will be used.
75 | * Vector, Quaternion, Rotation_by_Vector, Euler Rotation, and Matrix defination:
76 | ```python
77 | vector = np.array(1, 1, 1) # we use vector to represent the BONE
78 | quat = R.from_quat([ 0. , 0.80976237, 0.53984158, -0.22990426])
79 | rot_vec = R.from_rotvec([0, 3, 2])
80 | euler_angle = R.from_euler('XYZ', [-109.60643629, -21.85973481, -164.48716608], degrees=True)
81 | matrix = R.from_matrix([[-0.89428806, 0.24822376, -0.37233564],
82 | [-0.24822376, 0.41714214, 0.8742868 ],
83 | [ 0.37233564, 0.8742868 , -0.31143019]])
84 | ```
85 | * Rotate an vector
86 | ```python
87 | orentation = # R.from_quat / R.from_rotvec / R.from_matrix
88 | orientation.apply(vector)
89 | ```
90 | * More related details please follow: [https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.transform.Rotation.html ](https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.transform.Rotation.html)
91 |
92 | ## Task 2 - Forward Kinematics
93 |
94 | In this task, we will load the data from the BVH file and visualize the skeleton based on two data types - Offset and Rotation. For convenience, almost code modules (file parse, visualization) have been given, and only core implementation needs to be filled.
95 |
96 | You are required to implement two functions in *[task2_forward_kinematic.py](https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/blob/2024/assignment_1/task2_forward_kinematic.py "task2_forward_kinematic.py").*
97 |
98 | 1. Starting from Line 8, there is a function called *part1_show_T_pose*. We should fill in the code based on the instruction. Then uncomment line 131 and run the script for calling the function. You can see a T-Pose centered with (0, 0, 0).
99 | 2. Starting from Line 45, there is a function called *part2_forward_kinametic*. We should fill in the code based on the instruction. Then uncomment line 134 and run the script for calling the function. You can see a walking motion if show_animation is set to True otherwise a static
100 |
101 | ```python
102 | # Inside the main function, you need to remove the commend in the beginning for testing different functions
103 |
104 | # part1_show_T_pose(viewer, joint_names, joint_parents, joint_offsets)
105 | # part2_forward_kinametic(viewer, joint_names, joint_parents, joint_offsets, local_joint_positions, local_joint_rotations, show_animation=True)
106 | ```
107 |
108 | Screenshot of T-pose and walking motion will be expected in the assignment report.
109 |
110 | ## Task 3 - Inverse Kinematics - CCD IK
111 |
112 | In Task 1, you might have been familiar with controlling the whole body pose by a few control points, but it's a built-in blender function. And in this task, you will try to implement a simple IK algorithm yourself!
113 |
114 | The CCD IK is the default option in the code, we provided almost code modules already, and you only need to fill in the core algorithm as shown here:
115 |
116 | 
117 |
118 | 1. Given P1, ... P4, and target position of P4
119 | 2. In each iteration:
120 | 1. Find a current cursor, for example -> P3
121 | 2. Find the vector from P3 to P4
122 | 3. Find the vector from P3 to target
123 | 4. Rotate the vector P3->P4 to the direction of P3->target (should be careful!)
124 | 5. Update chain orentation
125 | 6. move the cursor to P2
126 |
127 | You are required to implement one function in *[task3_inverse_kinematic.py](https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/blob/2024/assignment_1/task3_inverse_kinematic.py)* and try different IK settings(iteration number/start joint/end joint/target position), then report them with a screenshot and comparison table.
128 |
129 | * The function is started from line 48. And only a few lines will be needed to fill.
130 | * Any other IK implementation will be welcome.
131 |
132 | ```python
133 | # Inside the main function, you need to remove the commend in the beginning for testing different IK configrations
134 |
135 | IK_example(viewer, np.array([0.5, 0.75, 0.5]), 'RootJoint', 'lWrist_end')
136 | # IK_example(viewer, np.array([0.5, 0.75, 0.5]), 'lToeJoint_end', 'lWrist_end')
137 | # IK_interactive(viewer, np.array([0.5, 0.75, 0.5]), 'RootJoint', 'lWrist_end')
138 | # IK_interactive(viewer, np.array([0.5, 0.75, 0.5]), 'lToeJoint_end', 'lWrist_end')
139 | # IK_interactive(viewer, np.array([0.5, 0.75, 0.5]), 'rToeJoint_end', 'lWrist_end')
140 | ```
141 |
142 | ## Task 4 - Report
143 |
144 | * PDF format, no page size requriment so you can also prepare it with powerpoint or keynote
145 | * The first two lines should introduce your NAME and UID.
146 | * Screenshot for each subtask need to be included in the report.
147 |
--------------------------------------------------------------------------------
/assignment_1/data/default_skeleton.npy:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/assignment_1/data/default_skeleton.npy
--------------------------------------------------------------------------------
/assignment_1/data/materials/Checker.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/assignment_1/data/materials/Checker.png
--------------------------------------------------------------------------------
/assignment_1/data/materials/GroundScene.egg:
--------------------------------------------------------------------------------
1 | { Y-up }
2 |
3 | Checker {
4 | "Checker.png"
5 | envtype { MODULATE }
6 | minfilter { LINEAR_MIPMAP_LINEAR }
7 | magfilter { LINEAR_MIPMAP_LINEAR }
8 | wrap { REPEAT }
9 | }
10 | Cube {
11 | {
12 | {
13 | 1.0 0.0 0.0 0.0
14 | 0.0 1.0 0.0 0.0
15 | 0.0 0.0 1.0 0.0
16 | 0.0 0.0 0.0 1.0
17 | }
18 | }
19 |
20 | Cube {
21 | 0 {-1.000000 -1.000000 -1.000000 { 0.0 0.0 } }
22 | 1 {-1.000000 -1.000000 1.000000 { 0.0 1.0 } }
23 | 2 {-1.000000 1.000000 1.000000 { 1.0 1.0 } }
24 | 3 {-1.000000 1.000000 -1.000000 { 1.0 0.0 } }
25 | 4 {-1.000000 1.000000 -1.000000 { 0.0 0.0 } }
26 | 5 {-1.000000 1.000000 1.000000 { 0.0 1.0 } }
27 | 6 { 1.000000 1.000000 1.000000 { 1.0 1.0 } }
28 | 7 { 1.000000 1.000000 -1.000000 { 1.0 0.0 } }
29 | 8 { 1.000000 1.000000 -1.000000 { 0.0 0.0 } }
30 | 9 { 1.000000 1.000000 1.000000 { 0.0 1.0 } }
31 | 10 { 1.000000 -1.000000 1.000000 { 1.0 1.0 } }
32 | 11 { 1.000000 -1.000000 -1.000000 { 1.0 0.0 } }
33 | 12 { 1.000000 -1.000000 -1.000000 { 0.0 0.0 } }
34 | 13 { 1.000000 -1.000000 1.000000 { 0.0 1.0 } }
35 | 14 {-1.000000 -1.000000 1.000000 { 1.0 1.0 } }
36 | 15 {-1.000000 -1.000000 -1.000000 { 1.0 0.0 } }
37 | 16 {-1.000000 1.000000 -1.000000 { 0.0 0.0 } }
38 | 17 { 1.000000 1.000000 -1.000000 { 0.0 1.0 } }
39 | 18 { 1.000000 -1.000000 -1.000000 { 1.0 1.0 } }
40 | 19 {-1.000000 -1.000000 -1.000000 { 1.0 0.0 } }
41 | 20 { 1.000000 1.000000 1.000000 { 0.0 0.0 } }
42 | 21 {-1.000000 1.000000 1.000000 { 0.0 1.0 } }
43 | 22 {-1.000000 -1.000000 1.000000 { 1.0 1.0 } }
44 | 23 { 1.000000 -1.000000 1.000000 { 1.0 0.0 } }
45 | }
46 |
47 |
48 | {
49 | { Checker }
50 | {-1.000000 0.000000 0.000000}
51 | { 0 1 2 3 [ { Cube }}
52 | }
53 | {
54 | { Checker }
55 | {0.000000 1.000000 0.000000}
56 | { 4 5 6 7 ][ { Cube }}
57 | }
58 | {
59 | { Checker }
60 | {1.000000 0.000000 0.000000}
61 | { 8 9 10 11 ][ { Cube }}
62 | }
63 | {
64 | { Checker }
65 | {0.000000 -1.000000 0.000000}
66 | { 12 13 14 15 ][ { Cube }}
67 | }
68 | {
69 | { Checker }
70 | {0.000000 0.000000 -1.000000}
71 | { 16 17 18 19 ][ { Cube }}
72 | }
73 | {
74 | { Checker }
75 | {0.000000 0.000000 1.000000}
76 | { 20 21 22 23 ][ { Cube }}
77 | }
78 | }
79 |
--------------------------------------------------------------------------------
/assignment_1/env_test.py:
--------------------------------------------------------------------------------
1 | from viewer import SimpleViewer
2 |
3 | if __name__ == "__main__":
4 | viewer = SimpleViewer()
5 | viewer.run()
6 |
--------------------------------------------------------------------------------
/assignment_1/file_io.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from scipy.spatial.transform import Rotation as R
3 |
4 | def load_meta_data(bvh_path):
5 | with open(bvh_path, 'r') as f:
6 | channels = []
7 | joint_names = []
8 | joint_parents = []
9 | joint_offsets = []
10 | end_sites = []
11 |
12 | parent_stack = [None]
13 | for line in f:
14 | if 'ROOT' in line or 'JOINT' in line:
15 | joint_names.append(line.split()[-1])
16 | joint_parents.append(parent_stack[-1])
17 | channels.append('')
18 | joint_offsets.append([0, 0, 0])
19 |
20 | elif 'End Site' in line:
21 | end_sites.append(len(joint_names))
22 | joint_names.append(parent_stack[-1] + '_end')
23 | joint_parents.append(parent_stack[-1])
24 | channels.append('')
25 | joint_offsets.append([0, 0, 0])
26 |
27 | elif '{' in line:
28 | parent_stack.append(joint_names[-1])
29 |
30 | elif '}' in line:
31 | parent_stack.pop()
32 |
33 | elif 'OFFSET' in line:
34 | joint_offsets[-1] = np.array([float(x) for x in line.split()[-3:]]).reshape(1,3)
35 |
36 | elif 'CHANNELS' in line:
37 | trans_order = []
38 | rot_order = []
39 | for token in line.split():
40 | if 'position' in token:
41 | trans_order.append(token[0])
42 |
43 | if 'rotation' in token:
44 | rot_order.append(token[0])
45 |
46 | channels[-1] = ''.join(trans_order)+ ''.join(rot_order)
47 |
48 | elif 'Frame Time:' in line:
49 | break
50 |
51 | joint_parents = [-1]+ [joint_names.index(i) for i in joint_parents[1:]]
52 | channels = [len(i) for i in channels]
53 | return joint_names, joint_parents, channels, joint_offsets
54 |
55 |
56 | def load_motion_data(bvh_file_path):
57 | with open(bvh_file_path, 'r') as f:
58 | lines = f.readlines()
59 | for i in range(len(lines)):
60 | if lines[i].startswith('Frame Time'):
61 | break
62 | motion_data = []
63 | for line in lines[i+1:]:
64 | data = [float(x) for x in line.split()]
65 | if len(data) == 0:
66 | break
67 | motion_data.append(np.array(data).reshape(1,-1))
68 | motion_data = np.concatenate(motion_data, axis=0)
69 |
70 | joint_names, joint_parents, channels, joint_offsets = load_meta_data(bvh_file_path)
71 | joint_number = len(joint_names)
72 |
73 | local_joint_positions = np.zeros((motion_data.shape[0], joint_number, 3))
74 | local_joint_rotations = np.zeros((motion_data.shape[0], joint_number, 4))
75 | local_joint_rotations[:,:,3] = 1.0
76 |
77 | cur_channel = 0
78 | for i in range(len(joint_names)):
79 | if channels[i] == 0:
80 | local_joint_positions[:,i,:] = joint_offsets[i].reshape(1,3)
81 | continue
82 | elif channels[i] == 3:
83 | local_joint_positions[:,i,:] = joint_offsets[i].reshape(1,3)
84 | rotation = motion_data[:, cur_channel:cur_channel+3]
85 | elif channels[i] == 6:
86 | local_joint_positions[:, i, :] = motion_data[:, cur_channel:cur_channel+3]
87 | rotation = motion_data[:, cur_channel+3:cur_channel+6]
88 | local_joint_rotations[:, i, :] = R.from_euler('XYZ', rotation,degrees=True).as_quat()
89 | cur_channel += channels[i]
90 |
91 | return motion_data, local_joint_positions, local_joint_rotations
--------------------------------------------------------------------------------
/assignment_1/task2_forward_kinematic.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from scipy.spatial.transform import Rotation as R
3 |
4 | import file_io as bvh_reader
5 | from viewer import SimpleViewer
6 |
7 |
8 | def part1_show_T_pose(viewer, joint_names, joint_parents, joint_offsets):
9 | '''
10 | A function to show the T-pose of the skeleton
11 | joint_names: Shape - (J) a list to store the name of each joit
12 | joint_parents: Shape - (J) a list to store the parent index of each joint, -1 means no parent
13 | joint_offsets: Shape - (J, 1, 3) an array to store the local offset to the parent joint
14 | '''
15 | global_joint_position = np.zeros((len(joint_names), 3))
16 | for joint_idx, parent_idx in enumerate(joint_parents):
17 | '''
18 | TODO: How to update global_joint_position by OFFSETS?
19 | Toy Sample:
20 | Joint1 in (0, 0)
21 | the offset between J1 and J2 is (1, 1)
22 | the offset between J2 and J3 is (1, 1)
23 | the offset between J3 and J4 is (1, 1)
24 | Parent -> Childs: J1 -> J2 -> J3 -> J4
25 | so the global joint position of J4 is (0, 0) + (1, 1) + (1, 1) + (1, 1) = (3, 3)
26 | Hints:
27 | 1. There is a joint tree with parent-child relationship (joint_idx, parent_idx)
28 | 2. The OFFSET between joint_idx and parent_idx is known as *joint_offsets[joint_idx]*
29 | 3. The *parents* variable is a topological sort of the skeleton tree
30 | * The joints after the current joint MUST be below than the current joint or NO Connection
31 | * One iteration on parents variable is enough to cover all joint chains
32 | 4. If parent_idx == -1, then there is no parent joint for current joint,
33 | and the global position of current joint is the same as the local position;
34 | else, the current joint global position = the sum of all parent joint offsets
35 | '''
36 | ########## Code Start ############
37 |
38 |
39 | ########## Code End ############
40 | viewer.set_joint_position_by_name(joint_names[joint_idx], global_joint_position[joint_idx])
41 |
42 | viewer.run()
43 |
44 |
45 | def part2_forward_kinametic(viewer, joint_names, joint_parents, joint_offsets, joint_positions, joint_rotations, show_animation=False):
46 | '''
47 | A function to calculate the global joint positions and orientations by FK
48 | F: Frame number; J: Joint number
49 |
50 | joint_names: Shape - (J) a list to store the name of each joit
51 | joint_parents: Shape - (J) a list to store the parent index of each joint, -1 means no parent
52 | joint_offsets: Shape - (J, 1, 3) an array to store the local offset to the parent joint
53 | joint_positions: Shape - (F, J, 3) an array to store the local joint positions
54 | joint_rotations: Shape - (F, J, 4) an array to store the local joint rotation in quaternion representation
55 | '''
56 | joint_number = len(joint_names)
57 | frame_number = joint_rotations.shape[0]
58 |
59 | global_joint_positions = np.zeros((frame_number, joint_number, 3))
60 | global_joint_orientations = np.zeros((frame_number, joint_number, 4))
61 | global_joint_orientations[:, :, 3] = 1.0
62 |
63 | '''
64 | TODO: How to update global_joint_position by rotation and offset?
65 | Sample:
66 | Joint1 in (0, 0)
67 | the offset between J1 and J2 is (1, 0)
68 | the offset between J2 and J3 is (1, 0)
69 | then rotate the joint J1 by 45 degree
70 | then rotate the joint J2 by 45 degree
71 | How to calculate the global position of J3 after two rotation operations?
72 | Tips: The results should be (sin45, 1+sin45)
73 | Hints:
74 | 1. There is a joint chain with parent-child relationship (joint_idx, parent_idx)
75 | 2. The OFFSET between joint_idx and parent_idx is known as *joint_offsets[joint_idx]*
76 | 3. The rotation of parent joint will effect all child joints
77 | 4. The *parents* variable is a topological sort of the skeleton
78 | * The joints after the current joint MUST be below than the current joint
79 | * One iteration on parents variable is enough to cover all joint chains
80 | More details:
81 | 1. You can use R.from_quat() to represent a rotation in Scipy format
82 | like: r1 = R.from_quat(global_joint_orientations[:, joint_idx, :])
83 | 2. Then R.apply() can apply this rotation to any vector
84 | like: rotated_offset = r1.apply(vector)
85 | 3. new_joint_position = parent_joint_position + parent_joint_rotation.apply(rotated_offset)
86 |
87 | '''
88 | ########## Code Start ############
89 |
90 |
91 | ########## Code End ############
92 | if not show_animation:
93 | show_frame_idx = 0
94 | viewer.show_pose(
95 | joint_names, global_joint_positions[show_frame_idx], global_joint_orientations[show_frame_idx])
96 |
97 | else:
98 | class UpdateHandle:
99 | def __init__(self):
100 | self.current_frame = 0
101 |
102 | def update_func(self, viewer_):
103 | cur_joint_position = global_joint_positions[self.current_frame]
104 | cur_joint_orentation = global_joint_orientations[self.current_frame]
105 | viewer.show_pose(
106 | joint_names, cur_joint_position, cur_joint_orentation)
107 | self.current_frame = (self.current_frame + 1) % frame_number
108 |
109 | handle = UpdateHandle()
110 | viewer.update_func = handle.update_func
111 | viewer.run()
112 |
113 |
114 | def main():
115 | viewer = SimpleViewer()
116 | bvh_file_path = "data/motion_walking.bvh"
117 |
118 | '''
119 | Basic data terms in BVH format:
120 | joint_names: Shape - (J) a list to store the name of each joit
121 | joint_parents: Shape - (J) a list to store the parent index of each joint, -1 means no parent
122 | channels: Shape - (J) a list to store the channel number of each joint
123 | joint_offsets: Shape - (J, 1, 3) an array to store the local offset to the parent joint
124 | local_joint_positions: Shape - (F, J, 3) an array to store the local joint positions
125 | local_joint_rotations: Shape - (F, J, 4) an array to store the local joint rotation in quaternion representation
126 | '''
127 | joint_names, joint_parents, channels, joint_offsets = bvh_reader.load_meta_data(bvh_file_path)
128 | _, local_joint_positions, local_joint_rotations = bvh_reader.load_motion_data(bvh_file_path)
129 |
130 | # part 1
131 | part1_show_T_pose(viewer, joint_names, joint_parents, joint_offsets)
132 |
133 | # part 2
134 | # part2_forward_kinametic(viewer, joint_names, joint_parents, joint_offsets, local_joint_positions, local_joint_rotations, show_animation=True)
135 |
136 |
137 | if __name__ == "__main__":
138 | main()
139 |
--------------------------------------------------------------------------------
/assignment_1/task3_inverse_kinematic.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from scipy.spatial.transform import Rotation as R
3 |
4 | from viewer import SimpleViewer
5 |
6 |
7 | def norm(v):
8 | return v/np.sqrt(np.vdot(v, v))
9 |
10 |
11 | class MetaData:
12 | '''
13 | A helper class to store meta data of the skeleton
14 | get_path_from_root_to_end: return the path from root to end joint
15 | return:
16 | path: a list of joint indices from root to end joint
17 | path_name: a list of joint names from root to end joint
18 | path1: a list of joint indices from FOOT to ROOT joint
19 | path2: a list of joint indices from ROOT to End joint
20 | '''
21 | def __init__(self, joint_name, joint_parent, joint_initial_position, root_joint, end_joint):
22 | self.joint_name = joint_name
23 | self.joint_parent = joint_parent
24 | self.joint_initial_position = joint_initial_position
25 | self.root_joint = root_joint
26 | self.end_joint = end_joint
27 |
28 | def get_path_from_root_to_end(self):
29 |
30 | path1 = [self.joint_name.index(self.end_joint)]
31 | while self.joint_parent[path1[-1]] != -1:
32 | path1.append(self.joint_parent[path1[-1]])
33 |
34 | path2 = [self.joint_name.index(self.root_joint)]
35 | while self.joint_parent[path2[-1]] != -1:
36 | path2.append(self.joint_parent[path2[-1]])
37 |
38 | while path1 and path2 and path2[-1] == path1[-1]:
39 | path1.pop()
40 | a = path2.pop()
41 |
42 | path2.append(a)
43 | path = path2 + list(reversed(path1))
44 | path_name = [self.joint_name[i] for i in path]
45 | return path, path_name, path1, path2
46 |
47 |
48 | def inverse_kinematics(meta_data, global_joint_positions, global_joint_orientations, target_pose, method='ccd'):
49 | path, path_name, path1, path2 = meta_data.get_path_from_root_to_end()
50 |
51 | '''
52 | Take out all the joint information on the IK chain
53 | The IK operation should be done on this chain
54 | Order: start_joint -> end_joint
55 | chain_positions: a list of joint positions on the IK chain
56 | chain_offsets: a list of joint offsets on the IK chain
57 | chain_orientations: a list of joint orientations on the IK chain
58 | '''
59 | chain_positions = [global_joint_positions[joint_idx] for joint_idx in path]
60 | chain_offsets = [np.array([0., 0., 0.])] + [meta_data.joint_initial_position[path[i + 1]] - meta_data.joint_initial_position[path[i]] for i in range(len(path) - 1)]
61 | chain_orientations = [R.from_quat(global_joint_orientations[path[i]]) for i in range(len(path) - 1)] + [R.identity()]
62 |
63 | # Feel free to implement any other IK methods, bonus will be given
64 | if method == 'ccd':
65 | iteration_num = 20
66 | end_joint_name = meta_data.end_joint
67 | end_idx = path_name.index(end_joint_name)
68 | for _ in range(iteration_num):
69 | for current_idx in range(end_idx - 1, 0, -1):
70 | '''
71 | TODO: How to update chain_orientations by optimizing the chain_positions(CCD)?
72 |
73 | Hints:
74 | 1. The CCD IK is an iterative algorithm, the loop structure is given
75 | 2. The CCD IK algorithm is to update the orientation of each joint on the IK chain
76 | 3. Two vectors are essential for the CCD
77 | * The vector from end joint to current joint
78 | * The vector from target position to current joint
79 | 4. The rotation matrix can be obtained by these two vectors
80 | More details about CCD algorithm can be found in the lecture slides.
81 |
82 | Useful functions:
83 | 1. norm: normalize a vector
84 | 2. the position of one joint: chain_positions[current_idx]
85 | 3. vector_between_J1_J2: get the vector from J1 to J2
86 | like: vec_cur2end = norm(j_pos[end_idx] - j_pos[current_idx])
87 | 4. The angle between two vectors: rot = np.arccos(np.vdot(vec1, vec2)) * Sometimes the rot will be nan, so np.isnan() will be helpful
88 | 5. The axis of rotation: axis = norm(np.cross(vec1, vec2))
89 | 6. The rotation matrix: rot_vec = R.from_rotvec(rot * axis)
90 | 7. Update orientation: new_orien = rot_vec * old_orien
91 | '''
92 |
93 | ########## Code Start ############
94 |
95 |
96 |
97 |
98 |
99 |
100 |
101 |
102 | ########## Code End ############
103 |
104 | chain_local_rotations = [chain_orientations[0]] + [chain_orientations[i].inv() * chain_orientations[i + 1] for i in range(len(path) - 1)]
105 | for j in range(current_idx, end_idx):
106 | chain_positions[j + 1] = chain_positions[j] + chain_orientations[j].apply(chain_offsets[j + 1])
107 | if j + 1 < end_idx:
108 | chain_orientations[j + 1] = chain_orientations[j] * chain_local_rotations[j + 1]
109 | else:
110 | chain_orientations[j + 1] = chain_orientations[j]
111 |
112 | # Update global joint positions and orientations with optimized chain
113 | local_joint_rotations = [R.identity()]*len(meta_data.joint_parent)
114 | for joint_idx, parent_idx in enumerate(meta_data.joint_parent):
115 | if parent_idx == -1:
116 | local_joint_rotations[joint_idx] = R.from_quat(global_joint_orientations[joint_idx])
117 | else:
118 | local_joint_rotations[joint_idx] = R.from_quat(global_joint_orientations[parent_idx]).inv() * R.from_quat(global_joint_orientations[joint_idx])
119 |
120 | for chain_idx, joint_idx in enumerate(path):
121 | global_joint_positions[joint_idx] = chain_positions[chain_idx]
122 | global_joint_orientations[joint_idx] = chain_orientations[chain_idx].as_quat()
123 |
124 | for chain_idx, joint_idx in enumerate(path2[:-1]):
125 | global_joint_orientations[path2[chain_idx+1]] = chain_orientations[chain_idx].as_quat()
126 | global_joint_orientations[path2[-1]] = chain_orientations[len(path2) - 1].as_quat()
127 |
128 | for joint_idx, parent_idx in enumerate(meta_data.joint_parent):
129 | if parent_idx != -1 and meta_data.joint_name[joint_idx] not in path_name:
130 | parent_orientation = global_joint_orientations[parent_idx]
131 | original_offset = meta_data.joint_initial_position[joint_idx] - meta_data.joint_initial_position[parent_idx]
132 | rotated_offset = R.from_quat(parent_orientation).apply(original_offset)
133 | global_joint_positions[joint_idx] = global_joint_positions[parent_idx] + rotated_offset
134 | global_joint_orientations[joint_idx] = (R.from_quat(global_joint_orientations[parent_idx]) * local_joint_rotations[joint_idx]).as_quat()
135 |
136 | return global_joint_positions, global_joint_orientations
137 |
138 |
139 | def IK_example(viewer, target_pos, start_joint, end_joint):
140 | '''
141 | A simple example for inverse kinematics
142 | '''
143 | viewer.create_marker(target_pos, [1, 0, 0, 1])
144 | joint_name, joint_parent, joint_initial_position = viewer.get_meta_data()
145 | meta_data = MetaData(joint_name, joint_parent, joint_initial_position, start_joint, end_joint)
146 | global_joint_position = viewer.get_joint_positions()
147 | global_joint_orientation = viewer.get_joint_orientations()
148 |
149 | joint_position, joint_orientation = inverse_kinematics(meta_data, global_joint_position, global_joint_orientation, target_pos)
150 | viewer.show_pose(joint_name, joint_position, joint_orientation)
151 | viewer.run()
152 | pass
153 |
154 |
155 | def IK_interactive(viewer, target_pos, start_joint, end_joint):
156 | '''
157 | A simple interactive example for inverse kinematics
158 | '''
159 | marker = viewer.create_marker(target_pos, [1, 0, 0, 1])
160 |
161 | joint_name, joint_parent, joint_initial_position = viewer.get_meta_data()
162 | meta_data = MetaData(joint_name, joint_parent, joint_initial_position, start_joint, end_joint)
163 | joint_position = viewer.get_joint_positions()
164 | joint_orientation = viewer.get_joint_orientations()
165 |
166 | class UpdateHandle:
167 | def __init__(self, marker, joint_position, joint_orientation):
168 | self.marker = marker
169 | self.joint_position = joint_position
170 | self.joint_orientation = joint_orientation
171 |
172 | def update_func(self, viewer):
173 | target_pos = np.array(self.marker.getPos())
174 | self.joint_position, self.joint_orientation = inverse_kinematics(meta_data, self.joint_position, self.joint_orientation, target_pos)
175 | viewer.show_pose(joint_name, self.joint_position, self.joint_orientation)
176 | handle = UpdateHandle(marker, joint_position, joint_orientation)
177 | handle.update_func(viewer)
178 | viewer.update_marker_func = handle.update_func
179 | viewer.run()
180 |
181 |
182 | def main():
183 | viewer = SimpleViewer()
184 |
185 | '''
186 | You should try different start and end joints and different target positions
187 | use WASD to move the control points in interactive mode (click the scene to activate the control points)
188 | '''
189 | IK_example(viewer, np.array([0.5, 0.75, 0.5]), 'RootJoint', 'lWrist_end')
190 | # IK_example(viewer, np.array([0.5, 0.75, 0.5]), 'lToeJoint_end', 'lWrist_end')
191 | # IK_interactive(viewer, np.array([0.5, 0.75, 0.5]), 'RootJoint', 'lWrist_end')
192 | # IK_interactive(viewer, np.array([0.5, 0.75, 0.5]), 'lToeJoint_end', 'lWrist_end')
193 | # IK_interactive(viewer, np.array([0.5, 0.75, 0.5]), 'rToeJoint_end', 'lWrist_end')
194 |
195 |
196 | if __name__ == "__main__":
197 | main()
198 |
--------------------------------------------------------------------------------
/assignment_1/viewer.py:
--------------------------------------------------------------------------------
1 | from direct.showbase.ShowBase import ShowBase
2 | import numpy as np
3 | from panda3d.core import ClockObject
4 | import panda3d.core as pc
5 | import math
6 | from direct.showbase.DirectObject import DirectObject
7 | from direct.gui.DirectGui import *
8 |
9 |
10 | class CameraCtrl(DirectObject):
11 | def __init__(self, base, camera):
12 | super(CameraCtrl).__init__()
13 | self.accept('mouse1', self.onMouse1Down)
14 | self.accept('mouse1-up', self.onMouse1Up)
15 | self.accept('mouse2', self.onMouse2Down)
16 | self.accept('mouse2-up', self.onMouse2Up)
17 | self.accept('mouse3', self.onMouse3Down)
18 | self.accept('mouse3-up', self.onMouse3Up)
19 | self.accept('wheel_down', self.onMouseWheelDown)
20 | self.accept('wheel_up', self.onMouseWheelUp)
21 |
22 | self.accept('control-mouse1', self.onMouse1Down)
23 | self.accept('control-mouse1-up', self.onMouse1Up)
24 | self.accept('control-mouse2', self.onMouse2Down)
25 | self.accept('control-mouse2-up', self.onMouse2Up)
26 | self.accept('control-mouse3', self.onMouse3Down)
27 | self.accept('control-mouse3-up', self.onMouse3Up)
28 | self.accept('control-wheel_down', self.onMouseWheelDown)
29 | self.accept('control-wheel_up', self.onMouseWheelUp)
30 |
31 | self.position = pc.LVector3(4, 4, 4)
32 | self.center = pc.LVector3(0, 1, 0)
33 | self.up = pc.LVector3(0, 1, 0)
34 |
35 | self.base = base
36 | base.taskMgr.add(self.onUpdate, 'updateCamera')
37 | self.camera = camera
38 |
39 | self._locked_info = None
40 | self._locked_mouse_pos = None
41 | self._mouse_id = -1
42 |
43 | self.pre_position_x = 0
44 | self.pre_position_y = 0
45 |
46 | self.look()
47 |
48 | def look(self):
49 | self.camera.setPos(self.position)
50 | self.camera.lookAt(self.center, self.up)
51 |
52 | @property
53 | def _mousePos(self):
54 | try:
55 | cur_position_x = self.base.mouseWatcherNode.getMouseX()
56 | cur_position_y = self.base.mouseWatcherNode.getMouseY()
57 | except:
58 | cur_position_x = self.pre_position_x
59 | cur_position_y = self.pre_position_y
60 | self.pre_position_x = cur_position_x
61 | self.pre_position_y = cur_position_y
62 | return pc.LVector2(cur_position_x, cur_position_y)
63 |
64 | def _lockMouseInfo(self):
65 | self._locked_info = (pc.LVector3(self.position),
66 | pc.LVector3(self.center), pc.LVector3(self.up))
67 | self._locked_mouse_pos = self._mousePos
68 |
69 | def onMouse1Down(self):
70 | self._lockMouseInfo()
71 | self._mouse_id = 1
72 |
73 | def onMouse1Up(self):
74 | self._mouse_id = -1
75 |
76 | def onMouse2Down(self):
77 | self._lockMouseInfo()
78 | self._mouse_id = 2
79 |
80 | def onMouse2Up(self):
81 | self._mouse_id = -1
82 |
83 | def onMouse3Down(self):
84 | self._lockMouseInfo()
85 | self._mouse_id = 3
86 |
87 | def onMouse3Up(self):
88 | self._mouse_id = -1
89 |
90 | def onMouseWheelDown(self):
91 | z = self.position - self.center
92 |
93 | scale = 1.1
94 |
95 | if scale < 0.05:
96 | scale = 0.05
97 |
98 | self.position = self.center + z * scale
99 | self.look()
100 |
101 | def onMouseWheelUp(self):
102 | z = self.position - self.center
103 |
104 | scale = 0.9
105 |
106 | if scale < 0.05:
107 | scale = 0.05
108 |
109 | self.position = self.center + z * scale
110 | self.look()
111 |
112 | def onUpdate(self, task):
113 | if self._mouse_id < 0:
114 | return task.cont
115 |
116 | mousePosOff0 = self._mousePos - self._locked_mouse_pos
117 | mousePosOff = self._mousePos - self._locked_mouse_pos
118 |
119 | if self._mouse_id == 1:
120 | z = self._locked_info[0] - self._locked_info[1]
121 |
122 | zDotUp = self._locked_info[2].dot(z)
123 | zMap = z - self._locked_info[2] * zDotUp
124 | angX = math.acos(zMap.length() / z.length()) / math.pi * 180.0
125 |
126 | if zDotUp < 0:
127 | angX = -angX
128 |
129 | angleScale = 200.0
130 |
131 | x = self._locked_info[2].cross(z)
132 | x.normalize()
133 | y = z.cross(x)
134 | y.normalize()
135 |
136 | rot_x_angle = -mousePosOff.getY() * angleScale
137 | rot_x_angle += angX
138 | if rot_x_angle > 85:
139 | rot_x_angle = 85
140 | if rot_x_angle < -85:
141 | rot_x_angle = -85
142 | rot_x_angle -= angX
143 |
144 | rot_y = pc.LMatrix3()
145 | rot_y.setRotateMat(-mousePosOff.getX() *
146 | angleScale, y, pc.CS_yup_right)
147 |
148 | rot_x = pc.LMatrix3()
149 | rot_x.setRotateMat(-rot_x_angle, x, pc.CS_yup_right)
150 |
151 | self.position = self._locked_info[1] + (rot_x * rot_y).xform(z)
152 |
153 | elif self._mouse_id == 2:
154 | z = self._locked_info[0] - self._locked_info[1]
155 |
156 | shiftScale = 0.5 * z.length()
157 |
158 | x = self._locked_info[2].cross(z)
159 | z.normalize()
160 | x.normalize()
161 | y = z.cross(x)
162 |
163 | shift = x * -mousePosOff.getX() + y * -mousePosOff.getY()
164 | shift *= shiftScale
165 | self.position = self._locked_info[0] + shift
166 | self.center = self._locked_info[1] + shift
167 |
168 | elif self._mouse_id == 3:
169 | z = self._locked_info[0] - self._locked_info[1]
170 |
171 | scale = 1
172 | scale = 1.0 + scale * mousePosOff0.getY()
173 |
174 | if scale < 0.05:
175 | scale = 0.05
176 |
177 | self.position = self._locked_info[1] + z * scale
178 |
179 | self.look()
180 |
181 | return task.cont
182 |
183 |
184 | class SimpleViewer(ShowBase):
185 | def __init__(self, fStartDirect=True, windowType=None):
186 | '''
187 | this is only used for my project... lots of assumptions...
188 | '''
189 | super().__init__(fStartDirect, windowType)
190 | self.disableMouse()
191 |
192 | self.camera.lookAt(0, 0.9, 0)
193 | self.setupCameraLight()
194 | self.camera.setHpr(0, 0, 0)
195 |
196 | self.setFrameRateMeter(True)
197 | globalClock.setMode(ClockObject.MLimited)
198 | globalClock.setFrameRate(60)
199 |
200 | self.load_ground()
201 |
202 | xSize = self.pipe.getDisplayWidth()
203 | ySize = self.pipe.getDisplayHeight()
204 | props = pc.WindowProperties()
205 | props.setSize(min(xSize-200, 800), min(ySize-200, 600))
206 | self.win.requestProperties(props)
207 |
208 | # color for links
209 | color = [131/255, 175/255, 155/255, 1]
210 | self.tex = self.create_texture(color, 'link_tex')
211 |
212 | self.load_character()
213 | self.update_func = None
214 | self.add_task(self.update, 'update')
215 | self.update_flag = True
216 | self.accept('space', self.receive_space)
217 | pass
218 |
219 | def receive_space(self):
220 | self.update_flag = not self.update_flag
221 |
222 | def create_texture(self, color, name):
223 | img = pc.PNMImage(32, 32)
224 | img.fill(*color[:3])
225 | img.alphaFill(color[3])
226 | tex = pc.Texture(name)
227 | tex.load(img)
228 | return tex
229 |
230 | def load_ground(self):
231 | self.ground = self.loader.loadModel("data/materials/GroundScene.egg")
232 | self.ground.reparentTo(self.render)
233 | self.ground.setScale(100, 1, 100)
234 | self.ground.setTexScale(pc.TextureStage.getDefault(), 50, 50)
235 | self.ground.setPos(0, -1, 0)
236 |
237 | def setupCameraLight(self):
238 | # create a orbiting camera
239 | self.cameractrl = CameraCtrl(self, self.cam)
240 | self.cameraRefNode = self.camera # pc.NodePath('camera holder')
241 | self.cameraRefNode.setPos(0, 0, 0)
242 | self.cameraRefNode.setHpr(0, 0, 0)
243 | self.cameraRefNode.reparentTo(self.render)
244 |
245 | self.accept("v", self.bufferViewer.toggleEnable)
246 |
247 | self.d_lights = []
248 | # Create Ambient Light
249 | ambientLight = pc.AmbientLight('ambientLight')
250 | ambientLight.setColor((0.3, 0.3, 0.3, 1))
251 | ambientLightNP = self.render.attachNewNode(ambientLight)
252 | self.render.setLight(ambientLightNP)
253 |
254 | # Directional light 01
255 | directionalLight = pc.DirectionalLight('directionalLight1')
256 | directionalLight.setColor((0.4, 0.4, 0.4, 1))
257 | directionalLightNP = self.render.attachNewNode(directionalLight)
258 | directionalLightNP.setPos(10, 10, 10)
259 | directionalLightNP.lookAt((0, 0, 0), (0, 1, 0))
260 |
261 | directionalLightNP.wrtReparentTo(self.cameraRefNode)
262 | self.render.setLight(directionalLightNP)
263 | self.d_lights.append(directionalLightNP)
264 |
265 | # Directional light 02
266 | directionalLight = pc.DirectionalLight('directionalLight2')
267 | directionalLight.setColor((0.4, 0.4, 0.4, 1))
268 | directionalLightNP = self.render.attachNewNode(directionalLight)
269 | directionalLightNP.setPos(-10, 10, 10)
270 | directionalLightNP.lookAt((0, 0, 0), (0, 1, 0))
271 |
272 | directionalLightNP.wrtReparentTo(self.cameraRefNode)
273 | self.render.setLight(directionalLightNP)
274 | self.d_lights.append(directionalLightNP)
275 |
276 | # Directional light 03
277 | directionalLight = pc.DirectionalLight('directionalLight3')
278 | directionalLight.setColorTemperature(6500)
279 | directionalLightNP = self.render.attachNewNode(directionalLight)
280 | directionalLightNP.setPos(0, 20, -10)
281 | directionalLightNP.lookAt((0, 0, 0), (0, 1, 0))
282 |
283 | directionalLightNP.wrtReparentTo(self.cameraRefNode)
284 | directionalLight.setShadowCaster(True, 2048, 2048)
285 | directionalLight.getLens().setFilmSize((10, 10))
286 | directionalLight.getLens().setNearFar(0.1, 300)
287 |
288 | self.render.setLight(directionalLightNP)
289 | self.d_lights.append(directionalLightNP)
290 |
291 | self.render.setShaderAuto(True)
292 |
293 | def create_joint(self, link_id, position, end_effector=False):
294 | # create a joint
295 | box = self.loader.loadModel("data/materials/GroundScene.egg")
296 | node = self.render.attachNewNode(f"joint{link_id}")
297 | box.reparentTo(node)
298 |
299 | # add texture
300 | box.setTextureOff(1)
301 | if end_effector:
302 | tex = self.create_texture([0, 1, 0, 1], f"joint{link_id}_tex")
303 | box.setTexture(tex, 1)
304 | box.setScale(0.01, 0.01, 0.01)
305 | node.setPos(self.render, *position)
306 | return node
307 |
308 | def create_link(self, link_id, position, scale, rot):
309 | # create a link
310 | box = self.loader.loadModel("data/materials/GroundScene.egg")
311 | node = self.render.attachNewNode(f"link{link_id}")
312 | box.reparentTo(node)
313 |
314 | # add texture
315 | box.setTextureOff(1)
316 | box.setTexture(self.tex, 1)
317 | box.setScale(*scale)
318 |
319 | node.setPos(self.render, *position)
320 | if rot is not None:
321 | node.setQuat(self.render, pc.Quat(*rot[[3, 0, 1, 2]].tolist()))
322 | return node
323 |
324 | def show_axis_frame(self):
325 | pose = [[1, 0, 0], [0, 1, 0], [0, 0, 1]]
326 | color = [[1, 0, 0, 1], [0, 1, 0, 1], [0, 0, 1, 1]]
327 | for i in range(3):
328 | box = self.loader.loadModel("data/materials/GroundScene.egg")
329 | box.setScale(0.1, 0.1, 0.1)
330 | box.setPos(*pose[i])
331 | tex = self.create_texture(color[i], f"frame{i}")
332 | box.setTextureOff(1)
333 | box.setTexture(tex, 1)
334 | box.reparentTo(self.render)
335 |
336 | def update(self, task):
337 | if self.update_func and self.update_flag:
338 | self.update_func(self)
339 | return task.cont
340 |
341 | def load_character(self):
342 | info = np.load('data/default_skeleton.npy', allow_pickle=True).item()
343 | joint_pos = info['joint_pos']
344 | body_pos = info['body_pos']
345 | joint_name = info['joint_name']
346 | body_rot = info.get('body_ori', None)
347 |
348 | joint, body = [], []
349 |
350 | thickness = 0.03
351 | # joint_name = ['RootJoint'] + joint_name
352 | name_idx_map = {joint_name[i]: i for i in range(len(joint_name))}
353 | scale = [[thickness]*3 for i in range(len(body_pos))]
354 |
355 | scale[name_idx_map['RootJoint']] = [0.06, 0.06, thickness]
356 | scale[name_idx_map['torso_head']] = [0.05, 0.08, thickness]
357 | scale[name_idx_map['lowerback_torso']] = [thickness, 0.05, thickness]
358 | scale[name_idx_map['pelvis_lowerback']] = [thickness, 0.046, thickness]
359 |
360 | scale[name_idx_map['lHip']] = scale[name_idx_map['rHip']] = [
361 | thickness, 0.2, thickness]
362 | scale[name_idx_map['rKnee']] = scale[name_idx_map['lKnee']] = [
363 | thickness, 0.19, thickness]
364 | scale[name_idx_map['rAnkle']] = scale[name_idx_map['lAnkle']] = [
365 | thickness*1.5, thickness, 0.08]
366 | scale[name_idx_map['rToeJoint']] = scale[name_idx_map['lToeJoint']] = [
367 | thickness*1.5, thickness*0.6, 0.02]
368 |
369 | scale[name_idx_map['rTorso_Clavicle']] = scale[name_idx_map['lTorso_Clavicle']] = [
370 | 0.05, thickness, thickness]
371 | scale[name_idx_map['rShoulder']] = scale[name_idx_map['lShoulder']] = [
372 | 0.11, thickness, thickness]
373 | scale[name_idx_map['rElbow']] = scale[name_idx_map['lElbow']] = [
374 | 0.11, thickness, thickness]
375 | scale[name_idx_map['rWrist']] = scale[name_idx_map['lWrist']] = [
376 | 0.05, thickness*0.6, thickness*1.2]
377 |
378 | # joint_pos = np.concatenate([body_pos[0:1], joint_pos], axis=0)
379 | for i in range(len(joint_pos)):
380 | joint.append(self.create_joint(
381 | i, joint_pos[i], 'end' in joint_name[i]))
382 | if i < body_pos.shape[0]:
383 | body.append(self.create_link(
384 | i, body_pos[i], scale[i], rot=body_rot[i] if body_rot is not None else None))
385 | body[-1].wrtReparentTo(joint[-1])
386 |
387 | self.joints = joint
388 | self.joint_name = joint_name
389 | self.name2idx = name_idx_map
390 | self.parent_index = info['parent']
391 | self.init_joint_pos = self.get_joint_positions()
392 |
393 | def get_joint_positions(self):
394 | pos = [joint.getPos(self.render) for joint in self.joints]
395 | return np.concatenate([pos], axis=0)
396 |
397 | def get_joint_orientations(self):
398 | quat = [joint.getQuat(self.render) for joint in self.joints]
399 | return np.concatenate([quat], axis=0)[..., [1, 2, 3, 0]]
400 |
401 | def get_joint_position_by_name(self, name):
402 | pos = self.joints[self.name2idx[name]].getPos(self.render)
403 | return np.array(pos)
404 |
405 | def get_joint_orientation_by_name(self, name):
406 | quat = self.joints[self.name2idx[name]].getQuat(self.render)
407 | return np.array(quat)[..., [1, 2, 3, 0]]
408 |
409 | def set_joint_position_by_name(self, name, pos):
410 | self.joints[self.name2idx[name]].setPos(self.render, *pos)
411 |
412 | def set_joint_orientation_by_name(self, name, quat):
413 | self.joints[self.name2idx[name]].setQuat(
414 | self.render, pc.Quat(*quat[..., [3, 0, 1, 2]].tolist()))
415 |
416 | def set_joint_position_orientation(self, link_name, pos, quat):
417 | if not link_name in self.name2idx:
418 | return
419 | self.joints[self.name2idx[link_name]].setPos(
420 | self.render, *pos.tolist())
421 | self.joints[self.name2idx[link_name]].setQuat(
422 | self.render, pc.Quat(*quat[..., [3, 0, 1, 2]].tolist()))
423 |
424 | def show_pose(self, joint_name_list, joint_positions, joint_orientations):
425 | length = len(joint_name_list)
426 | assert joint_positions.shape == (length, 3)
427 | assert joint_orientations.shape == (length, 4)
428 |
429 | for i in range(length):
430 | self.set_joint_position_orientation(
431 | joint_name_list[i], joint_positions[i], joint_orientations[i])
432 |
433 | def show_rest_pose(self, joint_name, joint_parent, joint_offset):
434 | length = len(joint_name)
435 | joint_positions = np.zeros((length, 3), dtype=np.float64)
436 | joint_orientations = np.zeros((length, 4), dtype=np.float64)
437 | for i in range(length):
438 | if joint_parent[i] == -1:
439 | joint_positions[i] = joint_offset[i]
440 | else:
441 | joint_positions[i] = joint_positions[joint_parent[i]
442 | ] + joint_offset[i]
443 | joint_orientations[i, 3] = 1.0
444 | self.set_joint_position_orientation(
445 | joint_name[i], joint_positions[i], joint_orientations[i])
446 |
447 | def get_meta_data(self):
448 | return self.joint_name, self.parent_index, self.init_joint_pos
449 |
450 | def move_marker(self, marker, x, y):
451 |
452 | if not self.update_marker_func:
453 | return
454 |
455 | y_axis = self.cameractrl._locked_info[2]
456 | z_axis = self.cameractrl.position - self.cameractrl.center
457 | x_axis = np.cross(y_axis, z_axis)
458 | x_axis = x_axis / np.linalg.norm(x_axis)
459 | pos = np.array(marker.getPos(self.render))
460 | pos += x_axis * x + y_axis * y
461 | marker.setPos(self.render, *pos.tolist())
462 | self.update_marker_func(self)
463 |
464 | def create_marker(self, pos, color):
465 | self.update_marker_func = None
466 | marker = self.loader.loadModel("data/materials/GroundScene.egg")
467 | marker.setScale(0.05, 0.05, 0.05)
468 | marker.setPos(*pos)
469 | tex = self.create_texture(color, "marker")
470 | marker.setTextureOff(1)
471 | marker.setTexture(tex, 1)
472 |
473 | marker.wrtReparentTo(self.render)
474 |
475 | self.accept('w', self.move_marker, [marker, 0, 0.05])
476 | self.accept('s', self.move_marker, [marker, 0, -0.05])
477 | self.accept('a', self.move_marker, [marker, -0.05, 0])
478 | self.accept('d', self.move_marker, [marker, 0.05, 0])
479 |
480 | self.accept('w-repeat', self.move_marker, [marker, 0, 0.05])
481 | self.accept('s-repeat', self.move_marker, [marker, 0, -0.05])
482 | self.accept('a-repeat', self.move_marker, [marker, -0.05, 0])
483 | self.accept('d-repeat', self.move_marker, [marker, 0.05, 0])
484 | return marker
485 |
486 | def create_marker2(self, pos, color):
487 | self.update_marker_func = None
488 | marker = self.loader.loadModel("data/materials/GroundScene.egg")
489 | marker.setScale(0.05, 0.05, 0.05)
490 | marker.setPos(*pos)
491 | tex = self.create_texture(color, "marker")
492 | marker.setTextureOff(1)
493 | marker.setTexture(tex, 1)
494 |
495 | marker.wrtReparentTo(self.render)
496 |
497 | self.accept('arrow_up', self.move_marker, [marker, 0, 0.05])
498 | self.accept('arrow_down', self.move_marker, [marker, 0, -0.05])
499 | self.accept('arrow_left', self.move_marker, [marker, -0.05, 0])
500 | self.accept('arrow_right', self.move_marker, [marker, 0.05, 0])
501 |
502 | self.accept('arrow_up-repeat', self.move_marker, [marker, 0, 0.05])
503 | self.accept('arrow_down-repeat', self.move_marker, [marker, 0, -0.05])
504 | self.accept('arrow_left-repeat', self.move_marker, [marker, -0.05, 0])
505 | self.accept('arrow_right-repeat', self.move_marker, [marker, 0.05, 0])
506 | return marker
507 |
--------------------------------------------------------------------------------
/assignment_2/README.md:
--------------------------------------------------------------------------------
1 | # Assignment 2 - Motion Processing
2 |
3 | ## Highlights
4 |
5 | * Submission DDL: 18th, March (Mon)
6 |
7 | ## Introduction
8 |
9 | This assignment will provide a practical introduction to working with animation data through various algorithms such as interpolation and concatenation. Additionally, you will learn to consider various variables from motion data to enhance the performance of the motion matching method.
10 |
11 | The experiment environment is re-designed by [GAMES105](https://github.com/GAMES-105/GAMES-105), thanks for the open sourcing.
12 |
13 | #### Submission
14 |
15 | File format: A compress file **[uid_name_assignment2.zip]** with:
16 |
17 | 1. video_file1 [motion concatenation], video_file2 [motion_matching](if you video file is too big, you can also prepare a link to google drive or onedrive)
18 | 2. task1_motion_editing.py, task2_motion_matching.py
19 | 3. uid_name_2.pdf
20 |
21 | ## Examples
22 |
23 | ]
24 | Desired results for Motion Concatenation
25 |
26 |
27 | 
28 |
29 |
30 | Desired results for Motion Matching
31 |
32 |
33 | 
34 |
35 | ## Before Task 1 & 2
36 |
37 | #### Enviroment Setting
38 |
39 | As same as assignment 1, the Task 1 & 2 also requires Python 3 runtime and Panda3D library.
40 |
41 | ```shell
42 | # recommend to use Anaconda to manage enviroment
43 | conda create -n comp3360 python=3.8
44 | conda activate comp3360
45 | conda install numpy scipy
46 | pip install panda3d
47 |
48 | # Enviromtns verification. After running, you should see a skeleton in a 3D space
49 | # If you are using Appli-Silicon Chip and meet black screen problem, check this ISSUE: https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/issues/4
50 | cd ./assignment_2
51 | python env_test.py
52 | ```
53 |
54 | #### Pre-knowledge
55 |
56 | Human Skeleton and Motion
57 |
58 | * As introduced in the tutorial, a body skeleton is a collection of bones representing the connection of body joints. The body joint is a 3D point in space, with (x, y, z) positional information as primary data. Sometimes, the offset between two joints will also be used, formulated by (x2-x1, y2-y1, z2-z1) with a vector.
59 | * When we rotate a bone, a rotation value will be applied to the parent joint of this bone. For example, if we have joint_1 in (0, 0) and joint_2 in (1, 0), then a 45-degree anticlockwise rotation on joint_1 will yield the positional movement of joint_2, from (1, 0) to (sin45, sin45).
60 | * Global/Local Rotation. The local rotation always means the relative rotation from the parent to the child's joint, and the global rotation (orientation) represents the global rotation of the whole 3D space. In the above case, after the rotation, the orientation of J1 and J2 is both 45 degrees, but the local rotation of J2 keeps the same as 0 degrees, only the local rotation of J1 will be updated to 45 degrees.
61 | * BVH file is a combination of joint offset and local rotations, and you can find more details with this [link](https://research.cs.wisc.edu/graphics/Courses/cs-838-1999/Jeff/BVH.html).
62 | * Quaternion is a rotation representation, you can find the details and its history with [wiki](https://en.wikipedia.org/wiki/Quaternion). In this project, (x, y, z, w) is the default order.
63 |
64 | Interpolation:
65 |
66 | * Linear Interpolation
67 | We assume the offset between each two iteams is the same
68 |
69 | ```python
70 | start_value, end_value = 0, 11
71 | between_num = 10
72 | offset = (start_value - end_value)/between_num
73 | betweens = [start_value + offse*i for i in range(between_num)]
74 | ```
75 | * SciPy Interpolation: [https://docs.scipy.org/doc/scipy/reference/interpolate.html](https://docs.scipy.org/doc/scipy/reference/interpolate.html)
76 |
77 | Imagain the interpolation as a function, we know some x and corresponded y, and send it to scipy function, it will return a reasonable known function that can be used further with a new_x.
78 |
79 | ```python
80 | from scipy.spatial.transform import Slerp
81 |
82 | key_quaternions = R.from_quat(q1), …(q2)
83 | keys = [0 , 1]
84 | slerp_function = Slerp(keys, key_quaternions)
85 | new_keys = np.linspace(0, 1, 10) -> 0, 0.1, 0.2 … 1
86 | interp_quaternions = slerp_function(new_keys)
87 | ```
88 |
89 | More Tips:
90 |
91 | * Numpy: Indexing of numpy array: [https://numpy.org/doc/stable/user/basics.indexing.html](https://numpy.org/doc/stable/user/basics.indexing.html)
92 | * Motion Matching: [https://www.youtube.com/watch?v=KSTn3ePDt50](https://www.youtube.com/watch?v=KSTn3ePDt50)
93 | * VS code debugging feature: [https://www.youtube.com/watch?v=KEdq7gC_RTA](https://www.youtube.com/watch?v=KEdq7gC_RTA)
94 |
95 | ## Task 1 - Keyframing Animation
96 |
97 | For this task, you will be responsible for developing two interpolation functions and implementing them using various keyframing settings. By default, we require that you utilize linear interpolation for position data and slerp for rotation data.
98 |
99 | You are required to implement the interpolation functions at line 43 in *[task1_motion_editing.py](https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/blob/2024/assignment_2/task1_motion_editing.p "task2_forward_kinematic.py").* After doing so, you should uncomment lines 184-187 one by one and execute the script in order to call the function. Successful execution should result in a walking motion being displayed.
100 |
101 | From Line 53, the primary keyframing process is detailed. First, the motion data is loaded, and then keyframes is taken every N (time_step) frames. Next, the interpolation function is called to generate M (target_step) posed frames. If N > M, the motion will play out more slowly, and if N < M, it will be faster. No modifications are required for this section of code.
102 |
103 | Screenshot of walking motion will be expected in the assignment report.
104 |
105 | ## Task 2 - Motion Concatenation
106 |
107 | In this task, you will be required to implement a simple method for concatenating two distinct motions: walking and running. These motions are positioned globally in different locations and have distinct motion styles, directions, and velocities. As such, you will need to consider these differences to achieve a realistic concatenation.
108 |
109 | The basic structure of the pipeline is presented at line 92 in *[task1_motion_editing.py](https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/blob/2024/assignment_2/task1_motion_editing.py#L92)*. Upon completing the implementation, you should uncomment line 189 and examine the final result, which will be a combination of the walking and running motions as shown as above figure.
110 |
111 | The basic pipeline consists of five steps, each of which will be considered in your grading: 1. Locating the search windows 2. Calculating the similarity matrix 3. Determining the closest frames with real frame indices 4. Shifting motion2 to motion1 5. Utilizing your task1 function for interpolation.
112 |
113 | Additionally, you will be eligible for bonus points if you can find ways to address the following issues:
114 |
115 | 1. If the frame interval is set too high, noticeable on-the-fly motion will occur. How can we determine the optimal number of between frames? Alternatively, is there a way to enhance the realism of root position in these frames?
116 | 2. The current pipeline does not consider velocity. Inertialization is a viable option: https://theorangeduck.com/page/spring-roll-call#inertialization
117 | 3. Any other improvements are also welcome.
118 |
119 | ## Task 3 - Motion Matching
120 |
121 | Motion matching is a sophisticated technology used in real-time character control systems. In this section, using a basic motion matching framework, you will required to design the matching variables and experiement how these variables impact the matching quality.
122 |
123 | If you want to modify the variables in the motion matching code *[task2_motion_matching.py](https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/blob/2024/assignment_2/task2_motion_matching.py)*, you should pay attention to the following three parts:
124 |
125 | 1. Line 24 contains a feature mapping variable that records all feature shapes. While some features are already prepared, additional features with specific names can also be included by yourself.
126 | 2. In line 420, you must determine which features to use by filling in their names in the *selected_feature_names* variable and their weights in the *selected_feature_weights* variable. Note that all features used must be present in the mapping variable.
127 | 3. Line 190 requires the implementation of feature calculation based on the variables provided (pos, rot, vel, and avel). Some hints and nesscery functiones has been provided in this class, they are: extract_vel, extract_offset, extract_rotation and extract_future_pos(rot).
128 |
129 | So basiclly, just remember: **For** **the feature you wanna use, you should present it in the mapping varaible.** And also, around L190, it should have the implementation about its calculation.
130 |
131 | We suggest using fewer variables to achieve realistic performance. And the grading of this part will depend on your variable number.
132 |
133 | ## Assessments
134 |
135 | * part1_key_framing (30%)
136 | * Linear interpolation (10%); Slerp Interpolation (15%)
137 | * Report the different performance by giving different numbers (5%)
138 | * part2_concatenation (35%)
139 | * Define the search window (10%) + Calculate the sim_matrix (10%);
140 | * Find the real_i and real_j (10%);
141 | * The shifting on the root joint position (5)
142 | * part3_motion_matching (25%)
143 | * Less variables, and better performance(total 15%, 22% - your_variable_num)
144 | * System analyzation (10%) about variable selection, future frame range, etc.
145 | * Report (8%) + 2 videos (2%)
146 | * Including necessary experiment results by *different parameters* (4%) and your *thinking*(4%) for how to produce high quality motions.
147 |
148 | ## Task 4 - Report
149 |
150 | * PDF format, no page size requriment so you can also prepare it with powerpoint or keynote
151 | * The first two lines should introduce your NAME and UID.
152 | * Including necessary experiment results from **different parameters** (4%) and your **thinking**(4%) about how to produce high quality motions
153 |
--------------------------------------------------------------------------------
/assignment_2/Viewer/controller.py:
--------------------------------------------------------------------------------
1 | '''
2 | This code is highly inspired by the code from the following link:
3 | https://github.com/orangeduck/Motion-Matching/blob/main/controller.cpp
4 | '''
5 |
6 | from .viewer import SimpleViewer
7 | import math
8 | import numpy as np
9 | from scipy.spatial.transform import Rotation as R
10 | from .visualize_utils import *
11 | from scipy.spatial.transform import Slerp
12 | from panda3d.core import *
13 | from direct.showbase.ShowBase import ShowBase
14 |
15 | class KeyAndPad():
16 | def __init__(self, viewer) -> None:
17 | self.viewer = viewer
18 | self.is_down = viewer.mouseWatcherNode.is_button_down
19 | self.use_gamepad = False
20 | self.input_vel = np.zeros(3)
21 | try:
22 | self.device = viewer.devices.getDevices(InputDevice.DeviceClass.gamepad)[0]
23 | except IndexError:
24 | self.device = None
25 |
26 | if self.device:
27 | self.use_gamepad = True
28 | self.set_gamepad_map()
29 | else:
30 | self.set_key_map()
31 |
32 | self.gait = 0
33 |
34 | def key_input(self, axis, value):
35 | if axis == 'x':
36 | self.input_vel[0] = value
37 | elif axis == 'z':
38 | self.input_vel[2] = value
39 | elif axis == 'gait':
40 | self.gait = value
41 |
42 | def set_gamepad_map(self):
43 | self.gamepad_map = {
44 | 'x': InputDevice.Axis.left_x,
45 | 'z': InputDevice.Axis.left_y,
46 | }
47 | self.viewer.attachInputDevice(self.device, prefix="gamepad")
48 | self.viewer.taskMgr.add(self.update_gamepad, 'update_gamepad', sort = 1)
49 | self.viewer.accept('gamepad-rshoulder', self.key_input, ['gait', 1])
50 | self.viewer.accept('gamepad-rshoulder-up', self.key_input, ['gait', 0])
51 |
52 | def update_gamepad(self, task):
53 | self.input_vel[0] = -self.device.findAxis(self.gamepad_map['x']).value
54 | self.input_vel[2] = self.device.findAxis(self.gamepad_map['z']).value
55 | if np.linalg.norm(self.input_vel) > 1:
56 | self.input_vel /= np.linalg.norm(self.input_vel)
57 | elif np.linalg.norm(self.input_vel) < 0.2:
58 | # a dead zone
59 | self.input_vel = np.zeros(3)
60 |
61 | right_x = self.device.findAxis(InputDevice.Axis.right_x).value
62 | right_y = self.device.findAxis(InputDevice.Axis.right_y).value
63 | self.viewer.cameractrl.updateGamepad(right_x, right_y, task)
64 | return task.cont
65 |
66 | def set_key_map(self):
67 | key_map = {
68 | ('w', 'arrow_up'): ['z', 1],
69 | ('s', 'arrow_down'): ['z', -1],
70 | ('a', 'arrow_left'): ['x', 1],
71 | ('d', 'arrow_right'): ['x', -1],
72 | ('space', 'space'): ['gait', 1]
73 | }
74 | for key, value in key_map.items():
75 | self.viewer.accept(key[0], self.key_input, [value[0], value[1]])
76 | self.viewer.accept(key[0] + '-up', self.key_input, [value[0], 0])
77 | self.viewer.accept(key[1], self.key_input, [value[0], value[1]])
78 | self.viewer.accept(key[1] + '-up', self.key_input, [value[0], 0])
79 |
80 | def get_input(self):
81 | return self.input_vel
82 |
83 | def from_euler(e):
84 | return R.from_euler('XYZ', e, degrees=True)
85 |
86 | class InterpolationHelper():
87 | @staticmethod
88 | def lerp(a, b, t):
89 | return a + (b - a) * t
90 |
91 | @staticmethod
92 | def halflife2dampling(halflife):
93 | return 4 * math.log(2) / halflife
94 |
95 | def simulation_positions_update(pos, vel, acc, target_vel, halflife, dt):
96 | d = InterpolationHelper.halflife2dampling(halflife)/2
97 | j0 = vel - target_vel
98 | j1 = acc + d * j0
99 | eydt = math.exp(-d * dt)
100 | pos_prev = pos
101 | tmp1 = j0+j1*dt
102 | tmp2 = j1/(d*d)
103 | pos = eydt * ( -tmp2 -tmp1/d ) + tmp2 + j0/d + target_vel*dt + pos_prev
104 | vel = eydt*tmp1 + target_vel
105 | acc = eydt * (acc - j1*d*dt)
106 | return pos, vel, acc
107 | @staticmethod
108 | def simulation_rotations_update(rot, avel, target_rot, halflife, dt):
109 | d = InterpolationHelper.halflife2dampling(halflife)/2
110 | j0 = R.from_quat(rot) * R.from_quat(target_rot).inv()
111 | j0 = j0.as_rotvec()
112 | j1 = avel + d * j0
113 | eydt = math.exp(-d * dt)
114 | tmp1 = eydt * (j0 + j1 * dt)
115 | rot = R.from_rotvec(tmp1) * R.from_quat(target_rot)
116 | rot = rot.as_quat()
117 | avel = eydt * (avel - j1 * dt * d)
118 | return rot, avel
119 |
120 | @staticmethod
121 | def decay_spring_implicit_damping_rot(rot, avel, halflife, dt):
122 | d = InterpolationHelper.halflife2dampling(halflife)/2
123 | j0 = from_euler(rot).as_rotvec()
124 | j1 = avel + d * j0
125 | eydt = math.exp(-d * dt)
126 | a1 = eydt * (j0+j1*dt)
127 |
128 | rot_res = R.from_rotvec(a1).as_euler('XYZ', degrees=True)
129 | avel_res = eydt * (avel - j1 * dt * d)
130 | return rot_res, avel_res
131 |
132 | @staticmethod
133 | def decay_spring_implicit_damping_pos(pos, vel, halflife, dt):
134 | d = InterpolationHelper.halflife2dampling(halflife)/2
135 | j1 = vel + d * pos
136 | eydt = math.exp(-d * dt)
137 | pos = eydt * (pos+j1*dt)
138 | vel = eydt * (vel - j1 * dt * d)
139 | return pos, vel
140 |
141 | @staticmethod
142 | def inertialize_transition_rot(prev_off_rot, prev_off_avel, src_rot, src_avel, dst_rot, dst_avel):
143 | prev_off_rot, prev_off_avel = InterpolationHelper.decay_spring_implicit_damping_rot(prev_off_rot, prev_off_avel, 1/20, 1/60)
144 | off_rot = from_euler(prev_off_rot) * from_euler(src_rot) * from_euler(dst_rot).inv()
145 | off_avel = prev_off_avel + src_avel - dst_avel
146 | # off_rot = from_euler(src_rot) * from_euler(dst_rot).inv()
147 | # off_avel = src_avel - dst_avel
148 | return off_rot.as_euler('XYZ', degrees=True), off_avel
149 |
150 | @staticmethod
151 | def inertialize_update_rot(prev_off_rot, prev_off_avel, rot, avel, halflife, dt):
152 | off_rot , off_avel = InterpolationHelper.decay_spring_implicit_damping_rot(prev_off_rot, prev_off_avel, halflife, dt)
153 | rot = from_euler(off_rot) * from_euler(rot)
154 | avel = off_avel + avel
155 | return rot.as_euler('XYZ', degrees=True), avel, off_rot, off_avel
156 |
157 | @staticmethod
158 | def inertialize_transition_pos(prev_off_pos, prev_off_vel, src_pos, src_vel, dst_pos, dst_vel):
159 | prev_off_pos, prev_off_vel = InterpolationHelper.decay_spring_implicit_damping_pos(prev_off_pos, prev_off_vel, 1/20, 1/60)
160 | off_pos = prev_off_pos + src_pos - dst_pos
161 | off_vel = prev_off_vel + src_vel - dst_vel
162 | return off_pos, off_vel
163 |
164 | @staticmethod
165 | def inertialize_update_pos(prev_off_pos, prev_off_vel, pos, vel, halflife, dt):
166 | off_pos , off_vel = InterpolationHelper.decay_spring_implicit_damping_pos(prev_off_pos, prev_off_vel, halflife, dt)
167 | pos = off_pos + pos
168 | vel = off_vel + vel
169 | return pos, vel, off_pos, off_vel
170 |
171 | class Controller:
172 | def __init__(self, viewer) -> None:
173 |
174 | self.vel = np.zeros(3)
175 | self.acc = np.zeros(3)
176 | self.avel = np.zeros(3)
177 | self.desired_rotation = np.array([0,0,0,1])
178 |
179 | self.dt = 1/60
180 | self.viewer = viewer
181 | viewer.taskMgr.add(self.update)
182 | self.future_step = 6
183 | self.futures = []
184 |
185 | self.future_vel = []
186 | self.future_avel = []
187 | self.future_pos = []
188 | self.future_rot = []
189 |
190 | self.desired_velocity_change = np.zeros(3)
191 | self.desired_rotation_change = np.zeros(3)
192 |
193 | for i in range(self.future_step):
194 | node = self.viewer.render.attach_new_node('future{i}')
195 | node.setPos(0,0.01,0)
196 | draw_circle_with_arrow(node, 0.5, (252/255, 173/255, 5/155,1), with_circle = i == 0)
197 | node.reparentTo(self.viewer.render)
198 | self.futures.append(node)
199 | self._node = self.futures[0]
200 | self.init_key_input()
201 | self.halflife = 0.27
202 | self.move_speed = np.array([1.75, 1.5, 1.25])
203 | @property
204 | def node(self):
205 | return self._node
206 | @property
207 | def rotation(self):
208 | return np.array(self.node.get_quat())[[1,2,3,0]]
209 | @property
210 | def cameractrl(self):
211 | return self.viewer.cameractrl
212 | @property
213 | def input_vel(self):
214 | return self.input_device.get_input()
215 |
216 |
217 | def desired_velocity_update(self, camera_to_pos, input_vel, simulation_rotation):
218 | camera_to_pos[1] = 0
219 |
220 | fwrd_speed, side_speed, back_speed = self.move_speed
221 |
222 | angle = np.arctan2(camera_to_pos[0], camera_to_pos[2])
223 | rot = R.from_rotvec( angle * np.array([0,1,0]) )
224 | global_direction = rot.apply(input_vel)
225 |
226 | local_desired_direction = R.from_quat(simulation_rotation).inv().apply(global_direction)
227 | local_desired_velocity = np.array([side_speed, 0, fwrd_speed]) * local_desired_direction \
228 | if local_desired_direction[2] > 0 else np.array([side_speed, 0, back_speed]) * local_desired_direction
229 |
230 | desired_velocity = R.from_quat(simulation_rotation).apply(local_desired_velocity)
231 | return desired_velocity
232 |
233 | def desired_rotation_update(self, cur_rotation, desired_velocity):
234 | if np.linalg.norm(desired_velocity) < 1e-5:
235 | # return cur_rotation
236 | return self.rotation
237 | else:
238 | desired_direction = desired_velocity / np.linalg.norm(desired_velocity)
239 | return R.from_rotvec( np.arctan2(desired_direction[0], desired_direction[2]) * np.array([0,1,0]) ).as_quat()
240 |
241 | def init_key_input(self):
242 | self.input_device = KeyAndPad(self.viewer)
243 |
244 | node = self.node.attach_new_node('camera_pos')
245 | node.setPos(0, 3, -5)
246 | self.cameractrl.position = node.getPos(self.viewer.render)
247 | self.camera_ref_pos = node
248 | self.camera_ref_pos.wrtReparentTo(self.node)
249 | self.line = LineSegs()
250 | self.geom = self.line.create(dynamic = True)
251 | self.viewer.render.attach_new_node(self.geom)
252 |
253 | @property
254 | def current_desired_rotation(self):
255 | return np.array(self.futures[0].get_quat())[[1,2,3,0]]
256 |
257 | @property
258 | def current_desired_position(self):
259 | return np.array(self.futures[0].get_pos())
260 |
261 | @property
262 | def gait(self):
263 | return self.input_device.gait
264 |
265 | def update_pos(self):
266 | # self.input_vel = np.array([1,0,0])
267 | init_pos = self.node.get_pos()
268 | init_rot = self.rotation
269 | self.sub_step = 20
270 |
271 | # map input to desired velocity
272 | camera_fwd = self.cameractrl.center - self.cameractrl.position
273 | cur_target_vel = self.desired_velocity_update(camera_fwd, self.input_vel, init_rot)
274 | cur_target_rot = self.desired_rotation_update(self.desired_rotation, cur_target_vel)
275 | self.desired_rotation = cur_target_rot
276 | self.desired_vel = cur_target_vel
277 |
278 | self.desired_velocity_change = (cur_target_vel - self.vel)/self.dt
279 | self.desired_rotation_change = (R.from_quat(cur_target_rot).inv() * R.from_quat(init_rot)).as_rotvec()/self.dt
280 |
281 | # predict future rotations
282 | rotation_trajectory = [init_rot]
283 | new_rot, new_avel = init_rot, self.avel
284 | self.future_avel = [new_avel]
285 | for i in range(self.future_step):
286 | new_rot, new_avel = InterpolationHelper.simulation_rotations_update(new_rot, new_avel, cur_target_rot, self.halflife, self.dt* self.sub_step )
287 | rotation_trajectory.append(new_rot)
288 | self.future_avel.append(new_avel.copy())
289 |
290 | # predict future positions
291 | new_pos, new_vel, new_acc = init_pos, self.vel, self.acc
292 | position_trajectory = [init_pos]
293 | self.future_vel = [new_vel]
294 | for i in range(self.future_step - 1):
295 | new_pos, new_vel, new_acc = InterpolationHelper.simulation_positions_update(new_pos, new_vel, new_acc, cur_target_vel, self.halflife, self.dt* self.sub_step )
296 | position_trajectory.append(new_pos)
297 | self.future_vel.append(new_vel.copy())
298 |
299 | # update current positions
300 | rotation_trajectory[0], self.avel = InterpolationHelper.simulation_rotations_update(init_rot, self.avel, cur_target_rot, self.halflife, self.dt)
301 | position_trajectory[0], self.vel, self.acc = InterpolationHelper.simulation_positions_update(init_pos, self.vel, self.acc, cur_target_vel, self.halflife, self.dt)
302 | rotation_trajectory = np.array(rotation_trajectory).reshape(-1, 4)
303 |
304 | # record the trajectory
305 | self.future_pos = np.array(position_trajectory).reshape(-1, 3)
306 | self.future_rot = rotation_trajectory.copy()
307 | self.future_vel[0] = self.vel.copy()
308 | self.future_vel = np.array(self.future_vel).reshape(-1, 3)
309 | self.future_avel[0] = self.avel.copy()
310 | self.future_avel = np.array(self.future_avel).reshape(-1, 3)
311 |
312 | rotation_trajectory = rotation_trajectory[...,[3,0,1,2]]
313 | for i in range(self.future_step):
314 | self.futures[i].set_pos(*position_trajectory[i])
315 | self.futures[i].set_quat( Quat(*rotation_trajectory[i]) )
316 |
317 | # update camera positions
318 | delta = position_trajectory[0] - init_pos
319 | delta = LVector3(*delta)
320 | self.cameractrl.position = self.cameractrl.position + delta
321 | self.cameractrl.center = self.cameractrl.center + delta
322 | self.cameractrl.look()
323 |
324 | def draw_future(self):
325 | self.line.reset()
326 | self.line.set_color(240/255,31/255,141/255,0.1)
327 | self.line.setThickness(3)
328 | positions = [np.array(self.futures[i].get_pos()) for i in range(self.future_step)]
329 | self.line.moveTo(*positions[0])
330 | for i in positions[1:]:
331 | self.line.drawTo(i[0], i[1], i[2])
332 | self.geom.remove_all_geoms()
333 | self.line.create(self.geom, True)
334 |
335 | def update(self, task):
336 | self.update_pos()
337 | self.draw_future()
338 | return task.cont
339 |
340 | def set_pos(self, pos):
341 |
342 | init_pos = self.node.get_pos()
343 | pos = pos.copy()
344 | pos[1] = 0.01
345 | self.node.set_pos(*pos)
346 |
347 | delta = pos - init_pos
348 | delta = LVector3(*delta)
349 | self.cameractrl.position = self.cameractrl.position + delta
350 | self.cameractrl.center = self.cameractrl.center + delta
351 | self.cameractrl.look()
352 |
353 |
354 | def set_rot(self, rot):
355 | rot = rot.copy()
356 | facing = R.from_quat(rot).apply(np.array([0,0,1]))
357 | facing_xz = np.array([facing[0], 0, facing[2]])
358 | rot = R.from_rotvec( np.arctan2(facing_xz[0], facing_xz[2]) * np.array([0,1,0]) ).as_quat()
359 | self.node.set_quat(Quat(rot[3], rot[0], rot[1], rot[2]))
360 |
361 | def get_desired_state(self):
362 | return self.future_pos, self.future_rot, self.future_vel, self.future_avel, self.gait
363 |
364 | def main():
365 | viewer = SimpleViewer()
366 | viewer.show_axis_frame()
367 | controller = Controller(viewer)
368 | viewer.run()
369 |
370 |
371 | if __name__=="__main__":
372 | main()
373 |
--------------------------------------------------------------------------------
/assignment_2/Viewer/viewer.py:
--------------------------------------------------------------------------------
1 | from direct.showbase.ShowBase import ShowBase
2 | import numpy as np
3 | from panda3d.core import ClockObject
4 | import panda3d.core as pc
5 | import math
6 | from direct.showbase.DirectObject import DirectObject
7 | from direct.gui.DirectGui import *
8 |
9 | class CameraCtrl(DirectObject):
10 | def __init__(self, base, camera):
11 | super(CameraCtrl).__init__()
12 | self.accept('mouse1',self.onMouse1Down)
13 | self.accept('mouse1-up',self.onMouse1Up)
14 | self.accept('mouse2',self.onMouse2Down)
15 | self.accept('mouse2-up',self.onMouse2Up)
16 | self.accept('mouse3',self.onMouse3Down)
17 | self.accept('mouse3-up',self.onMouse3Up)
18 | self.accept('wheel_down',self.onMouseWheelDown)
19 | self.accept('wheel_up',self.onMouseWheelUp)
20 |
21 | self.accept('control-mouse1',self.onMouse1Down)
22 | self.accept('control-mouse1-up',self.onMouse1Up)
23 | self.accept('control-mouse2',self.onMouse2Down)
24 | self.accept('control-mouse2-up',self.onMouse2Up)
25 | self.accept('control-mouse3',self.onMouse3Down)
26 | self.accept('control-mouse3-up',self.onMouse3Up)
27 | self.accept('control-wheel_down',self.onMouseWheelDown)
28 | self.accept('control-wheel_up',self.onMouseWheelUp)
29 |
30 | self.position = pc.LVector3(4,4,4)
31 | self.center = pc.LVector3(0,1,0)
32 | self.up = pc.LVector3(0,1,0)
33 |
34 | self.base = base
35 | base.taskMgr.add(self.onUpdate, 'updateCamera', sort = 2)
36 | self.camera = camera
37 |
38 | self._locked_info = None
39 | self._locked_mouse_pos = None
40 | self._mouse_id = -1
41 |
42 | self.pre_position_x = 0
43 | self.pre_position_y = 0
44 |
45 | self.gamepad_x = 0
46 | self.gamepad_y = 0
47 | self.has_pad = False
48 | self.look()
49 | self._locked_info = (pc.LVector3(self.position), pc.LVector3(self.center), pc.LVector3(self.up))
50 |
51 |
52 | def look(self):
53 | self.camera.setPos(self.position)
54 | self.camera.lookAt(self.center, self.up)
55 |
56 | @property
57 | def _mousePos(self):
58 | try:
59 | cur_position_x = self.base.mouseWatcherNode.getMouseX()
60 | cur_position_y = self.base.mouseWatcherNode.getMouseY()
61 | except:
62 | cur_position_x = self.pre_position_x
63 | cur_position_y = self.pre_position_y
64 | self.pre_position_x = cur_position_x
65 | self.pre_position_y = cur_position_y
66 | return pc.LVector2(cur_position_x, cur_position_y)
67 |
68 | def _lockMouseInfo(self):
69 | self._locked_info = (pc.LVector3(self.position), pc.LVector3(self.center), pc.LVector3(self.up))
70 | self._locked_mouse_pos = self._mousePos
71 |
72 | def onMouse1Down(self):
73 | self._lockMouseInfo()
74 | self._mouse_id = 1
75 |
76 | def onMouse1Up(self):
77 | self._mouse_id = -1
78 |
79 | def onMouse2Down(self):
80 | self._lockMouseInfo()
81 | self._mouse_id = 2
82 |
83 | def onMouse2Up(self):
84 | self._mouse_id = -1
85 |
86 | def onMouse3Down(self):
87 | self._lockMouseInfo()
88 | self._mouse_id = 3
89 |
90 | def onMouse3Up(self):
91 | self._mouse_id = -1
92 |
93 | def onMouseWheelDown(self):
94 | z = self.position - self.center
95 |
96 | scale = 1.1
97 |
98 | if scale < 0.05:
99 | scale = 0.05
100 |
101 | self.position = self.center + z * scale
102 | self.look()
103 |
104 | def onMouseWheelUp(self):
105 | z = self.position - self.center
106 |
107 | scale = 0.9
108 |
109 | if scale < 0.05:
110 | scale = 0.05
111 |
112 | self.position = self.center + z * scale
113 | self.look()
114 |
115 | def updateGamepad(self, x, y, task):
116 | self.gamepad_x = x
117 | self.gamepad_y = y
118 |
119 | self.has_pad = self.gamepad_x**2+self.gamepad_y**2 > 0.04
120 |
121 |
122 | def onUpdate(self, task):
123 | if self._mouse_id < 0 and not self.has_pad:
124 | return task.cont
125 |
126 | if self.has_pad:
127 | mousePosOff = pc.LVector2(self.gamepad_x, self.gamepad_y) * 0.02
128 | else:
129 | mousePosOff0 = self._mousePos - self._locked_mouse_pos
130 | mousePosOff = self._mousePos - self._locked_mouse_pos
131 |
132 | if self._mouse_id == 1 or self.has_pad:
133 | if self.has_pad:
134 | z = self.position - self.center
135 | else:
136 | z = self._locked_info[0] - self._locked_info[1]
137 |
138 | zDotUp = self._locked_info[2].dot(z)
139 | zMap = z - self._locked_info[2] * zDotUp
140 | angX = math.acos(zMap.length() / z.length()) / math.pi * 180.0
141 |
142 | if zDotUp < 0:
143 | angX = -angX
144 |
145 | angleScale = 200.0
146 |
147 | x = self._locked_info[2].cross(z)
148 | x.normalize()
149 | y = z.cross(x)
150 | y.normalize()
151 |
152 | rot_x_angle = -mousePosOff.getY() * angleScale
153 | rot_x_angle += angX
154 | if rot_x_angle > 85:
155 | rot_x_angle = 85
156 | if rot_x_angle < -85:
157 | rot_x_angle = -85
158 | rot_x_angle -= angX
159 |
160 | rot_y = pc.LMatrix3()
161 | rot_y.setRotateMat(-mousePosOff.getX() * angleScale, y, pc.CS_yup_right)
162 |
163 | rot_x = pc.LMatrix3()
164 | rot_x.setRotateMat(-rot_x_angle, x, pc.CS_yup_right)
165 | if not self.has_pad:
166 | self.position = self._locked_info[1] + (rot_x * rot_y).xform(z)
167 | else:
168 | self.position = self.center + (rot_x * rot_y).xform(z)
169 |
170 | elif self._mouse_id == 2:
171 | z = self._locked_info[0] - self._locked_info[1]
172 |
173 | shiftScale = 0.5 * z.length()
174 |
175 | x = self._locked_info[2].cross(z)
176 | z.normalize()
177 | x.normalize()
178 | y = z.cross(x)
179 |
180 | shift = x * -mousePosOff.getX() + y* -mousePosOff.getY()
181 | shift *= shiftScale
182 | self.position = self._locked_info[0] + shift
183 | self.center = self._locked_info[1] + shift
184 |
185 | elif self._mouse_id == 3:
186 | z = self._locked_info[0] - self._locked_info[1]
187 |
188 | scale = 1
189 | scale = 1.0 + scale * mousePosOff0.getY()
190 |
191 | if scale < 0.05:
192 | scale = 0.05
193 |
194 | self.position = self._locked_info[1] + z * scale
195 |
196 | self.look()
197 |
198 | return task.cont
199 |
200 | class SimpleViewer(ShowBase):
201 | def __init__(self, fStartDirect=True, windowType=None):
202 | '''
203 | this is only used for my project... lots of assumptions...
204 | '''
205 | super().__init__(fStartDirect, windowType)
206 | self.disableMouse()
207 |
208 | self.camera.lookAt(0,0.9,0)
209 | self.setupCameraLight()
210 | self.camera.setHpr(0,0,0)
211 |
212 | self.setFrameRateMeter(True)
213 | globalClock.setMode(ClockObject.MLimited)
214 | globalClock.setFrameRate(60)
215 |
216 | self.load_ground()
217 |
218 | xSize = self.pipe.getDisplayWidth()
219 | ySize = self.pipe.getDisplayHeight()
220 | props = pc.WindowProperties()
221 | props.setSize(min(xSize-200, 800), min(ySize-200, 600))
222 | self.win.requestProperties(props)
223 |
224 | # color for links
225 | color = [131/255,175/255,155/255,1]
226 | self.tex = self.create_texture(color, 'link_tex')
227 |
228 | self.load_character()
229 | self.update_func = None
230 | self.add_task(self.update, 'update')
231 | self.update_flag = True
232 | self.accept('space', self.receive_space)
233 | pass
234 |
235 | def receive_space(self):
236 | self.update_flag = not self.update_flag
237 |
238 | def create_texture(self, color, name):
239 | img = pc.PNMImage(32,32)
240 | img.fill(*color[:3])
241 | img.alphaFill(color[3])
242 | tex = pc.Texture(name)
243 | tex.load(img)
244 | return tex
245 |
246 | def load_ground(self):
247 | self.ground = self.loader.loadModel("data/materials/GroundScene.egg")
248 | self.ground.reparentTo(self.render)
249 | self.ground.setScale(100, 1, 100)
250 | self.ground.setTexScale(pc.TextureStage.getDefault(), 50, 50)
251 | self.ground.setPos(0, -1, 0)
252 |
253 | def setupCameraLight(self):
254 | # create a orbiting camera
255 | self.cameractrl = CameraCtrl(self, self.cam)
256 | self.cameraRefNode = self.camera # pc.NodePath('camera holder')
257 | self.cameraRefNode.setPos(0,0,0)
258 | self.cameraRefNode.setHpr(0,0,0)
259 | self.cameraRefNode.reparentTo(self.render)
260 |
261 | self.accept("v", self.bufferViewer.toggleEnable)
262 |
263 | self.d_lights = []
264 | # Create Ambient Light
265 | ambientLight = pc.AmbientLight('ambientLight')
266 | ambientLight.setColor((0.4, 0.4, 0.4, 1))
267 | ambientLightNP = self.render.attachNewNode(ambientLight)
268 | self.render.setLight(ambientLightNP)
269 |
270 | # Directional light 01
271 | directionalLight = pc.DirectionalLight('directionalLight1')
272 | directionalLight.setColor((0.4, 0.4, 0.4, 1))
273 | directionalLightNP = self.render.attachNewNode(directionalLight)
274 | directionalLightNP.setPos(10, 10, 10)
275 | directionalLightNP.lookAt((0, 0, 0), (0, 1, 0))
276 |
277 | directionalLightNP.wrtReparentTo(self.cameraRefNode)
278 | self.render.setLight(directionalLightNP)
279 | self.d_lights.append(directionalLightNP)
280 |
281 | # Directional light 02
282 | directionalLight = pc.DirectionalLight('directionalLight2')
283 | directionalLight.setColor((0.4, 0.4, 0.4, 1))
284 | directionalLightNP = self.render.attachNewNode(directionalLight)
285 | directionalLightNP.setPos(-10, 10, 10)
286 | directionalLightNP.lookAt((0, 0, 0), (0, 1, 0))
287 |
288 | directionalLightNP.wrtReparentTo(self.cameraRefNode)
289 | self.render.setLight(directionalLightNP)
290 | self.d_lights.append(directionalLightNP)
291 |
292 |
293 | # Directional light 03
294 | directionalLight = pc.DirectionalLight('directionalLight3')
295 | directionalLight.setColorTemperature(6500)
296 | directionalLightNP = self.render.attachNewNode(directionalLight)
297 | directionalLightNP.setPos(0, 20, -10)
298 | directionalLightNP.lookAt((0, 0, 0), (0, 1, 0))
299 |
300 | directionalLightNP.wrtReparentTo(self.cameraRefNode)
301 | directionalLight.setShadowCaster(True, 2048, 2048)
302 | directionalLight.getLens().setFilmSize((10,10))
303 | directionalLight.getLens().setNearFar(0.1,300)
304 |
305 | self.render.setLight(directionalLightNP)
306 | self.d_lights.append(directionalLightNP)
307 |
308 | self.render.setShaderAuto(True)
309 |
310 |
311 | def create_joint(self, link_id, position, end_effector=False):
312 | # create a joint
313 | box = self.loader.loadModel("data/materials/GroundScene.egg")
314 | node = self.render.attachNewNode(f"joint{link_id}")
315 | box.reparentTo(node)
316 |
317 | # add texture
318 | box.setTextureOff(1)
319 | if end_effector:
320 | tex = self.create_texture([0,1,0,1], f"joint{link_id}_tex")
321 | box.setTexture(tex, 1)
322 | box.setScale(0.01,0.01,0.01)
323 | node.setPos(self.render, *position)
324 | return node
325 |
326 | def create_link(self, link_id, position, scale, rot):
327 | # create a link
328 | box = self.loader.loadModel("data/materials/GroundScene.egg")
329 | node = self.render.attachNewNode(f"link{link_id}")
330 | box.reparentTo(node)
331 |
332 | # add texture
333 | box.setTextureOff(1)
334 | box.setTexture(self.tex,1)
335 | box.setScale(*scale)
336 |
337 | node.setPos(self.render, *position)
338 | if rot is not None:
339 | node.setQuat(self.render, pc.Quat(*rot[[3,0,1,2]].tolist()))
340 | return node
341 |
342 | def show_axis_frame(self):
343 | pose = [ [1,0,0], [0,1,0], [0,0,1] ]
344 | color = [ [1,0,0,1], [0,1,0,1], [0,0,1,1] ]
345 | for i in range(3):
346 | box = self.loader.loadModel("data/materials/GroundScene.egg")
347 | box.setScale(0.1, 0.1, 0.1)
348 | box.setPos(*pose[i])
349 | tex = self.create_texture(color[i], f"frame{i}")
350 | box.setTextureOff(1)
351 | box.setTexture(tex,1)
352 | box.reparentTo(self.render)
353 |
354 |
355 | def update(self, task):
356 | if self.update_func and self.update_flag:
357 | self.update_func(self)
358 | return task.cont
359 |
360 | def load_character(self):
361 | info = np.load('data/default_skeleton.npy', allow_pickle=True).item()
362 | joint_pos = info['joint_pos']
363 | body_pos = info['body_pos']
364 | joint_name = info['joint_name']
365 | body_rot = info.get('body_ori', None)
366 |
367 | joint, body = [], []
368 |
369 | thickness = 0.03
370 | # joint_name = ['RootJoint'] + joint_name
371 | name_idx_map = {joint_name[i]:i for i in range(len(joint_name))}
372 | scale = [ [thickness]*3 for i in range(len(body_pos))]
373 |
374 | scale[name_idx_map['RootJoint']] = [0.06,0.06,thickness]
375 | scale[name_idx_map['torso_head']] = [0.05,0.08,thickness]
376 | scale[name_idx_map['lowerback_torso']] = [thickness,0.05,thickness]
377 | scale[name_idx_map['pelvis_lowerback']] = [thickness,0.046,thickness]
378 |
379 | scale[name_idx_map['lHip']] = scale[name_idx_map['rHip']] = [thickness,0.2,thickness]
380 | scale[name_idx_map['rKnee']] = scale[name_idx_map['lKnee']] = [thickness,0.19,thickness]
381 | scale[name_idx_map['rAnkle']] = scale[name_idx_map['lAnkle']] = [thickness*1.5,thickness,0.08]
382 | scale[name_idx_map['rToeJoint']] = scale[name_idx_map['lToeJoint']] = [thickness*1.5,thickness*0.6,0.02]
383 |
384 | scale[name_idx_map['rTorso_Clavicle']] = scale[name_idx_map['lTorso_Clavicle']] = [0.05,thickness,thickness]
385 | scale[name_idx_map['rShoulder']] = scale[name_idx_map['lShoulder']] = [0.11,thickness,thickness]
386 | scale[name_idx_map['rElbow']] = scale[name_idx_map['lElbow']] = [0.11,thickness,thickness]
387 | scale[name_idx_map['rWrist']] = scale[name_idx_map['lWrist']] = [0.05,thickness*0.6,thickness*1.2]
388 |
389 | # joint_pos = np.concatenate([body_pos[0:1], joint_pos], axis=0)
390 | for i in range(len(joint_pos)):
391 | joint.append(self.create_joint(i, joint_pos[i], 'end' in joint_name[i]))
392 | if i < body_pos.shape[0]:
393 | body.append(self.create_link(i, body_pos[i], scale[i], rot = body_rot[i] if body_rot is not None else None))
394 | body[-1].wrtReparentTo(joint[-1])
395 |
396 | self.joints = joint
397 | self.joint_name = joint_name
398 | self.name2idx = name_idx_map
399 | self.parent_index = info['parent']
400 | self.init_joint_pos = self.get_joint_positions()
401 |
402 | def get_joint_positions(self):
403 | pos = [joint.getPos(self.render) for joint in self.joints]
404 | return np.concatenate([pos], axis=0)
405 |
406 | def get_joint_orientations(self):
407 | quat = [joint.getQuat(self.render) for joint in self.joints]
408 | return np.concatenate([quat], axis=0)[..., [1,2,3,0]]
409 |
410 | def get_joint_position_by_name(self, name):
411 | pos = self.joints[self.name2idx[name]].getPos(self.render)
412 | return np.array(pos)
413 |
414 | def get_joint_orientation_by_name(self, name):
415 | quat = self.joints[self.name2idx[name]].getQuat(self.render)
416 | return np.array(quat)[..., [1,2,3,0]]
417 |
418 | def set_joint_position_by_name(self, name, pos):
419 | self.joints[self.name2idx[name]].setPos(self.render, *pos)
420 |
421 | def set_joint_orientation_by_name(self, name, quat):
422 | self.joints[self.name2idx[name]].setQuat(self.render, pc.Quat(*quat[...,[3,0,1,2]].tolist()))
423 |
424 | def set_joint_position_orientation(self, link_name, pos, quat):
425 | if not link_name in self.name2idx:
426 | return
427 | self.joints[self.name2idx[link_name]].setPos(self.render, *pos.tolist())
428 | self.joints[self.name2idx[link_name]].setQuat(self.render, pc.Quat(*quat[...,[3,0,1,2]].tolist()))
429 |
430 | def show_pose(self, joint_name_list, joint_positions, joint_orientations):
431 | length = len(joint_name_list)
432 | assert joint_positions.shape == (length, 3)
433 | assert joint_orientations.shape == (length, 4)
434 |
435 | for i in range(length):
436 | self.set_joint_position_orientation(joint_name_list[i], joint_positions[i], joint_orientations[i])
437 | def show_rest_pose(self, joint_name, joint_parent, joint_offset):
438 | length = len(joint_name)
439 | joint_positions = np.zeros((length, 3), dtype=np.float64)
440 | joint_orientations = np.zeros((length, 4), dtype=np.float64)
441 | for i in range(length):
442 | if joint_parent[i] == -1:
443 | joint_positions[i] = joint_offset[i]
444 | else:
445 | joint_positions[i] = joint_positions[joint_parent[i]] + joint_offset[i]
446 | joint_orientations[i, 3] = 1.0
447 | self.set_joint_position_orientation(joint_name[i], joint_positions[i], joint_orientations[i])
448 |
449 | def get_meta_data(self):
450 | return self.joint_name, self.parent_index, self.init_joint_pos
451 |
452 | def move_marker(self, marker, x, y):
453 |
454 | if not self.update_marker_func:
455 | return
456 |
457 | y_axis = self.cameractrl._locked_info[2]
458 | z_axis = self.cameractrl.position - self.cameractrl.center
459 | x_axis = np.cross(y_axis, z_axis)
460 | x_axis = x_axis / np.linalg.norm(x_axis)
461 | pos = np.array(marker.getPos(self.render))
462 | pos += x_axis * x + y_axis * y
463 | marker.setPos(self.render, *pos.tolist())
464 | self.update_marker_func(self)
465 |
466 | def camera_fwd(self):
467 | return self.cameractrl.center - self.cameractrl.position
468 |
469 | def create_marker(self, pos, color):
470 | self.update_marker_func = None
471 | marker = self.loader.loadModel("data/materials/GroundScene.egg")
472 | marker.setScale(0.05,0.05,0.05)
473 | marker.setPos(*pos)
474 | tex = self.create_texture(color, "marker")
475 | marker.setTextureOff(1)
476 | marker.setTexture(tex,1)
477 |
478 | marker.wrtReparentTo(self.render)
479 |
480 | self.accept('w', self.move_marker, [marker, 0, 0.05])
481 | self.accept('s', self.move_marker, [marker, 0, -0.05])
482 | self.accept('a', self.move_marker, [marker, -0.05, 0])
483 | self.accept('d', self.move_marker, [marker, 0.05, 0])
484 |
485 | self.accept('w-repeat', self.move_marker, [marker, 0, 0.05])
486 | self.accept('s-repeat', self.move_marker, [marker, 0, -0.05])
487 | self.accept('a-repeat', self.move_marker, [marker, -0.05, 0])
488 | self.accept('d-repeat', self.move_marker, [marker, 0.05, 0])
489 | return marker
490 |
491 | def create_marker2(self, pos, color):
492 | self.update_marker_func = None
493 | marker = self.loader.loadModel("data/materials/GroundScene.egg")
494 | marker.setScale(0.05,0.05,0.05)
495 | marker.setPos(*pos)
496 | tex = self.create_texture(color, "marker")
497 | marker.setTextureOff(1)
498 | marker.setTexture(tex,1)
499 |
500 | marker.wrtReparentTo(self.render)
501 |
502 | self.accept('arrow_up', self.move_marker, [marker, 0, 0.05])
503 | self.accept('arrow_down', self.move_marker, [marker, 0, -0.05])
504 | self.accept('arrow_left', self.move_marker, [marker, -0.05, 0])
505 | self.accept('arrow_right', self.move_marker, [marker, 0.05, 0])
506 |
507 | self.accept('arrow_up-repeat', self.move_marker, [marker, 0, 0.05])
508 | self.accept('arrow_down-repeat', self.move_marker, [marker, 0, -0.05])
509 | self.accept('arrow_left-repeat', self.move_marker, [marker, -0.05, 0])
510 | self.accept('arrow_right-repeat', self.move_marker, [marker, 0.05, 0])
511 | return marker
512 |
513 | def create_arrow(self, pos, forward_xz = np.array([0,1]), color = [1,0,0,0]):
514 | from .visualize_utils import draw_arrow
515 | arrow = self.render.attachNewNode("arrow")
516 | draw_arrow(arrow, 0.3, 1, color)
517 | arrow.setPos(*pos)
518 |
519 | from scipy.spatial.transform import Rotation as R
520 | axis = np.array([0,1,0])
521 | angle = np.arctan2(forward_xz[0], forward_xz[1])
522 | rot = R.from_rotvec(angle * axis).as_quat()
523 | quat = pc.Quat(rot[3], rot[0], rot[1], rot[2])
524 | arrow.setQuat(quat)
525 | return arrow
526 |
527 |
528 | class ShowBVHUpdate():
529 | def __init__(self, viewer, joint_name, translation, orientation):
530 | self.viewer = viewer
531 | self.cur_frame = 0
532 | self.joint_name = joint_name
533 | self.translation = translation
534 | self.orientation = orientation
535 |
536 | def update(self, task):
537 | if not self.viewer.update_flag:
538 | return task.cont
539 |
540 | speed_inv = 1
541 | for i in range(len(self.joint_name)):
542 | self.viewer.set_joint_position_orientation(
543 | self.joint_name[i],
544 | self.translation[self.cur_frame // speed_inv, i, :],
545 | self.orientation[self.cur_frame // speed_inv, i, :])
546 | self.cur_frame = (self.cur_frame + 1) % (self.translation.shape[0] *
547 | speed_inv)
548 | return task.cont
--------------------------------------------------------------------------------
/assignment_2/Viewer/visualize_utils.py:
--------------------------------------------------------------------------------
1 | from panda3d.core import *
2 | import numpy as np
3 |
4 |
5 | def draw_circle(nodepath, radius, color):
6 | from direct.showutil.Rope import Rope
7 | r = Rope()
8 | w = 0.5
9 | a = 0.866*radius
10 | b = 0.5*radius
11 | c = 1*radius
12 | points = [
13 | (-a, 0, b, 1),
14 | (0, 0, c, w),
15 | (a, 0, b, 1),
16 | (a, 0, -b, w),
17 | (0, 0, -c, 1),
18 | (-a, 0, -b, w),
19 | (-a, 0, b, 1),
20 | ]
21 | verts = [
22 | {
23 | 'node': nodepath,
24 | 'point': point,
25 | 'color': color,
26 | 'thickness': 10,
27 | }
28 | for point in points
29 | ]
30 | r.ropeNode.setUseVertexThickness(1)
31 | r.ropeNode.setUseVertexColor(1)
32 | r.setup(3, verts, knots=[0, 0, 0, 1/3, 1/3, 2/3, 2/3, 1, 1, 1])
33 | r.reparentTo(nodepath)
34 | return r
35 |
36 |
37 | def draw_circle_with_arrow(nodepath, radius, color, with_circle=True):
38 | if with_circle:
39 | draw_circle(nodepath, radius, color)
40 | from direct.showutil import BuildGeometry as BG
41 | node = nodepath.attach_new_node('arrow')
42 | _, _, geo = BG.addArrowGeom(
43 | node, 0.03, 0.3, color=Vec4(*(i/2 for i in color)))
44 | node.setQuat(Quat(0, 0, 1, 0)*Quat(0.707, 0.707, 0, 0))
45 | node.setPos(0, 0, 0.15)
46 | node.wrtReparentTo(nodepath)
47 | return node
48 |
49 |
50 | def draw_arrow(nodepath, width, length, color):
51 | from direct.showutil import BuildGeometry as BG
52 | node = nodepath.attach_new_node('arrow')
53 | _, _, geo = BG.addArrowGeom(
54 | node, width, length, color=Vec4(*(i for i in color)))
55 | node.setQuat(Quat(0, 0, 1, 0)*Quat(0.707, 0.707, 0, 0))
56 | node.setPos(0, 0, length/2)
57 | node.wrtReparentTo(nodepath)
58 | return node
59 |
60 |
61 | def pos_vel_to_beizer(position, velocity, dt):
62 | position = position.reshape(-1, 3)
63 | velocity = velocity.reshape(-1, 3)
64 | prev = position - velocity*dt
65 | post = position + velocity*dt
66 | points = np.concatenate([prev, position, post], axis=-1).reshape(-1, 3)
67 | points = points[1:-1]
68 | return points
69 |
70 |
71 | def draw_beizer(positions, velocity, dt, rope):
72 | points = pos_vel_to_beizer(positions, velocity, dt)
73 | points = [{
74 | 'node': None,
75 | 'point': tuple(point),
76 | 'color': (0, 0, 1, 1),
77 | 'thickness': 10,
78 | } for point in points]
79 | if rope is None:
80 | from direct.showutil.Rope import Rope
81 | rope = Rope()
82 | rope.ropeNode.setUseVertexThickness(1)
83 | rope.ropeNode.setUseVertexColor(1)
84 | rope.setup(3, points)
85 | rope.set_render_mode_thickness(100)
86 | return rope
87 | # rope.verts = points
88 | # rope.recompute()
89 |
--------------------------------------------------------------------------------
/assignment_2/data/default_skeleton.npy:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/assignment_2/data/default_skeleton.npy
--------------------------------------------------------------------------------
/assignment_2/data/materials/Checker.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Shimingyi/COMP3360_Data_Driven_Animation/0eddde45ec5f9992fc0d99581d8eff9ac6b67a0a/assignment_2/data/materials/Checker.png
--------------------------------------------------------------------------------
/assignment_2/data/materials/GroundScene.egg:
--------------------------------------------------------------------------------
1 | { Y-up }
2 |
3 | Checker {
4 | "Checker.png"
5 | envtype { MODULATE }
6 | minfilter { LINEAR_MIPMAP_LINEAR }
7 | magfilter { LINEAR_MIPMAP_LINEAR }
8 | wrap { REPEAT }
9 | }
10 | Cube {
11 | {
12 | {
13 | 1.0 0.0 0.0 0.0
14 | 0.0 1.0 0.0 0.0
15 | 0.0 0.0 1.0 0.0
16 | 0.0 0.0 0.0 1.0
17 | }
18 | }
19 |
20 | Cube {
21 | 0 {-1.000000 -1.000000 -1.000000 { 0.0 0.0 } }
22 | 1 {-1.000000 -1.000000 1.000000 { 0.0 1.0 } }
23 | 2 {-1.000000 1.000000 1.000000 { 1.0 1.0 } }
24 | 3 {-1.000000 1.000000 -1.000000 { 1.0 0.0 } }
25 | 4 {-1.000000 1.000000 -1.000000 { 0.0 0.0 } }
26 | 5 {-1.000000 1.000000 1.000000 { 0.0 1.0 } }
27 | 6 { 1.000000 1.000000 1.000000 { 1.0 1.0 } }
28 | 7 { 1.000000 1.000000 -1.000000 { 1.0 0.0 } }
29 | 8 { 1.000000 1.000000 -1.000000 { 0.0 0.0 } }
30 | 9 { 1.000000 1.000000 1.000000 { 0.0 1.0 } }
31 | 10 { 1.000000 -1.000000 1.000000 { 1.0 1.0 } }
32 | 11 { 1.000000 -1.000000 -1.000000 { 1.0 0.0 } }
33 | 12 { 1.000000 -1.000000 -1.000000 { 0.0 0.0 } }
34 | 13 { 1.000000 -1.000000 1.000000 { 0.0 1.0 } }
35 | 14 {-1.000000 -1.000000 1.000000 { 1.0 1.0 } }
36 | 15 {-1.000000 -1.000000 -1.000000 { 1.0 0.0 } }
37 | 16 {-1.000000 1.000000 -1.000000 { 0.0 0.0 } }
38 | 17 { 1.000000 1.000000 -1.000000 { 0.0 1.0 } }
39 | 18 { 1.000000 -1.000000 -1.000000 { 1.0 1.0 } }
40 | 19 {-1.000000 -1.000000 -1.000000 { 1.0 0.0 } }
41 | 20 { 1.000000 1.000000 1.000000 { 0.0 0.0 } }
42 | 21 {-1.000000 1.000000 1.000000 { 0.0 1.0 } }
43 | 22 {-1.000000 -1.000000 1.000000 { 1.0 1.0 } }
44 | 23 { 1.000000 -1.000000 1.000000 { 1.0 0.0 } }
45 | }
46 |
47 |
48 | {
49 | { Checker }
50 | {-1.000000 0.000000 0.000000}
51 | { 0 1 2 3 [ { Cube }}
52 | }
53 | {
54 | { Checker }
55 | {0.000000 1.000000 0.000000}
56 | { 4 5 6 7 ][ { Cube }}
57 | }
58 | {
59 | { Checker }
60 | {1.000000 0.000000 0.000000}
61 | { 8 9 10 11 ][ { Cube }}
62 | }
63 | {
64 | { Checker }
65 | {0.000000 -1.000000 0.000000}
66 | { 12 13 14 15 ][ { Cube }}
67 | }
68 | {
69 | { Checker }
70 | {0.000000 0.000000 -1.000000}
71 | { 16 17 18 19 ][ { Cube }}
72 | }
73 | {
74 | { Checker }
75 | {0.000000 0.000000 1.000000}
76 | { 20 21 22 23 ][ { Cube }}
77 | }
78 | }
79 |
--------------------------------------------------------------------------------
/assignment_2/env_test.py:
--------------------------------------------------------------------------------
1 | from Viewer.viewer import SimpleViewer
2 |
3 | if __name__ == "__main__":
4 | viewer = SimpleViewer()
5 | viewer.run()
6 |
--------------------------------------------------------------------------------
/assignment_2/file_io.py:
--------------------------------------------------------------------------------
1 | import copy
2 | import numpy as np
3 | import scipy.signal as signal
4 | from scipy.spatial.transform import Rotation as R
5 |
6 | def load_meta_data(bvh_path):
7 | with open(bvh_path, 'r') as f:
8 | channels = []
9 | joint_names = []
10 | joint_parents = []
11 | joint_offsets = []
12 | end_sites = []
13 |
14 | parent_stack = [None]
15 | for line in f:
16 | if 'ROOT' in line or 'JOINT' in line:
17 | joint_names.append(line.split()[-1])
18 | joint_parents.append(parent_stack[-1])
19 | channels.append('')
20 | joint_offsets.append([0, 0, 0])
21 |
22 | elif 'End Site' in line:
23 | end_sites.append(len(joint_names))
24 | joint_names.append(parent_stack[-1] + '_end')
25 | joint_parents.append(parent_stack[-1])
26 | channels.append('')
27 | joint_offsets.append([0, 0, 0])
28 |
29 | elif '{' in line:
30 | parent_stack.append(joint_names[-1])
31 |
32 | elif '}' in line:
33 | parent_stack.pop()
34 |
35 | elif 'OFFSET' in line:
36 | joint_offsets[-1] = np.array([float(x) for x in line.split()[-3:]]).reshape(1,3)
37 |
38 | elif 'CHANNELS' in line:
39 | trans_order = []
40 | rot_order = []
41 | for token in line.split():
42 | if 'position' in token:
43 | trans_order.append(token[0])
44 |
45 | if 'rotation' in token:
46 | rot_order.append(token[0])
47 |
48 | channels[-1] = ''.join(trans_order)+ ''.join(rot_order)
49 |
50 | elif 'Frame Time:' in line:
51 | break
52 |
53 | joint_parents = [-1]+ [joint_names.index(i) for i in joint_parents[1:]]
54 | channels = [len(i) for i in channels]
55 | return joint_names, joint_parents, channels, joint_offsets
56 |
57 |
58 | def load_motion_data(bvh_file_path):
59 | with open(bvh_file_path, 'r') as f:
60 | lines = f.readlines()
61 | for i in range(len(lines)):
62 | if lines[i].startswith('Frame Time'):
63 | break
64 | motion_data = []
65 | for line in lines[i+1:]:
66 | data = [float(x) for x in line.split()]
67 | if len(data) == 0:
68 | break
69 | motion_data.append(np.array(data).reshape(1,-1))
70 | motion_data = np.concatenate(motion_data, axis=0)
71 |
72 | joint_names, joint_parents, channels, joint_offsets = load_meta_data(bvh_file_path)
73 | joint_number = len(joint_names)
74 |
75 | local_joint_positions = np.zeros((motion_data.shape[0], joint_number, 3))
76 | local_joint_rotations = np.zeros((motion_data.shape[0], joint_number, 4))
77 | local_joint_rotations[:,:,3] = 1.0
78 |
79 | cur_channel = 0
80 | for i in range(len(joint_names)):
81 | if channels[i] == 0:
82 | local_joint_positions[:,i,:] = joint_offsets[i].reshape(1,3)
83 | continue
84 | elif channels[i] == 3:
85 | local_joint_positions[:,i,:] = joint_offsets[i].reshape(1,3)
86 | rotation = motion_data[:, cur_channel:cur_channel+3]
87 | elif channels[i] == 6:
88 | local_joint_positions[:, i, :] = motion_data[:, cur_channel:cur_channel+3]
89 | rotation = motion_data[:, cur_channel+3:cur_channel+6]
90 | local_joint_rotations[:, i, :] = R.from_euler('XYZ', rotation,degrees=True).as_quat()
91 | cur_channel += channels[i]
92 |
93 | return motion_data, local_joint_positions, local_joint_rotations
94 |
95 |
96 | class BVHMotion():
97 | def __init__(self, bvh_file_name = None) -> None:
98 |
99 | self.joint_name = []
100 | self.joint_channel = []
101 | self.joint_parent = []
102 |
103 | self.local_joint_positions = None # (N,M,3) 的ndarray, 局部平移
104 | self.local_joint_rotations = None # (N,M,4)的ndarray, 用四元数表示的局部旋转
105 |
106 | if bvh_file_name is not None:
107 | self.load_motion(bvh_file_name)
108 | pass
109 |
110 | def load_motion(self, bvh_file_path):
111 | self.joint_name, self.joint_parent, self.joint_channel, self.joint_offset = \
112 | load_meta_data(bvh_file_path)
113 |
114 | self.motion_data, self.local_joint_positions, self.local_joint_rotations = load_motion_data(bvh_file_path)
115 |
116 | def batch_forward_kinematics(self, joint_position=None, joint_rotation=None):
117 | if joint_position is None:
118 | joint_position = self.local_joint_positions
119 | if joint_rotation is None:
120 | joint_rotation = self.local_joint_rotations
121 |
122 | if len(joint_position.shape) == 2:
123 | joint_position = joint_position.reshape(1, -1, 3).copy()
124 | joint_rotation = joint_rotation.reshape(1, -1, 4).copy()
125 |
126 | joint_translation = np.zeros_like(joint_position)
127 | joint_orientation = np.zeros_like(joint_rotation)
128 | joint_orientation[:,:,3] = 1.0
129 |
130 | for i in range(len(self.joint_name)):
131 | pi = self.joint_parent[i]
132 | parent_orientation = R.from_quat(joint_orientation[:,pi,:])
133 | joint_translation[:, i, :] = joint_translation[:, pi, :] + \
134 | parent_orientation.apply(joint_position[:, i, :])
135 | joint_orientation[:, i, :] = (parent_orientation * R.from_quat(joint_rotation[:, i, :])).as_quat()
136 | return joint_translation, joint_orientation
137 |
138 | def raw_copy(self):
139 | return copy.deepcopy(self)
140 |
141 | def sub_sequence(self, start, end):
142 | res = self.raw_copy()
143 | res.local_joint_positions = res.local_joint_positions[start:end,:,:]
144 | res.local_joint_rotations = res.local_joint_rotations[start:end,:,:]
145 | return res
146 |
147 | def adjust_joint_name(self, target_joint_name):
148 | idx = [
149 | self.joint_name.index(joint_name)
150 | for joint_name in target_joint_name
151 | ]
152 | idx_inv = [
153 | target_joint_name.index(joint_name)
154 | for joint_name in self.joint_name
155 | ]
156 | self.joint_name = [self.joint_name[i] for i in idx]
157 | self.joint_parent = [idx_inv[self.joint_parent[i]] for i in idx]
158 | self.joint_parent[0] = -1
159 | self.joint_channel = [self.joint_channel[i] for i in idx]
160 | self.local_joint_positions = self.local_joint_positions[:, idx, :]
161 | self.local_joint_rotations = self.local_joint_rotations[:, idx, :]
162 | pass
--------------------------------------------------------------------------------
/assignment_2/matching_utils.py:
--------------------------------------------------------------------------------
1 | import math
2 | import numpy as np
3 | from scipy.spatial.transform import Rotation as R
4 |
5 | def from_euler(e):
6 | return R.from_euler('XYZ', e, degrees=True)
7 |
8 | def lerp(a, b, t):
9 | return a + (b - a) * t
10 |
11 |
12 | def align_quat(qt: np.ndarray, inplace: bool):
13 | ''' make q_n and q_n+1 in the same semisphere
14 |
15 | the first axis of qt should be the time
16 | '''
17 | qt = np.asarray(qt)
18 | if qt.shape[-1] != 4:
19 | raise ValueError('qt has to be an array of quaterions')
20 |
21 | if not inplace:
22 | qt = qt.copy()
23 |
24 | if qt.size == 4: # do nothing since there is only one quation
25 | return qt
26 |
27 | sign = np.sum(qt[:-1] * qt[1:], axis=-1)
28 | sign[sign < 0] = -1
29 | sign[sign >= 0] = 1
30 | sign = np.cumprod(sign, axis=0, )
31 |
32 | qt[1:][sign < 0] *= -1
33 |
34 | return qt
35 |
36 |
37 | def quat_to_avel(rot, dt):
38 | quat_diff = (rot[1:] - rot[:-1])/dt
39 | quat_diff[...,-1] = (1 - np.sum(quat_diff[...,:-1]**2, axis=-1)).clip(min = 0)**0.5
40 | quat_tmp = rot[:-1].copy()
41 | quat_tmp[...,:3] *= -1
42 | shape = quat_diff.shape[:-1]
43 | rot_tmp = R.from_quat(quat_tmp.reshape(-1, 4)) * R.from_quat(quat_diff.reshape(-1, 4))
44 | return 2 * rot_tmp.as_quat().reshape( shape + (4, ) )[...,:3]
45 |
46 |
47 | def forward_kinematics_with_channel(joint_parent, joint_channel, joint_offset, motion_data):
48 | num = len(joint_parent)
49 | joint_positions = np.zeros((motion_data.shape[0], num, 3))
50 | joint_orientations = np.zeros((motion_data.shape[0], num, 4))
51 | joint_orientations[...,3] = 1
52 |
53 | j = 0
54 | for i in range(num):
55 |
56 | if joint_channel[i] == 6:
57 | joint_positions[:, i] = joint_offset[i] + motion_data[:, j:j+3]
58 | rot = R.from_euler('XYZ', motion_data[:, j+3:j+6], degrees=True)
59 | if joint_parent[i] != -1:
60 | joint_positions[:, i] += joint_positions[:, joint_parent[i]]
61 | rot = R.from_quat(joint_orientations[:, joint_parent[i]])*rot
62 | joint_orientations[:, i] = rot.as_quat()
63 | j += 6
64 | else:
65 | parent_rot = R.from_quat(joint_orientations[:, joint_parent[i]])
66 | joint_positions[:, i] = joint_positions[:, joint_parent[i]] + parent_rot.apply(joint_offset[i])
67 | if joint_channel[i] == 3:
68 | joint_orientations[:, i] = (parent_rot*R.from_euler('XYZ', motion_data[:, j:j+3], degrees=True)).as_quat()
69 | j += 3
70 | return joint_positions, joint_orientations
71 |
72 |
73 | class InterpolationHelper():
74 |
75 | @staticmethod
76 | def halflife2dampling(halflife):
77 | return 4 * math.log(2) / halflife
78 |
79 | def simulation_positions_update(pos, vel, acc, target_vel, halflife, dt):
80 | d = InterpolationHelper.halflife2dampling(halflife)/2
81 | j0 = vel - target_vel
82 | j1 = acc + d * j0
83 | eydt = math.exp(-d * dt)
84 | pos_prev = pos
85 | tmp1 = j0+j1*dt
86 | tmp2 = j1/(d*d)
87 | pos = eydt * ( -tmp2 -tmp1/d ) + tmp2 + j0/d + target_vel*dt + pos_prev
88 | vel = eydt*tmp1 + target_vel
89 | acc = eydt * (acc - j1*d*dt)
90 | return pos, vel, acc
91 |
92 | @staticmethod
93 | def simulation_rotations_update(rot, avel, target_rot, halflife, dt):
94 | d = InterpolationHelper.halflife2dampling(halflife)/2
95 | j0 = R.from_quat(rot) * R.from_quat(target_rot).inv()
96 | j0 = j0.as_rotvec()
97 | j1 = avel + d * j0
98 | eydt = math.exp(-d * dt)
99 | tmp1 = eydt * (j0 + j1 * dt)
100 | rot = R.from_rotvec(tmp1) * R.from_quat(target_rot)
101 | rot = rot.as_quat()
102 | avel = eydt * (avel - j1 * dt * d)
103 | return rot, avel
104 |
105 | @staticmethod
106 | def decay_spring_implicit_damping_rot(rot, avel, halflife, dt):
107 | d = InterpolationHelper.halflife2dampling(halflife)/2
108 | j0 = from_euler(rot).as_rotvec()
109 | j1 = avel + d * j0
110 | eydt = math.exp(-d * dt)
111 | a1 = eydt * (j0+j1*dt)
112 |
113 | rot_res = R.from_rotvec(a1).as_euler('XYZ', degrees=True)
114 | avel_res = eydt * (avel - j1 * dt * d)
115 | return rot_res, avel_res
116 |
117 | @staticmethod
118 | def decay_spring_implicit_damping_pos(pos, vel, halflife, dt):
119 | d = InterpolationHelper.halflife2dampling(halflife)/2
120 | j1 = vel + d * pos
121 | eydt = math.exp(-d * dt)
122 | pos = eydt * (pos+j1*dt)
123 | vel = eydt * (vel - j1 * dt * d)
124 | return pos, vel
125 |
126 | @staticmethod
127 | def inertialize_transition_rot(prev_off_rot, prev_off_avel, src_rot, src_avel, dst_rot, dst_avel):
128 | prev_off_rot, prev_off_avel = InterpolationHelper.decay_spring_implicit_damping_rot(prev_off_rot, prev_off_avel, 1/20, 1/60)
129 | off_rot = from_euler(prev_off_rot) * from_euler(src_rot) * from_euler(dst_rot).inv()
130 | off_avel = prev_off_avel + src_avel - dst_avel
131 | # off_rot = from_euler(src_rot) * from_euler(dst_rot).inv()
132 | # off_avel = src_avel - dst_avel
133 | return off_rot.as_euler('XYZ', degrees=True), off_avel
134 |
135 | @staticmethod
136 | def inertialize_update_rot(prev_off_rot, prev_off_avel, rot, avel, halflife, dt):
137 | off_rot , off_avel = InterpolationHelper.decay_spring_implicit_damping_rot(prev_off_rot, prev_off_avel, halflife, dt)
138 | rot = from_euler(off_rot) * from_euler(rot)
139 | avel = off_avel + avel
140 | return rot.as_euler('XYZ', degrees=True), avel, off_rot, off_avel
141 |
142 | @staticmethod
143 | def inertialize_transition_pos(prev_off_pos, prev_off_vel, src_pos, src_vel, dst_pos, dst_vel):
144 | prev_off_pos, prev_off_vel = InterpolationHelper.decay_spring_implicit_damping_pos(prev_off_pos, prev_off_vel, 1/20, 1/60)
145 | off_pos = prev_off_pos + src_pos - dst_pos
146 | off_vel = prev_off_vel + src_vel - dst_vel
147 | return off_pos, off_vel
148 |
149 | @staticmethod
150 | def inertialize_update_pos(prev_off_pos, prev_off_vel, pos, vel, halflife, dt):
151 | off_pos , off_vel = InterpolationHelper.decay_spring_implicit_damping_pos(prev_off_pos, prev_off_vel, halflife, dt)
152 | pos = off_pos + pos
153 | vel = off_vel + vel
154 | return pos, vel, off_pos, off_vel
155 |
--------------------------------------------------------------------------------
/assignment_2/task1_motion_editing.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from scipy.spatial.transform import Rotation as R
3 | from scipy.spatial.transform import Slerp
4 |
5 | from file_io import BVHMotion
6 | from Viewer.controller import SimpleViewer
7 | from Viewer.viewer import ShowBVHUpdate
8 |
9 | '''
10 | Find the betweening data between left_data and right_data
11 | Parameters:
12 | - left_data: the left data
13 | - right_data: the right data
14 | - t: the number of betweening frames
15 | - method: the interpolation method, 'linear' or 'slerp'
16 | Return: An array for the betweening data,
17 | if return_first_key=True, then including the known left one,
18 | for example, if t=9, then the result should be 9 frames including the left one
19 | '''
20 | def interpolation(left_data, right_data, t, method='linear', return_first_key=True):
21 | res = [left_data] if return_first_key else []
22 | '''
23 | TODO: Implement following functions
24 | Hints:
25 | 1. The shape of data is (num_joints, 3) for position and (num_joints, 4) for rotation
26 | 2. The rotation data is in quaternion format
27 | 3. We use linear interpolation for position and slerp for rotation
28 | * The linear interpolation can be obtained by the following formula:
29 | data_between = left_data + (right_data - left_data) * (i / t)
30 | 4. We use scipy.spatial.transform.Slerp to do slerp interpolation (https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.transform.Slerp.html)
31 | * The Slerp can be obtained by the following code:
32 | key_rots = R.from_quat([q1, q2])
33 | key_frames= [0, 1]
34 | slerp = Slerp(key_times, key_rots)
35 | new_key_frames = np.linspace(0, 1, t+1))
36 | interp_rots = slerp(new_key_frames)
37 | * The slerp in scipy doesn't support quaternion with shape (num_joints, 4),
38 | so you need to take the quaternion for each joint in a loop with joint_num,
39 | like left_data[joint_idx], then combine them after the loop.
40 | * Don't take the last frame of slerp, because it is the same as the right_data
41 | '''
42 | ########## Code Start ############
43 | if method == 'linear':
44 |
45 |
46 | return res
47 | elif method == 'slerp':
48 |
49 |
50 | return res
51 | ########## Code End ############
52 |
53 | def part1_key_framing(viewer, time_step, target_step):
54 | motion = BVHMotion('data/motion_walking.bvh')
55 |
56 | motio_length = motion.local_joint_positions.shape[0]
57 | keyframes = np.arange(0, motio_length, time_step)
58 |
59 | new_motion_local_positions, new_motion_local_rotations = [], []
60 |
61 | previous_frame_idx = 0
62 | for current_frame_idx in keyframes[1:]:
63 | between_local_pos = interpolation(motion.local_joint_positions[previous_frame_idx],
64 | motion.local_joint_positions[current_frame_idx],
65 | target_step - 1, 'linear')
66 | between_local_rot = interpolation(motion.local_joint_rotations[previous_frame_idx],
67 | motion.local_joint_rotations[current_frame_idx],
68 | target_step - 1, 'slerp')
69 | new_motion_local_positions.append(between_local_pos)
70 | new_motion_local_rotations.append(between_local_rot)
71 | previous_frame_idx = current_frame_idx
72 |
73 | res_motion = motion.raw_copy()
74 | res_motion.local_joint_positions = np.concatenate(new_motion_local_positions)
75 | res_motion.local_joint_rotations = np.concatenate(new_motion_local_rotations)
76 |
77 | translation, orientation = res_motion.batch_forward_kinematics()
78 | task = ShowBVHUpdate(viewer, res_motion.joint_name, translation, orientation)
79 | viewer.addTask(task.update)
80 |
81 | '''
82 | Combine two different motions into one motion
83 | Parameters:
84 | - motion1: the first motion
85 | - motion2: the second motion
86 | - last_frame_index: the last frame index of the first motion
87 | - start_frame_indx: the start frame index of the second motion
88 | - between_frames: the number of frames between the two motions
89 | - searching_frames: the number of frames for searching the closest frame
90 | - method: the interpolation method, 'interpolation' or 'inertialization'
91 | '''
92 | def concatenate_two_motions(motion1, motion2, last_frame_index, start_frame_indx, between_frames, searching_frames=20, method='interpolation'):
93 | '''
94 | TODO: Implement following functions
95 | Hints:
96 | 1. We should operation on the local joint positions and rotations
97 | motion.local_joint_positions: (num_frames, num_joints, 3)
98 | motion.local_joint_rotations: (num_frames, num_joints, 4)
99 | 2. There are five steps in the concatenation:
100 | i) get the searching windows for motion 1 and motion 2
101 | win_1 = motion1.local_joint_rotations[last_frame_index - searching_frames:last_frame_index + searching_frames]
102 | win_2 = motion2.local_joint_rotations[max(0, start_frame_indx - searching_frames):start_frame_indx + searching_frames]
103 | ii) find the closest frame in motion 1 searching window and motion 2 searching window
104 | * You can use similarity matrix in DTW (Dynamic Time Warping) to find the closest frame (motion editing slides)
105 | * sim_matrix is a matrix with shape (win_1.shape[0], win_2.shape[0])
106 | * sim_matrix[i, j] = np.linalg.norm(search_source[i] - search_target[j])
107 | * Find the minimum value in sim_matrix and get the corresponding index i and j
108 | iii) the i and j is the index for the searching window, so convert it to the index in the original motion sequence
109 | iv) we have the pose in motion_1(real_i) and motion_2(real_j), then we can do the interpolation
110 | * The interpolation should be done for the *positions* and *rotations* both
111 | * (Shifting) You must align the root positions of motion_2(real_j) to the root positions of motion_1(real_i)
112 | # The shifting is done by updating variable: motion2.local_joint_positions = motion2.local_joint_positions - ?
113 | v) combine the motion1, betweening, motion2 into one motion (have been provided)
114 | 3. You can get all marks if you done above steps correctly, but bonus will be given if you can do any one of them:
115 | (bonus) There are N between_frames, but the root position in these frames are not considered
116 | (bonus) The velocity of two motions are not the same, it can give different weight to the two motions interpolation
117 | (bonus) The inertialization method provides the best results ref:https://theorangeduck.com/page/spring-roll-call#inertialization
118 | (bonus) Any way to produce more smooth and natural transitions
119 |
120 | Useful functions:
121 | 1. The differece between two vectors: np.linalg.norm(a - b)
122 | 2. The index of minimal value in a matrix: np.min(dtw)
123 | 3. local_joint_rotations = motion.local_joint_rotations
124 | 4. local_joint_positions = motion.local_joint_positions
125 | 5. Visualize the dtw matrix:
126 | import matplotlib.pyplot as plt
127 | plt.imshow(dtw)
128 | plt.show() # You can use plt.savefig('dtw.png') to save the image
129 | '''
130 |
131 | ########## Code Start ############
132 | # search_win1 =
133 | # search_win2 =
134 |
135 | # sim_matrix =
136 | # min_idx =
137 | # i, j = min_idx // dtw.shape[1], min_idx % dtw.shape[1]
138 | # real_i, real_j =
139 |
140 | # motion2.local_joint_positions = motion2.local_joint_positions - ?
141 |
142 | # between_local_pos = interpolation(?, ?, between_frames, 'linear')
143 | # between_local_rot = interpolation(?, ?, between_frames, 'slerp')
144 |
145 | ########## Code End ############
146 |
147 | res = motion1.raw_copy()
148 | res.local_joint_positions = np.concatenate([motion1.local_joint_positions[:real_i],
149 | between_local_pos,
150 | motion2.local_joint_positions[real_j:]],
151 | axis=0)
152 | res.local_joint_rotations = np.concatenate([motion1.local_joint_rotations[:real_i],
153 | between_local_rot,
154 | motion2.local_joint_rotations[real_j:]],
155 | axis=0)
156 | return res
157 |
158 |
159 | def part2_concatenate(viewer, between_frames, do_interp=True):
160 | walk_forward = BVHMotion('data/motion_walking.bvh')
161 | run_forward = BVHMotion('data/motion_running.bvh')
162 | run_forward.adjust_joint_name(walk_forward.joint_name)
163 |
164 | last_frame_index = 40
165 | start_frame_indx = 0
166 |
167 | if do_interp:
168 | motion = concatenate_two_motions(walk_forward, run_forward, last_frame_index, start_frame_indx, between_frames, method='interpolation')
169 | else:
170 | motion = walk_forward.raw_copy()
171 | motion.local_joint_positions = np.concatenate([walk_forward.local_joint_positions[:last_frame_index],
172 | run_forward.local_joint_positions[start_frame_indx:]],
173 | axis=0)
174 | motion.local_joint_rotations = np.concatenate([walk_forward.local_joint_rotations[:last_frame_index],
175 | run_forward.local_joint_rotations[start_frame_indx:]],
176 | axis=0)
177 |
178 | translation, orientation = motion.batch_forward_kinematics()
179 | task = ShowBVHUpdate(viewer, motion.joint_name, translation, orientation)
180 | viewer.addTask(task.update)
181 | pass
182 |
183 |
184 | def main():
185 | viewer = SimpleViewer()
186 |
187 | part1_key_framing(viewer, 10, 10)
188 | # part1_key_framing(viewer, 10, 5)
189 | # part1_key_framing(viewer, 10, 20)
190 | # part1_key_framing(viewer, 10, 30)
191 | # part2_concatenate(viewer, between_frames=8, do_interp=False)
192 | # part2_concatenate(viewer, between_frames=8)
193 | viewer.run()
194 |
195 |
196 | if __name__ == '__main__':
197 | main()
198 |
--------------------------------------------------------------------------------
/assignment_2/task2_motion_matching.py:
--------------------------------------------------------------------------------
1 | from file_io import BVHMotion
2 | from matching_utils import *
3 | from Viewer.controller import SimpleViewer, Controller
4 |
5 | import numpy as np
6 | import scipy.signal as signal
7 | from scipy.spatial import cKDTree
8 | from scipy.spatial.transform import Rotation as R
9 |
10 |
11 | feature_mapping = {
12 | 'lFootPos': 3,
13 | 'rFootPos': 3,
14 | 'lFootVel': 3,
15 | 'rFootVel': 3,
16 | 'lFootRot': 3,
17 | 'rFootRot': 3,
18 | 'lHandPos': 3,
19 | 'rHandPos': 3,
20 | 'lHandVel': 3,
21 | 'rHandVel': 3,
22 | 'lHandRot': 3,
23 | 'rHandRot': 3,
24 | 'rHipPos': 3,
25 | 'lHipPos': 3,
26 | 'rHipVel': 3,
27 | 'lHipVel': 3,
28 | 'rHipRot': 3,
29 | 'lHipRot': 3,
30 | 'hipPos': 3,
31 | 'hipRot': 3,
32 | 'hipVel': 3,
33 | 'trajectoryPos2D': 6,
34 | 'trajectoryRot2D': 6,
35 | 'lKneePos': 3,
36 |
37 | }
38 |
39 | class Database():
40 | def __init__(self, motions, feature_mapping, selected_feature_names, selected_feature_weights) -> None:
41 |
42 | self.dt = 1/60
43 |
44 | # Feature Definition
45 | self.feature_dims = [feature_mapping[i] for i in selected_feature_names]
46 | self.feature_names = selected_feature_names
47 | self.feature_weights = []
48 |
49 | self.feature_start_idxs = [0]
50 | for idx, dim_num in enumerate(self.feature_dims):
51 | self.feature_start_idxs.append(self.feature_start_idxs[-1]+dim_num)
52 | self.feature_weights.extend([selected_feature_weights[idx]]*dim_num)
53 |
54 | self.feature_weights = np.array(self.feature_weights)
55 |
56 | self.joint_name, self.joint_parent, self.joint_offset = motions[0].joint_name, motions[0].joint_parent, motions[0].joint_offset
57 | self.joint_name = ['Simulation_Bone'] + self.joint_name
58 | self.joint_parent = [-1] + [i+1 for i in self.joint_parent]
59 | self.joint_offset = np.concatenate([[np.zeros((1,3))], self.joint_offset], axis=0)[:, 0]
60 | self.joint_channel = np.array([3 if 'end' not in name else 0 for name in self.joint_name])
61 | self.joint_channel[:2] = 6
62 | self.joint_channel_offset = np.cumsum([0] + list(self.joint_channel))
63 |
64 | # Load Motion Data
65 | self.motion_data = []
66 | self.features = []
67 | self.local_velocities = []
68 | self.local_a_velocities = []
69 |
70 | for bvh_item in motions:
71 | joint_name = bvh_item.joint_name
72 | motion_data = bvh_item.motion_data
73 | true_motion_data = np.zeros((motion_data.shape[0], self.joint_channel_offset[-1]))
74 | joint_channel = np.array([3 if 'end' not in name else 0 for name in joint_name])
75 | joint_channel[0] = 6
76 | joint_channel_offset = np.cumsum([0] + list(joint_channel))
77 |
78 | for idx, name in enumerate(joint_name):
79 | if joint_channel[idx] == 0:
80 | continue
81 | true_idx = self.joint_name.index(name)
82 | true_motion_data[:, self.joint_channel_offset[true_idx]:self.joint_channel_offset[true_idx+1]] = motion_data[:, joint_channel_offset[idx]:joint_channel_offset[idx+1]]
83 |
84 | # Add a simulation bone
85 | sim_position_joint = self.joint_name.index('lowerback_torso')
86 | sim_rotation_joint = self.joint_name.index('RootJoint')
87 | tmp_pos, tmp_rot = forward_kinematics_with_channel(self.joint_parent, self.joint_channel, self.joint_offset, true_motion_data)
88 | sim_position = tmp_pos[:, sim_position_joint, :]
89 |
90 | sim_position = np.array([1, 0, 1]) * sim_position
91 | sim_rotation = R.from_quat(tmp_rot[:, sim_rotation_joint, :])
92 | fwd_direction = np.array([1,0,1]) * sim_rotation.apply(np.array([0, 0, 1]))
93 | fwd_direction = fwd_direction / np.linalg.norm(fwd_direction, axis=1, keepdims=True)
94 |
95 | filtered_sim_position = signal.savgol_filter(sim_position, 31, 3, axis=0, mode='interp')
96 | filtered_fwd_direction = signal.savgol_filter(fwd_direction, 61, 3, axis=0, mode='interp')
97 | filtered_fwd_direction = filtered_fwd_direction / np.linalg.norm(filtered_fwd_direction, axis=1, keepdims=True)
98 |
99 | angle = np.arctan2(filtered_fwd_direction[:, 0], filtered_fwd_direction[:, 2])
100 | filtered_sim_rotation = R.from_rotvec(np.array([0,1,0]).reshape(1,3) * angle[:, np.newaxis])
101 | true_motion_data[:,:3] = filtered_sim_position
102 | true_motion_data[:,3:6] = filtered_sim_rotation.as_euler('XYZ', degrees=True)
103 | true_motion_data[:, 6:9] -= filtered_sim_position
104 | true_motion_data[:, 9:12] = (filtered_sim_rotation.inv() * R.from_euler('XYZ', true_motion_data[:, 9:12], degrees=True)).as_euler('XYZ', degrees=True)
105 |
106 | self.motion_data.append(true_motion_data)
107 |
108 | # Extract the data terms
109 | pos, rot = forward_kinematics_with_channel(self.joint_parent, self.joint_channel, self.joint_offset, true_motion_data)
110 | rot = align_quat(rot, False)
111 | vel = np.zeros_like(pos)
112 | vel[1:] = (pos[1:] - pos[:-1])/self.dt
113 | vel[0] = vel[-1]
114 | avel = np.zeros_like(vel)
115 | avel[1:] = quat_to_avel(rot, self.dt)
116 | avel[0] = avel[-1]
117 |
118 | # Extract the features
119 | features = self.extract_features(pos[:, 0], rot[:, 0], pos, rot, vel, avel)
120 | self.features.append(features)
121 |
122 | # Store the local variables
123 | local_rot = R.from_euler('XYZ', true_motion_data.reshape(-1,3), degrees=True).as_quat().reshape(true_motion_data.shape[0], -1, 4)
124 | local_rot = align_quat(local_rot, False)
125 | avel = np.zeros_like(true_motion_data)
126 | avel[1:] = quat_to_avel(local_rot, self.dt).reshape(-1, avel.shape[-1])
127 | avel[0] = avel[-1]
128 | self.local_a_velocities.append(avel)
129 |
130 | vel = np.zeros_like(true_motion_data)
131 | vel[1:] = (true_motion_data[1:] - true_motion_data[:-1])/self.dt
132 | vel[0] = vel[-1]
133 | self.local_velocities.append(vel)
134 |
135 | self.motion_data = np.concatenate(self.motion_data, axis=0)
136 | self.features = np.concatenate(self.features, axis=0)
137 | self.local_a_velocities = np.concatenate(self.local_a_velocities, axis=0)
138 | self.local_velocities = np.concatenate(self.local_velocities, axis=0)
139 |
140 | self.frame_num, self.joint_num = self.motion_data.shape[0], self.local_a_velocities.shape[1] // 3
141 |
142 | self.features_mean, self.features_std = np.mean(self.features, axis=0), np.std(self.features, axis=0)
143 | self.features_std[self.features_std <= 0.1] = 0.1
144 |
145 | self.normalized_feature = (self.features - self.features_mean) / self.features_std
146 | self.query_tree = cKDTree(self.normalized_feature * self.feature_weights)
147 |
148 | print('Feature Extraction Done')
149 |
150 | def extract_features(self, root_pos, root_rot, pos, rot, vel, avel):
151 | features = []
152 | for feature_name in self.feature_names:
153 | if feature_name == 'lFootPos':
154 | features.append(self.extract_offset(root_pos, root_rot, pos[:, self.joint_name.index('lToeJoint')]))
155 | elif feature_name == 'rFootPos':
156 | features.append(self.extract_offset(root_pos, root_rot, pos[:, self.joint_name.index('rToeJoint')]))
157 | elif feature_name == 'lFootRot':
158 | features.append(self.extract_rotation(root_rot, rot[:, self.joint_name.index('lToeJoint')]))
159 | elif feature_name == 'rFootRot':
160 | features.append(self.extract_rotation(root_rot, rot[:, self.joint_name.index('rToeJoint')]))
161 | elif feature_name == 'lFootVel':
162 | features.append(self.extract_vel(root_rot, vel[:, self.joint_name.index('lToeJoint')]))
163 | elif feature_name == 'rFootVel':
164 | features.append(self.extract_vel(root_rot, vel[:, self.joint_name.index('rToeJoint')]))
165 | elif feature_name == 'lHandPos':
166 | features.append(self.extract_offset(root_pos, root_rot, pos[:, self.joint_name.index('lWrist')]))
167 | elif feature_name == 'rHandPos':
168 | features.append(self.extract_offset(root_pos, root_rot, pos[:, self.joint_name.index('rWrist')]))
169 | elif feature_name == 'lHandRot':
170 | features.append(self.extract_rotation(root_rot, rot[:, self.joint_name.index('lWrist')]))
171 | elif feature_name == 'rHandRot':
172 | features.append(self.extract_rotation(root_rot, rot[:, self.joint_name.index('rWrist')]))
173 | elif feature_name == 'lHandVel':
174 | features.append(self.extract_vel(root_rot, vel[:, self.joint_name.index('lWrist')]))
175 | elif feature_name == 'rHandVel':
176 | features.append(self.extract_vel(root_rot, vel[:, self.joint_name.index('rWrist')]))
177 | elif feature_name == 'rHipPos':
178 | features.append(self.extract_offset(root_pos, root_rot, pos[:, self.joint_name.index('lHip')]))
179 | elif feature_name == 'lHipPos':
180 | features.append(self.extract_offset(root_pos, root_rot, pos[:, self.joint_name.index('rHip')]))
181 | elif feature_name == 'rHipRot':
182 | features.append(self.extract_rotation(root_rot, rot[:, self.joint_name.index('lHip')]))
183 | elif feature_name == 'lHipRot':
184 | features.append(self.extract_rotation(root_rot, rot[:, self.joint_name.index('rHip')]))
185 | elif feature_name == 'rHipVel':
186 | features.append(self.extract_vel(root_rot, vel[:, self.joint_name.index('lHip')]))
187 | elif feature_name == 'lHipVel':
188 | features.append(self.extract_vel(root_rot, vel[:, self.joint_name.index('rHip')]))
189 | elif feature_name == 'hipPos':
190 | features.append(self.extract_offset(root_pos, root_rot, pos[:, self.joint_name.index('RootJoint')]))
191 | elif feature_name == 'hipRot':
192 | features.append(self.extract_rotation(root_rot, rot[:, self.joint_name.index('RootJoint')]))
193 | elif feature_name == 'hipVel':
194 | features.append(self.extract_vel(root_rot, vel[:, self.joint_name.index('RootJoint')]))
195 | elif feature_name == 'trajectoryPos2D':
196 | features.append(self.extract_future_pos(root_pos, root_rot, pos[:, 0]))
197 | elif feature_name == 'trajectoryRot2D':
198 | features.append(self.extract_future_rot(root_rot, rot[:, 0]))
199 | return np.concatenate(features, axis=-1)
200 |
201 | def calculate_statistics(self):
202 | self.features_mean = np.mean(self.features, axis=0)
203 | self.features_std = np.std(self.features, axis=0)
204 | self.features_std[self.features_std <= 0.1] = 0.1
205 |
206 | def normalize_features(self, features):
207 | normalized_feature = (features - self.features_mean) / self.features_std
208 | return normalized_feature
209 |
210 | def extract_vel(self, root_rot, bone_vel):
211 | rot = R.from_quat(root_rot).inv()
212 | return rot.apply(bone_vel)
213 |
214 | def extract_offset(self, root_pos, root_rot, bone_pos):
215 | rot = R.from_quat(root_rot).inv()
216 | return rot.apply(bone_pos - root_pos)
217 |
218 | def extract_rotation(self, root_rot, bone_rot):
219 | rot = R.from_quat(root_rot).inv()
220 | return (rot * R.from_quat(bone_rot)).as_euler('XYZ', degrees=True)
221 |
222 | def extract_future_pos(self, root_pos, root_rot, bone_pos, frames = [20, 40, 60]):
223 | rot = R.from_quat(root_rot).inv()
224 | res = []
225 | for t in frames:
226 | idx = np.arange(bone_pos.shape[0]) + t
227 | idx[idx >= bone_pos.shape[0]] = bone_pos.shape[0] - 1
228 | pos = rot.apply(bone_pos[idx] - root_pos)
229 | res.append(pos[:,[0,2]])
230 | return np.concatenate(res, axis = 1)
231 |
232 |
233 | def extract_future_rot(self, root_rot, bone_rot, frames = [20, 40, 60]):
234 | rot = R.from_quat(root_rot).inv()
235 | res = []
236 | for t in frames:
237 | idx = np.arange(bone_rot.shape[0]) + t
238 | idx[idx >= bone_rot.shape[0]] = bone_rot.shape[0] - 1
239 | direction = (rot*R.from_quat(bone_rot[idx])).apply(np.array([0,0,1]))
240 | res.append(direction[:,[0,2]])
241 | return np.concatenate(res, axis = 1)
242 |
243 | class CharacterController:
244 | def __init__(self, viewer, controller, selected_feature_names, selected_feature_weights):
245 | self.viewer = viewer
246 |
247 | self.motions = []
248 | self.motions.append(BVHMotion('data/motion_walking_long.bvh'))
249 | self.motions.append(BVHMotion('data/push.bvh'))
250 |
251 | self.db = Database(self.motions, feature_mapping, selected_feature_names, selected_feature_weights)
252 |
253 | self.controller = controller
254 | self.cur_frame = 0
255 | self.global_bone_pos = np.zeros(3)
256 | self.global_bone_rot = R.identity()
257 | self.search_time = 0.1
258 | self.search_timer = -1
259 | self.desired_vel_change = np.zeros(3)
260 | self.desired_rot_change = np.zeros(3)
261 |
262 | self.pre_motion_data = None
263 | self.pre_avel = np.zeros((self.db.joint_num, 3))
264 | self.rot_offset = np.zeros((self.db.joint_num, 3))
265 | self.avel_offset = np.zeros((self.db.joint_num, 3))
266 |
267 | self.pos_offset = np.zeros((2,3))
268 | self.vel_offset = np.zeros((2,3))
269 | self.pre_vel = np.zeros((2,3))
270 |
271 | def update_state(self, desired_pos_list, desired_rot_list, desired_vel_list, desired_avel_list, current_gait):
272 |
273 | should_search = False
274 | if self.search_timer <= 0 or self.cur_frame == self.db.frame_num:
275 | should_search = True
276 | eps = 30
277 |
278 | if np.linalg.norm(self.desired_vel_change) < eps and np.linalg.norm(self.controller.desired_velocity_change) > eps:
279 | should_search = True
280 |
281 | if np.linalg.norm(self.desired_rot_change) < eps and np.linalg.norm(self.controller.desired_rotation_change) > eps:
282 | should_search = True
283 |
284 | if self.cur_frame == self.db.frame_num - 1:
285 | should_search = True
286 |
287 | self.desired_vel_change = self.controller.desired_velocity_change
288 | self.desired_rot_change = self.controller.desired_rotation_change
289 |
290 | if should_search:
291 | cur_feature = self.db.features[self.cur_frame]
292 | query_feature = cur_feature.copy()
293 |
294 | if 'trajectoryPos2D' in self.db.feature_names:
295 | future_pos = np.concatenate([np.array(i).reshape(1,3) for i in desired_pos_list[1:4]], axis=0)
296 | future_pos = self.global_bone_rot.inv().apply(future_pos - self.global_bone_pos.reshape(1,3))
297 | start_idx = self.db.feature_start_idxs[self.db.feature_names.index('trajectoryPos2D')]
298 | end_idx = start_idx + self.db.feature_dims[self.db.feature_names.index('trajectoryPos2D')]
299 | query_feature[start_idx:end_idx] = (future_pos[:,[0,2]]).flatten()
300 |
301 | if 'trajectoryRot2D' in self.db.feature_names:
302 | future_rot = np.concatenate([np.array(i).reshape(1,4) for i in desired_rot_list[1:4]], axis=0)
303 | future_rot = (self.global_bone_rot.inv() * R.from_quat(future_rot[:,[1,2,3,0]])).apply(np.array([0,0,1]))
304 | start_idx = self.db.feature_start_idxs[self.db.feature_names.index('trajectoryRot2D')]
305 | end_idx = start_idx + self.db.feature_dims[self.db.feature_names.index('trajectoryRot2D')]
306 | query_feature[start_idx:end_idx] = (future_rot[:,[0,2]]).flatten()
307 |
308 | normalized_query = self.db.normalize_features(query_feature)
309 |
310 | # Do the query
311 | idx = self.db.query_tree.query(normalized_query.reshape(1,-1), k=1)[1][0]
312 |
313 | self.search_timer = self.search_time
314 | if self.pre_motion_data is not None:
315 | target_rot = self.db.motion_data[idx].reshape(-1,3).copy()
316 | target_rot[1] = R.from_quat(desired_rot_list[0]).as_euler('XYZ', degrees=True)
317 | target_avel = self.db.local_a_velocities[idx].reshape(-1,3).copy()
318 | self.rot_offset, self.avel_offset = InterpolationHelper.inertialize_transition_rot(
319 | self.rot_offset, self.avel_offset,
320 | self.pre_motion_data, self.pre_avel,
321 | target_rot, target_avel)
322 |
323 | target_pos = np.zeros_like(self.pos_offset)
324 | target_pos[0] = desired_pos_list[0]
325 | target_pos[0] = lerp( self.global_bone_pos , desired_pos_list[0], 0.5)
326 | target_pos[1] = self.db.motion_data[idx][6:9].copy()
327 | target_vel = self.db.local_velocities[idx].reshape(-1,3)[[0,2]].copy()
328 | target_vel[0] = desired_vel_list[0]
329 | target_vel[1] = 0
330 | self.pos_offset, self.vel_offset = InterpolationHelper.inertialize_transition_pos(
331 | self.pos_offset, self.vel_offset,
332 | self.pre_motion_data[[0,2]], self.pre_vel,
333 | target_pos, target_vel)
334 | else:
335 | idx = self.cur_frame
336 | self.search_timer -= self.db.dt
337 |
338 | idx += 1
339 | if idx >= self.db.motion_data.shape[0]:
340 | idx = self.cur_frame -1
341 |
342 | motion_data = self.db.motion_data[idx].copy()
343 |
344 | if self.pre_motion_data is not None:
345 | target_rot = self.db.motion_data[idx].reshape(-1,3).copy()
346 | target_rot[1] = R.from_quat(desired_rot_list[0]).as_euler('XYZ', degrees=True)
347 | target_avel = self.db.local_a_velocities[idx].reshape(-1,3).copy()
348 | target_avel[0] = desired_avel_list[1]
349 | rot, self.pre_avel, self.rot_offset, self.avel_offset = InterpolationHelper.inertialize_update_rot(
350 | self.rot_offset, self.avel_offset, target_rot,
351 | target_avel,
352 | 0.05, 1/60)
353 |
354 | target_pos = np.zeros_like(self.pos_offset)
355 | target_pos[0] = lerp( self.global_bone_pos , self.controller.current_desired_position, 0.9999)
356 | target_pos[1] = self.db.motion_data[idx][6:9]
357 | target_vel = self.db.local_velocities[idx].reshape(-1,3)[[0,2]].copy()
358 | target_vel[0] = self.controller.vel
359 | target_vel[1] = 0
360 | pos, self.pre_vel, self.pos_offset, self.vel_offset = InterpolationHelper.inertialize_update_pos(
361 | self.pos_offset, self.vel_offset,
362 | target_pos, target_vel,
363 | 0.05, 1/60)
364 |
365 | motion_data = rot.flatten()
366 | motion_data[0:3] = pos[0]
367 | motion_data[6:9] = pos[1]
368 | else:
369 | motion_data[0:3] = self.global_bone_pos
370 | motion_data[3:6] = self.global_bone_rot.as_euler('XYZ', degrees=True)
371 |
372 | self.global_bone_pos = motion_data[0:3]
373 | self.global_bone_rot = R.from_euler('XYZ', motion_data[3:6], degrees=True)
374 | self.pre_motion_data = motion_data.reshape(-1,3)
375 |
376 | joint_translation, joint_orientation = forward_kinematics_with_channel(self.db.joint_parent, self.db.joint_channel, self.db.joint_offset, motion_data.reshape(1,-1))
377 | joint_translation, joint_orientation = joint_translation[0], joint_orientation[0]
378 | for name, p, r in zip(self.db.joint_name, joint_translation, joint_orientation):
379 | self.viewer.set_joint_position_orientation(name, p, r)
380 | self.cur_frame = idx
381 | return
382 |
383 | def sync_controller_and_character(self, controller, character_state):
384 | self.cur_root_pos = character_state[1][0]
385 | self.cur_root_rot = character_state[2][0]
386 | controller.set_pos(self.cur_root_pos)
387 | controller.set_rot(self.cur_root_rot)
388 |
389 | character_state = (self.db.joint_name, character_state[1], character_state[2])
390 | return character_state
391 |
392 |
393 | class InteractiveUpdate:
394 | def __init__(self, viewer, controller, character_controller):
395 | self.viewer = viewer
396 | self.controller = controller
397 | self.character_controller = character_controller
398 |
399 | def update(self, task):
400 | desired_pos_list, desired_rot_list, desired_vel_list, desired_avel_list, current_gait = self.controller.get_desired_state()
401 | self.character_controller.update_state(
402 | desired_pos_list, desired_rot_list,
403 | desired_vel_list, desired_avel_list, current_gait
404 | )
405 | return task.cont
406 |
407 |
408 | def main():
409 | viewer = SimpleViewer()
410 | controller = Controller(viewer)
411 |
412 | selected_feature_names = ['trajectoryPos2D', 'trajectoryRot2D']
413 | selected_feature_weights = [1, 1]
414 |
415 | # selected_feature_names = ['lFootPos', 'rFootPos']
416 | # selected_feature_weights = [1, 1]
417 |
418 | assert len(selected_feature_names) == len(selected_feature_weights)
419 |
420 | character_controller = CharacterController(viewer, controller, selected_feature_names, selected_feature_weights)
421 | task = InteractiveUpdate(viewer, controller, character_controller)
422 | viewer.addTask(task.update)
423 | viewer.run()
424 | pass
425 |
426 |
427 | if __name__ == '__main__':
428 | main()
429 |
--------------------------------------------------------------------------------
/assignment_3/README.md:
--------------------------------------------------------------------------------
1 | # Assignment 3
2 |
3 | ## Highlights
4 |
5 | ### Submission Due:
6 |
7 | - Apr. 8th, 23:59
8 |
9 | ### Submission
10 |
11 | File format: A compressed file **[uid_name_assignment3.zip]** with:
12 |
13 | 1. your `rigid_body_dynamic.py`
14 | 2. a recorded video of your simulator
15 | 3. your report `uid_name_3.pdf`
16 | 4. your own mesh.obj (if any)
17 |
18 | ## Introduction
19 |
20 | Rigid body simulation is a powerful tool used in a variety of fields, from robotics and engineering to computer graphics and animation. It involves modeling and simulating the motion of solid objects that maintain their shape and size, even when subjected to external forces and torques.
21 |
22 | In this assignment, we will explore how to implement a simple rigid body simulation using Python and Taichi. Taichi is a high-performance programming language designed for numerical computing and machine learning, and it provides a user-friendly interface for writing efficient simulations that can run on both CPUs and GPUs.
23 |
24 | We will begin by reviewing the basic principles of rigid body dynamics, including the equations of motion and the kinematics of rotation and translation.
25 |
26 | Using these tools, we will develop a Python program that simulates the motion of a rigid body in response to external forces and torques, such as gravity, friction, and collisions.
27 |
28 | By the end of this assignment, you will have gained a solid understanding of the principles of rigid body simulation and how to implement them using Python and Taichi. You will also have developed practical skills in numerical computing, programming, and visualization that can be applied to a wide range of scientific and engineering problems.
29 |
30 | ## Examples
31 |
32 | Here are some examples of the sucessfully implemented simulator.
33 |
34 | Ball Dropping without changing any of the simulation parameters.
35 |
36 | https://user-images.githubusercontent.com/43705353/231971484-f043cb9c-b53b-43cc-9220-c69b3c60fc4c.mov
37 |
38 |
39 | Ball Dropping with initial velocity [3, 0, 0].
40 |
41 | https://user-images.githubusercontent.com/43705353/231971536-5b57bb7c-e12c-49bd-806f-c7208e933eb1.mov
42 |
43 |
44 | Ball Dropping with initial velocity [3, 0, 0] and collision damping stiffness 1e4, friction coefficient 0.5 on the ground.
45 |
46 | https://user-images.githubusercontent.com/43705353/231973213-6caa0c81-3eab-4b0f-bbbf-e878ca193215.mov
47 |
48 |
49 |
50 | ## Environment
51 |
52 | ### New environment
53 |
54 | ```bash
55 | # recommend to use Anaconda to manage enviroment
56 | $ conda create -n comp3360 python=3.10
57 | $ conda activate comp3360
58 | $ conda install numpy scipy
59 | $ pip install panda3d taichi
60 |
61 | $ cd ./assignment_3
62 | $ python rigid_body_dynamic.py
63 | ```
64 |
65 | ### From the existing environment for Assignment1&2
66 |
67 | ```bash
68 | $ conda activate comp3360
69 | $ pip install taichi
70 | ```
71 |
72 | ## Task 1 - Basic Rigid Body Simulator (90%)
73 |
74 | You are required to implement **7** `TODO` to update the rigid body state in rigid_body_dynamic.py. Please check the hints in the comments.
75 |
76 | Feel free to use your mesh or change the simulation parameters(initial_velocity, etc.).
77 |
78 | You are strongly suggested to spend 30-60 mins to go through https://www.cs.cmu.edu/~baraff/sigcourse/notesd1.pdf, before doing any implementation.
79 |
80 | ## Task 2 (Bonus)
81 |
82 | Once you have implemented the basic simulator, there is an opportunity to enhance its functionality by adding additional features. For instance, you may consider implementing damping and friction to improve the simulator.
83 |
84 | If you are unable to implement additional features, you may still propose your ideas on how to improve the simulator in your report (For example, One possible way to model friction is to use the equation $\mu N v_d$, where mu represents the friction coefficient, N represents the pressure, and vd represents the direction of velocity.).
85 |
86 | If your proposal is deemed reasonable, you may be rewarded with a bonus.
87 |
88 | ## Task 3 Report (10%)
89 |
90 | - PDF format, no page size requirement, so you can also prepare it with PowerPoint or keynote or markdown
91 | - The first two lines should introduce your NAME and UID.
92 | - Should include
93 | - Your simulation parameters. (initial_velocity, initial_angular_velocity, etc.)
94 | - Task 2, Your idea to improve the simulator. (If any)
95 |
--------------------------------------------------------------------------------
/assignment_3/rigid_body_dynamic.py:
--------------------------------------------------------------------------------
1 | import taichi as ti
2 | import numpy as np
3 |
4 | # Set up Taichi
5 | ti.init(arch=ti.cpu, debug=True)
6 |
7 |
8 | # Function to read an OBJ file
9 | def read_obj_file(file_path, scale=1.0):
10 | vertices = []
11 | faces = []
12 |
13 | with open(file_path, 'r') as file:
14 | for line in file:
15 | line = line.strip()
16 | if not line or line.startswith('#'):
17 | continue
18 |
19 | elements = line.split()
20 | if elements[0] == 'v':
21 | vertices.append([scale * float(e) for e in elements[1:]])
22 | elif elements[0] == 'f':
23 | faces.append([int(e.split('/')[0]) - 1 for e in elements[1:]])
24 |
25 | return np.array(vertices, dtype=np.float32), np.array(faces, dtype=np.int32)
26 |
27 |
28 | # Read an OBJ file, feel free to change the file path to use your own mesh
29 | file_path = "sphere.obj"
30 | vertices_np, faces_np = read_obj_file(file_path, scale=0.3)
31 |
32 | faces = ti.field(dtype=ti.i32, shape=faces_np.shape)
33 | # Particle state
34 | particle_vertices = ti.Vector.field(3, dtype=ti.f32, shape=vertices_np.shape[0])
35 | particle_origin_vertices = ti.Vector.field(3, dtype=ti.f32, shape=vertices_np.shape[0])
36 | particle_velocities = ti.Vector.field(3, dtype=ti.f32, shape=vertices_np.shape[0])
37 | particle_force = ti.Vector.field(3, dtype=float, shape=vertices_np.shape[0])
38 |
39 | # Body state
40 | # IMPORTANT: if you wish to access the following fields in the kernel function, use field_name[None]
41 | body_cm_position = ti.Vector.field(3, dtype=float, shape=())
42 | body_origin_cm_position = ti.Vector.field(3, dtype=float, shape=())
43 | body_velocity = ti.Vector.field(3, dtype=float, shape=())
44 | body_angular_velocity = ti.Vector.field(3, dtype=float, shape=())
45 | body_rotation = ti.Matrix.field(3, 3, dtype=float, shape=())
46 | body_rotation_quaternion = ti.Vector.field(4, dtype=float, shape=())
47 | body_angular_momentum = ti.Vector.field(3, dtype=float, shape=())
48 | body_origin_inverse_inertia = ti.Matrix.field(3, 3, dtype=float, shape=())
49 | body_mass = ti.field(float, shape=())
50 |
51 | # Simulation parameters, feel free to change them
52 | # We assume all particles have the same mass
53 | particle_mass = 1
54 | initial_velocity = ti.Vector([0.0, 0.0, 0.0])
55 | initial_angular_velocity = ti.Vector([0.0, 0, 0.0])
56 | gravity = ti.Vector([0.0, -9.8, 0.0])
57 | # stiffness of the collision
58 | collision_stiffness = 1e4
59 | velocity_damping_stiffness = 1e3
60 | friction_stiffness = 0.1
61 | # simulation integration time step
62 | dt = 1e-3
63 |
64 | # Initialize the fields
65 | # Copy the vertices and faces numpy data to Taichi Fields
66 | particle_vertices.from_numpy(vertices_np)
67 | particle_origin_vertices.from_numpy(vertices_np)
68 | faces.from_numpy(faces_np)
69 |
70 | # Indices field for rendering
71 | indices = ti.field(int, shape=3 * faces_np.shape[0])
72 | for i in range(faces_np.shape[0]):
73 | indices[3 * i] = faces[i, 0]
74 | indices[3 * i + 1] = faces[i, 1]
75 | indices[3 * i + 2] = faces[i, 2]
76 |
77 |
78 | @ti.kernel
79 | def initial():
80 | # Initialize the body and particle state
81 |
82 | # Compute the center of mass and mass of the body
83 | for i in ti.grouped(particle_vertices):
84 | body_mass[None] += particle_mass
85 | body_cm_position[None] += particle_mass * particle_vertices[i]
86 | body_cm_position[None] /= body_mass[None]
87 | body_origin_cm_position[None] = body_cm_position[None]
88 |
89 | # Compute the inertia of the body
90 | inertia = ti.Matrix([[0.0, 0.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0]])
91 |
92 | # TODO 1: Compute the inertia tensor of the body
93 | # Hint: You can use the function ti.Matrix.outer_product to compute v*v^T
94 | # Hint: You can use the function ti.Matrix.dot to compute v^T*v
95 | # Hint: You can use the function ti.Matrix.identity(float, 3) to get a 3x3 identity matrix
96 | for i in ti.grouped(particle_vertices):
97 | # inertia += particle_mass * ((particle_vertices[i] - body_cm_position[None]).dot(
98 | pass
99 |
100 | # Compute the inverse inertia of the body and store it in the field
101 | body_origin_inverse_inertia[None] = inertia.inverse()
102 |
103 | # Initialize the particle velocities
104 | for i in ti.grouped(particle_vertices):
105 | particle_velocities[i] = initial_velocity
106 |
107 | # Initialize the body state
108 | body_velocity[None] = initial_velocity
109 | body_angular_velocity[None] = initial_angular_velocity
110 | body_angular_momentum[None] = inertia @ initial_angular_velocity
111 |
112 | # Initialize the rotation matrix and quaternion
113 | body_rotation[None] = ti.Matrix([[1.0, 0.0, 0.0], [0.0, 1.0, 0.0], [0.0, 0.0, 1.0]])
114 | body_rotation_quaternion[None] = ti.Vector([1.0, 0.0, 0.0, 0.0])
115 |
116 |
117 | initial()
118 |
119 |
120 | # quaternion multiplication, this is used to update the rotation quaternion
121 | @ti.func
122 | def quaternion_multiplication(p: ti.template(), q: ti.template()) -> ti.template():
123 | return ti.Vector([p[0] * q[0] - p[1] * q[1] - p[2] * q[2] - p[3] * q[3],
124 | p[0] * q[1] + p[1] * q[0] + p[2] * q[3] - p[3] * q[2],
125 | p[0] * q[2] - p[1] * q[3] + p[2] * q[0] + p[3] * q[1],
126 | p[0] * q[3] + p[1] * q[2] - p[2] * q[1] + p[3] * q[0]])
127 |
128 |
129 | # quaternion to rotation matrix
130 | @ti.func
131 | def quaternion_to_matrix(q: ti.template()) -> ti.template():
132 | qw = q[0]
133 | qx = q[1]
134 | qy = q[2]
135 | qz = q[3]
136 | return ti.Matrix([[qw * qw + qx * qx - qy * qy - qz * qz, 2 * (qx * qy - qw * qz), 2 * (qx * qz + qw * qy)],
137 | [2 * (qx * qy + qw * qz), (qw * qw - qx * qx + qy * qy - qz * qz), 2 * (qy * qz - qw * qx)],
138 | [2 * (qx * qz - qw * qy), 2 * (qy * qz + qw * qx), qw * qw - qx * qx - qy * qy + qz * qz]])
139 |
140 |
141 | @ti.kernel
142 | def substep():
143 | # computer the force on each particle
144 | for i in ti.grouped(particle_vertices):
145 | # TODO 2: gravity
146 | # particle_force[i] =
147 |
148 | # Collision force, we use a spring model to simulate the collision
149 | if particle_vertices[i][1] < -1:
150 | f_collision = collision_stiffness * (-1 - particle_vertices[i][1])
151 | particle_force[i] += ti.Vector([0, f_collision, 0])
152 | if particle_vertices[i][0] < -1:
153 | f_collision = collision_stiffness * (-1 - particle_vertices[i][0])
154 | particle_force[i] += ti.Vector([f_collision, 0, 0])
155 | if particle_vertices[i][0] > 1:
156 | f_collision = collision_stiffness * (1 - particle_vertices[i][0])
157 | particle_force[i] += ti.Vector([f_collision, 0, 0])
158 | if particle_vertices[i][2] < -1:
159 | f_collision = collision_stiffness * (-1 - particle_vertices[i][2])
160 | particle_force[i] += ti.Vector([0, 0, f_collision])
161 | if particle_vertices[i][2] > 1:
162 | f_collision = collision_stiffness * (1 - particle_vertices[i][2])
163 | particle_force[i] += ti.Vector([0, 0, f_collision])
164 |
165 | # computer the force for rigid body
166 | body_force = ti.Vector([0.0, 0.0, 0.0])
167 | for i in ti.grouped(particle_vertices):
168 | # TODO 3: compute the force for rigid body
169 | # body_force +=
170 | pass
171 |
172 | # computer the torque for rigid body
173 | body_torque = ti.Vector([0.0, 0.0, 0.0])
174 | for i in ti.grouped(particle_vertices):
175 | # TODO 4: compute the torque for rigid body
176 | # Hint: use ti.math.cross(v1, v2) to compute the cross product
177 | # torque +=
178 | pass
179 |
180 | # update the rigid body
181 | # TODO 5: update the center of mass position and velocity
182 | # body_velocity[None] +=
183 | # body_cm_position[None] +=
184 |
185 | # TODO 6: update the rotation quaternion
186 | # d_q = 0.5 * quaternion_multiplication(ti.Vector([0, ?, ?, ?]), body_rotation_quaternion[None])
187 | # body_rotation_quaternion[None] +=
188 |
189 | # normalize the quaternion to avoid numerical error
190 | body_rotation_quaternion[None] /= body_rotation_quaternion[None].norm()
191 | body_rotation[None] = quaternion_to_matrix(body_rotation_quaternion[None])
192 |
193 | # TODO 7: update, the angular momentum, inertia tensor and angular velocity
194 | # hint: use A @ B to do matrix multiplication, use A.transpose() to get the transpose of A
195 | # body_angular_momentum[None] =
196 | # body_inverse_inertia =
197 | # body_angular_velocity[None] =
198 |
199 | # update the particles
200 | for i in ti.grouped(particle_vertices):
201 | ri = body_rotation[None] @ (particle_origin_vertices[i] - body_origin_cm_position[None])
202 | particle_vertices[i] = ri + body_cm_position[None]
203 | particle_velocities[i] = body_velocity[None] + ti.math.cross(body_angular_velocity[None], ri)
204 |
205 |
206 | # GUI stuff
207 | # draw a cube frame
208 | frame_vertices = ti.Vector.field(3, dtype=float, shape=24)
209 | vertices_list = [
210 | [-1, -1, 0], [1, -1, 0], [1, -1, 0], [1, 1, 0],
211 | [1, 1, 0], [-1, 1, 0], [-1, 1, 0], [-1, -1, 0],
212 | [-1, -1, 1], [1, -1, 1], [1, -1, 1], [1, 1, 1],
213 | [1, 1, 1], [-1, 1, 1], [-1, 1, 1], [-1, -1, 1],
214 | [-1, -1, 0], [-1, -1, 1], [1, -1, 0], [1, -1, 1],
215 | [1, 1, 0], [1, 1, 1], [-1, 1, 0], [-1, 1, 1]
216 | ]
217 | for i in range(len(vertices_list)):
218 | frame_vertices[i] = ti.Vector(vertices_list[i])
219 |
220 | window = ti.ui.Window("Rigid Body Simulation", (1024, 1024),
221 | vsync=True)
222 | canvas = window.get_canvas()
223 | canvas.set_background_color((1, 1, 1))
224 | scene = ti.ui.Scene()
225 | camera = ti.ui.Camera()
226 | # rendering frame rate is 1/60
227 | substeps = int(1 / 60 // dt)
228 | current_t = 0.0
229 |
230 | while window.running:
231 | for i in range(substeps):
232 | substep()
233 | current_t += dt
234 |
235 | camera.position(0.0, 0.0, 3)
236 | camera.lookat(0.0, 0.0, 0)
237 | scene.set_camera(camera)
238 |
239 | scene.point_light(pos=(0, 1, 2), color=(1, 1, 1))
240 | scene.ambient_light((0.5, 0.5, 0.5))
241 |
242 | scene.mesh(particle_vertices,
243 | indices=indices,
244 | two_sided=True,
245 | show_wireframe=True)
246 |
247 | scene.lines(frame_vertices, color=(1, 0, 0), width=1)
248 | canvas.scene(scene)
249 | window.show()
250 |
--------------------------------------------------------------------------------
/assignment_3/sphere.obj:
--------------------------------------------------------------------------------
1 | # Blender v3.2.2 OBJ File: ''
2 | # www.blender.org
3 | o Icosphere
4 | v 0.000000 -1.000000 0.000000
5 | v 0.723607 -0.447220 0.525725
6 | v -0.276388 -0.447220 0.850649
7 | v -0.894426 -0.447216 0.000000
8 | v -0.276388 -0.447220 -0.850649
9 | v 0.723607 -0.447220 -0.525725
10 | v 0.276388 0.447220 0.850649
11 | v -0.723607 0.447220 0.525725
12 | v -0.723607 0.447220 -0.525725
13 | v 0.276388 0.447220 -0.850649
14 | v 0.894426 0.447216 0.000000
15 | v 0.000000 1.000000 0.000000
16 | v -0.162456 -0.850654 0.499995
17 | v 0.425323 -0.850654 0.309011
18 | v 0.262869 -0.525738 0.809012
19 | v 0.850648 -0.525736 0.000000
20 | v 0.425323 -0.850654 -0.309011
21 | v -0.525730 -0.850652 0.000000
22 | v -0.688189 -0.525736 0.499997
23 | v -0.162456 -0.850654 -0.499995
24 | v -0.688189 -0.525736 -0.499997
25 | v 0.262869 -0.525738 -0.809012
26 | v 0.951058 0.000000 0.309013
27 | v 0.951058 0.000000 -0.309013
28 | v 0.000000 0.000000 1.000000
29 | v 0.587786 0.000000 0.809017
30 | v -0.951058 0.000000 0.309013
31 | v -0.587786 0.000000 0.809017
32 | v -0.587786 0.000000 -0.809017
33 | v -0.951058 0.000000 -0.309013
34 | v 0.587786 0.000000 -0.809017
35 | v 0.000000 0.000000 -1.000000
36 | v 0.688189 0.525736 0.499997
37 | v -0.262869 0.525738 0.809012
38 | v -0.850648 0.525736 0.000000
39 | v -0.262869 0.525738 -0.809012
40 | v 0.688189 0.525736 -0.499997
41 | v 0.162456 0.850654 0.499995
42 | v 0.525730 0.850652 0.000000
43 | v -0.425323 0.850654 0.309011
44 | v -0.425323 0.850654 -0.309011
45 | v 0.162456 0.850654 -0.499995
46 | vt 0.181819 0.000000
47 | vt 0.227273 0.078731
48 | vt 0.136365 0.078731
49 | vt 0.272728 0.157461
50 | vt 0.318182 0.078731
51 | vt 0.363637 0.157461
52 | vt 0.909091 0.000000
53 | vt 0.954545 0.078731
54 | vt 0.863636 0.078731
55 | vt 0.727273 0.000000
56 | vt 0.772727 0.078731
57 | vt 0.681818 0.078731
58 | vt 0.545455 0.000000
59 | vt 0.590909 0.078731
60 | vt 0.500000 0.078731
61 | vt 0.318182 0.236191
62 | vt 0.090910 0.157461
63 | vt 0.181819 0.157461
64 | vt 0.136365 0.236191
65 | vt 0.818182 0.157461
66 | vt 0.909091 0.157461
67 | vt 0.863636 0.236191
68 | vt 0.636364 0.157461
69 | vt 0.727273 0.157461
70 | vt 0.681818 0.236191
71 | vt 0.454546 0.157461
72 | vt 0.545455 0.157461
73 | vt 0.500000 0.236191
74 | vt 0.227273 0.236191
75 | vt 0.045455 0.236191
76 | vt 0.772727 0.236191
77 | vt 0.590909 0.236191
78 | vt 0.409092 0.236191
79 | vt 0.181819 0.314921
80 | vt 0.272728 0.314921
81 | vt 0.227273 0.393651
82 | vt 0.000000 0.314921
83 | vt 0.090910 0.314921
84 | vt 0.045455 0.393651
85 | vt 0.727273 0.314921
86 | vt 0.818182 0.314921
87 | vt 0.772727 0.393651
88 | vt 0.545455 0.314921
89 | vt 0.636364 0.314921
90 | vt 0.590909 0.393651
91 | vt 0.363637 0.314921
92 | vt 0.454546 0.314921
93 | vt 0.409092 0.393651
94 | vt 0.500000 0.393651
95 | vt 0.454546 0.472382
96 | vt 0.681818 0.393651
97 | vt 0.636364 0.472382
98 | vt 0.863636 0.393651
99 | vt 0.818182 0.472382
100 | vt 0.909091 0.314921
101 | vt 0.136365 0.393651
102 | vt 0.090910 0.472382
103 | vt 0.318182 0.393651
104 | vt 0.272728 0.472382
105 | vt 0.954545 0.236191
106 | vt 1.000000 0.157461
107 | vt 0.409092 0.078731
108 | vt 0.363637 0.000000
109 | vn 0.1024 -0.9435 0.3151
110 | vn 0.7002 -0.6617 0.2680
111 | vn -0.2680 -0.9435 0.1947
112 | vn -0.2680 -0.9435 -0.1947
113 | vn 0.1024 -0.9435 -0.3151
114 | vn 0.9050 -0.3304 0.2680
115 | vn 0.0247 -0.3304 0.9435
116 | vn -0.8897 -0.3304 0.3151
117 | vn -0.5746 -0.3304 -0.7488
118 | vn 0.5346 -0.3304 -0.7779
119 | vn 0.8026 -0.1256 0.5831
120 | vn -0.3066 -0.1256 0.9435
121 | vn -0.9921 -0.1256 0.0000
122 | vn -0.3066 -0.1256 -0.9435
123 | vn 0.8026 -0.1256 -0.5831
124 | vn 0.4089 0.6617 0.6284
125 | vn -0.4713 0.6617 0.5831
126 | vn -0.7002 0.6617 -0.2680
127 | vn 0.0385 0.6617 -0.7488
128 | vn 0.7240 0.6617 -0.1947
129 | vn 0.2680 0.9435 -0.1947
130 | vn 0.4911 0.7947 -0.3568
131 | vn 0.4089 0.6617 -0.6284
132 | vn -0.1024 0.9435 -0.3151
133 | vn -0.1876 0.7947 -0.5773
134 | vn -0.4713 0.6617 -0.5831
135 | vn -0.3313 0.9435 0.0000
136 | vn -0.6071 0.7947 0.0000
137 | vn -0.7002 0.6617 0.2680
138 | vn -0.1024 0.9435 0.3151
139 | vn -0.1876 0.7947 0.5773
140 | vn 0.0385 0.6617 0.7488
141 | vn 0.2680 0.9435 0.1947
142 | vn 0.4911 0.7947 0.3568
143 | vn 0.7240 0.6617 0.1947
144 | vn 0.8897 0.3304 -0.3151
145 | vn 0.7947 0.1876 -0.5773
146 | vn 0.5746 0.3304 -0.7488
147 | vn -0.0247 0.3304 -0.9435
148 | vn -0.3035 0.1876 -0.9342
149 | vn -0.5346 0.3304 -0.7779
150 | vn -0.9050 0.3304 -0.2680
151 | vn -0.9822 0.1876 0.0000
152 | vn -0.9050 0.3304 0.2680
153 | vn -0.5346 0.3304 0.7779
154 | vn -0.3035 0.1876 0.9342
155 | vn -0.0247 0.3304 0.9435
156 | vn 0.5746 0.3304 0.7488
157 | vn 0.7947 0.1876 0.5773
158 | vn 0.8897 0.3304 0.3151
159 | vn 0.3066 0.1256 -0.9435
160 | vn 0.3035 -0.1876 -0.9342
161 | vn 0.0247 -0.3304 -0.9435
162 | vn -0.8026 0.1256 -0.5831
163 | vn -0.7947 -0.1876 -0.5773
164 | vn -0.8897 -0.3304 -0.3151
165 | vn -0.8026 0.1256 0.5831
166 | vn -0.7947 -0.1876 0.5773
167 | vn -0.5746 -0.3304 0.7488
168 | vn 0.3066 0.1256 0.9435
169 | vn 0.3035 -0.1876 0.9342
170 | vn 0.5346 -0.3304 0.7779
171 | vn 0.9921 0.1256 0.0000
172 | vn 0.9822 -0.1876 0.0000
173 | vn 0.9050 -0.3304 -0.2680
174 | vn 0.4713 -0.6617 -0.5831
175 | vn 0.1876 -0.7947 -0.5773
176 | vn -0.0385 -0.6617 -0.7488
177 | vn -0.4089 -0.6617 -0.6284
178 | vn -0.4911 -0.7947 -0.3568
179 | vn -0.7240 -0.6617 -0.1947
180 | vn -0.7240 -0.6617 0.1947
181 | vn -0.4911 -0.7947 0.3568
182 | vn -0.4089 -0.6617 0.6284
183 | vn 0.7002 -0.6617 -0.2680
184 | vn 0.6071 -0.7947 0.0000
185 | vn 0.3313 -0.9435 0.0000
186 | vn -0.0385 -0.6617 0.7488
187 | vn 0.1876 -0.7947 0.5773
188 | vn 0.4713 -0.6617 0.5831
189 | s off
190 | f 1/1/1 14/2/1 13/3/1
191 | f 2/4/2 14/5/2 16/6/2
192 | f 1/7/3 13/8/3 18/9/3
193 | f 1/10/4 18/11/4 20/12/4
194 | f 1/13/5 20/14/5 17/15/5
195 | f 2/4/6 16/6/6 23/16/6
196 | f 3/17/7 15/18/7 25/19/7
197 | f 4/20/8 19/21/8 27/22/8
198 | f 5/23/9 21/24/9 29/25/9
199 | f 6/26/10 22/27/10 31/28/10
200 | f 2/4/11 23/16/11 26/29/11
201 | f 3/17/12 25/19/12 28/30/12
202 | f 4/20/13 27/22/13 30/31/13
203 | f 5/23/14 29/25/14 32/32/14
204 | f 6/26/15 31/28/15 24/33/15
205 | f 7/34/16 33/35/16 38/36/16
206 | f 8/37/17 34/38/17 40/39/17
207 | f 9/40/18 35/41/18 41/42/18
208 | f 10/43/19 36/44/19 42/45/19
209 | f 11/46/20 37/47/20 39/48/20
210 | f 39/48/21 42/49/21 12/50/21
211 | f 39/48/22 37/47/22 42/49/22
212 | f 37/47/23 10/43/23 42/49/23
213 | f 42/45/24 41/51/24 12/52/24
214 | f 42/45/25 36/44/25 41/51/25
215 | f 36/44/26 9/40/26 41/51/26
216 | f 41/42/27 40/53/27 12/54/27
217 | f 41/42/28 35/41/28 40/53/28
218 | f 35/41/29 8/55/29 40/53/29
219 | f 40/39/30 38/56/30 12/57/30
220 | f 40/39/31 34/38/31 38/56/31
221 | f 34/38/32 7/34/32 38/56/32
222 | f 38/36/33 39/58/33 12/59/33
223 | f 38/36/34 33/35/34 39/58/34
224 | f 33/35/35 11/46/35 39/58/35
225 | f 24/33/36 37/47/36 11/46/36
226 | f 24/33/37 31/28/37 37/47/37
227 | f 31/28/38 10/43/38 37/47/38
228 | f 32/32/39 36/44/39 10/43/39
229 | f 32/32/40 29/25/40 36/44/40
230 | f 29/25/41 9/40/41 36/44/41
231 | f 30/31/42 35/41/42 9/40/42
232 | f 30/31/43 27/22/43 35/41/43
233 | f 27/22/44 8/55/44 35/41/44
234 | f 28/30/45 34/38/45 8/37/45
235 | f 28/30/46 25/19/46 34/38/46
236 | f 25/19/47 7/34/47 34/38/47
237 | f 26/29/48 33/35/48 7/34/48
238 | f 26/29/49 23/16/49 33/35/49
239 | f 23/16/50 11/46/50 33/35/50
240 | f 31/28/51 32/32/51 10/43/51
241 | f 31/28/52 22/27/52 32/32/52
242 | f 22/27/53 5/23/53 32/32/53
243 | f 29/25/54 30/31/54 9/40/54
244 | f 29/25/55 21/24/55 30/31/55
245 | f 21/24/56 4/20/56 30/31/56
246 | f 27/22/57 28/60/57 8/55/57
247 | f 27/22/58 19/21/58 28/60/58
248 | f 19/21/59 3/61/59 28/60/59
249 | f 25/19/60 26/29/60 7/34/60
250 | f 25/19/61 15/18/61 26/29/61
251 | f 15/18/62 2/4/62 26/29/62
252 | f 23/16/63 24/33/63 11/46/63
253 | f 23/16/64 16/6/64 24/33/64
254 | f 16/6/65 6/26/65 24/33/65
255 | f 17/15/66 22/27/66 6/26/66
256 | f 17/15/67 20/14/67 22/27/67
257 | f 20/14/68 5/23/68 22/27/68
258 | f 20/12/69 21/24/69 5/23/69
259 | f 20/12/70 18/11/70 21/24/70
260 | f 18/11/71 4/20/71 21/24/71
261 | f 18/9/72 19/21/72 4/20/72
262 | f 18/9/73 13/8/73 19/21/73
263 | f 13/8/74 3/61/74 19/21/74
264 | f 16/6/75 17/62/75 6/26/75
265 | f 16/6/76 14/5/76 17/62/76
266 | f 14/5/77 1/63/77 17/62/77
267 | f 13/3/78 15/18/78 3/17/78
268 | f 13/3/79 14/2/79 15/18/79
269 | f 14/2/80 2/4/80 15/18/80
270 |
--------------------------------------------------------------------------------
/assignment_4/README.md:
--------------------------------------------------------------------------------
1 | # Assignment 4
2 |
3 | ## Submission
4 | ### Due
5 | - Apr. 26th, 23:59
6 |
7 | ### Submission Contents
8 |
9 | File format: A compressed file **[uid_name_assignment3.zip]** with:
10 |
11 | 1. your `rigid_body_dynamic.py`
12 | 2. a recorded video of your simulator
13 | 3. your report `uid_name_3.pdf`
14 | 4. your own mesh.obj (if any)
15 |
16 | ## Introduction
17 | In this assignment, you need to implement a basic PBD simulator based on Jacobi approach,
18 | which offers a parallelizable method for solving position-based constraints for a fast simulation.
19 |
20 |
21 | ## Examples
22 | A successful implementation of the PBD simulator (with the default parameters) should produce a video similar to the following:
23 |
24 | https://github.com/Shimingyi/COMP3360_Data_Driven_Animation/assets/25116439/fc6930aa-959a-4751-80af-6bbc7a023779
25 |
26 |
27 | ## Environment
28 | Same as Assignment 3. If you want to save the video directly by Taichi, you need to install ffmpeg.
29 |
30 | ```bash
31 | $ conda activate comp3360
32 | $ conda install ffmpeg
33 | ```
34 |
35 | ## Task 1 - PBD Simulator based on Jacobi Approach (80%)
36 |
37 | You are required to implement **3** `TODO` in `cloth_pbd.py`. Please check the hints in the comments.
38 |
39 | Note that to fix the corner of the cloth, you need to keep vertex 0 and vertex 1 fixed by not updating their positions.
40 |
41 | ## Task 2 - Collision with a Mesh (Bonus)
42 |
43 | Once you have implemented the basic simulator, there is an opportunity to enhance its functionality by adding extra features.
44 | For example, support for collision with a mesh (example meshes `torus.obj` and `obstacle.obj` are available, but feel free to import your own mesh) instead of a simple sphere.
45 |
46 | The idea is still similar to the sphere collision, i.e., detect if the point is inside the mesh and then project it out.
47 | An easy way is first compute the vertex normal for the obstacle mesh.
48 | Then, for every point of the cloth, find its closest point on the obstacle mesh.
49 | If the displacement from the obstacle point to the cloth point have a negative dot product with the normal, then the point is inside the obstacle.
50 | In this case, you can project the point out along the normal direction.
51 |
52 | Or a better way is to use the signed distance field (SDF) of the mesh to compute the penetration depth and direction.
53 |
54 | ## Task 3 Report (20%)
55 |
56 | - PDF format, no page size requirement, so you can also prepare it with PowerPoint or keynote or markdown
57 | - The first two lines should introduce your NAME and UID.
58 |
59 | You're expected to briefly introduce your implementation, and Task 2 if you have done it.
60 | You also need to experiment with the parameters that will affect the material behavior, i.e., iteration number, alpha,
61 | and analyze the results.
62 |
--------------------------------------------------------------------------------
/assignment_4/cloth.obj:
--------------------------------------------------------------------------------
1 | # Blender 3.6.10
2 | # www.blender.org
3 | o Plane
4 | v -1.000000 1.500000 0.900000
5 | v 1.000000 1.500000 0.900000
6 | v -1.000000 1.500000 -1.100000
7 | v 1.000000 1.500000 -1.100000
8 | v -1.000000 1.500000 -0.975000
9 | v -1.000000 1.500000 -0.850000
10 | v -1.000000 1.500000 -0.725000
11 | v -1.000000 1.500000 -0.600000
12 | v -1.000000 1.500000 -0.475000
13 | v -1.000000 1.500000 -0.350000
14 | v -1.000000 1.500000 -0.225000
15 | v -1.000000 1.500000 -0.100000
16 | v -1.000000 1.500000 0.025000
17 | v -1.000000 1.500000 0.150000
18 | v -1.000000 1.500000 0.275000
19 | v -1.000000 1.500000 0.400000
20 | v -1.000000 1.500000 0.525000
21 | v -1.000000 1.500000 0.650000
22 | v -1.000000 1.500000 0.775000
23 | v -0.875000 1.500000 0.900000
24 | v -0.750000 1.500000 0.900000
25 | v -0.625000 1.500000 0.900000
26 | v -0.500000 1.500000 0.900000
27 | v -0.375000 1.500000 0.900000
28 | v -0.250000 1.500000 0.900000
29 | v -0.125000 1.500000 0.900000
30 | v 0.000000 1.500000 0.900000
31 | v 0.125000 1.500000 0.900000
32 | v 0.250000 1.500000 0.900000
33 | v 0.375000 1.500000 0.900000
34 | v 0.500000 1.500000 0.900000
35 | v 0.625000 1.500000 0.900000
36 | v 0.750000 1.500000 0.900000
37 | v 0.875000 1.500000 0.900000
38 | v 1.000000 1.500000 0.775000
39 | v 1.000000 1.500000 0.650000
40 | v 1.000000 1.500000 0.525000
41 | v 1.000000 1.500000 0.400000
42 | v 1.000000 1.500000 0.275000
43 | v 1.000000 1.500000 0.150000
44 | v 1.000000 1.500000 0.025000
45 | v 1.000000 1.500000 -0.100000
46 | v 1.000000 1.500000 -0.225000
47 | v 1.000000 1.500000 -0.350000
48 | v 1.000000 1.500000 -0.475000
49 | v 1.000000 1.500000 -0.600000
50 | v 1.000000 1.500000 -0.725000
51 | v 1.000000 1.500000 -0.850000
52 | v 1.000000 1.500000 -0.975000
53 | v 0.875000 1.500000 -1.100000
54 | v 0.750000 1.500000 -1.100000
55 | v 0.625000 1.500000 -1.100000
56 | v 0.500000 1.500000 -1.100000
57 | v 0.375000 1.500000 -1.100000
58 | v 0.250000 1.500000 -1.100000
59 | v 0.125000 1.500000 -1.100000
60 | v 0.000000 1.500000 -1.100000
61 | v -0.125000 1.500000 -1.100000
62 | v -0.250000 1.500000 -1.100000
63 | v -0.375000 1.500000 -1.100000
64 | v -0.500000 1.500000 -1.100000
65 | v -0.625000 1.500000 -1.100000
66 | v -0.750000 1.500000 -1.100000
67 | v -0.875000 1.500000 -1.100000
68 | v -0.875000 1.500000 0.775000
69 | v -0.750000 1.500000 0.775000
70 | v -0.625000 1.500000 0.775000
71 | v -0.500000 1.500000 0.775000
72 | v -0.375000 1.500000 0.775000
73 | v -0.250000 1.500000 0.775000
74 | v -0.125000 1.500000 0.775000
75 | v 0.000000 1.500000 0.775000
76 | v 0.125000 1.500000 0.775000
77 | v 0.250000 1.500000 0.775000
78 | v 0.375000 1.500000 0.775000
79 | v 0.500000 1.500000 0.775000
80 | v 0.625000 1.500000 0.775000
81 | v 0.750000 1.500000 0.775000
82 | v 0.875000 1.500000 0.775000
83 | v -0.875000 1.500000 0.650000
84 | v -0.750000 1.500000 0.650000
85 | v -0.625000 1.500000 0.650000
86 | v -0.500000 1.500000 0.650000
87 | v -0.375000 1.500000 0.650000
88 | v -0.250000 1.500000 0.650000
89 | v -0.125000 1.500000 0.650000
90 | v 0.000000 1.500000 0.650000
91 | v 0.125000 1.500000 0.650000
92 | v 0.250000 1.500000 0.650000
93 | v 0.375000 1.500000 0.650000
94 | v 0.500000 1.500000 0.650000
95 | v 0.625000 1.500000 0.650000
96 | v 0.750000 1.500000 0.650000
97 | v 0.875000 1.500000 0.650000
98 | v -0.875000 1.500000 0.525000
99 | v -0.750000 1.500000 0.525000
100 | v -0.625000 1.500000 0.525000
101 | v -0.500000 1.500000 0.525000
102 | v -0.375000 1.500000 0.525000
103 | v -0.250000 1.500000 0.525000
104 | v -0.125000 1.500000 0.525000
105 | v 0.000000 1.500000 0.525000
106 | v 0.125000 1.500000 0.525000
107 | v 0.250000 1.500000 0.525000
108 | v 0.375000 1.500000 0.525000
109 | v 0.500000 1.500000 0.525000
110 | v 0.625000 1.500000 0.525000
111 | v 0.750000 1.500000 0.525000
112 | v 0.875000 1.500000 0.525000
113 | v -0.875000 1.500000 0.400000
114 | v -0.750000 1.500000 0.400000
115 | v -0.625000 1.500000 0.400000
116 | v -0.500000 1.500000 0.400000
117 | v -0.375000 1.500000 0.400000
118 | v -0.250000 1.500000 0.400000
119 | v -0.125000 1.500000 0.400000
120 | v 0.000000 1.500000 0.400000
121 | v 0.125000 1.500000 0.400000
122 | v 0.250000 1.500000 0.400000
123 | v 0.375000 1.500000 0.400000
124 | v 0.500000 1.500000 0.400000
125 | v 0.625000 1.500000 0.400000
126 | v 0.750000 1.500000 0.400000
127 | v 0.875000 1.500000 0.400000
128 | v -0.875000 1.500000 0.275000
129 | v -0.750000 1.500000 0.275000
130 | v -0.625000 1.500000 0.275000
131 | v -0.500000 1.500000 0.275000
132 | v -0.375000 1.500000 0.275000
133 | v -0.250000 1.500000 0.275000
134 | v -0.125000 1.500000 0.275000
135 | v 0.000000 1.500000 0.275000
136 | v 0.125000 1.500000 0.275000
137 | v 0.250000 1.500000 0.275000
138 | v 0.375000 1.500000 0.275000
139 | v 0.500000 1.500000 0.275000
140 | v 0.625000 1.500000 0.275000
141 | v 0.750000 1.500000 0.275000
142 | v 0.875000 1.500000 0.275000
143 | v -0.875000 1.500000 0.150000
144 | v -0.750000 1.500000 0.150000
145 | v -0.625000 1.500000 0.150000
146 | v -0.500000 1.500000 0.150000
147 | v -0.375000 1.500000 0.150000
148 | v -0.250000 1.500000 0.150000
149 | v -0.125000 1.500000 0.150000
150 | v 0.000000 1.500000 0.150000
151 | v 0.125000 1.500000 0.150000
152 | v 0.250000 1.500000 0.150000
153 | v 0.375000 1.500000 0.150000
154 | v 0.500000 1.500000 0.150000
155 | v 0.625000 1.500000 0.150000
156 | v 0.750000 1.500000 0.150000
157 | v 0.875000 1.500000 0.150000
158 | v -0.875000 1.500000 0.025000
159 | v -0.750000 1.500000 0.025000
160 | v -0.625000 1.500000 0.025000
161 | v -0.500000 1.500000 0.025000
162 | v -0.375000 1.500000 0.025000
163 | v -0.250000 1.500000 0.025000
164 | v -0.125000 1.500000 0.025000
165 | v 0.000000 1.500000 0.025000
166 | v 0.125000 1.500000 0.025000
167 | v 0.250000 1.500000 0.025000
168 | v 0.375000 1.500000 0.025000
169 | v 0.500000 1.500000 0.025000
170 | v 0.625000 1.500000 0.025000
171 | v 0.750000 1.500000 0.025000
172 | v 0.875000 1.500000 0.025000
173 | v -0.875000 1.500000 -0.100000
174 | v -0.750000 1.500000 -0.100000
175 | v -0.625000 1.500000 -0.100000
176 | v -0.500000 1.500000 -0.100000
177 | v -0.375000 1.500000 -0.100000
178 | v -0.250000 1.500000 -0.100000
179 | v -0.125000 1.500000 -0.100000
180 | v 0.000000 1.500000 -0.100000
181 | v 0.125000 1.500000 -0.100000
182 | v 0.250000 1.500000 -0.100000
183 | v 0.375000 1.500000 -0.100000
184 | v 0.500000 1.500000 -0.100000
185 | v 0.625000 1.500000 -0.100000
186 | v 0.750000 1.500000 -0.100000
187 | v 0.875000 1.500000 -0.100000
188 | v -0.875000 1.500000 -0.225000
189 | v -0.750000 1.500000 -0.225000
190 | v -0.625000 1.500000 -0.225000
191 | v -0.500000 1.500000 -0.225000
192 | v -0.375000 1.500000 -0.225000
193 | v -0.250000 1.500000 -0.225000
194 | v -0.125000 1.500000 -0.225000
195 | v 0.000000 1.500000 -0.225000
196 | v 0.125000 1.500000 -0.225000
197 | v 0.250000 1.500000 -0.225000
198 | v 0.375000 1.500000 -0.225000
199 | v 0.500000 1.500000 -0.225000
200 | v 0.625000 1.500000 -0.225000
201 | v 0.750000 1.500000 -0.225000
202 | v 0.875000 1.500000 -0.225000
203 | v -0.875000 1.500000 -0.350000
204 | v -0.750000 1.500000 -0.350000
205 | v -0.625000 1.500000 -0.350000
206 | v -0.500000 1.500000 -0.350000
207 | v -0.375000 1.500000 -0.350000
208 | v -0.250000 1.500000 -0.350000
209 | v -0.125000 1.500000 -0.350000
210 | v 0.000000 1.500000 -0.350000
211 | v 0.125000 1.500000 -0.350000
212 | v 0.250000 1.500000 -0.350000
213 | v 0.375000 1.500000 -0.350000
214 | v 0.500000 1.500000 -0.350000
215 | v 0.625000 1.500000 -0.350000
216 | v 0.750000 1.500000 -0.350000
217 | v 0.875000 1.500000 -0.350000
218 | v -0.875000 1.500000 -0.475000
219 | v -0.750000 1.500000 -0.475000
220 | v -0.625000 1.500000 -0.475000
221 | v -0.500000 1.500000 -0.475000
222 | v -0.375000 1.500000 -0.475000
223 | v -0.250000 1.500000 -0.475000
224 | v -0.125000 1.500000 -0.475000
225 | v -0.000000 1.500000 -0.475000
226 | v 0.125000 1.500000 -0.475000
227 | v 0.250000 1.500000 -0.475000
228 | v 0.375000 1.500000 -0.475000
229 | v 0.500000 1.500000 -0.475000
230 | v 0.625000 1.500000 -0.475000
231 | v 0.750000 1.500000 -0.475000
232 | v 0.875000 1.500000 -0.475000
233 | v -0.875000 1.500000 -0.600000
234 | v -0.750000 1.500000 -0.600000
235 | v -0.625000 1.500000 -0.600000
236 | v -0.500000 1.500000 -0.600000
237 | v -0.375000 1.500000 -0.600000
238 | v -0.250000 1.500000 -0.600000
239 | v -0.125000 1.500000 -0.600000
240 | v 0.000000 1.500000 -0.600000
241 | v 0.125000 1.500000 -0.600000
242 | v 0.250000 1.500000 -0.600000
243 | v 0.375000 1.500000 -0.600000
244 | v 0.500000 1.500000 -0.600000
245 | v 0.625000 1.500000 -0.600000
246 | v 0.750000 1.500000 -0.600000
247 | v 0.875000 1.500000 -0.600000
248 | v -0.875000 1.500000 -0.725000
249 | v -0.750000 1.500000 -0.725000
250 | v -0.625000 1.500000 -0.725000
251 | v -0.500000 1.500000 -0.725000
252 | v -0.375000 1.500000 -0.725000
253 | v -0.250000 1.500000 -0.725000
254 | v -0.125000 1.500000 -0.725000
255 | v 0.000000 1.500000 -0.725000
256 | v 0.125000 1.500000 -0.725000
257 | v 0.250000 1.500000 -0.725000
258 | v 0.375000 1.500000 -0.725000
259 | v 0.500000 1.500000 -0.725000
260 | v 0.625000 1.500000 -0.725000
261 | v 0.750000 1.500000 -0.725000
262 | v 0.875000 1.500000 -0.725000
263 | v -0.875000 1.500000 -0.850000
264 | v -0.750000 1.500000 -0.850000
265 | v -0.625000 1.500000 -0.850000
266 | v -0.500000 1.500000 -0.850000
267 | v -0.375000 1.500000 -0.850000
268 | v -0.250000 1.500000 -0.850000
269 | v -0.125000 1.500000 -0.850000
270 | v 0.000000 1.500000 -0.850000
271 | v 0.125000 1.500000 -0.850000
272 | v 0.250000 1.500000 -0.850000
273 | v 0.375000 1.500000 -0.850000
274 | v 0.500000 1.500000 -0.850000
275 | v 0.625000 1.500000 -0.850000
276 | v 0.750000 1.500000 -0.850000
277 | v 0.875000 1.500000 -0.850000
278 | v -0.875000 1.500000 -0.975000
279 | v -0.750000 1.500000 -0.975000
280 | v -0.625000 1.500000 -0.975000
281 | v -0.500000 1.500000 -0.975000
282 | v -0.375000 1.500000 -0.975000
283 | v -0.250000 1.500000 -0.975000
284 | v -0.125000 1.500000 -0.975000
285 | v 0.000000 1.500000 -0.975000
286 | v 0.125000 1.500000 -0.975000
287 | v 0.250000 1.500000 -0.975000
288 | v 0.375000 1.500000 -0.975000
289 | v 0.500000 1.500000 -0.975000
290 | v 0.625000 1.500000 -0.975000
291 | v 0.750000 1.500000 -0.975000
292 | v 0.875000 1.500000 -0.975000
293 | s 0
294 | f 66 80 65
295 | f 67 81 66
296 | f 68 82 67
297 | f 69 83 68
298 | f 70 84 69
299 | f 71 85 70
300 | f 72 86 71
301 | f 73 87 72
302 | f 74 88 73
303 | f 75 89 74
304 | f 76 90 75
305 | f 77 91 76
306 | f 78 92 77
307 | f 79 93 78
308 | f 81 95 80
309 | f 82 96 81
310 | f 83 97 82
311 | f 84 98 83
312 | f 85 99 84
313 | f 86 100 85
314 | f 87 101 86
315 | f 88 102 87
316 | f 89 103 88
317 | f 90 104 89
318 | f 91 105 90
319 | f 92 106 91
320 | f 93 107 92
321 | f 94 108 93
322 | f 96 110 95
323 | f 97 111 96
324 | f 98 112 97
325 | f 99 113 98
326 | f 100 114 99
327 | f 101 115 100
328 | f 102 116 101
329 | f 103 117 102
330 | f 104 118 103
331 | f 105 119 104
332 | f 106 120 105
333 | f 107 121 106
334 | f 108 122 107
335 | f 109 123 108
336 | f 111 125 110
337 | f 112 126 111
338 | f 113 127 112
339 | f 114 128 113
340 | f 115 129 114
341 | f 116 130 115
342 | f 117 131 116
343 | f 118 132 117
344 | f 119 133 118
345 | f 120 134 119
346 | f 121 135 120
347 | f 122 136 121
348 | f 123 137 122
349 | f 124 138 123
350 | f 126 140 125
351 | f 127 141 126
352 | f 128 142 127
353 | f 129 143 128
354 | f 130 144 129
355 | f 131 145 130
356 | f 132 146 131
357 | f 133 147 132
358 | f 134 148 133
359 | f 135 149 134
360 | f 136 150 135
361 | f 137 151 136
362 | f 138 152 137
363 | f 139 153 138
364 | f 141 155 140
365 | f 142 156 141
366 | f 143 157 142
367 | f 144 158 143
368 | f 145 159 144
369 | f 146 160 145
370 | f 147 161 146
371 | f 148 162 147
372 | f 149 163 148
373 | f 150 164 149
374 | f 151 165 150
375 | f 152 166 151
376 | f 153 167 152
377 | f 154 168 153
378 | f 156 170 155
379 | f 157 171 156
380 | f 158 172 157
381 | f 159 173 158
382 | f 160 174 159
383 | f 161 175 160
384 | f 162 176 161
385 | f 163 177 162
386 | f 164 178 163
387 | f 165 179 164
388 | f 166 180 165
389 | f 167 181 166
390 | f 168 182 167
391 | f 169 183 168
392 | f 171 185 170
393 | f 172 186 171
394 | f 173 187 172
395 | f 174 188 173
396 | f 175 189 174
397 | f 176 190 175
398 | f 177 191 176
399 | f 178 192 177
400 | f 179 193 178
401 | f 180 194 179
402 | f 181 195 180
403 | f 182 196 181
404 | f 183 197 182
405 | f 184 198 183
406 | f 186 200 185
407 | f 187 201 186
408 | f 188 202 187
409 | f 189 203 188
410 | f 190 204 189
411 | f 191 205 190
412 | f 192 206 191
413 | f 193 207 192
414 | f 194 208 193
415 | f 195 209 194
416 | f 196 210 195
417 | f 197 211 196
418 | f 198 212 197
419 | f 199 213 198
420 | f 201 215 200
421 | f 202 216 201
422 | f 203 217 202
423 | f 204 218 203
424 | f 205 219 204
425 | f 206 220 205
426 | f 207 221 206
427 | f 208 222 207
428 | f 209 223 208
429 | f 210 224 209
430 | f 211 225 210
431 | f 212 226 211
432 | f 213 227 212
433 | f 214 228 213
434 | f 216 230 215
435 | f 217 231 216
436 | f 218 232 217
437 | f 219 233 218
438 | f 220 234 219
439 | f 221 235 220
440 | f 222 236 221
441 | f 223 237 222
442 | f 224 238 223
443 | f 225 239 224
444 | f 226 240 225
445 | f 227 241 226
446 | f 228 242 227
447 | f 229 243 228
448 | f 231 245 230
449 | f 232 246 231
450 | f 233 247 232
451 | f 234 248 233
452 | f 235 249 234
453 | f 236 250 235
454 | f 237 251 236
455 | f 238 252 237
456 | f 239 253 238
457 | f 240 254 239
458 | f 241 255 240
459 | f 242 256 241
460 | f 243 257 242
461 | f 244 258 243
462 | f 246 260 245
463 | f 247 261 246
464 | f 248 262 247
465 | f 249 263 248
466 | f 250 264 249
467 | f 251 265 250
468 | f 252 266 251
469 | f 253 267 252
470 | f 254 268 253
471 | f 255 269 254
472 | f 256 270 255
473 | f 257 271 256
474 | f 258 272 257
475 | f 259 273 258
476 | f 261 275 260
477 | f 262 276 261
478 | f 263 277 262
479 | f 264 278 263
480 | f 265 279 264
481 | f 266 280 265
482 | f 267 281 266
483 | f 268 282 267
484 | f 269 283 268
485 | f 270 284 269
486 | f 271 285 270
487 | f 272 286 271
488 | f 273 287 272
489 | f 274 288 273
490 | f 20 19 1
491 | f 21 65 20
492 | f 22 66 21
493 | f 23 67 22
494 | f 24 68 23
495 | f 25 69 24
496 | f 26 70 25
497 | f 27 71 26
498 | f 28 72 27
499 | f 29 73 28
500 | f 30 74 29
501 | f 31 75 30
502 | f 32 76 31
503 | f 33 77 32
504 | f 34 78 33
505 | f 2 79 34
506 | f 35 94 79
507 | f 36 109 94
508 | f 37 124 109
509 | f 38 139 124
510 | f 39 154 139
511 | f 40 169 154
512 | f 41 184 169
513 | f 42 199 184
514 | f 43 214 199
515 | f 44 229 214
516 | f 45 244 229
517 | f 46 259 244
518 | f 47 274 259
519 | f 48 289 274
520 | f 49 50 289
521 | f 289 51 288
522 | f 288 52 287
523 | f 287 53 286
524 | f 286 54 285
525 | f 285 55 284
526 | f 284 56 283
527 | f 283 57 282
528 | f 282 58 281
529 | f 281 59 280
530 | f 280 60 279
531 | f 279 61 278
532 | f 278 62 277
533 | f 277 63 276
534 | f 276 64 275
535 | f 275 3 5
536 | f 260 5 6
537 | f 245 6 7
538 | f 230 7 8
539 | f 215 8 9
540 | f 200 9 10
541 | f 185 10 11
542 | f 170 11 12
543 | f 155 12 13
544 | f 140 13 14
545 | f 125 14 15
546 | f 110 15 16
547 | f 95 16 17
548 | f 80 17 18
549 | f 65 18 19
550 | f 66 81 80
551 | f 67 82 81
552 | f 68 83 82
553 | f 69 84 83
554 | f 70 85 84
555 | f 71 86 85
556 | f 72 87 86
557 | f 73 88 87
558 | f 74 89 88
559 | f 75 90 89
560 | f 76 91 90
561 | f 77 92 91
562 | f 78 93 92
563 | f 79 94 93
564 | f 81 96 95
565 | f 82 97 96
566 | f 83 98 97
567 | f 84 99 98
568 | f 85 100 99
569 | f 86 101 100
570 | f 87 102 101
571 | f 88 103 102
572 | f 89 104 103
573 | f 90 105 104
574 | f 91 106 105
575 | f 92 107 106
576 | f 93 108 107
577 | f 94 109 108
578 | f 96 111 110
579 | f 97 112 111
580 | f 98 113 112
581 | f 99 114 113
582 | f 100 115 114
583 | f 101 116 115
584 | f 102 117 116
585 | f 103 118 117
586 | f 104 119 118
587 | f 105 120 119
588 | f 106 121 120
589 | f 107 122 121
590 | f 108 123 122
591 | f 109 124 123
592 | f 111 126 125
593 | f 112 127 126
594 | f 113 128 127
595 | f 114 129 128
596 | f 115 130 129
597 | f 116 131 130
598 | f 117 132 131
599 | f 118 133 132
600 | f 119 134 133
601 | f 120 135 134
602 | f 121 136 135
603 | f 122 137 136
604 | f 123 138 137
605 | f 124 139 138
606 | f 126 141 140
607 | f 127 142 141
608 | f 128 143 142
609 | f 129 144 143
610 | f 130 145 144
611 | f 131 146 145
612 | f 132 147 146
613 | f 133 148 147
614 | f 134 149 148
615 | f 135 150 149
616 | f 136 151 150
617 | f 137 152 151
618 | f 138 153 152
619 | f 139 154 153
620 | f 141 156 155
621 | f 142 157 156
622 | f 143 158 157
623 | f 144 159 158
624 | f 145 160 159
625 | f 146 161 160
626 | f 147 162 161
627 | f 148 163 162
628 | f 149 164 163
629 | f 150 165 164
630 | f 151 166 165
631 | f 152 167 166
632 | f 153 168 167
633 | f 154 169 168
634 | f 156 171 170
635 | f 157 172 171
636 | f 158 173 172
637 | f 159 174 173
638 | f 160 175 174
639 | f 161 176 175
640 | f 162 177 176
641 | f 163 178 177
642 | f 164 179 178
643 | f 165 180 179
644 | f 166 181 180
645 | f 167 182 181
646 | f 168 183 182
647 | f 169 184 183
648 | f 171 186 185
649 | f 172 187 186
650 | f 173 188 187
651 | f 174 189 188
652 | f 175 190 189
653 | f 176 191 190
654 | f 177 192 191
655 | f 178 193 192
656 | f 179 194 193
657 | f 180 195 194
658 | f 181 196 195
659 | f 182 197 196
660 | f 183 198 197
661 | f 184 199 198
662 | f 186 201 200
663 | f 187 202 201
664 | f 188 203 202
665 | f 189 204 203
666 | f 190 205 204
667 | f 191 206 205
668 | f 192 207 206
669 | f 193 208 207
670 | f 194 209 208
671 | f 195 210 209
672 | f 196 211 210
673 | f 197 212 211
674 | f 198 213 212
675 | f 199 214 213
676 | f 201 216 215
677 | f 202 217 216
678 | f 203 218 217
679 | f 204 219 218
680 | f 205 220 219
681 | f 206 221 220
682 | f 207 222 221
683 | f 208 223 222
684 | f 209 224 223
685 | f 210 225 224
686 | f 211 226 225
687 | f 212 227 226
688 | f 213 228 227
689 | f 214 229 228
690 | f 216 231 230
691 | f 217 232 231
692 | f 218 233 232
693 | f 219 234 233
694 | f 220 235 234
695 | f 221 236 235
696 | f 222 237 236
697 | f 223 238 237
698 | f 224 239 238
699 | f 225 240 239
700 | f 226 241 240
701 | f 227 242 241
702 | f 228 243 242
703 | f 229 244 243
704 | f 231 246 245
705 | f 232 247 246
706 | f 233 248 247
707 | f 234 249 248
708 | f 235 250 249
709 | f 236 251 250
710 | f 237 252 251
711 | f 238 253 252
712 | f 239 254 253
713 | f 240 255 254
714 | f 241 256 255
715 | f 242 257 256
716 | f 243 258 257
717 | f 244 259 258
718 | f 246 261 260
719 | f 247 262 261
720 | f 248 263 262
721 | f 249 264 263
722 | f 250 265 264
723 | f 251 266 265
724 | f 252 267 266
725 | f 253 268 267
726 | f 254 269 268
727 | f 255 270 269
728 | f 256 271 270
729 | f 257 272 271
730 | f 258 273 272
731 | f 259 274 273
732 | f 261 276 275
733 | f 262 277 276
734 | f 263 278 277
735 | f 264 279 278
736 | f 265 280 279
737 | f 266 281 280
738 | f 267 282 281
739 | f 268 283 282
740 | f 269 284 283
741 | f 270 285 284
742 | f 271 286 285
743 | f 272 287 286
744 | f 273 288 287
745 | f 274 289 288
746 | f 20 65 19
747 | f 21 66 65
748 | f 22 67 66
749 | f 23 68 67
750 | f 24 69 68
751 | f 25 70 69
752 | f 26 71 70
753 | f 27 72 71
754 | f 28 73 72
755 | f 29 74 73
756 | f 30 75 74
757 | f 31 76 75
758 | f 32 77 76
759 | f 33 78 77
760 | f 34 79 78
761 | f 2 35 79
762 | f 35 36 94
763 | f 36 37 109
764 | f 37 38 124
765 | f 38 39 139
766 | f 39 40 154
767 | f 40 41 169
768 | f 41 42 184
769 | f 42 43 199
770 | f 43 44 214
771 | f 44 45 229
772 | f 45 46 244
773 | f 46 47 259
774 | f 47 48 274
775 | f 48 49 289
776 | f 49 4 50
777 | f 289 50 51
778 | f 288 51 52
779 | f 287 52 53
780 | f 286 53 54
781 | f 285 54 55
782 | f 284 55 56
783 | f 283 56 57
784 | f 282 57 58
785 | f 281 58 59
786 | f 280 59 60
787 | f 279 60 61
788 | f 278 61 62
789 | f 277 62 63
790 | f 276 63 64
791 | f 275 64 3
792 | f 260 275 5
793 | f 245 260 6
794 | f 230 245 7
795 | f 215 230 8
796 | f 200 215 9
797 | f 185 200 10
798 | f 170 185 11
799 | f 155 170 12
800 | f 140 155 13
801 | f 125 140 14
802 | f 110 125 15
803 | f 95 110 16
804 | f 80 95 17
805 | f 65 80 18
806 |
--------------------------------------------------------------------------------
/assignment_4/cloth_pbd.py:
--------------------------------------------------------------------------------
1 | import taichi as ti
2 | import numpy as np
3 |
4 |
5 | ti.init(arch=ti.cpu, debug=False)
6 |
7 |
8 | # Function to read an OBJ file
9 | def read_obj_file(file_path, scale=1.0):
10 | vertices = []
11 | faces = []
12 |
13 | with open(file_path, 'r') as file:
14 | for line in file:
15 | line = line.strip()
16 | if not line or line.startswith('#'):
17 | continue
18 |
19 | elements = line.split()
20 | if elements[0] == 'v':
21 | vertices.append([scale * float(e) for e in elements[1:]])
22 | elif elements[0] == 'f':
23 | faces.append([int(e.split('/')[0]) - 1 for e in elements[1:]])
24 |
25 | return np.array(vertices, dtype=np.float32), np.array(faces, dtype=np.int32)
26 |
27 |
28 | def init_edges(vertices, faces):
29 | # Generate vertex index pairs for edges, assuming triangular faces.
30 | edges = np.hstack([faces, np.roll(faces, -1, axis=1)]).reshape(-1, 2)
31 |
32 | # Ensure each edge is ordered (smaller index first).
33 | edges = np.sort(edges, axis=1)
34 |
35 | # Remove duplicate edges.
36 | edges = np.unique(edges, axis=0)
37 |
38 | # Compute edge lengths.
39 | edge_lengths = np.sqrt(np.sum(np.square(vertices[edges[:, 0]] - vertices[edges[:, 1]]), -1))
40 |
41 | return edges, edge_lengths
42 |
43 |
44 | # Read an OBJ file, feel free to change the file path to use your own mesh
45 | file_path = "cloth.obj"
46 | vertices_np, faces_np = read_obj_file(file_path, scale=1)
47 | edges_np, edge_lengths_np = init_edges(vertices_np, faces_np)
48 |
49 | indices = ti.field(int, shape=3 * faces_np.shape[0])
50 | indices.from_numpy(faces_np.flatten())
51 |
52 |
53 | num_verts = vertices_np.shape[0]
54 | num_edges = edges_np.shape[0]
55 |
56 | # Simulation parameters
57 | fps = 60
58 | subsample = 1
59 | t = 1.0 / fps
60 | damping = 0.99
61 | alpha = 0.2
62 | gravity = ti.Vector([0, -9.8, 0])
63 |
64 | # Collision body
65 | c = ti.Vector.field(3, dtype=ti.f32, shape=1)
66 | c.from_numpy(np.array([[0.0, 0.0, 0.0]], dtype=np.float32))
67 | r = 0.5
68 |
69 | prev_X = ti.Vector.field(3, dtype=ti.f32, shape=num_verts) # Previous position
70 | X = ti.Vector.field(3, dtype=ti.f32, shape=num_verts) # Current position
71 | E = ti.Vector.field(2, dtype=ti.i32, shape=num_edges) # Edge indices
72 | L = ti.field(dtype=ti.f32, shape=num_edges) # Rest edge length
73 | sum_X = ti.Vector.field(3, dtype=ti.f32, shape=num_verts)
74 | sum_N = ti.field(dtype=ti.i32, shape=num_verts)
75 |
76 | # Copy data to Taichi fields
77 | X.from_numpy(vertices_np)
78 | prev_X.from_numpy(vertices_np)
79 | E.from_numpy(edges_np)
80 | L.from_numpy(edge_lengths_np)
81 |
82 |
83 | @ti.kernel
84 | def strain_limiting():
85 | sum_X.fill(0.0)
86 | sum_N.fill(0)
87 | # TODO 1: Implement the projection function
88 | # There will be two loops. The first iterates over all edges to compute the correction vector
89 | # The second iterates over all vertices and linearly blends the vectors
90 | # Variables you can use: X (vertex position), E (edges, i.e., indices of two vertices each edge connects),
91 | # L (edge length), sum_X, sum_N, alpha (blend factor)
92 | # There is no need to update velocity here. You only need to update the position.
93 | # To fix the corner of the cloth, you need to keep the vertex 0 and vertex 1 fixed.
94 | pass
95 |
96 | @ti.kernel
97 | def collision_handling():
98 | # TODO 2: Implement collision handling
99 | # when the cloth vertex falls within the collision sphere, push it back to the sphere surface
100 | # Simply find the closest point on the sphere surface and move the vertex to that point
101 | # Variables you can use: X (vertex position), c (collision sphere position), r (collision sphere radius)
102 | # To fix the corner of the cloth, you need to keep the vertex 0 and vertex 1 fixed.
103 | pass
104 |
105 |
106 | @ti.kernel
107 | def update():
108 | for i in range(num_verts):
109 | if i != 0 and i != 1:
110 | v = (X[i] - prev_X[i]) / t
111 | prev_X[i] = X[i]
112 | # TODO 3
113 | # 1. Update the velocity (a. multiply by damping, b. add gravity)
114 | # 2. Update the position X using the velocity
115 | # To fix the corner of the cloth, you need to keep the vertex 0 and vertex 1 fixed.
116 | pass
117 |
118 |
119 | def substep():
120 | update()
121 | for _ in range(32):
122 | strain_limiting()
123 | collision_handling()
124 |
125 |
126 | headless = False # if this is True, no window will show, video will be saved locally
127 | window = ti.ui.Window("PBD Model", (800, 800), fps_limit=fps // subsample, vsync=True,
128 | show_window=not headless)
129 | canvas = window.get_canvas()
130 | canvas.set_background_color((1, 1, 1))
131 | scene = ti.ui.Scene()
132 | camera = ti.ui.make_camera()
133 | camera.position(-4, 0, -2)
134 | camera.lookat(0, 0, 0.2)
135 | camera.up(0, 1, 0)
136 |
137 | if headless:
138 | video_manager = ti.tools.VideoManager(output_dir='.', framerate=fps // subsample)
139 | else:
140 | video_manager = None
141 |
142 | idx = 0
143 | while window.running:
144 | for _ in range(subsample):
145 | substep()
146 |
147 | scene.set_camera(camera)
148 | scene.point_light(pos=(-4, 0, -2), color=(1, 1, 1))
149 | scene.ambient_light((0.2, 0.2, 0.2))
150 |
151 | # Since the cloth's resolution is low,
152 | # here we make the sphere visually smaller than the actual collision sphere to prevent intersection
153 | scene.particles(c, radius=r-0.02, color=(0, 1, 1))
154 | scene.mesh(X, indices=indices, color=(1, 0.5, 0.7), show_wireframe=False, two_sided=True)
155 |
156 | canvas.scene(scene)
157 |
158 | idx += 1
159 |
160 | if headless:
161 | img = window.get_image_buffer_as_numpy()
162 | video_manager.write_frame(img)
163 | if idx >= 600:
164 | break
165 | else:
166 | window.show()
167 |
168 | if headless:
169 | video_manager.make_video(gif=False, mp4=True)
170 |
--------------------------------------------------------------------------------
/assignment_4/obstacle.obj:
--------------------------------------------------------------------------------
1 | # Blender 3.6.10
2 | # www.blender.org
3 | o Icosphere
4 | v -0.078632 0.048312 0.111859
5 | v 0.270227 0.314813 0.365316
6 | v -0.211881 0.314813 0.521965
7 | v -0.509843 0.314815 0.111859
8 | v -0.211881 0.314813 -0.298248
9 | v 0.270227 0.314813 -0.141599
10 | v 0.054618 0.746031 0.521965
11 | v -0.427490 0.746031 0.365316
12 | v -0.427490 0.746031 -0.141599
13 | v 0.054618 0.746031 -0.298248
14 | v 0.352580 0.746029 0.111859
15 | v -0.078632 1.012532 0.111859
16 | v -0.190877 0.213425 0.457321
17 | v -0.156953 0.120313 0.352911
18 | v -0.116046 0.063764 0.227012
19 | v 0.019324 0.063764 0.183027
20 | v 0.126421 0.120313 0.260836
21 | v 0.215237 0.213425 0.325364
22 | v 0.177822 0.288257 0.440519
23 | v 0.048100 0.276958 0.501891
24 | v -0.092921 0.288257 0.528490
25 | v 0.313193 0.288258 -0.030478
26 | v 0.331474 0.276959 0.111859
27 | v 0.313193 0.288258 0.254196
28 | v 0.019324 0.063764 0.040691
29 | v 0.126421 0.120313 -0.037119
30 | v 0.215237 0.213425 -0.101647
31 | v -0.441873 0.213427 0.111859
32 | v -0.332091 0.120314 0.111859
33 | v -0.199712 0.063764 0.111859
34 | v -0.311959 0.288257 0.457322
35 | v -0.410415 0.276959 0.352912
36 | v -0.479289 0.288259 0.227012
37 | v -0.190877 0.213425 -0.233604
38 | v -0.156953 0.120313 -0.129194
39 | v -0.116046 0.063764 -0.003295
40 | v -0.479289 0.288259 -0.003295
41 | v -0.410415 0.276959 -0.129195
42 | v -0.311959 0.288257 -0.233604
43 | v -0.092921 0.288257 -0.304773
44 | v 0.048100 0.276958 -0.278174
45 | v 0.177822 0.288257 -0.216802
46 | v 0.382567 0.651503 0.183027
47 | v 0.379883 0.530422 0.260837
48 | v 0.336319 0.409339 0.325365
49 | v 0.336319 0.409339 -0.101647
50 | v 0.379883 0.530422 -0.037119
51 | v 0.382567 0.651503 0.040690
52 | v -0.003801 0.651504 0.572477
53 | v -0.078632 0.530422 0.593969
54 | v -0.153462 0.409339 0.572477
55 | v 0.252654 0.409339 0.440520
56 | v 0.204746 0.530422 0.501894
57 | v 0.131572 0.651505 0.528492
58 | v -0.493582 0.651504 0.325365
59 | v -0.537146 0.530422 0.260837
60 | v -0.539830 0.409340 0.183027
61 | v -0.288835 0.409339 0.528492
62 | v -0.362009 0.530422 0.501894
63 | v -0.409918 0.651505 0.440520
64 | v -0.409918 0.651505 -0.216803
65 | v -0.362009 0.530422 -0.278176
66 | v -0.288835 0.409339 -0.304775
67 | v -0.539830 0.409340 0.040690
68 | v -0.537146 0.530422 -0.037119
69 | v -0.493582 0.651504 -0.101647
70 | v 0.131572 0.651505 -0.304775
71 | v 0.204746 0.530422 -0.278176
72 | v 0.252654 0.409339 -0.216803
73 | v -0.153462 0.409339 -0.348760
74 | v -0.078632 0.530422 -0.370251
75 | v -0.003801 0.651504 -0.348760
76 | v 0.322026 0.772585 0.227012
77 | v 0.253151 0.783884 0.352912
78 | v 0.154696 0.772586 0.457322
79 | v -0.064342 0.772586 0.528490
80 | v -0.205363 0.783885 0.501891
81 | v -0.335086 0.772586 0.440519
82 | v -0.470456 0.772586 0.254196
83 | v -0.488737 0.783884 0.111859
84 | v -0.470456 0.772586 -0.030478
85 | v -0.335086 0.772586 -0.216802
86 | v -0.205363 0.783885 -0.278174
87 | v -0.064342 0.772586 -0.304773
88 | v 0.154696 0.772586 -0.233604
89 | v 0.253151 0.783884 -0.129195
90 | v 0.322026 0.772585 -0.003295
91 | v -0.041217 0.997080 0.227012
92 | v -0.000310 0.940531 0.352911
93 | v 0.033614 0.847418 0.457321
94 | v 0.284610 0.847416 0.111859
95 | v 0.174828 0.940529 0.111859
96 | v 0.042449 0.997079 0.111859
97 | v -0.176587 0.997080 0.183027
98 | v -0.283684 0.940531 0.260836
99 | v -0.372500 0.847418 0.325364
100 | v -0.176587 0.997080 0.040691
101 | v -0.283684 0.940531 -0.037119
102 | v -0.372500 0.847418 -0.101647
103 | v -0.041217 0.997080 -0.003295
104 | v -0.000310 0.940531 -0.129194
105 | v 0.033614 0.847418 -0.233604
106 | v 0.095796 0.961635 -0.014870
107 | v 0.229048 0.879281 -0.014871
108 | v 0.136972 0.879282 -0.141600
109 | v -0.145258 0.961635 -0.093192
110 | v -0.104082 0.879282 -0.219922
111 | v -0.253061 0.879282 -0.171515
112 | v -0.294236 0.961635 0.111859
113 | v -0.402039 0.879282 0.033537
114 | v -0.402039 0.879282 0.190181
115 | v -0.145258 0.961635 0.316909
116 | v -0.253061 0.879282 0.395232
117 | v -0.104082 0.879282 0.443639
118 | v 0.095796 0.961635 0.238587
119 | v 0.136972 0.879282 0.365317
120 | v 0.229048 0.879281 0.238588
121 | v 0.336853 0.663675 -0.093193
122 | v 0.311405 0.530422 -0.171517
123 | v 0.244778 0.663676 -0.219924
124 | v -0.145258 0.663676 -0.346654
125 | v -0.227611 0.530421 -0.346655
126 | v -0.294238 0.663676 -0.298248
127 | v -0.535293 0.663675 0.033536
128 | v -0.560741 0.530422 0.111859
129 | v -0.535292 0.663675 0.190181
130 | v -0.294239 0.663675 0.521965
131 | v -0.227612 0.530421 0.570372
132 | v -0.145259 0.663675 0.570372
133 | v 0.244778 0.663675 0.443642
134 | v 0.311404 0.530421 0.395235
135 | v 0.336853 0.663674 0.316911
136 | v 0.070349 0.530422 -0.346655
137 | v 0.136976 0.397168 -0.298247
138 | v -0.012004 0.397168 -0.346654
139 | v -0.468667 0.530422 -0.171518
140 | v -0.402040 0.397168 -0.219925
141 | v -0.494115 0.397168 -0.093194
142 | v -0.468667 0.530422 0.395235
143 | v -0.494115 0.397168 0.316912
144 | v -0.402040 0.397168 0.443642
145 | v 0.070349 0.530422 0.570372
146 | v -0.012004 0.397168 0.570372
147 | v 0.136976 0.397168 0.521965
148 | v 0.403478 0.530422 0.111859
149 | v 0.378029 0.397168 0.190181
150 | v 0.378029 0.397168 0.033536
151 | v 0.095798 0.181561 -0.171516
152 | v -0.012006 0.099209 -0.093193
153 | v -0.053181 0.181562 -0.219923
154 | v -0.294236 0.181561 -0.141600
155 | v -0.253060 0.099209 -0.014870
156 | v -0.386312 0.181562 -0.014870
157 | v -0.386312 0.181562 0.238588
158 | v -0.253060 0.099209 0.238588
159 | v -0.294236 0.181562 0.365318
160 | v 0.244776 0.181562 0.033537
161 | v 0.244776 0.181562 0.190181
162 | v 0.136973 0.099209 0.111859
163 | v -0.053181 0.181561 0.443640
164 | v -0.012005 0.099209 0.316910
165 | v 0.095798 0.181562 0.395233
166 | s 0
167 | f 1 16 15
168 | f 2 18 24
169 | f 1 15 30
170 | f 1 30 36
171 | f 1 36 25
172 | f 2 24 45
173 | f 3 21 51
174 | f 4 33 57
175 | f 5 39 63
176 | f 6 42 69
177 | f 2 45 52
178 | f 3 51 58
179 | f 4 57 64
180 | f 5 63 70
181 | f 6 69 46
182 | f 7 75 90
183 | f 8 78 96
184 | f 9 81 99
185 | f 10 84 102
186 | f 11 87 91
187 | f 93 100 12
188 | f 92 103 93
189 | f 91 104 92
190 | f 93 103 100
191 | f 103 101 100
192 | f 92 104 103
193 | f 104 105 103
194 | f 103 105 101
195 | f 105 102 101
196 | f 91 87 104
197 | f 87 86 104
198 | f 104 86 105
199 | f 86 85 105
200 | f 105 85 102
201 | f 85 10 102
202 | f 100 97 12
203 | f 101 106 100
204 | f 102 107 101
205 | f 100 106 97
206 | f 106 98 97
207 | f 101 107 106
208 | f 107 108 106
209 | f 106 108 98
210 | f 108 99 98
211 | f 102 84 107
212 | f 84 83 107
213 | f 107 83 108
214 | f 83 82 108
215 | f 108 82 99
216 | f 82 9 99
217 | f 97 94 12
218 | f 98 109 97
219 | f 99 110 98
220 | f 97 109 94
221 | f 109 95 94
222 | f 98 110 109
223 | f 110 111 109
224 | f 109 111 95
225 | f 111 96 95
226 | f 99 81 110
227 | f 81 80 110
228 | f 110 80 111
229 | f 80 79 111
230 | f 111 79 96
231 | f 79 8 96
232 | f 94 88 12
233 | f 95 112 94
234 | f 96 113 95
235 | f 94 112 88
236 | f 112 89 88
237 | f 95 113 112
238 | f 113 114 112
239 | f 112 114 89
240 | f 114 90 89
241 | f 96 78 113
242 | f 78 77 113
243 | f 113 77 114
244 | f 77 76 114
245 | f 114 76 90
246 | f 76 7 90
247 | f 88 93 12
248 | f 89 115 88
249 | f 90 116 89
250 | f 88 115 93
251 | f 115 92 93
252 | f 89 116 115
253 | f 116 117 115
254 | f 115 117 92
255 | f 117 91 92
256 | f 90 75 116
257 | f 75 74 116
258 | f 116 74 117
259 | f 74 73 117
260 | f 117 73 91
261 | f 73 11 91
262 | f 48 87 11
263 | f 47 118 48
264 | f 46 119 47
265 | f 48 118 87
266 | f 118 86 87
267 | f 47 119 118
268 | f 119 120 118
269 | f 118 120 86
270 | f 120 85 86
271 | f 46 69 119
272 | f 69 68 119
273 | f 119 68 120
274 | f 68 67 120
275 | f 120 67 85
276 | f 67 10 85
277 | f 72 84 10
278 | f 71 121 72
279 | f 70 122 71
280 | f 72 121 84
281 | f 121 83 84
282 | f 71 122 121
283 | f 122 123 121
284 | f 121 123 83
285 | f 123 82 83
286 | f 70 63 122
287 | f 63 62 122
288 | f 122 62 123
289 | f 62 61 123
290 | f 123 61 82
291 | f 61 9 82
292 | f 66 81 9
293 | f 65 124 66
294 | f 64 125 65
295 | f 66 124 81
296 | f 124 80 81
297 | f 65 125 124
298 | f 125 126 124
299 | f 124 126 80
300 | f 126 79 80
301 | f 64 57 125
302 | f 57 56 125
303 | f 125 56 126
304 | f 56 55 126
305 | f 126 55 79
306 | f 55 8 79
307 | f 60 78 8
308 | f 59 127 60
309 | f 58 128 59
310 | f 60 127 78
311 | f 127 77 78
312 | f 59 128 127
313 | f 128 129 127
314 | f 127 129 77
315 | f 129 76 77
316 | f 58 51 128
317 | f 51 50 128
318 | f 128 50 129
319 | f 50 49 129
320 | f 129 49 76
321 | f 49 7 76
322 | f 54 75 7
323 | f 53 130 54
324 | f 52 131 53
325 | f 54 130 75
326 | f 130 74 75
327 | f 53 131 130
328 | f 131 132 130
329 | f 130 132 74
330 | f 132 73 74
331 | f 52 45 131
332 | f 45 44 131
333 | f 131 44 132
334 | f 44 43 132
335 | f 132 43 73
336 | f 43 11 73
337 | f 67 72 10
338 | f 68 133 67
339 | f 69 134 68
340 | f 67 133 72
341 | f 133 71 72
342 | f 68 134 133
343 | f 134 135 133
344 | f 133 135 71
345 | f 135 70 71
346 | f 69 42 134
347 | f 42 41 134
348 | f 134 41 135
349 | f 41 40 135
350 | f 135 40 70
351 | f 40 5 70
352 | f 61 66 9
353 | f 62 136 61
354 | f 63 137 62
355 | f 61 136 66
356 | f 136 65 66
357 | f 62 137 136
358 | f 137 138 136
359 | f 136 138 65
360 | f 138 64 65
361 | f 63 39 137
362 | f 39 38 137
363 | f 137 38 138
364 | f 38 37 138
365 | f 138 37 64
366 | f 37 4 64
367 | f 55 60 8
368 | f 56 139 55
369 | f 57 140 56
370 | f 55 139 60
371 | f 139 59 60
372 | f 56 140 139
373 | f 140 141 139
374 | f 139 141 59
375 | f 141 58 59
376 | f 57 33 140
377 | f 33 32 140
378 | f 140 32 141
379 | f 32 31 141
380 | f 141 31 58
381 | f 31 3 58
382 | f 49 54 7
383 | f 50 142 49
384 | f 51 143 50
385 | f 49 142 54
386 | f 142 53 54
387 | f 50 143 142
388 | f 143 144 142
389 | f 142 144 53
390 | f 144 52 53
391 | f 51 21 143
392 | f 21 20 143
393 | f 143 20 144
394 | f 20 19 144
395 | f 144 19 52
396 | f 19 2 52
397 | f 43 48 11
398 | f 44 145 43
399 | f 45 146 44
400 | f 43 145 48
401 | f 145 47 48
402 | f 44 146 145
403 | f 146 147 145
404 | f 145 147 47
405 | f 147 46 47
406 | f 45 24 146
407 | f 24 23 146
408 | f 146 23 147
409 | f 23 22 147
410 | f 147 22 46
411 | f 22 6 46
412 | f 27 42 6
413 | f 26 148 27
414 | f 25 149 26
415 | f 27 148 42
416 | f 148 41 42
417 | f 26 149 148
418 | f 149 150 148
419 | f 148 150 41
420 | f 150 40 41
421 | f 25 36 149
422 | f 36 35 149
423 | f 149 35 150
424 | f 35 34 150
425 | f 150 34 40
426 | f 34 5 40
427 | f 34 39 5
428 | f 35 151 34
429 | f 36 152 35
430 | f 34 151 39
431 | f 151 38 39
432 | f 35 152 151
433 | f 152 153 151
434 | f 151 153 38
435 | f 153 37 38
436 | f 36 30 152
437 | f 30 29 152
438 | f 152 29 153
439 | f 29 28 153
440 | f 153 28 37
441 | f 28 4 37
442 | f 28 33 4
443 | f 29 154 28
444 | f 30 155 29
445 | f 28 154 33
446 | f 154 32 33
447 | f 29 155 154
448 | f 155 156 154
449 | f 154 156 32
450 | f 156 31 32
451 | f 30 15 155
452 | f 15 14 155
453 | f 155 14 156
454 | f 14 13 156
455 | f 156 13 31
456 | f 13 3 31
457 | f 22 27 6
458 | f 23 157 22
459 | f 24 158 23
460 | f 22 157 27
461 | f 157 26 27
462 | f 23 158 157
463 | f 158 159 157
464 | f 157 159 26
465 | f 159 25 26
466 | f 24 18 158
467 | f 18 17 158
468 | f 158 17 159
469 | f 17 16 159
470 | f 159 16 25
471 | f 16 1 25
472 | f 13 21 3
473 | f 14 160 13
474 | f 15 161 14
475 | f 13 160 21
476 | f 160 20 21
477 | f 14 161 160
478 | f 161 162 160
479 | f 160 162 20
480 | f 162 19 20
481 | f 15 16 161
482 | f 16 17 161
483 | f 161 17 162
484 | f 17 18 162
485 | f 162 18 19
486 | f 18 2 19
487 |
--------------------------------------------------------------------------------
]