├── README.md ├── figures ├── FLAME.jpg ├── pipeline.jpg └── teaser.jpg ├── matlab ├── MR_ambient_occlusion.m ├── MR_poisson_blending.m ├── MR_sample_image.m ├── README.md ├── demo_blending.m └── sample_data │ ├── FV.mat │ ├── im1.jpg │ ├── im2.jpg │ └── im3.jpg └── scala ├── .gitignore ├── LICENSE.txt ├── README.md ├── Screenshot.png ├── build.sbt ├── contributors.txt ├── fitting ├── Bob_Stoops_0005.png ├── Bob_Stoops_0005_face0.tlms └── results │ ├── fit-best.rps │ ├── fitter-best-diffuse.png │ ├── fitter-best-specular.png │ ├── fitter-best.png │ ├── fitter-best_Gamma.png │ ├── fitter-lminit.png │ ├── fitter-lminit.rps │ ├── target.png │ ├── target.tlms │ └── target_Gamma.png ├── project ├── assembly.sbt └── build.properties └── src ├── main └── scala │ └── faces │ ├── apps │ ├── AlbedoModelBuilder.scala │ ├── AlbedoModelFit.scala │ └── AlbedoModelViewer.scala │ └── lib │ ├── AlbedoMoMo.scala │ ├── AlbedoMoMoIO.scala │ ├── AlbedoMoMoRenderer.scala │ ├── AlbedoModelFitScript.scala │ ├── AlbedoModelHelpers.scala │ ├── ParametricAlbedoModel.scala │ ├── SpecularProposal.scala │ └── VertexAlbedoMesh.scala └── test └── scala └── faces ├── FacesTestSuite.scala └── lib └── AlbedoMoMoTest.scala /README.md: -------------------------------------------------------------------------------- 1 | # [A Morphable Face Albedo Model](http://openaccess.thecvf.com/content_CVPR_2020/papers/Smith_A_Morphable_Face_Albedo_Model_CVPR_2020_paper.pdf) 2 | 3 | [William A. P. Smith](https://www-users.cs.york.ac.uk/wsmith) 1, [Alassane Seck](https://www.linkedin.com/in/alassane-seck-67508365) 2,3, [Hannah Dee](http://users.aber.ac.uk/hmd1/) 3, [Bernard Tiddeman](http://users.aber.ac.uk/bpt/) 3, [Joshua Tenenbaum](http://web.mit.edu/cocosci/josh.html) 4 and [Bernhard Egger](https://eggerbernhard.ch/) 4 4 |
5 | 1 University of York, UK 6 |
7 | 2 ARM Ltd, UK 8 |
9 | 3 Aberystwyth University, UK 10 |
11 | 4 MIT, USA 12 |
13 | #### [CVPR2020] 14 | 15 |
16 | 17 |

18 | 19 | ## Abstract 20 | 21 | In this paper, we bring together two divergent strands of research: photometric face capture and statistical 3D face appearance modelling. We propose a novel lightstage capture and processing pipeline for acquiring ear-to-ear, truly intrinsic diffuse and specular albedo maps that fully factor out the effects of illumination, camera and geometry. Using this pipeline, we capture a dataset of 50 scans and combine them with the only existing publicly available albedo dataset (3DRFE) of 23 scans. This allows us to build the first morphable face albedo model. We believe this is the first statistical analysis of the variability of facial specular albedo maps. This model can be used as a plug in replacement for the texture model of the Basel Face Model and we make our new albedo model publicly available. We ensure careful spectral calibration such that our model is built in a linear sRGB space, suitable for inverse rendering of images taken by typical cameras. We demonstrate our model in a state of the art analysis-by-synthesis 3DMM fitting pipeline, are the first to integrate specular map estimation and outperform the Basel Face Model in albedo reconstruction. 22 | 23 | ## Oral CVPR 2020 presentation 24 | [![Oral CVPR 2020 presentation](https://img.youtube.com/vi/2Xrlzj7UAyQ/0.jpg)](https://www.youtube.com/watch?v=2Xrlzj7UAyQ) 25 | 26 | ## Scala code for loading, visualising and fitting the model 27 | We make available [Scala code](scala/README.md) for loading the statistical model, visualising its principal components and fitting to an image in an inverse rendering pipeline. This code also enables to combine the albedo Model with the Basel Face Model to built a joint model file. 28 | 29 | ## Matlab code for sampling and Poisson blending textures 30 |

31 | 32 | In our capture pipeline, we acquire three photometric views of the head and a mesh to which we fit template geometry. We have developed [Matlab code](matlab/README.md) for sampling and blending the different views into a seamless per-vertex texture. We also make available a matlab implementation of per-vertex ambient occlusion. 33 | 34 | ## Loading the model in Matlab 35 | 36 | If you wish to use the model in matlab, download the h5 file in the release folder and use the following code: 37 | ```matlab 38 | texMU = h5read('albedoModel2020_bfm_albedoPart.h5','/diffuseAlbedo/model/mean')'; 39 | texPC = h5read('albedoModel2020_bfm_albedoPart.h5','/diffuseAlbedo/model/pcaBasis')'; 40 | texEV = h5read('albedoModel2020_bfm_albedoPart.h5','/diffuseAlbedo/model/pcaVariance')'; 41 | ``` 42 | 43 | ## FLAME topology model 44 | 45 |

46 | 47 | We also make a version of our model available in the topology of the [FLAME model](https://flame.is.tue.mpg.de/). See the official release which contains a compressed numpy file containing the mean, principal components and variances for the diffuse and specular models. Big thanks to [Timo Bolkart](https://sites.google.com/site/bolkartt/) for registering our meshes with the FLAME model. 48 | 49 | ## Raw data 50 | 51 | Some of the participants gave additional permission for their scans to be distributed. We will provide the raw data captured in our scanner, multiview stereo models and the final registered, processed albedo maps on the template mesh. We hope to make this available very soon. 52 | 53 | ## License 54 | 55 | We give permission to use the model and code only for academic research purposes. Anyone wishing to use the model or code for commercial purposes should contact [William Smith](mailto:william.smith@york.ac.uk) in the first instance. 56 | 57 | ## Citation 58 | 59 | If you use the model or the code in your research, please cite the following paper: 60 | 61 | William A. P. Smith, Alassane Seck, Hannah Dee, Bernard Tiddeman, Joshua Tenenbaum and Bernhard Egger. "A Morphable Face Albedo Model". In Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2020. 62 | [https://arxiv.org/abs/2004.02711](https://arxiv.org/abs/2004.02711) 63 | 64 | Bibtex: 65 | 66 | @inproceedings{smith2020morphable, 67 | title={A Morphable Face Albedo Model}, 68 | author={Smith, William A. P. and Seck, Alassane and Dee, Hannah and Tiddeman, Bernard and Tenenbaum, Joshua and Egger, Bernhard}, 69 | booktitle={Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)}, 70 | pages={5011--5020}, 71 | year={2020} 72 | } 73 | 74 | In addition, if you use the model, you should cite the following paper since the model is partly derived from the data in the 3DRFE dataset: 75 | 76 | Stratou, Giota, Abhijeet Ghosh, Paul Debevec, and Louis-Philippe Morency. "Effect of illumination on automatic expression recognition: a novel 3D relightable facial database." In Proc. Face and Gesture 2011, pp. 611-618. 2011. 77 | 78 | Bibtex: 79 | 80 | @inproceedings{stratou2011effect, 81 | title={Effect of illumination on automatic expression recognition: a novel {3D} relightable facial database}, 82 | author={Stratou, Giota and Ghosh, Abhijeet and Debevec, Paul and Morency, Louis-Philippe}, 83 | booktitle={Proc. International Conference on Automatic Face and Gesture Recognition}, 84 | pages={611--618}, 85 | year={2011} 86 | } 87 | -------------------------------------------------------------------------------- /figures/FLAME.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/figures/FLAME.jpg -------------------------------------------------------------------------------- /figures/pipeline.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/figures/pipeline.jpg -------------------------------------------------------------------------------- /figures/teaser.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/figures/teaser.jpg -------------------------------------------------------------------------------- /matlab/MR_ambient_occlusion.m: -------------------------------------------------------------------------------- 1 | function AO = MR_ambient_occlusion(V,F,N,subdiv) 2 | %MR_AMBIENT_OCCLUSION Per-vertex ambient occlusion for a mesh 3 | % Inputs: 4 | % V - nverts x 3 matrix of vertex positions 5 | % F - nfaces x 3 matrix of face indices 6 | % N - nverts x 3 matrix of per-vertex surface normals 7 | % subdiv - number of subdivisions of icosahedron to use for view 8 | % sampling sphere. 1 is very coarse, 2 is better, 3 is very 9 | % accurate (42, 162, 642 samples respectively) 10 | % 11 | % Outputs: 12 | % AO - nverts x 3 per-vertex ambient occlusion values (0..1) 13 | % 14 | % An extension to the Matlab Renderer 15 | % (https://github.com/waps101/MatlabRenderer) 16 | % 17 | % This code was written for the following paper which you should cite if 18 | % you use the code in your research: 19 | % 20 | % William A. P. Smith, Alassane Seck, Hannah Dee, Bernard Tiddeman, Joshua 21 | % Tenenbaum and Bernhard Egger. A Morphable Face Albedo Model. In Proc. 22 | % CVPR, 2020. 23 | % 24 | % William Smith 25 | % University of York 26 | % 2020 27 | 28 | 29 | [dirs,~]=icosphere(subdiv); 30 | 31 | cameraparams.type = 'scaledorthographic'; 32 | cameraparams.scale = 1; 33 | 34 | AO = zeros(size(V,1),1); 35 | count = zeros(size(AO)); 36 | for j=1:size(dirs,1) 37 | weights = dirs(j,1).*N(:,1) + dirs(j,2).*N(:,2) + dirs(j,3).*N(:,3); 38 | weights(weights<0)=0; 39 | axis = cross(dirs(j,:),[0;0;-1]); 40 | if norm(axis)==0 41 | R = eye(3); 42 | else 43 | axis = axis./norm(axis); 44 | angle = acos(dot(dirs(j,:),[0;0;-1])); 45 | R = axang2rotm(axis.*angle); 46 | end 47 | t = [0;0;0]; 48 | cameraparams.T = [R t]; 49 | 50 | [Vxy,Vcam] = MR_project(V,cameraparams); 51 | scale = 500 / max( max(Vxy(:,1))-min(Vxy(:,1)), max(Vxy(:,2))-min(Vxy(:,2)) ); 52 | Vxy = Vxy.*scale; 53 | Vxy(:,1) = Vxy(:,1) - min(Vxy(:,1)); 54 | Vxy(:,2) = Vxy(:,2) - min(Vxy(:,2)); 55 | 56 | cameraparams.w = ceil(max(Vxy(:,1))); 57 | cameraparams.h = ceil(max(Vxy(:,2))); 58 | 59 | [zbuffer,fbuffer,wbuffer,Vz] = MR_rasterise_mesh_mex(uint32(F), Vcam, Vxy, cameraparams.w, cameraparams.h); 60 | visibility = MR_vertex_visibility(Vxy,Vz,zbuffer,fbuffer,F); 61 | AO = AO + visibility.*weights; 62 | 63 | count = count+(weights>0); 64 | %disp(['Direction ' num2str(j) ' of ' num2str(size(dirs,1))]); 65 | end 66 | AO = AO./count; 67 | AO = min(1,2.*AO); 68 | 69 | end 70 | 71 | function [vv,ff] = icosphere(varargin) 72 | %ICOSPHERE Generate icosphere. 73 | % Create a unit geodesic sphere created by subdividing a regular 74 | % icosahedron with normalised vertices. 75 | % 76 | % [V,F] = ICOSPHERE(N) generates to matrices containing vertex and face 77 | % data so that patch('Faces',F,'Vertices',V) produces a unit icosphere 78 | % with N subdivisions. 79 | % 80 | % FV = ICOSPHERE(N) generates an FV structure for using with patch. 81 | % 82 | % ICOSPHERE(N) and just ICOSPHERE display the icosphere as a patch on the 83 | % current axes and does not return anything. 84 | % 85 | % ICOSPHERE uses N = 3. 86 | % 87 | % ICOSPHERE(AX,...) plots into AX instead of GCA. 88 | % 89 | % See also SPHERE. 90 | % 91 | % Based on C# code by Andres Kahler 92 | % http://blog.andreaskahler.com/2009/06/creating-icosphere-mesh-in-code.html 93 | % 94 | % Wil O.C. Ward 19/03/2015 95 | % University of Nottingham, UK 96 | 97 | % Parse possible axes input 98 | if nargin > 2 99 | error('Too many input variables, must be 0, 1 or 2.'); 100 | end 101 | [cax,args,nargs] = axescheck(varargin{:}); 102 | 103 | n = 3; % default number of sub-divisions 104 | if nargs > 0, n = args{1}; end % override based on input 105 | 106 | % generate regular unit icosahedron (20 faced polyhedron) 107 | [v,f] = icosahedron(); % size(v) = [12,3]; size(f) = [20,3]; 108 | 109 | % recursively subdivide triangle faces 110 | for gen = 1:n 111 | f_ = zeros(size(f,1)*4,3); 112 | for i = 1:size(f,1) % for each triangle 113 | tri = f(i,:); 114 | % calculate mid points (add new points to v) 115 | [a,v] = getMidPoint(tri(1),tri(2),v); 116 | [b,v] = getMidPoint(tri(2),tri(3),v); 117 | [c,v] = getMidPoint(tri(3),tri(1),v); 118 | % generate new subdivision triangles 119 | nfc = [tri(1),a,c; 120 | tri(2),b,a; 121 | tri(3),c,b; 122 | a,b,c]; 123 | % replace triangle with subdivision 124 | idx = 4*(i-1)+1:4*i; 125 | f_(idx,:) = nfc; 126 | end 127 | f = f_; % update 128 | end 129 | 130 | % remove duplicate vertices 131 | [v,b,ix] = unique(v,'rows'); clear b % b dummy / compatibility 132 | % reassign faces to trimmed vertex list and remove any duplicate faces 133 | f = unique(ix(f),'rows'); 134 | 135 | switch(nargout) 136 | case 0 % no output 137 | cax = newplot(cax); % draw to given axis (or gca) 138 | showSphere(cax,f,v); 139 | case 1 % return fv structure for patch 140 | vv = struct('Vertices',v,'Faces',f,... 141 | 'VertexNormals',v,'FaceVertexCData',v(:,3)); 142 | case 2 % return vertices and faces 143 | vv = v; ff = f; 144 | otherwise 145 | error('Too many output variables, must be 0, 1 or 2.'); 146 | end 147 | 148 | end 149 | 150 | function [i,v] = getMidPoint(t1,t2,v) 151 | %GETMIDPOINT calculates point between two vertices 152 | % Calculate new vertex in sub-division and normalise to unit length 153 | % then find or add it to v and return index 154 | % 155 | % Wil O.C. Ward 19/03/2015 156 | % University of Nottingham, UK 157 | 158 | % get vertice positions 159 | p1 = v(t1,:); p2 = v(t2,:); 160 | % calculate mid point (on unit sphere) 161 | pm = (p1 + p2) ./ 2; 162 | pm = pm./norm(pm); 163 | % add to vertices list, return index 164 | i = size(v,1) + 1; 165 | v = [v;pm]; 166 | 167 | end 168 | 169 | function [v,f] = icosahedron() 170 | %ICOSAHEDRON creates unit regular icosahedron 171 | % Returns 12 vertex and 20 face values. 172 | % 173 | % Wil O.C. Ward 19/03/2015 174 | % University of Nottingham, UK 175 | t = (1+sqrt(5)) / 2; 176 | % create vertices 177 | v = [-1, t, 0; % v1 178 | 1, t, 0; % v2 179 | -1,-t, 0; % v3 180 | 1,-t, 0; % v4 181 | 0,-1, t; % v5 182 | 0, 1, t; % v6 183 | 0,-1,-t; % v7 184 | 0, 1,-t; % v8 185 | t, 0,-1; % v9 186 | t, 0, 1; % v10 187 | -t, 0,-1; % v11 188 | -t, 0, 1];% v12 189 | % normalise vertices to unit size 190 | v = v./norm(v(1,:)); 191 | 192 | % create faces 193 | f = [ 1,12, 6; % f1 194 | 1, 6, 2; % f2 195 | 1, 2, 8; % f3 196 | 1, 8,11; % f4 197 | 1,11,12; % f5 198 | 2, 6,10; % f6 199 | 6,12, 5; % f7 200 | 12,11, 3; % f8 201 | 11, 8, 7; % f9 202 | 8, 2, 9; % f10 203 | 4,10, 5; % f11 204 | 4, 5, 3; % f12 205 | 4, 3, 7; % f13 206 | 4, 7, 9; % f14 207 | 4, 9,10; % f15 208 | 5,10, 6; % f16 209 | 3, 5,12; % f17 210 | 7, 3,11; % f18 211 | 9, 7, 8; % f19 212 | 10, 9, 2];% f20 213 | end -------------------------------------------------------------------------------- /matlab/MR_poisson_blending.m: -------------------------------------------------------------------------------- 1 | function pervertexcolour = MR_poisson_blending(V,F,samples,lambda) 2 | %MR_POISSON_BLENDING Screened Poisson blending on a mesh 3 | % Inputs: 4 | % V - nverts x 3 matrix of vertex positions 5 | % F - nfaces x 3 matrix of face indices 6 | % samples - nviews x 1 structure with: 7 | % samples(i).pervertexcolour - nverts x 3 matrix of sampled colours 8 | % samples(i).weights - nverts x 3 matrix of blending weight 9 | % Note: samples(1).pervertexcolour acts as the screening term - in 10 | % other words the overall unknown colour offset is solved to 11 | % minimise error to these colours (ignoring vertices with zero 12 | % weight). This is used instead of a boundary condition. 13 | % lambda - regularisation weight 14 | % 15 | % Outputs: 16 | % pervertexcolour - nverts x 3 per-vertex blended colours 17 | % 18 | % Triangles where at least one vertex has no valid colour samples are 19 | % assigned zero gradient. The effect is to smoothly inpaint these regions 20 | % based on the surrounding colours. 21 | % 22 | % An extension to the Matlab Renderer 23 | % (https://github.com/waps101/MatlabRenderer) 24 | % 25 | % This code was written for the following paper which you should cite if 26 | % you use the code in your research: 27 | % 28 | % William A. P. Smith, Alassane Seck, Hannah Dee, Bernard Tiddeman, Joshua 29 | % Tenenbaum and Bernhard Egger. A Morphable Face Albedo Model. In Proc. 30 | % CVPR, 2020. 31 | % 32 | % William Smith 33 | % University of York 34 | % 2020 35 | 36 | X = V'; 37 | F = F'; 38 | 39 | n = size(X,2); 40 | m = size(F,2); 41 | 42 | % Callback to get the coordinates of all the vertex of index i=1,2,3 in all faces 43 | XF = @(i)X(:,F(i,:)); 44 | 45 | % Compute un-normalized normal through the formula e1xe2 where ei are the edges. 46 | Na = cross( XF(2)-XF(1), XF(3)-XF(1) ); 47 | 48 | % Compute the area of each face as half the norm of the cross product. 49 | amplitude = @(X)sqrt( sum( X.^2 ) ); 50 | A = amplitude(Na)/2; 51 | 52 | % Compute the set of unit-norm normals to each face. 53 | normalize = @(X)X ./ repmat(amplitude(X), [3 1]); 54 | N = normalize(Na); 55 | 56 | % Populate the sparse entries of the matrices for the operator implementing ?i?fui(Nf?ei) 57 | I = []; J = []; V = []; % indexes to build the sparse matrices 58 | for i=1:3 59 | % opposite edge e_i indexes 60 | s = mod(i,3)+1; 61 | t = mod(i+1,3)+1; 62 | % vector N_f^e_i 63 | wi = cross(XF(t)-XF(s),N); 64 | % update the index listing 65 | I = [I, 1:m]; 66 | J = [J, F(i,:)]; 67 | V = [V, wi]; 68 | end 69 | 70 | % Sparse matrix with entries 1/(2Af) 71 | dA = spdiags(1./(2*A(:)),0,m,m); 72 | 73 | % Compute gradient. 74 | GradMat = {}; 75 | for k=1:3 76 | GradMat{k} = dA*sparse(I,J,V(k,:),m,n); 77 | end 78 | 79 | V = X'; 80 | F = F'; 81 | 82 | 83 | y = zeros(size(F,1)*3,3); 84 | 85 | Tweight = zeros(size(F)); 86 | for im=1:length(samples) 87 | Tweight(:,im) = min([samples(im).weight(F(:,1)) samples(im).weight(F(:,2)) samples(im).weight(F(:,3))],[],2); 88 | rhs{im} = [GradMat{1}; GradMat{2}; GradMat{3}]*samples(im).pervertexcolour; 89 | end 90 | 91 | [~,idx] = max(Tweight,[],2); 92 | 93 | for im=1:length(samples) 94 | mask = (idx==im) & (Tweight(:,im)>0); 95 | mask = [mask; mask; mask]; 96 | y(mask,:) = rhs{im}(mask,:); 97 | end 98 | 99 | A = [GradMat{1}; GradMat{2}; GradMat{3}; sparse(1:size(V,1),1:size(V,1),lambda.*(samples(1).weight>0),size(V,1),size(V,1))]; 100 | 101 | y = [y; lambda.*samples(1).pervertexcolour]; 102 | 103 | pervertexcolour = A\y; 104 | 105 | end 106 | 107 | -------------------------------------------------------------------------------- /matlab/MR_sample_image.m: -------------------------------------------------------------------------------- 1 | function [pervertextex,weight] = MR_sample_image(V,F,Vxy,cameraparams,visibility,im,fbuffer,boundarydistT) 2 | %MR_SAMPLE_IMAGE Sample an image onto a mesh given rasterisation 3 | % Inputs: 4 | % V - nverts x 3 matrix of vertex positions 5 | % F - nfaces x 3 matrix of face indices 6 | % Vxy - nverts x 2 projected 2D vertex positions 7 | % cameraparams - structure containing camera parameters 8 | % visibility - nverts x 1 per-vertex binary visibility 9 | % im - H x W x 3 image 10 | % fbuffer - face buffer from rasteriser 11 | % boundarydistT - distance to boundary threshold (in pixels) 12 | % 13 | % Outputs: 14 | % pervertextex - nverts x 3 per-vertex sampled colours 15 | % weight - pervertex weight, 0 is not visible, else cosine of 16 | % angle between viewer and surface normal 17 | % 18 | % An extension to the Matlab Renderer 19 | % (https://github.com/waps101/MatlabRenderer) 20 | % 21 | % This code was written for the following paper which you should cite if 22 | % you use the code in your research: 23 | % 24 | % William A. P. Smith, Alassane Seck, Hannah Dee, Bernard Tiddeman, Joshua 25 | % Tenenbaum and Bernhard Egger. A Morphable Face Albedo Model. In Proc. 26 | % CVPR, 2020. 27 | % 28 | % William Smith 29 | % University of York 30 | 31 | % Compute camera centre 32 | c = -cameraparams.T(1:3,1:3)'*cameraparams.T(1:3,4); 33 | % Compute per-vertex view vectors 34 | Views(:,1) = c(1)-V(:,1); 35 | Views(:,2) = c(2)-V(:,2); 36 | Views(:,3) = c(3)-V(:,3); 37 | %Views = -Views; 38 | Views = Views./repmat(sqrt(sum(Views.^2,2)),[1 3]); 39 | 40 | % Sample image onto vertices 41 | pervertextex(:,1) = interp2(im(:,:,1),Vxy(:,1),Vxy(:,2)); 42 | pervertextex(:,2) = interp2(im(:,:,2),Vxy(:,1),Vxy(:,2)); 43 | pervertextex(:,3) = interp2(im(:,:,3),Vxy(:,1),Vxy(:,2)); 44 | 45 | dist2boundary = bwdist(~imfill(fbuffer~=0,'holes')); 46 | pervertexd2b = interp2(dist2boundary,Vxy(:,1),Vxy(:,2)); 47 | 48 | % Compute per vertex normals 49 | pervertexnormals = MR_vertex_normals(F,V); 50 | 51 | % Calculate weight as clamped dot product between normal and viewer, masked 52 | % by visibility 53 | weight = max(0,sum(pervertexnormals.*Views,2)).*visibility; 54 | 55 | weight(isnan(pervertextex(:,1)))=0; 56 | 57 | weight(pervertexd2b 132 | 133 | ## Dependencies 134 | 135 | - [scalismo-faces](https://github.com/unibas-gravis/scalismo-faces) `0.10.1+` 136 | -------------------------------------------------------------------------------- /scala/Screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/Screenshot.png -------------------------------------------------------------------------------- /scala/build.sbt: -------------------------------------------------------------------------------- 1 | name := """albedoMorphableModel""" 2 | version := "1.0" 3 | 4 | scalaVersion := "2.11.12" 5 | 6 | scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8") 7 | 8 | resolvers += Resolver.jcenterRepo 9 | 10 | resolvers += Resolver.bintrayRepo("unibas-gravis", "maven") 11 | 12 | libraryDependencies += "ch.unibas.cs.gravis" %% "scalismo-faces" % "0.10.1+" 13 | libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test" 14 | 15 | mainClass in assembly := Some("faces.apps.AlbedoModelViewer") 16 | 17 | assemblyJarName in assembly := "assembly.jar" 18 | -------------------------------------------------------------------------------- /scala/contributors.txt: -------------------------------------------------------------------------------- 1 | Bernhard Egger 2 | -------------------------------------------------------------------------------- /scala/fitting/Bob_Stoops_0005.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/Bob_Stoops_0005.png -------------------------------------------------------------------------------- /scala/fitting/Bob_Stoops_0005_face0.tlms: -------------------------------------------------------------------------------- 1 | left.eyebrow.bend.lower 1 166.2 103.6 2 | left.eyebrow.inner_lower 1 138.2 100.8 3 | right.eyebrow.inner_lower 1 110.2 106.4 4 | right.eyebrow.bend.lower 1 90.6 112 5 | left.eye.corner_outer 1 155 112 6 | left.eye.corner_inner 1 138.2 112 7 | right.eye.corner_inner 1 113 117.6 8 | right.eye.corner_outer 1 96.2 117.6 9 | left.nose.wing.tip 1 138.2 140 10 | center.nose.tip 1 127 140 11 | right.nose.wing.tip 1 118.6 142.8 12 | left.lips.corner 1 146.6 156.8 13 | center.lips.upper.outer 1 129.8 159.6 14 | center.lips.lower.outer 1 129.8 165.2 15 | right.lips.corner 1 115.8 159.6 16 | center.chin.tip 1 135.4 193.2 17 | -------------------------------------------------------------------------------- /scala/fitting/results/fit-best.rps: -------------------------------------------------------------------------------- 1 | { 2 | "momo": { 3 | "shape": [-0.19309327131149293, 0.1407885353053634, -1.5352041402982002, -1.3505845566353214, -0.5062011358213976, 0.1157564506387255, 0.1861530099935135, 1.1187404738215816, -0.6548358532151486, -1.1959873415818538, 0.31646928802368324, -1.693239317931563, -0.2628868509051711, -0.16785752345829297, -1.382854352131294, -1.0913404825646706, -0.09183072813881285, -0.05500243835239794, -0.39330154903515163, 0.1300884536355031, -0.1122143255979745, 0.0793970308132228, 0.0506934716350203, -0.21169194207152625, -0.23641599738355043, -0.3484296108921833, -0.18660391387486056, -0.08928369814509214, 0.7567838063537765, -0.2811617292320389, -0.9732160519895866, -0.7334378203903531, -0.693784238930635, -0.8790013722819999, -0.7686732872682317, -1.3094140053205727, 0.3854817211874558, -1.2656019034027106, 0.961480533805653, -0.025576613056629997, -0.07805688287964668, -0.12848929786686905, 0.026456558436803623, -0.23158392466864222, 0.5073319480746473, -0.4322230345418748, -0.8373463037234968, -0.6273326709513076, -0.006686712309351039, 0.38571292575763727], 4 | "color": [-0.5923265368953831, -0.6513190801096681, 0.5932560359254291, -0.495390266187016, -1.051633512314428, -0.2602150660919913, 0.10571626000514663, 1.0884829937443095, -0.48666879477116237, -0.5778384306130029, 0.6295796422373253, -0.28933435411208575, -0.9484086621146467, 1.286135468772064, -0.4648082739533657, -1.2921435337997758, 0.24410573961807425, 0.3262699058902697, 0.011632241405730909, -0.45593532771877043, 0.7595822649825951, -0.26152998174915587, 0.1477803808323942, 0.07480051396307462, -1.7564969287414018, -0.03421823944203076, 0.29963613382043774, -0.07557125055159361, -0.744376328701486, 0.32604877275318966, -1.2635510091370588, -0.21102230109688763, -0.37084525495816695, -1.1818355046723907, 0.3820019502483453, -0.7752978607359646, 0.6076727245892595, -0.06529618734005299, -0.05872503639849794, -0.6827511521735168, -0.10737664153661353, 1.305622795881482, 0.36131070215413374, 0.7298855390188344, 0.08621203070613809, -0.27068995074529995, -0.7803045134160066, -1.0223628588041196, -0.49385947981923634, 0.007898426107997309], 5 | "expression": [-1.2820511824542038, -1.1053451782003367, -1.6358216840383573, -0.9474557738770832, -0.23175481661906552], 6 | "modelURI": "" 7 | }, 8 | "imageSize": { 9 | "height": 250, 10 | "width": 250 11 | }, 12 | "colorTransform": { 13 | "gain": [1.0, 1.0, 1.0], 14 | "colorContrast": 1.0, 15 | "offset": [0.0141877744958488, -0.004362358732503625, -0.00105158230756267] 16 | }, 17 | "camera": { 18 | "principalPoint": [0.11395619828635972, 0.020775486153173595], 19 | "sensorSize": [2.0, 2.0], 20 | "far": 1000000.0, 21 | "orthographic": false, 22 | "near": 10.0, 23 | "focalLength": 134.13737280655158 24 | }, 25 | "directionalLight": { 26 | "direction": [0.0, 0.0, 1.0], 27 | "diffuse": [0.0, 0.0, 0.0], 28 | "specular": [0.0, 0.0, 0.0], 29 | "@type": "DirectionalLight", 30 | "ambient": [0.0, 0.0, 0.0], 31 | "shininess": 10.0 32 | }, 33 | "version": "V4.0", 34 | "environmentMap": { 35 | "coefficients": [[0.3651534407751903, 0.3376333381145733, 0.36495961799525084], [0.007990016235064148, -0.01947828264105806, -0.01185043802727937], [0.3976321911057917, 0.2911150906892497, 0.34425637349873983], [-0.09691594107657349, -0.13003985834744192, -0.14277620749687647], [-0.009295643554440614, -0.00995449165788486, -0.028240518162071702], [0.22203578956107534, 0.18958437378380355, 0.21639024344847047], [0.1356527342676411, 0.11405236112218055, 0.12836746301201624], [0.132981652710949, 0.16016764643359788, 0.11886831538219021], [-0.19450284471770923, -0.20242904934640313, -0.20942734640139188]], 36 | "@type": "SphericalHarmonicsLight" 37 | }, 38 | "pose": { 39 | "scaling": 1.0, 40 | "roll": 0.1100176118292774, 41 | "pitch": 0.0814491143227416, 42 | "yaw": -0.10922502660961991, 43 | "translation": [-4.384559612266215, -15.241813603569915, -24833.09472154316] 44 | }, 45 | "view": { 46 | "translation": [0.0, 0.0, 0.0], 47 | "roll": 0.0, 48 | "yaw": 0.0, 49 | "pitch": 0.0 50 | } 51 | } -------------------------------------------------------------------------------- /scala/fitting/results/fitter-best-diffuse.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/fitter-best-diffuse.png -------------------------------------------------------------------------------- /scala/fitting/results/fitter-best-specular.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/fitter-best-specular.png -------------------------------------------------------------------------------- /scala/fitting/results/fitter-best.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/fitter-best.png -------------------------------------------------------------------------------- /scala/fitting/results/fitter-best_Gamma.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/fitter-best_Gamma.png -------------------------------------------------------------------------------- /scala/fitting/results/fitter-lminit.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/fitter-lminit.png -------------------------------------------------------------------------------- /scala/fitting/results/fitter-lminit.rps: -------------------------------------------------------------------------------- 1 | { 2 | "momo": { 3 | "shape": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 4 | "color": [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 5 | "expression": [0.0, 0.0, 0.0, 0.0, 0.0], 6 | "modelURI": "" 7 | }, 8 | "imageSize": { 9 | "height": 250, 10 | "width": 250 11 | }, 12 | "colorTransform": { 13 | "gain": [1.0, 1.0, 1.0], 14 | "colorContrast": 1.0, 15 | "offset": [0.0, 0.0, 0.0] 16 | }, 17 | "camera": { 18 | "principalPoint": [0.09949994160901748, 0.0150549430171892], 19 | "sensorSize": [2.0, 2.0], 20 | "far": 1000000.0, 21 | "orthographic": false, 22 | "near": 10.0, 23 | "focalLength": 120.05606183108446 24 | }, 25 | "directionalLight": { 26 | "direction": [0.0, 0.0, 1.0], 27 | "diffuse": [0.0, 0.0, 0.0], 28 | "specular": [0.0, 0.0, 0.0], 29 | "@type": "DirectionalLight", 30 | "ambient": [0.0, 0.0, 0.0], 31 | "shininess": 10.0 32 | }, 33 | "version": "V4.0", 34 | "environmentMap": { 35 | "coefficients": [[0.9027032341391218, 0.9027032341391218, 0.9027032341391218], [0.0, 0.0, 0.0], [0.19544101431611297, 0.19544101431611297, 0.19544101431611297], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0], [0.0, 0.0, 0.0]], 36 | "@type": "SphericalHarmonicsLight" 37 | }, 38 | "pose": { 39 | "scaling": 1.0, 40 | "roll": 0.09159006724552202, 41 | "pitch": 0.055660865172980786, 42 | "yaw": -0.08672018522768452, 43 | "translation": [-4.384559612266215, -15.241813603569915, -21750.1582358068] 44 | }, 45 | "view": { 46 | "translation": [0.0, 0.0, 0.0], 47 | "roll": 0.0, 48 | "yaw": 0.0, 49 | "pitch": 0.0 50 | } 51 | } -------------------------------------------------------------------------------- /scala/fitting/results/target.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/target.png -------------------------------------------------------------------------------- /scala/fitting/results/target.tlms: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/target.tlms -------------------------------------------------------------------------------- /scala/fitting/results/target_Gamma.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/waps101/AlbedoMM/c7b0ae61683b5010fb728b025f5a52335b706364/scala/fitting/results/target_Gamma.png -------------------------------------------------------------------------------- /scala/project/assembly.sbt: -------------------------------------------------------------------------------- 1 | addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.1") 2 | -------------------------------------------------------------------------------- /scala/project/build.properties: -------------------------------------------------------------------------------- 1 | sbt.version=0.13.17 -------------------------------------------------------------------------------- /scala/src/main/scala/faces/apps/AlbedoModelBuilder.scala: -------------------------------------------------------------------------------- 1 | package faces.apps 2 | 3 | import java.io.File 4 | 5 | import faces.lib.{AlbedoMoMo, AlbedoMoMoIO} 6 | 7 | import scalismo.common.UnstructuredPointsDomain 8 | import scalismo.faces.io.MoMoIO 9 | import scalismo.faces.momo.PancakeDLRGP 10 | import scalismo.geometry.{Point, _3D} 11 | import scalismo.utils.Random 12 | 13 | object AlbedoModelBuilder extends App{ 14 | scalismo.initialize() 15 | val seed = 1024L 16 | implicit val rnd: Random = Random(seed) 17 | 18 | val bfmAlbedo = AlbedoMoMoIO.read(new File("albedoModel2020_bfm_albedoPart.h5")).get 19 | val face12Albedo = AlbedoMoMoIO.read(new File("albedoModel2020_face12_albedoPart.h5")).get 20 | 21 | val bfmShape = MoMoIO.read(new File("model2017-1_bfm_nomouth.h5")).get 22 | val face12Shape = MoMoIO.read(new File("model2017-1_face12_nomouth.h5")).get 23 | 24 | val bfmCombined = AlbedoMoMo(bfmShape.referenceMesh, bfmShape.neutralModel.shape, bfmAlbedo.neutralModel.diffuseAlbedo, bfmAlbedo.neutralModel.specularAlbedo, bfmShape.expressionModel.get.expression, bfmShape.landmarks) 25 | val face12Combined = AlbedoMoMo(face12Shape.referenceMesh, face12Shape.neutralModel.shape, face12Albedo.neutralModel.diffuseAlbedo, face12Albedo.neutralModel.specularAlbedo, face12Shape.expressionModel.get.expression, face12Shape.landmarks) 26 | 27 | 28 | AlbedoMoMoIO.write(bfmCombined, new File("albedoModel2020_bfm.h5")) 29 | AlbedoMoMoIO.write(face12Combined, new File("albedoModel2020_face12.h5")) 30 | } 31 | -------------------------------------------------------------------------------- /scala/src/main/scala/faces/apps/AlbedoModelFit.scala: -------------------------------------------------------------------------------- 1 | package faces.apps 2 | 3 | import java.io.File 4 | 5 | import faces.lib.{AlbedoMoMoIO, AlbedoMoMoRenderer, AlbedoModelFitScript} 6 | import scalismo.utils.Random 7 | 8 | object AlbedoModelFit extends App{ 9 | 10 | scalismo.initialize() 11 | val seed = 1024L 12 | implicit val rnd: Random = Random(seed) 13 | 14 | 15 | val model = AlbedoMoMoIO.read(new File("albedoModel2020_face12.h5")).get 16 | val renderer = AlbedoMoMoRenderer(model, 10.0).cached(5) 17 | 18 | val targetFn = "fitting/Bob_Stoops_0005.png" 19 | val lmFn = "fitting/Bob_Stoops_0005_face0.tlms" 20 | val outPutDir = "fitting/results" 21 | 22 | 23 | AlbedoModelFitScript.fit(targetFn, lmFn, outPutDir, renderer, true, true) 24 | } 25 | -------------------------------------------------------------------------------- /scala/src/main/scala/faces/apps/AlbedoModelViewer.scala: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed under the Apache License, Version 2.0 (the "License"); 3 | * you may not use this file except in compliance with the License. 4 | * You may obtain a copy of the License at 5 | * 6 | * http://www.apache.org/licenses/LICENSE-2.0 7 | * 8 | * Unless required by applicable law or agreed to in writing, software 9 | * distributed under the License is distributed on an "AS IS" BASIS, 10 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 11 | * See the License for the specific language governing permissions and 12 | * limitations under the License. 13 | * 14 | * This code was adapted from 15 | * https://github.com/unibas-gravis/basel-face-model-viewer 16 | * Copyright University of Basel, Graphics and Vision Research Group 17 | */ 18 | package faces.apps 19 | 20 | import java.awt.Dimension 21 | import java.io.{File, IOException} 22 | 23 | import javax.swing._ 24 | import javax.swing.event.{ChangeEvent, ChangeListener} 25 | import breeze.linalg.min 26 | import breeze.numerics.pow 27 | import faces.lib.{AlbedoMoMo, AlbedoMoMoIO, AlbedoMoMoRenderer} 28 | import scalismo.color.{RGB, RGBA} 29 | import scalismo.faces.gui.{GUIBlock, GUIFrame, ImagePanel} 30 | import scalismo.faces.gui.GUIBlock._ 31 | import scalismo.faces.parameters.{RenderParameter, SphericalHarmonicsLight} 32 | import scalismo.faces.io.{MeshIO, PixelImageIO, RenderParameterIO} 33 | import scalismo.faces.image.PixelImage 34 | import scalismo.faces.parameters.SphericalHarmonicsLight.fromAmbientDiffuse 35 | import scalismo.geometry.EuclideanVector3D 36 | import scalismo.utils.Random 37 | 38 | import scala.reflect.io.Path 39 | import scala.util.{Failure, Try} 40 | 41 | object AlbedoModelViewer extends App { 42 | 43 | final val DEFAULT_DIR = new File(".") 44 | 45 | val modelFile: Option[File] = getModelFile(args) 46 | modelFile.map(SimpleAlbedoModelViewer(_)) 47 | 48 | private def getModelFile(args: Seq[String]): Option[File] = { 49 | if (args.nonEmpty) { 50 | val path = Path(args.head) 51 | if (path.isFile) return Some(path.jfile) 52 | if (path.isDirectory) return askUserForModelFile(path.jfile) 53 | } 54 | askUserForModelFile(DEFAULT_DIR) 55 | } 56 | 57 | private def askUserForModelFile(dir: File): Option[File] = { 58 | val jFileChooser = new JFileChooser(dir) 59 | if (jFileChooser.showOpenDialog(null) == JFileChooser.APPROVE_OPTION) { 60 | Some(jFileChooser.getSelectedFile) 61 | } else { 62 | println("No model select...") 63 | None 64 | } 65 | } 66 | } 67 | 68 | case class SimpleAlbedoModelViewer( 69 | modelFile: File, 70 | imageWidth: Int = 512, 71 | imageHeight: Int = 512, 72 | maximalSliderValue: Int = 2, 73 | maximalShapeRank: Option[Int] = None, 74 | maximalColorRank: Option[Int] = None, 75 | maximalExpressionRank: Option[Int] = None 76 | ) { 77 | 78 | scalismo.initialize() 79 | val seed = 1024L 80 | implicit val rnd: Random = Random(seed) 81 | 82 | 83 | val g = 1.0 / 2.2 // gamma 84 | 85 | def applyGamma(i: PixelImage[RGBA]): PixelImage[RGBA] = i.map(c => RGBA(pow(c.r, g), pow(c.g, g), pow(c.b, g))) 86 | 87 | val model: AlbedoMoMo = AlbedoMoMoIO.read(modelFile, "").get 88 | var showExpressionModel: Boolean = model.hasExpressions 89 | 90 | val shapeRank: Int = maximalShapeRank match { 91 | case Some(rank) => min(model.neutralModel.shape.rank, rank) 92 | case _ => model.neutralModel.shape.rank 93 | } 94 | 95 | val colorRank: Int = maximalColorRank match { 96 | case Some(rank) => min(model.neutralModel.diffuseAlbedo.rank, rank) 97 | case _ => model.neutralModel.diffuseAlbedo.rank 98 | } 99 | 100 | val expRank: Int = maximalExpressionRank match { 101 | case Some(rank) => try{min(model.expressionModel.get.expression.rank, rank)} catch {case _: Exception => 0} 102 | case _ => try{model.expressionModel.get.expression.rank} catch {case _: Exception => 0} 103 | } 104 | 105 | var renderer: AlbedoMoMoRenderer = AlbedoMoMoRenderer(model, 10.0).cached(5) 106 | 107 | val light = fromAmbientDiffuse(RGB(0.6), RGB(0.1), EuclideanVector3D.unitZ).withNumberOfBands(2) 108 | val initDefault: RenderParameter = RenderParameter.defaultSquare.fitToImageSize(imageWidth, imageHeight).withEnvironmentMap(light) 109 | val init10: RenderParameter = initDefault.copy( 110 | momo = initDefault.momo.withNumberOfCoefficients(shapeRank, colorRank, expRank) 111 | ) 112 | var init: RenderParameter = init10 113 | 114 | var changingSliders = false 115 | 116 | val sliderSteps = 1000 117 | var maximalSigma: Int = maximalSliderValue 118 | var maximalSigmaSpinner: JSpinner = { 119 | val spinner = new JSpinner(new SpinnerNumberModel(maximalSigma,0,999,1)) 120 | spinner.addChangeListener( new ChangeListener() { 121 | override def stateChanged(e: ChangeEvent): Unit = { 122 | val newMaxSigma = spinner.getModel.asInstanceOf[SpinnerNumberModel].getNumber.intValue() 123 | maximalSigma = math.abs(newMaxSigma) 124 | setShapeSliders() 125 | setColorSliders() 126 | setExpSliders() 127 | } 128 | }) 129 | spinner.setToolTipText("maximal slider value") 130 | spinner 131 | } 132 | 133 | 134 | def sliderToParam(value: Int): Double = { 135 | maximalSigma * value.toDouble/sliderSteps 136 | } 137 | 138 | def paramToSlider(value: Double): Int = { 139 | (value / maximalSigma * sliderSteps).toInt 140 | } 141 | 142 | val bg = PixelImage(imageWidth, imageHeight, (_, _) => RGBA.Black) 143 | 144 | val imageWindow = ImagePanel(renderWithBG(init)) 145 | 146 | //--- SHAPE ----- 147 | val shapeSlider: IndexedSeq[JSlider] = for (n <- 0 until shapeRank) yield { 148 | GUIBlock.slider(-sliderSteps, sliderSteps, 0, f => { 149 | updateShape(n, f) 150 | updateImage() 151 | }) 152 | } 153 | 154 | val shapeSliderView: JPanel = GUIBlock.shelf(shapeSlider.zipWithIndex.map(s => GUIBlock.stack(s._1, new JLabel("" + s._2))): _*) 155 | val shapeScrollPane = new JScrollPane(shapeSliderView) 156 | val shapeScrollBar: JScrollBar = shapeScrollPane.createVerticalScrollBar() 157 | shapeScrollPane.setSize(800, 300) 158 | shapeScrollPane.setPreferredSize(new Dimension(800, 300)) 159 | 160 | val rndShapeButton: JButton = GUIBlock.button("random", { 161 | randomShape(); updateImage() 162 | }) 163 | val resetShapeButton: JButton = GUIBlock.button("reset", { 164 | resetShape(); updateImage() 165 | }) 166 | rndShapeButton.setToolTipText("draw each shape parameter at random from a standard normal distribution") 167 | resetShapeButton.setToolTipText("set all shape parameters to zero") 168 | 169 | def updateShape(n: Int, value: Int): Unit = { 170 | init = init.copy(momo = init.momo.copy(shape = { 171 | val current = init.momo.shape 172 | current.zipWithIndex.map { case (v, i) => if (i == n) sliderToParam(value) else v } 173 | })) 174 | } 175 | 176 | def randomShape(): Unit = { 177 | init = init.copy(momo = init.momo.copy(shape = { 178 | val current = init.momo.shape 179 | current.zipWithIndex.map { 180 | case (_, _) => 181 | rnd.scalaRandom.nextGaussian 182 | } 183 | 184 | })) 185 | setShapeSliders() 186 | } 187 | 188 | def resetShape(): Unit = { 189 | init = init.copy(momo = init.momo.copy( 190 | shape = IndexedSeq.fill(shapeRank)(0.0) 191 | )) 192 | setShapeSliders() 193 | } 194 | 195 | def setShapeSliders(): Unit = { 196 | changingSliders = true 197 | (0 until shapeRank).foreach(i => { 198 | shapeSlider(i).setValue(paramToSlider(init.momo.shape(i))) 199 | }) 200 | changingSliders = false 201 | } 202 | 203 | //--- COLOR ----- 204 | val colorSlider: IndexedSeq[JSlider] = for (n <- 0 until colorRank) yield { 205 | GUIBlock.slider(-sliderSteps, sliderSteps, 0, f => { 206 | updateColor(n, f) 207 | updateImage() 208 | }) 209 | } 210 | 211 | val colorSliderView: JPanel = GUIBlock.shelf(colorSlider.zipWithIndex.map(s => GUIBlock.stack(s._1, new JLabel("" + s._2))): _*) 212 | val colorScrollPane = new JScrollPane(colorSliderView) 213 | val colorScrollBar: JScrollBar = colorScrollPane.createHorizontalScrollBar() 214 | colorScrollPane.setSize(800, 300) 215 | colorScrollPane.setPreferredSize(new Dimension(800, 300)) 216 | 217 | val rndColorButton: JButton = GUIBlock.button("random", { 218 | randomColor(); updateImage() 219 | }) 220 | 221 | val resetColorButton: JButton = GUIBlock.button("reset", { 222 | resetColor(); updateImage() 223 | }) 224 | rndColorButton.setToolTipText("draw each color parameter at random from a standard normal distribution") 225 | resetColorButton.setToolTipText("set all color parameters to zero") 226 | 227 | def updateColor(n: Int, value: Int): Unit = { 228 | init = init.copy(momo = init.momo.copy(color = { 229 | val current = init.momo.color 230 | current.zipWithIndex.map { case (v, i) => if (i == n) sliderToParam(value) else v } 231 | })) 232 | } 233 | 234 | def randomColor(): Unit = { 235 | init = init.copy(momo = init.momo.copy(color = { 236 | val current = init.momo.color 237 | current.zipWithIndex.map { 238 | case (_, _) => 239 | rnd.scalaRandom.nextGaussian 240 | } 241 | 242 | })) 243 | setColorSliders() 244 | } 245 | 246 | def resetColor(): Unit = { 247 | init = init.copy(momo = init.momo.copy( 248 | color = IndexedSeq.fill(colorRank)(0.0) 249 | )) 250 | setColorSliders() 251 | } 252 | 253 | def setColorSliders(): Unit = { 254 | changingSliders = true 255 | (0 until colorRank).foreach(i => { 256 | colorSlider(i).setValue(paramToSlider(init.momo.color(i))) 257 | }) 258 | changingSliders = false 259 | } 260 | 261 | //--- EXPRESSION ----- 262 | val expSlider: IndexedSeq[JSlider] = for (n <- 0 until expRank)yield { 263 | GUIBlock.slider(-sliderSteps, sliderSteps, 0, f => { 264 | updateExpression(n, f) 265 | updateImage() 266 | }) 267 | } 268 | 269 | val expSliderView: JPanel = GUIBlock.shelf(expSlider.zipWithIndex.map(s => GUIBlock.stack(s._1, new JLabel("" + s._2))): _*) 270 | val expScrollPane = new JScrollPane(expSliderView) 271 | val expScrollBar: JScrollBar = expScrollPane.createVerticalScrollBar() 272 | expScrollPane.setSize(800, 300) 273 | expScrollPane.setPreferredSize(new Dimension(800, 300)) 274 | 275 | val rndExpButton: JButton = GUIBlock.button("random", { 276 | randomExpression(); updateImage() 277 | }) 278 | val resetExpButton: JButton = GUIBlock.button("reset", { 279 | resetExpression(); updateImage() 280 | }) 281 | 282 | rndExpButton.setToolTipText("draw each expression parameter at random from a standard normal distribution") 283 | resetExpButton.setToolTipText("set all expression parameters to zero") 284 | 285 | def updateExpression(n: Int, value: Int): Unit = { 286 | init = init.copy(momo = init.momo.copy(expression = { 287 | val current = init.momo.expression 288 | current.zipWithIndex.map { case (v, i) => if (i == n) sliderToParam(value) else v } 289 | })) 290 | } 291 | 292 | def randomExpression(): Unit = { 293 | init = init.copy(momo = init.momo.copy(expression = { 294 | val current = init.momo.expression 295 | current.zipWithIndex.map { 296 | case (_, _) => 297 | rnd.scalaRandom.nextGaussian 298 | } 299 | 300 | })) 301 | setExpSliders() 302 | } 303 | 304 | def resetExpression(): Unit = { 305 | init = init.copy(momo = init.momo.copy( 306 | expression = IndexedSeq.fill(expRank)(0.0) 307 | )) 308 | setExpSliders() 309 | } 310 | 311 | def setExpSliders(): Unit = { 312 | changingSliders = true 313 | (0 until expRank).foreach(i => { 314 | expSlider(i).setValue(paramToSlider(init.momo.expression(i))) 315 | }) 316 | changingSliders = false 317 | } 318 | 319 | 320 | //--- ALL TOGETHER ----- 321 | val randomButton: JButton = GUIBlock.button("random", { 322 | randomShape(); randomColor(); randomExpression(); updateImage() 323 | }) 324 | val resetButton: JButton = GUIBlock.button("reset", { 325 | resetShape(); resetColor(); resetExpression(); updateImage() 326 | }) 327 | 328 | val toggleExpressionButton: JButton = GUIBlock.button("expressions off", { 329 | if ( model.hasExpressions ) { 330 | if ( showExpressionModel ) renderer = AlbedoMoMoRenderer(model.neutralModel, 10.0).cached(5) 331 | else renderer = AlbedoMoMoRenderer(model, 10.0).cached(5) 332 | 333 | showExpressionModel = !showExpressionModel 334 | updateToggleExpressioButton() 335 | addRemoveExpressionTab() 336 | updateImage() 337 | } 338 | }) 339 | 340 | def updateToggleExpressioButton(): Unit = { 341 | if ( showExpressionModel ) toggleExpressionButton.setText("expressions off") 342 | else toggleExpressionButton.setText("expressions on") 343 | } 344 | 345 | randomButton.setToolTipText("draw each model parameter at random from a standard normal distribution") 346 | resetButton.setToolTipText("set all model parameters to zero") 347 | toggleExpressionButton.setToolTipText("toggle expression part of model on and off") 348 | 349 | //function to export the current shown face as a .ply file 350 | def exportShapeDiffuse (): Try[Unit] ={ 351 | 352 | def askToOverwrite(file: File): Boolean = { 353 | val dialogButton = JOptionPane.YES_NO_OPTION 354 | JOptionPane.showConfirmDialog(null, s"Would you like to overwrite the existing file: $file?","Warning",dialogButton) == JOptionPane.YES_OPTION 355 | } 356 | 357 | val VCM3D = if (model.hasExpressions && !showExpressionModel) model.neutralModel.instance(init.momo.coefficients) 358 | else model.instance(init.momo.coefficients) 359 | 360 | val fc = new JFileChooser() 361 | fc.setFileSelectionMode(JFileChooser.FILES_AND_DIRECTORIES) 362 | fc.setDialogTitle("Select a folder to store the .ply file and name it") 363 | if (fc.showSaveDialog(null) == JFileChooser.APPROVE_OPTION) { 364 | var file = fc.getSelectedFile 365 | if (file.isDirectory) file = new File(file,"instance.ply") 366 | if ( !file.getName.endsWith(".ply")) file = new File( file+".ply") 367 | if (!file.exists() || askToOverwrite(file)) { 368 | MeshIO.write(VCM3D.diffuseAlbedoMesh(), file) 369 | } else { 370 | Failure(new IOException(s"Something went wrong when writing to file the file $file.")) 371 | } 372 | } else { 373 | Failure(new Exception("User aborted save dialog.")) 374 | } 375 | } 376 | 377 | def exportShapeSpecular (): Try[Unit] ={ 378 | 379 | def askToOverwrite(file: File): Boolean = { 380 | val dialogButton = JOptionPane.YES_NO_OPTION 381 | JOptionPane.showConfirmDialog(null, s"Would you like to overwrite the existing file: $file?","Warning",dialogButton) == JOptionPane.YES_OPTION 382 | } 383 | 384 | val VCM3D = if (model.hasExpressions && !showExpressionModel) model.neutralModel.instance(init.momo.coefficients) 385 | else model.instance(init.momo.coefficients) 386 | 387 | val fc = new JFileChooser() 388 | fc.setFileSelectionMode(JFileChooser.FILES_AND_DIRECTORIES) 389 | fc.setDialogTitle("Select a folder to store the .ply file and name it") 390 | if (fc.showSaveDialog(null) == JFileChooser.APPROVE_OPTION) { 391 | var file = fc.getSelectedFile 392 | if (file.isDirectory) file = new File(file,"instance.ply") 393 | if ( !file.getName.endsWith(".ply")) file = new File( file+".ply") 394 | if (!file.exists() || askToOverwrite(file)) { 395 | MeshIO.write(VCM3D.specularAlbedoMesh(), file) 396 | } else { 397 | Failure(new IOException(s"Something went wrong when writing to file the file $file.")) 398 | } 399 | } else { 400 | Failure(new Exception("User aborted save dialog.")) 401 | } 402 | } 403 | 404 | //function to export the current shown face as a .ply file 405 | def exportImage (): Try[Unit] ={ 406 | 407 | def askToOverwrite(file: File): Boolean = { 408 | val dialogButton = JOptionPane.YES_NO_OPTION 409 | JOptionPane.showConfirmDialog(null, s"Would you like to overwrite the existing file: $file?","Warning",dialogButton) == JOptionPane.YES_OPTION 410 | } 411 | 412 | val img = applyGamma(renderer.renderImage(init)) 413 | 414 | val fc = new JFileChooser() 415 | fc.setFileSelectionMode(JFileChooser.FILES_AND_DIRECTORIES) 416 | fc.setDialogTitle("Select a folder to store the .png file and name it") 417 | if (fc.showSaveDialog(null) == JFileChooser.APPROVE_OPTION) { 418 | var file = fc.getSelectedFile 419 | if (file.isDirectory) file = new File(file,"instance.png") 420 | if ( !file.getName.endsWith(".png")) file = new File( file+".png") 421 | if (!file.exists() || askToOverwrite(file)) { 422 | PixelImageIO.write(img, file) 423 | } else { 424 | Failure(new IOException(s"Something went wrong when writing to file the file $file.")) 425 | } 426 | } else { 427 | Failure(new Exception("User aborted save dialog.")) 428 | } 429 | } 430 | 431 | //exportShape button and its tooltip 432 | val exportShapeDiffuseButton: JButton = GUIBlock.button("export Diffuse PLY", 433 | { 434 | exportShapeDiffuse() 435 | } 436 | ) 437 | exportShapeDiffuseButton.setToolTipText("export the current shape and texture as .ply") 438 | 439 | val exportShapeSpecularButton: JButton = GUIBlock.button("export Specular PLY", 440 | { 441 | exportShapeSpecular() 442 | } 443 | ) 444 | exportShapeSpecularButton.setToolTipText("export the current shape and texture as .ply") 445 | 446 | //exportImage button and its tooltip 447 | val exportImageButton: JButton = GUIBlock.button("export PNG", 448 | { 449 | exportImage() 450 | } 451 | ) 452 | exportImageButton.setToolTipText("export the current image as .png") 453 | 454 | 455 | //loads parameters from file 456 | //TODO: load other parameters than the momo shape, expr and color 457 | 458 | def askUserForRPSFile(dir: File): Option[File] = { 459 | val jFileChooser = new JFileChooser(dir) 460 | if (jFileChooser.showOpenDialog(null) == JFileChooser.APPROVE_OPTION) { 461 | Some(jFileChooser.getSelectedFile) 462 | } else { 463 | println("No Parameters select...") 464 | None 465 | } 466 | } 467 | 468 | def resizeParameterSequence(params: IndexedSeq[Double], length: Int, fill: Double): IndexedSeq[Double] = { 469 | val zeros = IndexedSeq.fill[Double](length)(fill) 470 | (params ++ zeros).slice(0, length) //brute force 471 | } 472 | 473 | def updateModelParameters(params: RenderParameter): Unit = { 474 | val newShape = resizeParameterSequence(params.momo.shape, shapeRank, 0) 475 | val newColor = resizeParameterSequence(params.momo.color, colorRank, 0) 476 | val newExpr = resizeParameterSequence(params.momo.expression, expRank, 0) 477 | println("Loaded Parameters") 478 | 479 | init = init.copy(momo = init.momo.copy(shape = newShape, color = newColor, expression = newExpr)) 480 | setShapeSliders() 481 | setColorSliders() 482 | setExpSliders() 483 | updateImage() 484 | } 485 | 486 | val loadButton: JButton = GUIBlock.button( 487 | "load RPS", 488 | { 489 | for {rpsFile <- askUserForRPSFile(new File(".")) 490 | rpsParams <- RenderParameterIO.read(rpsFile)} { 491 | val maxSigma = (rpsParams.momo.shape ++ rpsParams.momo.color ++ rpsParams.momo.expression).map(math.abs).max 492 | if ( maxSigma > maximalSigma ) { 493 | maximalSigma = math.ceil(maxSigma).toInt 494 | maximalSigmaSpinner.setValue(maximalSigma) 495 | setShapeSliders() 496 | setColorSliders() 497 | setExpSliders() 498 | } 499 | updateModelParameters(rpsParams) 500 | } 501 | } 502 | ) 503 | 504 | 505 | //---- update the image 506 | def updateImage(): Unit = { 507 | if (!changingSliders) 508 | imageWindow.updateImage(renderWithBG(init)) 509 | } 510 | 511 | def renderWithBG(init: RenderParameter): PixelImage[RGB] = { 512 | val fg = applyGamma(renderer.renderImage(init)) 513 | fg.zip(bg).map { case (f, b) => b.toRGB.blend(f) } 514 | // fg.map(_.toRGB) 515 | } 516 | 517 | //--- COMPOSE FRAME ------ 518 | val controls = new JTabbedPane() 519 | controls.addTab("color", GUIBlock.stack(colorScrollPane, GUIBlock.shelf(rndColorButton, resetColorButton))) 520 | controls.addTab("shape", GUIBlock.stack(shapeScrollPane, GUIBlock.shelf(rndShapeButton, resetShapeButton))) 521 | if ( model.hasExpressions) 522 | controls.addTab("expression", GUIBlock.stack(expScrollPane, GUIBlock.shelf(rndExpButton, resetExpButton))) 523 | def addRemoveExpressionTab(): Unit = { 524 | if ( showExpressionModel ) { 525 | controls.addTab("expression", GUIBlock.stack(expScrollPane, GUIBlock.shelf(rndExpButton, resetExpButton))) 526 | } else { 527 | val idx = controls.indexOfTab("expression") 528 | if ( idx >= 0) controls.remove(idx) 529 | } 530 | } 531 | 532 | val guiFrame: GUIFrame = GUIBlock.stack( 533 | GUIBlock.shelf(imageWindow, 534 | GUIBlock.stack(controls, 535 | if (model.hasExpressions) { 536 | GUIBlock.shelf(maximalSigmaSpinner, randomButton, resetButton, toggleExpressionButton, loadButton, exportShapeDiffuseButton, exportShapeSpecularButton, exportImageButton) 537 | } else { 538 | GUIBlock.shelf(maximalSigmaSpinner, randomButton, resetButton, loadButton, exportShapeDiffuseButton, exportShapeSpecularButton, exportImageButton) 539 | } 540 | ) 541 | ) 542 | ).displayIn("AlbedoMoMo-Viewer") 543 | 544 | 545 | 546 | //--- ROTATION CONTROLS ------ 547 | 548 | import java.awt.event._ 549 | 550 | var lookAt = false 551 | imageWindow.requestFocusInWindow() 552 | 553 | imageWindow.addKeyListener(new KeyListener { 554 | override def keyTyped(e: KeyEvent): Unit = { 555 | } 556 | 557 | override def keyPressed(e: KeyEvent): Unit = { 558 | if (e.getKeyCode == KeyEvent.VK_CONTROL) lookAt = true 559 | } 560 | 561 | override def keyReleased(e: KeyEvent): Unit = { 562 | if (e.getKeyCode == KeyEvent.VK_CONTROL) lookAt = false 563 | } 564 | }) 565 | 566 | imageWindow.addMouseListener(new MouseListener { 567 | override def mouseExited(e: MouseEvent): Unit = {} 568 | 569 | override def mouseClicked(e: MouseEvent): Unit = { 570 | imageWindow.requestFocusInWindow() 571 | } 572 | 573 | override def mouseEntered(e: MouseEvent): Unit = {} 574 | 575 | override def mousePressed(e: MouseEvent): Unit = {} 576 | 577 | override def mouseReleased(e: MouseEvent): Unit = {} 578 | }) 579 | 580 | imageWindow.addMouseMotionListener(new MouseMotionListener { 581 | override def mouseMoved(e: MouseEvent): Unit = { 582 | if (lookAt) { 583 | val x = e.getX 584 | val y = e.getY 585 | val yawPose = math.Pi / 2 * (x - imageWidth * 0.5) / (imageWidth / 2) 586 | val pitchPose = math.Pi / 2 * (y - imageHeight * 0.5) / (imageHeight / 2) 587 | 588 | init = init.copy(pose = init.pose.copy(yaw = yawPose, pitch = pitchPose)) 589 | updateImage() 590 | } 591 | } 592 | 593 | override def mouseDragged(e: MouseEvent): Unit = {} 594 | }) 595 | 596 | } 597 | -------------------------------------------------------------------------------- /scala/src/main/scala/faces/lib/AlbedoMoMoIO.scala: -------------------------------------------------------------------------------- 1 | package faces.lib 2 | 3 | /* 4 | * 5 | * Licensed under the Apache License, Version 2.0 (the "License"); 6 | * you may not use this file except in compliance with the License. 7 | * You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | * 17 | * This code was adapted from 18 | * https://github.com/unibas-gravis/scalismo-faces 19 | * Copyright University of Basel, Graphics and Vision Research Group 20 | 21 | */ 22 | 23 | import java.io.{ByteArrayInputStream, ByteArrayOutputStream, File} 24 | import java.net.URI 25 | import java.nio.charset.StandardCharsets 26 | import java.util.Calendar 27 | import java.util.Map.Entry 28 | 29 | import breeze.linalg.{DenseMatrix, DenseVector} 30 | import scalismo.common.{DiscreteDomain, PointId, UnstructuredPointsDomain, Vectorizer} 31 | import scalismo.color.RGB 32 | import scalismo.faces.io.TLMSLandmarksIO 33 | import scalismo.faces.momo.PancakeDLRGP 34 | import scalismo.faces.utils.ResourceManagement 35 | import scalismo.geometry.{EuclideanVector, Landmark, Point, _3D} 36 | import scalismo.io.{HDF5File, HDF5Utils, LandmarkIO, NDArray} 37 | import scalismo.mesh.{TriangleCell, TriangleList, TriangleMesh3D} 38 | import scalismo.statisticalmodel.{AlbedoModelHelpers, DiscreteLowRankGaussianProcess} 39 | 40 | import scala.io.Source 41 | import scala.util.{Failure, Success, Try} 42 | 43 | /** 44 | * IO for Morphable Models 45 | * 46 | */ 47 | object AlbedoMoMoIO { 48 | 49 | /** cache to hold open models, identified by their URI */ 50 | private val cacheSizeHint = 10 51 | private val openMoMos = new java.util.LinkedHashMap[URI, AlbedoMoMo](cacheSizeHint, 0.75f, false) { 52 | override def removeEldestEntry(eldest: Entry[URI, AlbedoMoMo]): Boolean = size() > cacheSizeHint 53 | } 54 | 55 | /** 56 | * clears all cached models 57 | * */ 58 | def clearURICache(): Unit = openMoMos.synchronized{ openMoMos.clear() } 59 | 60 | /** 61 | * load a Morphable Model from an URI (cached) 62 | * 63 | * @param uri URI of model to open, must be a file currently, use "#" to indicate the path inside the file 64 | * @return 65 | */ 66 | def read(uri: URI): Try[AlbedoMoMo] = Try { 67 | if (uri.getScheme != "file") throw new RuntimeException("Only supports loading from file:// URI") 68 | 69 | // separate model file and model path inside file 70 | val modelURI = new URI(uri.getScheme + ":" + uri.getSchemeSpecificPart) 71 | val modelPath = Option(uri.getFragment).getOrElse("/") 72 | 73 | // map access 74 | var cachedModel: Option[AlbedoMoMo] = None 75 | 76 | // look in cache 77 | openMoMos.synchronized { 78 | cachedModel = Option(openMoMos.get(uri)) 79 | } 80 | 81 | // open if necessary 82 | cachedModel.getOrElse { 83 | val model = read(new File(modelURI), modelPath).get 84 | openMoMos.synchronized { 85 | openMoMos.put(uri, model) 86 | } 87 | model 88 | } 89 | } 90 | 91 | /** 92 | * load a Morphable Model from a file 93 | * 94 | * @param file file to open 95 | * @param modelPath path of the model inside file 96 | * @return 97 | */ 98 | def read(file: File, modelPath: String = "/"): Try[AlbedoMoMo] = { 99 | // open HDF5 at path 100 | ResourceManagement.usingTry(HDF5Utils.openFileForReading(file)) { h5file: HDF5File => 101 | readFromHDF5(h5file, modelPath) 102 | } 103 | } 104 | 105 | /** 106 | * load a Morphable Model from an open HDF5 file 107 | * 108 | * @param h5file HDF5 file 109 | * @param modelPath path of the model inside file 110 | * @return 111 | */ 112 | def readFromHDF5(h5file: HDF5File, modelPath: String): Try[AlbedoMoMo] = Try { 113 | val path = AlbedoMoMoPathBuilder(modelPath) 114 | 115 | val shapeMesh = readGravisModelRepresenter(h5file, path.shape).get 116 | val shapeModel = readStatisticalModel3D[Point[_3D]](h5file, path.shape, shapeMesh.pointSet).get 117 | 118 | val diffuseAlbedoMesh = readGravisModelRepresenter(h5file, path.diffuseAlbedo).get 119 | val diffuseAlbedoModel = readStatisticalModel3D[RGB](h5file, path.diffuseAlbedo, diffuseAlbedoMesh.pointSet).get 120 | 121 | val specularAlbedoMesh = readGravisModelRepresenter(h5file, path.specularAlbedo).get 122 | val specularAlbedoModel = readStatisticalModel3D[RGB](h5file, path.specularAlbedo, specularAlbedoMesh.pointSet).get 123 | 124 | 125 | // check shape, color and expression compatibility 126 | if (shapeMesh.pointSet != diffuseAlbedoMesh.pointSet) 127 | throw new Exception("shape and diffuse model do not share a domain, different underlying point sets") 128 | 129 | if (shapeMesh.pointSet != specularAlbedoMesh.pointSet) 130 | throw new Exception("shape and specular model do not share a domain, different underlying point sets") 131 | 132 | // extract landmarks 133 | val landmarks = readLandmarks(h5file, path.landmarks).getOrElse(Map.empty[String, Landmark[_3D]]) 134 | 135 | // expression model defaults to empty model 136 | if (h5file.exists(path.expression)) { 137 | val expressionMesh = readGravisModelRepresenter(h5file, path.expression).get 138 | val expressionModel = readStatisticalModel3D[EuclideanVector[_3D]](h5file, path.expression, expressionMesh.pointSet).get 139 | if (shapeMesh.pointSet != expressionMesh.pointSet) 140 | throw new Exception("expression model does not share a domain, different underlying point sets") 141 | AlbedoMoMo(shapeMesh, shapeModel, diffuseAlbedoModel, specularAlbedoModel, expressionModel, landmarks) 142 | } 143 | else 144 | AlbedoMoMo(shapeMesh, shapeModel, diffuseAlbedoModel, specularAlbedoModel, landmarks) 145 | } 146 | 147 | /** 148 | * write a Morphable Model to a file, creates a HDF5 file 149 | * 150 | * @param file File object pointing to new file, will be created 151 | * @param modelPath path inside HDF5 file to put the model */ 152 | def write(model: AlbedoMoMo, file: File, modelPath: String = "/"): Try[Unit] = Try { 153 | ResourceManagement.using(HDF5Utils.createFile(file).get) { h5file: HDF5File => 154 | writeToHDF5(model, h5file, modelPath).get 155 | } 156 | } 157 | 158 | /** 159 | * write a Morphable Model into an open HDF5 file 160 | * 161 | * @param h5file File object pointing to open HDF5 file 162 | * @param modelPath path inside HDF5 file to put model (leave empty for default) */ 163 | def writeToHDF5(momo: AlbedoMoMo, h5file: HDF5File, modelPath: String): Try[Unit] = Try { 164 | val path = AlbedoMoMoPathBuilder(modelPath) 165 | 166 | // version information, 0.9 is our C++ standard format 167 | h5file.createGroup(path.version).get 168 | h5file.writeInt(path.majorVersion, 0).get 169 | h5file.writeInt(path.minorVersion, 9).get 170 | 171 | // catalog entries 172 | writeCatalog(momo, h5file, path).get 173 | 174 | // write model 175 | momo match { 176 | case albedoMoMoExpress: AlbedoMoMoExpress => writeAlbedoMoMoExpress(albedoMoMoExpress, h5file, path).get 177 | case albedoMoMo: AlbedoMoMoBasic => writeAlbedoMoMoBasic(albedoMoMo, h5file, path).get 178 | case modelType => throw new IllegalArgumentException("cannot write model of type: " + modelType.getClass.getName) 179 | } 180 | 181 | // write meta data 182 | h5file.createGroup(path.metadata).get 183 | 184 | // write landmarks 185 | val landmarksPath = path.landmarks 186 | h5file.createGroup(landmarksPath).get 187 | writeLandmarks(momo.landmarks, h5file, landmarksPath).get 188 | } 189 | 190 | // -- private detail writers / readers -- 191 | 192 | private def writeCatalog(momo: AlbedoMoMo, h5file: HDF5File, path: AlbedoMoMoPathBuilder): Try[Unit] = Try { 193 | h5file.createGroup(path.catalog).get 194 | // Morphable Model entry 195 | val catalogMorphableModelPath : String = path.catalog + "/MorphableModel" 196 | h5file.createGroup(catalogMorphableModelPath).get 197 | h5file.writeString(catalogMorphableModelPath + "/modelPath", "/").get 198 | h5file.writeString(catalogMorphableModelPath + "/modelType", "CUSTOM_MODEL").get 199 | 200 | // Shape model entry 201 | val catalogShapePath = catalogMorphableModelPath + ".shape" 202 | h5file.createGroup(catalogShapePath).get 203 | h5file.writeString(catalogShapePath + "/modelType", "POLYGON_MESH_MODEL").get 204 | h5file.writeString(catalogShapePath + "/modelPath", path.shape).get 205 | 206 | // Diffuse Albedo model entry 207 | val catalogDiffuseAlbedoPath = catalogMorphableModelPath + ".diffuseAlbedo" 208 | h5file.createGroup(catalogDiffuseAlbedoPath).get 209 | h5file.writeString(catalogDiffuseAlbedoPath + "/modelType", "POLYGON_MESH_DATA_MODEL").get 210 | h5file.writeString(catalogDiffuseAlbedoPath + "/modelPath", "/diffuseAlbedo").get 211 | 212 | // Specular Albedo model entry 213 | val catalogSpecularAlbedoPath = catalogMorphableModelPath + ".specularAlbedo" 214 | h5file.createGroup(catalogSpecularAlbedoPath).get 215 | h5file.writeString(catalogSpecularAlbedoPath + "/modelType", "POLYGON_MESH_DATA_MODEL").get 216 | h5file.writeString(catalogSpecularAlbedoPath + "/modelPath", "/specularAlbedo").get 217 | 218 | 219 | if (momo.hasExpressions) { 220 | // catalog: expression model entry 221 | val catalogExpressionPath = path.catalog + "/MorphableModel.expression" 222 | h5file.createGroup(catalogExpressionPath).get 223 | h5file.writeString(catalogExpressionPath + "/modelType", "POLYGON_MESH_DATA_MODEL").get 224 | h5file.writeString(catalogExpressionPath + "/modelPath", "/expression").get 225 | } 226 | } 227 | 228 | /** write a Morphable Model with expression into an opened HDF5 file */ 229 | private def writeAlbedoMoMoExpress(momo: AlbedoMoMoExpress, h5file: HDF5File, path: AlbedoMoMoPathBuilder): Try[Unit] = Try { 230 | // write shape model 231 | val shapePath = path.shape 232 | h5file.createGroup(shapePath).get 233 | writeStatisticalModel(momo.shape, h5file, shapePath).get 234 | writeShapeRepresenter(momo.referenceMesh, h5file, shapePath).get 235 | 236 | // write diffuse albedo model 237 | val diffuseAlbedoPath = path.diffuseAlbedo 238 | h5file.createGroup(diffuseAlbedoPath).get 239 | writeStatisticalModel(momo.diffuseAlbedo, h5file, diffuseAlbedoPath).get 240 | writeDiffuseAlbedoRepresenter(momo.referenceMesh, h5file, diffuseAlbedoPath).get 241 | 242 | // write specular albedo model 243 | val specularAlbedoPath = path.specularAlbedo 244 | h5file.createGroup(specularAlbedoPath).get 245 | writeStatisticalModel(momo.specularAlbedo, h5file, specularAlbedoPath).get 246 | writeSpecularAlbedoRepresenter(momo.referenceMesh, h5file, specularAlbedoPath).get 247 | 248 | // expression model 249 | val expressionPath = path.expression 250 | h5file.createGroup(expressionPath).get 251 | writeStatisticalModel(momo.expression, h5file, expressionPath).get 252 | writeExpressionRepresenter(momo.referenceMesh, h5file, expressionPath).get 253 | } 254 | 255 | /** write a Morphable Model with expression into an opened HDF5 file */ 256 | private def writeAlbedoMoMoBasic(momo: AlbedoMoMoBasic, h5file: HDF5File, path: AlbedoMoMoPathBuilder): Try[Unit] = Try { 257 | // write shape model 258 | val shapePath = path.shape 259 | h5file.createGroup(shapePath).get 260 | writeStatisticalModel(momo.shape, h5file, shapePath).get 261 | writeShapeRepresenter(momo.referenceMesh, h5file, shapePath).get 262 | 263 | // write diffuse albedo model 264 | val diffuseAlbedoPath = path.diffuseAlbedo 265 | h5file.createGroup(diffuseAlbedoPath).get 266 | writeStatisticalModel(momo.diffuseAlbedo, h5file, diffuseAlbedoPath).get 267 | writeDiffuseAlbedoRepresenter(momo.referenceMesh, h5file, diffuseAlbedoPath).get 268 | 269 | // write specular albedo model 270 | val specularAlbedoPath = path.specularAlbedo 271 | h5file.createGroup(specularAlbedoPath).get 272 | writeStatisticalModel(momo.specularAlbedo, h5file, specularAlbedoPath).get 273 | writeSpecularAlbedoRepresenter(momo.referenceMesh, h5file, specularAlbedoPath).get 274 | } 275 | 276 | private def readGravisModelRepresenter(h5file: HDF5File, modelPath: String): Try[TriangleMesh3D] = Try { 277 | val representerPath = representerPathBuilder(modelPath) 278 | 279 | val representerName = h5file.readStringAttribute(representerPath, "name").get 280 | // read mesh according to type given in representer 281 | val mesh: Try[TriangleMesh3D] = representerName match { 282 | case "gravis::MeshShapeRepresenter" => readPolygonRepresenterMesh(h5file, modelPath) 283 | case "gravis::MeshDiffuseAlbedoRepresenter" => readPolygonRepresenterMesh(h5file, modelPath) 284 | case "gravis::MeshSpecularAlbedoRepresenter" => readPolygonRepresenterMesh(h5file, modelPath) 285 | case "gravis::MeshExpressionRepresenter" => readPolygonRepresenterMesh(h5file, modelPath) 286 | case "gravis::Mesh Shape Representer" => readLegacyRepresenterMesh(h5file, modelPath) 287 | case _ => h5file.readStringAttribute(representerPath, "datasetType") match { 288 | case Success("POLYGON_MESH") => readPolygonRepresenterMesh(h5file, modelPath) 289 | case Success("POLYGON_MESH_DATA") => readPolygonRepresenterMesh(h5file, modelPath) 290 | case Success(datasetType) => Failure(new Exception(s"can only read model of datasetType POLYGON_MESH or POLYGON_MESH_DATA. Got $datasetType instead")) 291 | case Failure(exc) => Failure(new RuntimeException("unknown representer format (no datasetType attribute)")) 292 | } 293 | } 294 | mesh.get 295 | } 296 | 297 | private def readLegacyRepresenterMesh(h5file: HDF5File, modelPath: String): Try[TriangleMesh3D] = Try { 298 | val representerPath = representerPathBuilder(modelPath) 299 | 300 | val vertArray = h5file.readNDArray[Float]("%s/reference-mesh/vertex-coordinates".format(representerPath)).get 301 | if (vertArray.dims(1) != 3) 302 | throw new Exception("the representer points are not 3D points") 303 | 304 | val vertMat = ndFloatArrayToMatrix(vertArray) 305 | val points = for (i <- 0 until vertMat.rows) yield Point(vertMat(i, 0), vertMat(i, 1), vertMat(i, 2)) 306 | 307 | val cellArray = h5file.readNDArray[Int]("%s/reference-mesh/triangle-list".format(representerPath)).get 308 | if (cellArray.dims(1) != 3) 309 | throw new Exception("the representer cells are not triangles") 310 | 311 | val cellMat = ndIntArrayToMatrix(cellArray) 312 | val cells = for (i <- 0 until cellMat.rows) yield TriangleCell(PointId(cellMat(i, 0)), PointId(cellMat(i, 1)), PointId(cellMat(i, 2))) 313 | 314 | TriangleMesh3D(points, TriangleList(cells)) 315 | } 316 | 317 | private def readPolygonRepresenterMesh(h5file: HDF5File, modelPath: String): Try[TriangleMesh3D] = Try { 318 | val representerPath = representerPathBuilder(modelPath) 319 | 320 | val vertArray = h5file.readNDArray[Float]("%s/points".format(representerPath)).get 321 | if (vertArray.dims(0) != 3) 322 | throw new Exception("the representer points are not 3D points") 323 | 324 | val vertMat = ndFloatArrayToMatrix(vertArray) 325 | val points = for (i <- 0 until vertMat.cols) yield Point(vertMat(0, i), vertMat(1, i), vertMat(2, i)) 326 | 327 | val cellArray = h5file.readNDArray[Int]("%s/cells".format(representerPath)).get 328 | if (cellArray.dims(0) != 3) 329 | throw new Exception("the representer cells are not triangles") 330 | 331 | val cellMat = ndIntArrayToMatrix(cellArray) 332 | val cells = for (i <- 0 until cellMat.cols) yield TriangleCell(PointId(cellMat(0, i)), PointId(cellMat(1, i)), PointId(cellMat(2, i))) 333 | 334 | TriangleMesh3D(points, TriangleList(cells)) 335 | } 336 | 337 | /** read a statistical statismo model from a HDF5 file at path modelPath */ 338 | private def readStatisticalModel3D[A](h5file: HDF5File, 339 | modelPath: String, 340 | referencePoints: UnstructuredPointsDomain[_3D]) 341 | (implicit vectorizer: Vectorizer[A]): Try[PancakeDLRGP[_3D, UnstructuredPointsDomain[_3D], A]] = { 342 | val path = StatisticalModelPathBuilder(modelPath) 343 | 344 | for { 345 | meanArray <- h5file.readNDArray[Float](path.mean) 346 | meanVector = DenseVector(meanArray.data.map{_.toDouble}) 347 | pcaBasisArray <- h5file.readNDArray[Float](path.pcaBasis) 348 | majorVersion <- h5file.readInt(path.majorVersion).recover { case (e: Exception) => 0 } 349 | minorVersion <- h5file.readInt(path.minorVersion).recover { case (e: Exception) => 8 } 350 | pcaVarianceArray <- h5file.readNDArray[Float](path.pcaVariance) 351 | pcaVarianceVector = DenseVector(pcaVarianceArray.data.map{_.toDouble}) 352 | noiseVarianceArray <- h5file.readArray[Float](path.noiseVariance) 353 | noiseVariance = noiseVarianceArray(0).toDouble 354 | pcaBasisMatrix = ndFloatArrayToMatrix(pcaBasisArray).map{_.toDouble} 355 | pcaBasis <- (majorVersion, minorVersion) match { 356 | case (1, _) => Success(pcaBasisMatrix) 357 | case (0, 9) => Success(pcaBasisMatrix) 358 | case (0, 8) => Success(extractOrthonormalPCABasisMatrix(pcaBasisMatrix, pcaVarianceVector)) // an old statismo version 359 | case v => Failure(new RuntimeException(s"Unsupported version ${v._1}.${v._2}")) 360 | } 361 | } yield { 362 | val dlgp: DiscreteLowRankGaussianProcess[_3D, UnstructuredPointsDomain[_3D], A] = AlbedoModelHelpers.buildFrom(referencePoints, meanVector, pcaVarianceVector, pcaBasis) 363 | PancakeDLRGP(dlgp, noiseVariance) 364 | } 365 | } 366 | 367 | private def readLandmarks(h5file: HDF5File, path: String): Try[Map[String, Landmark[_3D]]] = Try { 368 | if (h5file.exists(s"$path/json")) { 369 | // json reader, modern format 370 | val jsonString = h5file.readString(s"$path/json").get 371 | val lmList = LandmarkIO.readLandmarksJsonFromSource[_3D](Source.fromString(jsonString)).get 372 | lmList.map(lm => lm.id -> lm).toMap 373 | } else if (h5file.exists(s"$path/text")) { 374 | // legacy tlms text format 375 | val tlmsString = h5file.readString(s"$path/text").get 376 | val lmList = TLMSLandmarksIO.read3DFromStream(new ByteArrayInputStream(tlmsString.getBytes(StandardCharsets.UTF_8))).get 377 | lmList.map(lm => lm.id -> Landmark[_3D](lm.id, lm.point, None, None)).toMap 378 | } else 379 | throw new Exception("No landmarks present") 380 | } 381 | 382 | private def writeShapeRepresenter(mesh: TriangleMesh3D, h5file: HDF5File, modelPath: String): Try[Unit] = Try { 383 | val representerPath = representerPathBuilder(modelPath) 384 | h5file.createGroup(representerPath).get 385 | h5file.writeStringAttribute(representerPath, "name", "gravis::MeshShapeRepresenter").get 386 | h5file.writeStringAttribute(representerPath, "datasetType", "POLYGON_MESH").get 387 | writeTriangleMesh3D(mesh, h5file, representerPath).get 388 | } 389 | 390 | private def writeDiffuseAlbedoRepresenter(mesh: TriangleMesh3D, h5file: HDF5File, modelPath: String): Try[Unit] = Try { 391 | val representerPath = representerPathBuilder(modelPath) 392 | h5file.createGroup(representerPath).get 393 | h5file.writeStringAttribute(representerPath, "name", "gravis::MeshDiffuseAlbedoRepresenter").get 394 | h5file.writeStringAttribute(representerPath, "datasetType", "POLYGON_MESH").get 395 | h5file.writeString(representerPath + "/colorspace", "RGB" + '\u0000').get // ugly 0-termination string hack because statismo in C++ stores strings including 0 termination 396 | writeTriangleMesh3D(mesh, h5file, representerPath).get 397 | } 398 | 399 | private def writeSpecularAlbedoRepresenter(mesh: TriangleMesh3D, h5file: HDF5File, modelPath: String): Try[Unit] = Try { 400 | val representerPath = representerPathBuilder(modelPath) 401 | h5file.createGroup(representerPath).get 402 | h5file.writeStringAttribute(representerPath, "name", "gravis::MeshSpecularAlbedoRepresenter").get 403 | h5file.writeStringAttribute(representerPath, "datasetType", "POLYGON_MESH").get 404 | h5file.writeString(representerPath + "/colorspace", "RGB" + '\u0000').get // ugly 0-termination string hack because statismo in C++ stores strings including 0 termination 405 | writeTriangleMesh3D(mesh, h5file, representerPath).get 406 | } 407 | 408 | private def writeExpressionRepresenter(mesh: TriangleMesh3D, h5file: HDF5File, modelPath: String): Try[Unit] = Try { 409 | val representerPath = representerPathBuilder(modelPath) 410 | h5file.createGroup(representerPath).get 411 | h5file.writeStringAttribute(representerPath, "name", "gravis::MeshExpressionRepresenter").get 412 | h5file.writeStringAttribute(representerPath, "datasetType", "POLYGON_MESH").get 413 | writeTriangleMesh3D(mesh, h5file, representerPath).get 414 | } 415 | 416 | private def writeTriangleMesh3D(mesh: TriangleMesh3D, h5file: HDF5File, path: String): Try[Unit] = Try { 417 | h5file.createGroup(s"$path").get 418 | val points = mesh.pointSet.points.toIndexedSeq 419 | val cells = mesh.cells 420 | val pointData: IndexedSeq[Double] = points.map(_.x) ++ points.map(_.y) ++ points.map(_.z) 421 | h5file.writeNDArray[Float](s"$path/points", NDArray(IndexedSeq(3, points.size), pointData.toArray.map(_.toFloat))).get 422 | val cellData: IndexedSeq[PointId] = cells.map(_.ptId1) ++ cells.map(_.ptId2) ++ cells.map(_.ptId3) 423 | h5file.writeNDArray[Int](s"$path/cells", NDArray(IndexedSeq(3, cells.size), cellData.toArray.map(_.id))).get 424 | } 425 | 426 | private def writeStatisticalModel[A](model: PancakeDLRGP[_3D, UnstructuredPointsDomain[_3D], A], h5file: HDF5File, modelPath: String): Try[Unit] = Try { 427 | val path = StatisticalModelPathBuilder(modelPath) 428 | val mean = model.meanVector.map(_.toFloat) 429 | val variance = model.variance.map(_.toFloat) 430 | val pcaBasis = model.basisMatrix.map(_.toFloat) 431 | 432 | h5file.createGroup(path.version).get 433 | h5file.writeInt(path.majorVersion, 0).get 434 | h5file.writeInt(path.minorVersion, 9).get 435 | 436 | h5file.writeArray[Float](path.mean, mean.toArray).get 437 | h5file.writeArray[Float](path.noiseVariance, Array(model.noiseVariance.toFloat)).get 438 | h5file.writeNDArray[Float](path.pcaBasis, NDArray(IndexedSeq(pcaBasis.rows, pcaBasis.cols), pcaBasis.t.flatten(false).toArray)).get 439 | h5file.writeArray[Float](path.pcaVariance, variance.toArray).get 440 | h5file.writeString(path.buildTime, Calendar.getInstance.getTime.toString).get 441 | h5file.writeNDArray[Float](path.scores, NDArray(IndexedSeq(1, 1), Array(0.0f))).get 442 | } 443 | 444 | private def writeLandmarks(landmarks: Map[String, Landmark[_3D]], h5file: HDF5File, path: String): Try[Unit] = Try { 445 | val jsonStream = new ByteArrayOutputStream() 446 | LandmarkIO.writeLandmarksJsonToStream(landmarks.values.toList, jsonStream) 447 | h5file.writeString(s"$path/json", jsonStream.toString("UTF-8")) 448 | } 449 | 450 | 451 | /* 452 | Path builder to ensure we have the same paths when reading and writing 453 | */ 454 | private case class AlbedoMoMoPathBuilder(modelPath: String) { 455 | def catalog = s"$modelPath/catalog" 456 | 457 | def version = s"$modelPath/version" 458 | 459 | def majorVersion: String = "%s/majorVersion".format(version) 460 | 461 | def minorVersion: String = "%s/minorVersion".format(version) 462 | 463 | def shape = s"$modelPath/shape" 464 | 465 | def diffuseAlbedo = s"$modelPath/diffuseAlbedo" 466 | 467 | def specularAlbedo = s"$modelPath/specularAlbedo" 468 | 469 | def expression = s"$modelPath/expression" 470 | 471 | def metadata = s"$modelPath/metadata" 472 | 473 | def landmarks: String = "%s/landmarks".format(metadata) 474 | } 475 | 476 | private case class StatisticalModelPathBuilder(modelPath: String) { 477 | 478 | def version = s"$modelPath/version" 479 | 480 | def majorVersion: String = "%s/majorVersion".format(version) 481 | 482 | def minorVersion: String = "%s/minorVersion".format(version) 483 | 484 | def model = s"$modelPath/model" 485 | 486 | def mean: String = "%s/mean".format(model) 487 | 488 | def noiseVariance: String = "%s/noiseVariance".format(model) 489 | 490 | def pcaBasis: String = "%s/pcaBasis".format(model) 491 | 492 | def pcaVariance: String = "%s/pcaVariance".format(model) 493 | 494 | def modelinfo = s"$modelPath/modelinfo" 495 | 496 | def buildTime: String = "%s/build-time".format(modelinfo) 497 | 498 | def scores: String = "%s/scores".format(modelinfo) 499 | } 500 | 501 | private def representerPathBuilder(modelPath: String) = s"$modelPath/representer" 502 | 503 | /* 504 | Helpers to handle matrices and arrays. 505 | */ 506 | private def ndFloatArrayToMatrix(array: NDArray[Float]) = { 507 | // the data in ndarray is stored row-major, but DenseMatrix stores it column major. We therefore 508 | // do switch dimensions and transpose 509 | DenseMatrix.create(array.dims(1).toInt, array.dims(0).toInt, array.data).t 510 | } 511 | 512 | private def ndDoubleArrayToMatrix(array: NDArray[Double]) = { 513 | // the data in ndarray is stored row-major, but DenseMatrix stores it column major. We therefore 514 | // do switch dimensions and transpose 515 | DenseMatrix.create(array.dims(1).toInt, array.dims(0).toInt, array.data).t 516 | } 517 | 518 | private def ndIntArrayToMatrix(array: NDArray[Int]) = { 519 | // the data in ndarray is stored row-major, but DenseMatrix stores it column major. We therefore 520 | // do switch dimensions and transpose 521 | DenseMatrix.create(array.dims(1).toInt, array.dims(0).toInt, array.data).t 522 | } 523 | 524 | /* 525 | Helper function to handle old statismo format. 526 | */ 527 | private def extractOrthonormalPCABasisMatrix(pcaBasisMatrix: DenseMatrix[Double], pcaVarianceVector: DenseVector[Double]): DenseMatrix[Double] = { 528 | // this is an old statismo format, that has the pcaVariance directly stored in the PCA matrix, 529 | // i.e. pcaBasis = U * sqrt(lmbda), where U is a matrix of eigenvectors and lmbda the corresponding eigenvalues. 530 | // We recover U from it by normalizing all eigenvectors 531 | val U = DenseMatrix.zeros[Double](pcaBasisMatrix.rows, pcaBasisMatrix.cols) 532 | for (i <- 0 until pcaBasisMatrix.cols) { 533 | val pcVec: DenseVector[Double] = pcaBasisMatrix(::, i).toDenseVector 534 | U(::, i) := breeze.linalg.normalize(pcVec) 535 | } 536 | U 537 | } 538 | 539 | } 540 | 541 | -------------------------------------------------------------------------------- /scala/src/main/scala/faces/lib/AlbedoMoMoRenderer.scala: -------------------------------------------------------------------------------- 1 | package faces.lib 2 | 3 | /* 4 | * 5 | * Licensed under the Apache License, Version 2.0 (the "License"); 6 | * you may not use this file except in compliance with the License. 7 | * You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | * 17 | * This code was adapted from 18 | * https://github.com/unibas-gravis/scalismo-faces 19 | * Copyright University of Basel, Graphics and Vision Research Group 20 | 21 | */ 22 | 23 | 24 | import java.io.File 25 | import java.net.URI 26 | 27 | import breeze.linalg.DenseVector 28 | import breeze.linalg.DenseVector 29 | import breeze.numerics.pow 30 | import scalismo.color.RGBA 31 | import scalismo.faces.parameters.{Camera, ColorTransform, DirectionalLight, Illumination, ImageSize, MoMoInstance, ParametricRenderer, Pose, RenderObject, RenderParameter, SphericalHarmonicsLight, ViewParameter} 32 | import scalismo.faces.sampling.face.{ParametricImageRenderer, ParametricLandmarksRenderer, ParametricMaskRenderer, ParametricMeshRenderer, ParametricModel} 33 | import scalismo.geometry.{EuclideanVector, EuclideanVector3D, Point, Point2D, Point3D, _3D} 34 | import scalismo.mesh.{BarycentricCoordinates, MeshSurfaceProperty, SurfacePointProperty, TriangleId, TriangleMesh3D, VertexColorMesh3D} 35 | import scalismo.utils.Memoize 36 | import scalismo.faces.image.{PixelImage, PixelImageOperations} 37 | import scalismo.faces.render.{Affine3D, PixelShader, PixelShaders, PointShader, TriangleFilters, TriangleRenderer, ZBuffer} 38 | import scalismo.faces.io.{MeshIO, MoMoIO, PixelImageIO, RenderParameterIO} 39 | import scalismo.faces.landmarks.TLMSLandmark2D 40 | import scalismo.faces.mesh.ColorNormalMesh3D 41 | import scalismo.faces.numerics.SphericalHarmonics 42 | import scalismo.faces.render.PixelShaders.{SphericalHarmonicsLambertShader, SphericalHarmonicsSpecularShader} 43 | 44 | import scala.reflect.ClassTag 45 | 46 | object SpecularRenderingTest extends App{ 47 | scalismo.initialize() 48 | 49 | val model = AlbedoMoMoIO.read(new File("/media/data/work/albedomodel/pipeline-data/data/modelbuilding/model/yamm2020_bfm_nomouth_combinedModel.h5")).get 50 | 51 | val renderer = AlbedoMoMoRenderer(model) 52 | 53 | val bip = new File("/media/data/work/Ilker/reducedBIP/data/parameters/").listFiles().take(10) 54 | 55 | bip.zipWithIndex.foreach(i =>{ 56 | 57 | val ill = RenderParameterIO.read(i._1).get 58 | val img = renderer.renderImage(RenderParameter.default.withMoMo(MoMoInstance(IndexedSeq(0.0), IndexedSeq(0.0), IndexedSeq(0.0), new URI(""))).fitToImageSize(1000,1000).withEnvironmentMap(ill.environmentMap)) 59 | PixelImageIO.write(img, new File("/media/temp/speculartest/test_"+i._2+".png")) 60 | }) 61 | val img = renderer.renderImage(RenderParameter.default.withMoMo(MoMoInstance(IndexedSeq(0.0), IndexedSeq(0.0), IndexedSeq(0.0), new URI(""))).fitToImageSize(1000,1000)) 62 | PixelImageIO.write(img, new File("/media/temp/speculartest/test.png")) 63 | } 64 | 65 | 66 | /** parametric renderer for a Morphable Model, implements all useful Parameteric*Renderer interfaces */ 67 | class AlbedoMoMoRenderer(val model: AlbedoMoMo, val specularExponent: Double, val clearColor: RGBA) 68 | extends ParametricImageRenderer[RGBA] 69 | with ParametricLandmarksRenderer 70 | with ParametricMaskRenderer 71 | with ParametricMeshRenderer 72 | with ParametricAlbedoModel { 73 | 74 | /** pad a coefficient vector if it is too short, basis with single vector */ 75 | private def padCoefficients(coefficients: DenseVector[Double], rank: Int): DenseVector[Double] = { 76 | require(coefficients.length <= rank, "too many coefficients for model") 77 | if (coefficients.length == rank) 78 | coefficients 79 | else 80 | DenseVector(coefficients.toArray ++ Array.fill(rank - coefficients.length)(0.0)) 81 | } 82 | 83 | /** create an instance of the model, in the original model's object coordinates */ 84 | override def instance(parameters: RenderParameter): VertexAlbedoMesh3D = { 85 | model.instance(parameters.momo.coefficients) 86 | } 87 | 88 | 89 | /** render the image described by the parameters */ 90 | override def renderImage(parameters: RenderParameter): PixelImage[RGBA] = { 91 | val inst = instance(parameters) 92 | val imgSpec =renderImageSpecular(parameters, inst) 93 | 94 | val imgDiffuse = renderImageDiffuse(parameters, inst) 95 | 96 | imgDiffuse.zip(imgSpec).map(p => (p._1+p._2).clamped) 97 | } 98 | 99 | def renderImageDiffuse(parameters: RenderParameter): PixelImage[RGBA] = { 100 | val inst = instance(parameters) 101 | ParametricRenderer.renderParameterVertexColorMesh( 102 | parameters, 103 | inst.diffuseAlbedoMesh(), 104 | clearColor) 105 | } 106 | 107 | def renderImageSpecular(parameters: RenderParameter): PixelImage[RGBA] = { 108 | val inst = instance(parameters) 109 | SpecularParametricRenderer.specularRenderParameterVertexColorMesh( 110 | parameters, 111 | inst, 112 | specularExponent, 113 | clearColor) 114 | } 115 | 116 | def renderImageDiffuse(parameters: RenderParameter, inst: VertexAlbedoMesh3D): PixelImage[RGBA] = { 117 | ParametricRenderer.renderParameterVertexColorMesh( 118 | parameters, 119 | inst.diffuseAlbedoMesh(), 120 | clearColor) 121 | } 122 | 123 | def renderImageSpecular(parameters: RenderParameter, inst: VertexAlbedoMesh3D): PixelImage[RGBA] = { 124 | SpecularParametricRenderer.specularRenderParameterVertexColorMesh( 125 | parameters, 126 | inst, 127 | specularExponent, 128 | clearColor) 129 | } 130 | 131 | 132 | /** render the mesh described by the parameters, draws instance from model and places properly in the world (world coordinates) */ 133 | override def renderMesh(parameters: RenderParameter): VertexColorMesh3D = { 134 | val t = parameters.pose.transform 135 | val mesh = instance(parameters) 136 | VertexColorMesh3D( 137 | mesh.shape.transform(p => t(p)), 138 | mesh.diffuseAlbedo 139 | ) 140 | } 141 | 142 | /** render landmark position in the image */ 143 | override def renderLandmark(lmId: String, parameter: RenderParameter): Option[TLMSLandmark2D] = { 144 | val renderer = parameter.renderTransform 145 | for { 146 | ptId <- model.landmarkPointId(lmId) 147 | lm3d <- Some(model.instanceAtPoint(parameter.momo.coefficients, ptId)._1) 148 | lmImage <- Some(renderer(lm3d)) 149 | } yield TLMSLandmark2D(lmId, Point(lmImage.x, lmImage.y), visible = true) 150 | } 151 | 152 | /** checks the availability of a named landmark */ 153 | override def hasLandmarkId(lmId: String): Boolean = model.landmarkPointId(lmId).isDefined 154 | 155 | 156 | /** get all available landmarks */ 157 | override def allLandmarkIds: IndexedSeq[String] = model.landmarks.keySet.toIndexedSeq 158 | 159 | 160 | /** render a mask defined on the model to image space */ 161 | override def renderMask(parameters: RenderParameter, mask: MeshSurfaceProperty[Int]): PixelImage[Int] = { 162 | val inst = instance(parameters) 163 | val maskImage = SpecularParametricRenderer.renderPropertyImage(parameters, inst.shape, mask) 164 | maskImage.map(_.getOrElse(0)) // 0 - invalid, outside rendering area 165 | } 166 | 167 | /** get a cached version of this renderer */ 168 | def cached(cacheSize: Int) = new AlbedoMoMoRenderer(model, specularExponent, clearColor) { 169 | private val imageRenderer = Memoize(super.renderImage, cacheSize) 170 | private val meshRenderer = Memoize(super.renderMesh, cacheSize) 171 | private val maskRenderer = Memoize((super.renderMask _).tupled, cacheSize) 172 | private val lmRenderer = Memoize((super.renderLandmark _).tupled, cacheSize * allLandmarkIds.length) 173 | private val instancer = Memoize(super.instance, cacheSize) 174 | 175 | override def renderImage(parameters: RenderParameter): PixelImage[RGBA] = imageRenderer(parameters) 176 | override def renderLandmark(lmId: String, parameter: RenderParameter): Option[TLMSLandmark2D] = lmRenderer((lmId, parameter)) 177 | override def renderMesh(parameters: RenderParameter): VertexColorMesh3D = meshRenderer(parameters) 178 | override def instance(parameters: RenderParameter): VertexAlbedoMesh3D = instancer(parameters) 179 | override def renderMask(parameters: RenderParameter, mask: MeshSurfaceProperty[Int]): PixelImage[Int] = maskRenderer((parameters, mask)) 180 | } 181 | } 182 | 183 | object AlbedoMoMoRenderer { 184 | def apply(model: AlbedoMoMo, specularExponent: Double, clearColor: RGBA) = new AlbedoMoMoRenderer(model, specularExponent, clearColor) 185 | def apply(model: AlbedoMoMo, specularExponent: Double) = new AlbedoMoMoRenderer(model, specularExponent, RGBA.BlackTransparent) 186 | def apply(model: AlbedoMoMo) = new AlbedoMoMoRenderer(model, 20.0, RGBA.BlackTransparent) 187 | 188 | } 189 | 190 | 191 | 192 | 193 | 194 | object SpecularParametricRenderer { 195 | /** 196 | * render a mesh with specified colors and normals according to scene description parameter 197 | * 198 | * @param parameter scene description 199 | * @param mesh mesh to render, has positions, colors and normals 200 | * @param clearColor background color of buffer 201 | * @return 202 | */ 203 | def specularRenderParameterMesh(parameter: RenderParameter, 204 | mesh: ColorNormalMesh3D, 205 | specular: SurfacePointProperty[RGBA], 206 | specularExp: Double, 207 | clearColor: RGBA = RGBA.BlackTransparent): PixelImage[RGBA] = { 208 | val buffer = ZBuffer(parameter.imageSize.width, parameter.imageSize.height, clearColor) 209 | 210 | val worldMesh = mesh.transform(parameter.modelViewTransform) 211 | val backfaceCullingFilter = TriangleFilters.backfaceCullingFilter(worldMesh.shape, parameter.view.eyePosition) 212 | 213 | def pixelShader(mesh: ColorNormalMesh3D, 214 | specular: SurfacePointProperty[RGBA], 215 | specularExp: Double 216 | ): PixelShader[RGBA] = { 217 | val worldMesh = mesh.transform(parameter.pose.transform) 218 | val ctRGB = parameter.colorTransform.transform 219 | val ct = (c: RGBA) => ctRGB(c.toRGB).toRGBA 220 | /* // default: both illumination sources active 221 | (environmentMap.shader(worldMesh, view.eyePosition) + directionalLight.shader(worldMesh, view.eyePosition)).map(ct) 222 | // old compatibility 223 | if (environmentMap.nonEmpty)*/ 224 | val environmentMap = SpecularSphericalHarmonicsLight(parameter.environmentMap.coefficients) 225 | environmentMap.shader(worldMesh, parameter.view.eyePosition, specular, specularExp).map(ct) 226 | /* else 227 | directionalLight.shader(worldMesh, view.eyePosition).map(ct)*/ 228 | } 229 | 230 | TriangleRenderer.renderMesh( 231 | mesh.shape, 232 | backfaceCullingFilter, 233 | parameter.pointShader, 234 | parameter.imageSize.screenTransform, 235 | pixelShader(mesh, specular, specularExp), 236 | buffer).toImage 237 | } 238 | 239 | /** 240 | * render according to parameters, convenience for vertex color mesh with vertex normals 241 | * 242 | * @param parameter scene description 243 | * @param mesh mesh to render, vertex color, vertex normals 244 | * @param clearColor background color of buffer 245 | * @return 246 | */ 247 | def specularRenderParameterVertexColorMesh(parameter: RenderParameter, 248 | mesh: VertexAlbedoMesh3D, 249 | specularExp: Double, 250 | clearColor: RGBA = RGBA.BlackTransparent): PixelImage[RGBA] = { 251 | specularRenderParameterMesh(parameter, ColorNormalMesh3D(VertexColorMesh3D(mesh.shape, mesh.diffuseAlbedo)), mesh.specularAlbedo, specularExp, clearColor) 252 | } 253 | 254 | 255 | /** 256 | * render an arbitrary property on the mesh into buffer (rasterization) 257 | * 258 | * @param renderParameter scene description 259 | * @param mesh mesh to render, positions 260 | * @param property surface property to rasterize 261 | * @tparam A type of surface property 262 | * @return 263 | */ 264 | def renderPropertyImage[A: ClassTag](renderParameter: RenderParameter, 265 | mesh: TriangleMesh3D, 266 | property: MeshSurfaceProperty[A]): PixelImage[Option[A]] = { 267 | TriangleRenderer.renderPropertyImage(mesh, 268 | renderParameter.pointShader, 269 | property, 270 | renderParameter.imageSize.width, 271 | renderParameter.imageSize.height 272 | ) 273 | } 274 | 275 | } 276 | 277 | 278 | case class SpecularSphericalHarmonicsLight(coefficients: IndexedSeq[EuclideanVector[_3D]]) { 279 | require(coefficients.isEmpty || SphericalHarmonics.totalCoefficients(bands) == coefficients.length, "invalid length of coefficients to build SphericalHarmonicsLight") 280 | 281 | def shader(worldMesh: ColorNormalMesh3D, eyePosition: Point[_3D], specular: SurfacePointProperty[RGBA], specularExp: Double): SphericalHarmonicsSpecularMapsShader = shader(worldMesh, specular, specularExp) 282 | 283 | /** SH shader for a given mesh and this environment map */ 284 | def shader(worldMesh: ColorNormalMesh3D, specular: SurfacePointProperty[RGBA], specularExp: Double): SphericalHarmonicsSpecularMapsShader = { 285 | val positionsWorld = SurfacePointProperty[Point[_3D]](worldMesh.shape.triangulation, worldMesh.shape.pointSet.points.toIndexedSeq) 286 | 287 | SphericalHarmonicsSpecularMapsShader(specular, coefficients, worldMesh.normals, positionsWorld, specularExp ) 288 | } 289 | 290 | /** number of bands */ 291 | def bands: Int = SphericalHarmonics.numberOfBandsForCoefficients(coefficients.size) 292 | 293 | } 294 | 295 | case class SphericalHarmonicsSpecularMapsShader(specularAlbedo: MeshSurfaceProperty[RGBA], 296 | environmentMap: IndexedSeq[EuclideanVector[_3D]], 297 | normalsWorld: MeshSurfaceProperty[EuclideanVector[_3D]], 298 | positionsWorld: MeshSurfaceProperty[Point[_3D]], specularExp: Double 299 | ) extends PixelShader[RGBA] { 300 | override def apply(triangleId: TriangleId, 301 | worldBCC: BarycentricCoordinates, 302 | screenCoordinates: Point[_3D]): RGBA = { 303 | SphericalHarmonicsSpecularMapsShader.specularPart(positionsWorld(triangleId, worldBCC), normalsWorld(triangleId, worldBCC), specularAlbedo(triangleId, worldBCC), environmentMap, specularExp) 304 | } 305 | } 306 | 307 | object SphericalHarmonicsSpecularMapsShader { 308 | def specularPart(posWorld: Point[_3D], normalWorld: EuclideanVector[_3D], specularAlbedo: RGBA, environmentMap: IndexedSeq[EuclideanVector[_3D]], specularExp: Double): RGBA = { 309 | /** BRDF: find outgoing vector for given view vector (reflect) */ 310 | val vecToEye: EuclideanVector3D = (Point3D(0, 0, 0) - posWorld).normalize 311 | val viewAdjusted = vecToEye * vecToEye.dot(normalWorld) 312 | val reflected = normalWorld + (normalWorld - viewAdjusted) 313 | /** Read out value in environment map in out direction for specularity */ 314 | val lightInOutDirection = SphericalHarmonicsLambertShader.shade(specularAlbedo, reflected, environmentMap) 315 | val specularFactor = math.pow(vecToEye.dot(reflected).toDouble, specularExp) 316 | specularFactor *: lightInOutDirection 317 | } 318 | } 319 | 320 | 321 | -------------------------------------------------------------------------------- /scala/src/main/scala/faces/lib/AlbedoModelFitScript.scala: -------------------------------------------------------------------------------- 1 | package faces.lib 2 | 3 | /* 4 | * 5 | * Licensed under the Apache License, Version 2.0 (the "License"); 6 | * you may not use this file except in compliance with the License. 7 | * You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | * 17 | * This code was adapted from 18 | * https://github.com/unibas-gravis/basel-face-model-viewer 19 | * Copyright University of Basel, Graphics and Vision Research Group 20 | */ 21 | 22 | 23 | import java.io.File 24 | 25 | import breeze.numerics.pow 26 | import faces.lib.ParameterProposals.implicits._ 27 | import scalismo.color.{RGB, RGBA} 28 | import scalismo.faces.image.PixelImage 29 | import scalismo.faces.io.{PixelImageIO, RenderParameterIO, TLMSLandmarksIO} 30 | import scalismo.faces.parameters.RenderParameter 31 | import scalismo.faces.sampling.face.evaluators.PixelEvaluators._ 32 | import scalismo.faces.sampling.face.evaluators.PointEvaluators.IsotropicGaussianPointEvaluator 33 | import scalismo.faces.sampling.face.evaluators.PriorEvaluators.{GaussianShapePrior, GaussianTexturePrior} 34 | import scalismo.faces.sampling.face.evaluators._ 35 | import scalismo.faces.sampling.face.loggers._ 36 | import scalismo.faces.sampling.face.proposals.ImageCenteredProposal.implicits._ 37 | import scalismo.faces.sampling.face.proposals.SphericalHarmonicsLightProposals._ 38 | import scalismo.faces.sampling.face.proposals._ 39 | import scalismo.faces.sampling.face.{ParametricLandmarksRenderer, ParametricModel} 40 | import scalismo.geometry.{EuclideanVector, EuclideanVector3D, _2D} 41 | import scalismo.sampling.algorithms.MetropolisHastings 42 | import scalismo.sampling.evaluators.ProductEvaluator 43 | import scalismo.sampling.loggers.BestSampleLogger 44 | import scalismo.sampling.loggers.ChainStateLogger.implicits._ 45 | import scalismo.sampling.loggers.ChainStateLoggerContainer.implicits._ 46 | import scalismo.sampling.proposals.MixtureProposal.implicits._ 47 | import scalismo.sampling.proposals.{MetropolisFilterProposal, MixtureProposal} 48 | import scalismo.sampling.{ProposalGenerator, TransitionProbability} 49 | import scalismo.utils.Random 50 | 51 | /* This Fitscript with its evaluators and the proposal distribution follows closely the proposed setting of: 52 | 53 | Markov Chain Monte Carlo for Automated Face Image Analysis 54 | Sandro Schonborn, Bernhard Egger, Andreas Morel-Forster and Thomas Vetter 55 | International Journal of Computer Vision 123(2), 160-183 , June 2017 56 | DOI: http://dx.doi.org/10.1007/s11263-016-0967-5 57 | 58 | To understand the concepts behind the fitscript and the underlying methods there is a tutorial on: 59 | http://gravis.dmi.unibas.ch/pmm/ 60 | 61 | */ 62 | 63 | object AlbedoModelFitScript { 64 | 65 | /* Collection of all pose related proposals */ 66 | def defaultPoseProposal(lmRenderer: ParametricLandmarksRenderer)(implicit rnd: Random): 67 | ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = { 68 | import MixtureProposal.implicits._ 69 | 70 | val yawProposalC = GaussianRotationProposal(EuclideanVector3D.unitY, 0.75f) 71 | val yawProposalI = GaussianRotationProposal(EuclideanVector3D.unitY, 0.10f) 72 | val yawProposalF = GaussianRotationProposal(EuclideanVector3D.unitY, 0.01f) 73 | val rotationYaw = MixtureProposal(0.1 *: yawProposalC + 0.4 *: yawProposalI + 0.5 *: yawProposalF) 74 | 75 | val pitchProposalC = GaussianRotationProposal(EuclideanVector3D.unitX, 0.75f) 76 | val pitchProposalI = GaussianRotationProposal(EuclideanVector3D.unitX, 0.10f) 77 | val pitchProposalF = GaussianRotationProposal(EuclideanVector3D.unitX, 0.01f) 78 | val rotationPitch = MixtureProposal(0.1 *: pitchProposalC + 0.4 *: pitchProposalI + 0.5 *: pitchProposalF) 79 | 80 | val rollProposalC = GaussianRotationProposal(EuclideanVector3D.unitZ, 0.75f) 81 | val rollProposalI = GaussianRotationProposal(EuclideanVector3D.unitZ, 0.10f) 82 | val rollProposalF = GaussianRotationProposal(EuclideanVector3D.unitZ, 0.01f) 83 | val rotationRoll = MixtureProposal(0.1 *: rollProposalC + 0.4 *: rollProposalI + 0.5 *: rollProposalF) 84 | 85 | val rotationProposal = MixtureProposal(0.5 *: rotationYaw + 0.3 *: rotationPitch + 0.2 *: rotationRoll).toParameterProposal 86 | 87 | val translationC = GaussianTranslationProposal(EuclideanVector(300f, 300f)).toParameterProposal 88 | val translationF = GaussianTranslationProposal(EuclideanVector(50f, 50f)).toParameterProposal 89 | val translationHF = GaussianTranslationProposal(EuclideanVector(10f, 10f)).toParameterProposal 90 | val translationProposal = MixtureProposal(0.2 *: translationC + 0.2 *: translationF + 0.6 *: translationHF) 91 | 92 | val distanceProposalC = GaussianDistanceProposal(500f, compensateScaling = true).toParameterProposal 93 | val distanceProposalF = GaussianDistanceProposal(50f, compensateScaling = true).toParameterProposal 94 | val distanceProposalHF = GaussianDistanceProposal(5f, compensateScaling = true).toParameterProposal 95 | val distanceProposal = MixtureProposal(0.2 *: distanceProposalC + 0.6 *: distanceProposalF + 0.2 *: distanceProposalHF) 96 | 97 | val scalingProposalC = GaussianScalingProposal(0.15f).toParameterProposal 98 | val scalingProposalF = GaussianScalingProposal(0.05f).toParameterProposal 99 | val scalingProposalHF = GaussianScalingProposal(0.01f).toParameterProposal 100 | val scalingProposal = MixtureProposal(0.2 *: scalingProposalC + 0.6 *: scalingProposalF + 0.2 *: scalingProposalHF) 101 | 102 | val poseMovingNoTransProposal = MixtureProposal(rotationProposal + distanceProposal + scalingProposal) 103 | val centerREyeProposal = poseMovingNoTransProposal.centeredAt("right.eye.corner_outer", lmRenderer).get 104 | val centerLEyeProposal = poseMovingNoTransProposal.centeredAt("left.eye.corner_outer", lmRenderer).get 105 | val centerRLipsProposal = poseMovingNoTransProposal.centeredAt("right.lips.corner", lmRenderer).get 106 | val centerLLipsProposal = poseMovingNoTransProposal.centeredAt("left.lips.corner", lmRenderer).get 107 | 108 | MixtureProposal(centerREyeProposal + centerLEyeProposal + centerRLipsProposal + centerLLipsProposal + 0.2 *: translationProposal) 109 | } 110 | 111 | 112 | /* Collection of all illumination related proposals */ 113 | def defaultIlluminationProposalNoOptimizer()(implicit rnd: Random): 114 | ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = { 115 | 116 | 117 | val lightSHPert = SHLightPerturbationProposal(0.001f, fixIntensity = true) 118 | val lightSHIntensity = SHLightIntensityProposal(0.1f) 119 | val lightSHBandMixter = SHLightBandEnergyMixer(0.1f) 120 | val lightSHSpatial = SHLightSpatialPerturbation(0.05f) 121 | val lightSHColor = SHLightColorProposal(0.01f) 122 | 123 | MixtureProposal(lightSHSpatial + lightSHBandMixter + lightSHIntensity + lightSHPert + lightSHColor).toParameterProposal 124 | } 125 | 126 | /* Collection of all illumination related proposals */ 127 | def defaultIlluminationProposalNoOptimizerExponentHack(modelRenderer: ParametricModel, target: PixelImage[RGBA])(implicit rnd: Random): 128 | ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = { 129 | 130 | val lightSHPert = SHLightPerturbationProposal(0.001f, fixIntensity = true) 131 | val lightSHIntensity = SHLightIntensityProposal(0.1f) 132 | val lightSHBandMixter = SHLightBandEnergyMixer(0.1f) 133 | val lightSHSpatial = SHLightSpatialPerturbation(0.05f) 134 | val lightSHColor = SHLightColorProposal(0.01f) 135 | 136 | MixtureProposal(MixtureProposal(lightSHSpatial + lightSHBandMixter + lightSHIntensity + lightSHPert + lightSHColor).toParameterProposal) 137 | } 138 | 139 | 140 | /* Collection of all statistical model (shape, texture) related proposals */ 141 | def neutralMorphableModelProposal(implicit rnd: Random): 142 | ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = { 143 | 144 | val shapeC = GaussianMoMoShapeProposal(0.2f) 145 | val shapeF = GaussianMoMoShapeProposal(0.1f) 146 | val shapeHF = GaussianMoMoShapeProposal(0.025f) 147 | val shapeScaleProposal = GaussianMoMoShapeCaricatureProposal(0.2f) 148 | val shapeProposal = MixtureProposal(0.1f *: shapeC + 0.5f *: shapeF + 0.2f *: shapeHF + 0.2f *: shapeScaleProposal).toParameterProposal 149 | 150 | val textureC = GaussianMoMoColorProposal(0.2f) 151 | val textureF = GaussianMoMoColorProposal(0.1f) 152 | val textureHF = GaussianMoMoColorProposal(0.025f) 153 | val textureScale = GaussianMoMoColorCaricatureProposal(0.2f) 154 | val textureProposal = MixtureProposal(0.1f *: textureC + 0.5f *: textureF + 0.2 *: textureHF + 0.2f *: textureScale).toParameterProposal 155 | 156 | MixtureProposal(shapeProposal + textureProposal) 157 | } 158 | 159 | /* Collection of all statistical model (shape, texture, expression) related proposals */ 160 | def defaultMorphableModelProposal(implicit rnd: Random): 161 | ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = { 162 | 163 | 164 | val expressionC = GaussianMoMoExpressionProposal(0.2f) 165 | val expressionF = GaussianMoMoExpressionProposal(0.1f) 166 | val expressionHF = GaussianMoMoExpressionProposal(0.025f) 167 | val expressionScaleProposal = GaussianMoMoExpressionCaricatureProposal(0.2f) 168 | val expressionProposal = MixtureProposal(0.1f *: expressionC + 0.5f *: expressionF + 0.2f *: expressionHF + 0.2f *: expressionScaleProposal).toParameterProposal 169 | 170 | 171 | MixtureProposal(neutralMorphableModelProposal + expressionProposal) 172 | } 173 | 174 | /* Collection of all color transform proposals */ 175 | def defaultColorProposal(implicit rnd: Random): 176 | ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = { 177 | val colorC = GaussianColorProposal(RGB(0.01f, 0.01f, 0.01f), 0.01f, RGB(1e-4f, 1e-4f, 1e-4f)) 178 | val colorF = GaussianColorProposal(RGB(0.001f, 0.001f, 0.001f), 0.01f, RGB(1e-4f, 1e-4f, 1e-4f)) 179 | val colorHF = GaussianColorProposal(RGB(0.0005f, 0.0005f, 0.0005f), 0.01f, RGB(1e-4f, 1e-4f, 1e-4f)) 180 | 181 | MixtureProposal(0.2f *: colorC + 0.6f *: colorF + 0.2f *: colorHF).toParameterProposal 182 | } 183 | 184 | 185 | def fit(targetFn: String, lmFn: String, outputDir: String, modelRenderer: AlbedoMoMoRenderer, expression: Boolean = true, gamma: Boolean = true)(implicit rnd: Random): RenderParameter = { 186 | 187 | val g = 1.0 / 2.2 // gamma 188 | 189 | def applyGamma(i: PixelImage[RGBA]): PixelImage[RGBA] = i.map(c => RGBA(pow(c.r, g), pow(c.g, g), pow(c.b, g))) 190 | 191 | def inverseGamma(i: PixelImage[RGBA]): PixelImage[RGBA] = i.map(c => RGBA(pow(c.r, 1.0 / g), pow(c.g, 1.0 / g), pow(c.b, 1.0 / g))) 192 | 193 | val loadimage = PixelImageIO.read[RGBA](new File(targetFn)).get 194 | val target = if (gamma) 195 | inverseGamma(loadimage) 196 | else 197 | loadimage 198 | 199 | val targetLM = TLMSLandmarksIO.read2D(new File(lmFn)).get.filter(lm => lm.visible) 200 | 201 | PixelImageIO.write(target, new File(s"$outputDir/target.png")).get 202 | 203 | if (gamma) 204 | PixelImageIO.write(applyGamma(target), new File(s"$outputDir/target_Gamma.png")).get 205 | 206 | 207 | val init: RenderParameter = RenderParameter.defaultSquare.fitToImageSize(target.width, target.height) 208 | 209 | 210 | val sdev = 0.043f 211 | 212 | /* Foreground Evaluator */ 213 | val pixEval = IsotropicGaussianPixelEvaluator(sdev) 214 | 215 | /* Background Evaluator */ 216 | val histBGEval = HistogramRGB.fromImageRGBA(target, 25) 217 | 218 | /* Pixel Evaluator */ 219 | val imgEval = IndependentPixelEvaluator(pixEval, histBGEval) 220 | 221 | /* Prior Evaluator */ 222 | val priorEval = ProductEvaluator(GaussianShapePrior(0, 1), GaussianTexturePrior(0, 1)) 223 | 224 | /* Image Evaluator */ 225 | val allEval = ImageRendererEvaluator(modelRenderer, imgEval.toDistributionEvaluator(target)) 226 | 227 | /* Landmarks Evaluator */ 228 | val pointEval = IsotropicGaussianPointEvaluator[_2D](4.0) //lm click uncertainty in pixel! -> should be related to image/face size 229 | val landmarksEval = LandmarkPointEvaluator(targetLM, pointEval, modelRenderer) 230 | 231 | 232 | //logging 233 | val imageLogger = ImageRenderLogger(modelRenderer, new File(s"$outputDir/"), "mc-").withBackground(target) 234 | 235 | // Metropolis logger 236 | val printLogger = PrintLogger[RenderParameter](Console.out, "").verbose 237 | val mhLogger = printLogger 238 | 239 | 240 | // keep track of best sample 241 | val bestFileLogger = ParametersFileBestLogger(allEval, new File(s"$outputDir/fit-best.rps")) 242 | val bestSampleLogger = BestSampleLogger(allEval) 243 | val parametersLogger = ParametersFileLogger(new File(s"$outputDir/"), "mc-") 244 | 245 | val fitLogger = bestFileLogger :+ bestSampleLogger 246 | 247 | // pose proposal 248 | val totalPose = defaultPoseProposal(modelRenderer) 249 | 250 | //light proposals 251 | val lightProposal = defaultIlluminationProposalNoOptimizer 252 | 253 | //color proposals 254 | val colorProposal = defaultColorProposal 255 | 256 | //Morphable Model proposals 257 | val momoProposal = if (expression) defaultMorphableModelProposal else neutralMorphableModelProposal 258 | 259 | 260 | // full proposal filtered by the landmark and prior Evaluator 261 | val proposal = MetropolisFilterProposal(MetropolisFilterProposal(MixtureProposal(totalPose + colorProposal + 3f *: momoProposal + 2f *: lightProposal), landmarksEval), priorEval) 262 | 263 | //pose and image chains 264 | val imageFitter = MetropolisHastings(proposal, allEval) 265 | val poseFitter = MetropolisHastings(totalPose, landmarksEval) 266 | 267 | 268 | println("everyting setup. starting fitter ...") 269 | 270 | 271 | //landmark chain for initialisation 272 | val initDefault: RenderParameter = RenderParameter.defaultSquare.fitToImageSize(target.width, target.height) 273 | val init10 = initDefault.withMoMo(init.momo.withNumberOfCoefficients(50, 50, 5)) 274 | val initLMSamples: IndexedSeq[RenderParameter] = poseFitter.iterator(init10, mhLogger).take(5000).toIndexedSeq 275 | 276 | val lmScores = initLMSamples.map(rps => (landmarksEval.logValue(rps), rps)) 277 | 278 | val bestLM = lmScores.maxBy(_._1)._2 279 | RenderParameterIO.write(bestLM, new File(s"$outputDir/fitter-lminit.rps")).get 280 | 281 | val imgLM = modelRenderer.renderImage(bestLM) 282 | PixelImageIO.write(imgLM, new File(s"$outputDir/fitter-lminit.png")).get 283 | 284 | def printer(sample: RenderParameter): RenderParameter = { 285 | println(s"${sample.momo.shape} ${sample.momo.color} ${sample.momo.expression}") 286 | sample 287 | } 288 | 289 | // image chain, fitting 290 | val fitsamples = imageFitter.iterator(bestLM, mhLogger).loggedWith(fitLogger).take(10000).toIndexedSeq 291 | val best = bestSampleLogger.currentBestSample().get 292 | 293 | val imgBest = modelRenderer.renderImage(best) 294 | PixelImageIO.write(imgBest, new File(s"$outputDir/fitter-best.png")).get 295 | 296 | if (gamma) 297 | PixelImageIO.write(applyGamma(imgBest), new File(s"$outputDir/fitter-best_Gamma.png")).get 298 | best 299 | 300 | val imgBestDiffuse = modelRenderer.renderImageDiffuse(best) 301 | PixelImageIO.write(imgBestDiffuse, new File(s"$outputDir/fitter-best-diffuse.png")).get 302 | 303 | val imgBestSpecular = modelRenderer.renderImageSpecular(best) 304 | PixelImageIO.write(imgBestSpecular, new File(s"$outputDir/fitter-best-specular.png")).get 305 | best 306 | } 307 | } -------------------------------------------------------------------------------- /scala/src/main/scala/faces/lib/AlbedoModelHelpers.scala: -------------------------------------------------------------------------------- 1 | /* 2 | * 3 | * Licensed under the Apache License, Version 2.0 (the "License"); 4 | * you may not use this file except in compliance with the License. 5 | * You may obtain a copy of the License at 6 | * 7 | * http://www.apache.org/licenses/LICENSE-2.0 8 | * 9 | * Unless required by applicable law or agreed to in writing, software 10 | * distributed under the License is distributed on an "AS IS" BASIS, 11 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | * See the License for the specific language governing permissions and 13 | * limitations under the License. 14 | * This code was adapted from 15 | * https://github.com/unibas-gravis/scalismo-faces 16 | * Copyright University of Basel, Graphics and Vision Research Group 17 | */ 18 | 19 | package scalismo.statisticalmodel 20 | 21 | import breeze.linalg.svd.SVD 22 | import breeze.linalg.{*, DenseMatrix, DenseVector} 23 | import faces.lib.AlbedoMoMo 24 | import scalismo.common._ 25 | import scalismo.faces.mesh.BinaryMask 26 | import scalismo.faces.momo.PancakeDLRGP 27 | import scalismo.geometry._ 28 | import scalismo.mesh.TriangleMesh 29 | import scalismo.statisticalmodel.DiscreteLowRankGaussianProcess 30 | 31 | import scala.collection.immutable 32 | import scala.util.{Failure, Success, Try} 33 | 34 | object AlbedoModelHelpers { 35 | 36 | /** 37 | * Converts a deformation model (DLRGP for EuclideanVector[_3D]) to a point distribution model (DLRGP for Point[_3D]). 38 | * 39 | * @param model DLRGP EuclideanVector[_3D] model 40 | * @param reference Reference used to map the deformation model to a point model. 41 | * @return DLRGP Point[_3D] model 42 | */ 43 | def vectorToPointDLRGP(model: DiscreteLowRankGaussianProcess[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]], 44 | reference: TriangleMesh[_3D]) 45 | : DiscreteLowRankGaussianProcess[_3D, UnstructuredPointsDomain[_3D], Point[_3D]] = { 46 | def vectorFieldToPointField( pf: DiscreteField[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]], 47 | f: (EuclideanVector[_3D], PointId) => Point[_3D] 48 | ) = new DiscreteField[_3D, UnstructuredPointsDomain[_3D], Point[_3D]]( 49 | pf.domain, 50 | pf.valuesWithIds.map{ case (v,i) =>f(v, i)}.toIndexedSeq 51 | ) 52 | 53 | val newKLBasis = model.klBasis.map( b => 54 | DiscreteLowRankGaussianProcess.Eigenpair[_3D, UnstructuredPointsDomain[_3D], Point[_3D]]( 55 | b.eigenvalue, 56 | vectorFieldToPointField( b.eigenfunction, (v: EuclideanVector[_3D], _) => v.toPoint ) 57 | ) 58 | ) 59 | val newMeanField = vectorFieldToPointField(model.mean, (v: EuclideanVector[_3D], i: PointId) => reference.pointSet.point(i)+v) 60 | 61 | DiscreteLowRankGaussianProcess[_3D, UnstructuredPointsDomain[_3D], Point[_3D]](newMeanField, newKLBasis) 62 | } 63 | 64 | /** 65 | * Converts a point distribution model (DLRGP for Point[_3D]) to a deformation model (DLRGP for EuclideanVector[_3D]). 66 | * @param model DLRGP Point[_3D] model 67 | * @param reference Reference used to map the point model to a deformation model. 68 | * @return DLRGP EuclideanVector[_3D] model 69 | */ 70 | def pointToVectorDLRGP(model: DiscreteLowRankGaussianProcess[_3D, UnstructuredPointsDomain[_3D], Point[_3D]], reference: TriangleMesh[_3D]): DiscreteLowRankGaussianProcess[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]] = { 71 | def pointFieldToVectorField( pf: DiscreteField[_3D, UnstructuredPointsDomain[_3D], Point[_3D]], 72 | f: (Point[_3D], PointId) => EuclideanVector[_3D] 73 | ) = new DiscreteField[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]]( 74 | pf.domain, 75 | pf.valuesWithIds.map{ case (v,i) =>f(v, i)}.toIndexedSeq 76 | ) 77 | 78 | val newKLBasis = model.klBasis.map( b => 79 | DiscreteLowRankGaussianProcess.Eigenpair[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]]( 80 | b.eigenvalue, 81 | pointFieldToVectorField( b.eigenfunction, (v: Point[_3D], _) => v.toVector ) 82 | ) 83 | ) 84 | val newMeanField = pointFieldToVectorField(model.mean, (p: Point[_3D], i: PointId) => p-reference.pointSet.point(i) ) 85 | 86 | DiscreteLowRankGaussianProcess[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]](newMeanField, newKLBasis) 87 | } 88 | 89 | /** 90 | * Helper function to build a DLRGP. Simply provides access to the constructor. 91 | */ 92 | def buildFrom[D <: Dim : NDSpace, DDomain <: DiscreteDomain[D], Value](domain: DDomain, meanVec: DenseVector[Double], d2: DenseVector[Double], U: DenseMatrix[Double]) 93 | (implicit vectorizer: Vectorizer[Value]) 94 | : DiscreteLowRankGaussianProcess[D, DDomain, Value] = { 95 | new DiscreteLowRankGaussianProcess[D, DDomain, Value](domain, meanVec, d2, U) 96 | } 97 | 98 | 99 | /** 100 | * Creates a discrete low rank GP from samples. This method assumes that the training data as DiscreteFields are in dense correspondence with the DiscreteDomain as reference. 101 | * 102 | * @param domain The domain where the model is defined. 103 | * @param discreteFields The training samples. 104 | * @param threshold The minimal value to keep the basis. 105 | * @return A discrete low rank GP learned from the samples. 106 | */ 107 | def createUsingPCA[D <: Dim: NDSpace, DDomain <: DiscreteDomain[D], Value]( 108 | domain: DDomain, 109 | discreteFields: Seq[DiscreteField[D, DDomain, Value]], 110 | threshold: Double = 1.0e-8 111 | )( 112 | implicit vectorizer: Vectorizer[Value] 113 | ): DiscreteLowRankGaussianProcess[D, DDomain, Value] = { 114 | val X = buildDataMatrixWithSamplesInCols[D, DDomain, Value](domain, discreteFields) 115 | val (basis, variance, mean) = calculatePPCABasis(X,0.0,threshold) 116 | DiscreteLowRankGaussianProcess(domain,mean,variance,basis) 117 | } 118 | 119 | /** 120 | * Method to build a reconstructive model. 121 | * The statistics is only calculated in the region indicated by the mask. Then a full 122 | * model is calculated based on the completion predicted by the input data. 123 | * @param domain domain of the full model 124 | * @param discreteFields data to build the full model from 125 | * @param mask indicates the region over which the statistic is calculated 126 | * @param threshold removes eigenvalue/eigenvector pairs if the eigenvalue is below the threshold 127 | */ 128 | def createReconstructiveUsingPCA[D <: Dim: NDSpace, DDomain <: DiscreteDomain[D], Value]( 129 | domain: DDomain, 130 | discreteFields: Seq[DiscreteField[D, DDomain, Value]], 131 | mask: BinaryMask, 132 | threshold: Double = 1.0e-8 133 | )( 134 | implicit vectorizer: Vectorizer[Value] 135 | ): DiscreteLowRankGaussianProcess[D, DDomain, Value] = { 136 | val X = buildDataMatrixWithSamplesInCols[D, DDomain, Value](domain, discreteFields) 137 | val M = buildColumnIndexingVectorForMask[D, DDomain, Value](domain, mask) 138 | val (basis, variance, mean) = calculateMaskedPPCABasis(X,M,0.0,threshold) 139 | 140 | DiscreteLowRankGaussianProcess(domain, mean, variance, basis) 141 | } 142 | 143 | 144 | /** 145 | * Creates a discrete low rank GP from samples. This method assumes that the training data as DiscreteFields are in dense correspondence with the DiscreteDomain as reference. 146 | * 147 | * @param domain The domain where the model is defined. 148 | * @param discreteFields The training samples. 149 | * @return A discrete low rank GP learned from the samples. 150 | */ 151 | def createUsingPPCA[D <: Dim: NDSpace, DDomain <: DiscreteDomain[D], Value](domain: DDomain, 152 | discreteFields: Seq[DiscreteField[D, DDomain, Value]], 153 | noiseVariance: Double, 154 | threshold: Double = 1.0e-8) 155 | (implicit vectorizer: Vectorizer[Value]) 156 | : PancakeDLRGP[D, DDomain, Value] = { 157 | 158 | val X = buildDataMatrixWithSamplesInCols[D, DDomain, Value](domain, discreteFields) 159 | val (basis, variance, mean) = calculatePPCABasis(X,noiseVariance,threshold) 160 | PancakeDLRGP(DiscreteLowRankGaussianProcess(domain,mean,variance,basis), noiseVariance) 161 | } 162 | 163 | /** 164 | * Create a masked model using a set of PointIds. 165 | * Please note that the procedure can return a model with less PointId's because of the maskPoints operation. 166 | * @param momo The model to be masked. 167 | * @param pointIds The PointIds of the reference to be kept. 168 | * @return the masked model. 169 | */ 170 | 171 | def maskAlbedoMoMo(momo : AlbedoMoMo, pointIds: Seq[PointId], strict: Boolean = true): Try[AlbedoMoMo] = { 172 | 173 | val referenceMesh = momo.referenceMesh 174 | val op = referenceMesh.operations.maskPoints(id => pointIds.contains(id)) 175 | val maskedMesh = op.transformedMesh 176 | 177 | if (strict && (maskedMesh.pointSet.numberOfPoints != pointIds.distinct.size) ) { 178 | return Failure(new Exception( 179 | "Masking the model would remove additonal points not specified in the provided list of point ids.\n"+ 180 | "Either provide a different list of point ids or set the parameter stict to false to mask the model.")) 181 | } 182 | 183 | val remainingPtIds = maskedMesh.pointSet.pointIds.map(id => op.pointBackMap(id)).toIndexedSeq 184 | val maskedModelShape = momo.neutralModel.shape.marginal(remainingPtIds) 185 | val maskedModelDiffuseAlbedo = momo.neutralModel.diffuseAlbedo.marginal(remainingPtIds) 186 | val maskedModelSpecularAlbedo = momo.neutralModel.specularAlbedo.marginal(remainingPtIds) 187 | 188 | if(momo.hasExpressions){ 189 | val maskedModelExpressions = momo.expressionModel.get.expression.marginal(remainingPtIds) 190 | Success(AlbedoMoMo(maskedMesh, maskedModelShape, maskedModelDiffuseAlbedo, maskedModelSpecularAlbedo, maskedModelExpressions, momo.landmarks)) 191 | } else { 192 | Success(AlbedoMoMo(maskedMesh, maskedModelShape, maskedModelDiffuseAlbedo, maskedModelSpecularAlbedo, momo.landmarks)) 193 | } 194 | 195 | } 196 | 197 | /** 198 | * Create a masked model using a masked reference mesh. 199 | * The input maskMesh for masking needs to be a subset of the reference mesh. 200 | * @param momo The model to be masked. 201 | * @param maskMesh The masked mesh, which must be a subset of the original reference mesh. It will be the new reference mesh for the output MoMo. 202 | * @return the masked model. 203 | */ 204 | 205 | def maskMoMo(momo : AlbedoMoMo, maskMesh: TriangleMesh[_3D]): Try[AlbedoMoMo] = { 206 | 207 | val remainingPtIds = maskMesh.pointSet.points.map(p => momo.referenceMesh.pointSet.findClosestPoint(p).id).toIndexedSeq 208 | val maskedReference = momo.referenceMesh.operations.maskPoints(pid => remainingPtIds.contains(pid)).transformedMesh 209 | if (maskMesh == maskedReference) { 210 | val maskedModelShape = momo.neutralModel.shape.marginal(remainingPtIds) 211 | val maskedModelDiffuseAlbedo = momo.neutralModel.diffuseAlbedo.marginal(remainingPtIds) 212 | val maskedModelSpecularAlbedo = momo.neutralModel.specularAlbedo.marginal(remainingPtIds) 213 | 214 | 215 | if (momo.hasExpressions) { 216 | val maskedModelExpressions = momo.expressionModel.get.expression.marginal(remainingPtIds) 217 | Success(AlbedoMoMo(maskMesh, maskedModelShape, maskedModelDiffuseAlbedo, maskedModelSpecularAlbedo, maskedModelExpressions, momo.landmarks)) 218 | } else { 219 | Success(AlbedoMoMo(maskMesh, maskedModelShape, maskedModelDiffuseAlbedo, maskedModelSpecularAlbedo, momo.landmarks)) 220 | } 221 | } else { 222 | Failure(new Exception("The mesh you provided does not seem to be a masked version of the reference.")) 223 | } 224 | 225 | } 226 | 227 | /** 228 | * 229 | * @param X Data matrix. 230 | * @param noiseVariance External estimate of the noise variance. 231 | * @param threshold Threshold for keeping components. 232 | * @return (basis,variance,mean) 233 | */ 234 | def calculatePPCABasis( 235 | X: DenseMatrix[Double], 236 | noiseVariance: Double, 237 | threshold: Double = 1.0e-8 238 | ): (DenseMatrix[Double], DenseVector[Double], DenseVector[Double]) = { 239 | val m = X.rows 240 | val n = X.cols 241 | 242 | val (x0, meanVec) = removeColMean(X) 243 | 244 | // decide what to do depending on the dimensions 245 | val (eVec, eVal) = if (n < m) { 246 | decomposeGramMatrix(x0,threshold) 247 | } else { 248 | decomposeCovarianceMatrix(x0,threshold) 249 | } 250 | 251 | val rVal = eVal.map(_ - noiseVariance) 252 | val rank = rVal.toArray.count(_ > threshold) 253 | 254 | (eVec(::, 0 until rank), rVal(0 until rank), meanVec) 255 | } 256 | 257 | 258 | /** 259 | * @param X Data matrix. 260 | * @param M selection or rows to perform the PPCA on it 261 | * @param noiseVariance External estimate of the noise variance. 262 | * @param threshold Threshold for keeping components. 263 | * @return (basis,variance,mean) 264 | */ 265 | def calculateMaskedPPCABasis( 266 | X: DenseMatrix[Double], 267 | M: IndexedSeq[Int], 268 | noiseVariance: Double, 269 | threshold: Double = 1.0e-8 270 | ): (DenseMatrix[Double], DenseVector[Double], DenseVector[Double]) = { 271 | val m = M.size 272 | val n = X.cols 273 | 274 | val (x0, meanVec) = removeColMean(X) 275 | 276 | // decide what to do depending on the dimensions 277 | val (eVec, eVal) = if (n < m) { 278 | decomposeMaskedGramMatrix(x0,M,threshold) 279 | } else { 280 | ??? 281 | } 282 | 283 | val rVal = eVal.map(_ - noiseVariance) 284 | val rank = rVal.toArray.count(_ > threshold) 285 | 286 | (eVec(::, 0 until rank), rVal(0 until rank), meanVec) 287 | } 288 | 289 | /** 290 | * Calculate the orthonormal basis and the variances of XX' using the Gram matrix X'X. 291 | * 292 | * @return The cols of the matrix hold the orthonormal eigenvectors and the vector the eigenvalues. 293 | */ 294 | def decomposeGramMatrix(X: DenseMatrix[Double], threshold: Double = 1.0e-8): (DenseMatrix[Double], DenseVector[Double]) = { 295 | val n = X.cols 296 | 297 | val G = X.t * X 298 | 299 | val SVD(u, s, _) = breeze.linalg.svd(G * (1.0 / (n-1))) 300 | 301 | // calculate factor for eigenvectors 302 | val S = s.map(t => Math.sqrt(t * n)) 303 | val SInv = s.map(t => if (Math.sqrt(t) > 1.0e-14) 1.0 / Math.sqrt(t * (n-1)) else 0.0) 304 | 305 | // a Matrix with the scaled eigenvectors 306 | val U = X * u * breeze.linalg.diag(SInv) 307 | 308 | val rank = s.toArray.count(_ > threshold) 309 | 310 | (U(::, 0 until rank), s(0 until rank)) 311 | } 312 | 313 | /** 314 | * Calculate the orthonormal basis and the variances of XX' using the Gram matrix X'X. 315 | * 316 | * @return The cols of the matrix hold the orthonormal eigenvectors and the vector the eigenvalues. 317 | */ 318 | def decomposeMaskedGramMatrix( 319 | X: DenseMatrix[Double], 320 | M: IndexedSeq[Int], 321 | threshold: Double = 1.0e-8 322 | ): (DenseMatrix[Double], DenseVector[Double]) = { 323 | 324 | val n = X.cols 325 | 326 | val X0 = X(M,::).toDenseMatrix 327 | val G = X0.t * X0 328 | 329 | val SVD(u, s, vt) = breeze.linalg.svd(G * (1.0 / (n-1))) 330 | 331 | // calculate factor for eigenvectors 332 | val S = s.map(t => Math.sqrt(t * n)) 333 | val SInv = s.map(t => if (Math.sqrt(t) > 1.0e-14) 1.0 / Math.sqrt(t * (n-1)) else 0.0) 334 | 335 | // a Matrix with the scaled eigenvectors 336 | val U = X * u * breeze.linalg.diag(SInv) 337 | 338 | val rank = s.toArray.count(_ > threshold) 339 | 340 | (U(::, 0 until rank), s(0 until rank)) 341 | } 342 | 343 | /** 344 | * Calculate the orthonormal basis and the variances of the covariance matrix XX'. 345 | * 346 | * @return The cols of the matrix hold the orthonormal eigenvectors and the vector the eigenvalues. 347 | */ 348 | def decomposeCovarianceMatrix( 349 | X: DenseMatrix[Double], 350 | threshold: Double = 1.0e-8 351 | ): (DenseMatrix[Double], DenseVector[Double]) = { 352 | 353 | val n = X.cols 354 | 355 | val Cov = X * X.t * (1.0 / (n-1)) 356 | 357 | val SVD(u, s, _) = breeze.linalg.svd(Cov) 358 | val rank = s.toArray.count(_ > threshold) 359 | 360 | (u(::, 0 until rank), s(0 until rank)) 361 | } 362 | 363 | 364 | /** 365 | * Removes row-mean from each row. 366 | * 367 | * @param X 368 | * @return 369 | */ 370 | def removeRowMean(X: DenseMatrix[Double]): (DenseMatrix[Double], DenseVector[Double]) = { 371 | 372 | val X0 = X 373 | val m: DenseVector[Double] = breeze.stats.mean(X0(::, *)).t.toDenseVector 374 | for (i <- 0 until X0.rows) { 375 | X0(i, ::) := X0(i, ::) - m.t 376 | } 377 | (X0, m) 378 | } 379 | 380 | /** 381 | * Removes col-mean from each col. 382 | * 383 | * @param X 384 | * @return 385 | */ 386 | def removeColMean(X: DenseMatrix[Double]): (DenseMatrix[Double], DenseVector[Double]) = { 387 | val X0 = X 388 | val m: DenseVector[Double] = breeze.stats.mean(X0(*, ::)).toDenseVector 389 | for (j <- 0 until X0.cols) { 390 | X0(::, j) := X0(::, j) - m 391 | } 392 | (X0, m) 393 | } 394 | 395 | /** 396 | * Build matrix with the samples stored in rows. 397 | */ 398 | def buildDataMatrixWithSamplesInRows[D <: Dim : NDSpace, DDomain <: DiscreteDomain[D], Value]( 399 | domain: DiscreteDomain[D], 400 | discreteFields: Seq[DiscreteField[D, DDomain, Value]] 401 | )( 402 | implicit vectorizer: Vectorizer[Value] 403 | ): DenseMatrix[Double] = { 404 | 405 | val dim = vectorizer.dim 406 | val n = discreteFields.size 407 | val p = domain.numberOfPoints 408 | 409 | // create the data matrix 410 | val X = DenseMatrix.zeros[Double](n, p * dim) 411 | for (f <- discreteFields.zipWithIndex.par) { 412 | val i = f._2 413 | val field = f._1 414 | field.data.zipWithIndex.map { p => 415 | val j = p._2 416 | val value = p._1 417 | val ux = vectorizer.vectorize(value) 418 | val colRange = j * dim until (j + 1) * dim 419 | X(i, colRange) := ux.t 420 | } 421 | } 422 | X 423 | } 424 | 425 | /** 426 | * Build matrix with the samples stored in cols. 427 | */ 428 | def buildDataMatrixWithSamplesInCols[D <: Dim : NDSpace, DDomain <: DiscreteDomain[D], Value]( 429 | domain: DDomain, 430 | discreteFields: Seq[DiscreteField[D, DDomain, Value]] 431 | )( 432 | implicit vectorizer: Vectorizer[Value] 433 | ): DenseMatrix[Double] = { 434 | 435 | val dim = vectorizer.dim 436 | val n = discreteFields.size 437 | val p = domain.numberOfPoints 438 | val m = p * dim 439 | 440 | // create the data matrix 441 | val X = DenseMatrix.zeros[Double](m, n) 442 | for (f <- discreteFields.zipWithIndex.par) { 443 | val j = f._2 444 | val field = f._1 445 | field.data.zipWithIndex.map { p => 446 | val i = p._2 447 | val value = p._1 448 | val ux = vectorizer.vectorize(value) 449 | val rowRange = i * dim until (i + 1) * dim 450 | X(rowRange, j) := ux 451 | } 452 | } 453 | X 454 | } 455 | 456 | def buildColumnIndexingVectorForMask[D <: Dim : NDSpace, DDomain <: DiscreteDomain[D], Value]( 457 | domain: DDomain, 458 | mask: BinaryMask 459 | )( 460 | implicit vectorizer: Vectorizer[Value] 461 | ): IndexedSeq[Int] = { 462 | 463 | val dim = vectorizer.dim 464 | val p = domain.numberOfPoints 465 | val m = p * dim 466 | 467 | val q = mask.entries.zipWithIndex.filter(_._1).flatMap { case (b,i) => 468 | val s = i * dim until (i + 1) * dim 469 | s 470 | } 471 | q 472 | } 473 | } 474 | -------------------------------------------------------------------------------- /scala/src/main/scala/faces/lib/ParametricAlbedoModel.scala: -------------------------------------------------------------------------------- 1 | /* 2 | * 3 | * Licensed under the Apache License, Version 2.0 (the "License"); 4 | * you may not use this file except in compliance with the License. 5 | * You may obtain a copy of the License at 6 | * 7 | * http://www.apache.org/licenses/LICENSE-2.0 8 | * 9 | * Unless required by applicable law or agreed to in writing, software 10 | * distributed under the License is distributed on an "AS IS" BASIS, 11 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | * See the License for the specific language governing permissions and 13 | * limitations under the License. 14 | * 15 | * This code was adapted from 16 | * https://github.com/unibas-gravis/scalismo-faces 17 | * Copyright University of Basel, Graphics and Vision Research Group 18 | 19 | */ 20 | 21 | package faces.lib 22 | 23 | import breeze.linalg.DenseVector 24 | import faces.lib.{AlbedoMoMo, VertexAlbedoMesh3D} 25 | import scalismo.faces.parameters.{MoMoInstance, RenderParameter} 26 | 27 | 28 | /** generates a model instance in original model coordinates */ 29 | trait ParametricAlbedoModel { 30 | def instance(parameters: RenderParameter): VertexAlbedoMesh3D 31 | } -------------------------------------------------------------------------------- /scala/src/main/scala/faces/lib/SpecularProposal.scala: -------------------------------------------------------------------------------- 1 | package faces.lib 2 | 3 | import scalismo.faces.parameters.{Camera, ColorTransform, DirectionalLight, MoMoInstance, Pose, RenderParameter, SphericalHarmonicsLight} 4 | import scalismo.faces.sampling.evaluators.LogNormalDistribution 5 | import scalismo.faces.sampling.face.proposals.ParameterProposals 6 | import scalismo.faces.sampling.face.proposals.ParameterProposals.implicits.PartialToFullParameterConverter 7 | import scalismo.sampling.{ProposalGenerator, SymmetricTransitionRatio, TransitionProbability} 8 | import scalismo.utils.Random 9 | 10 | 11 | 12 | object ParameterProposals { 13 | /** implicit conversions for promoting partial BetterRenderParameters to full BetterRenderParameter proposals, just use "proposalGenerator.toParameterProposal" */ 14 | object implicits { 15 | import scala.languageFeature.implicitConversions 16 | 17 | /** implicit wrapper class for partial parameter proposals */ 18 | implicit class NakedPartialParameterProposal[A](proposal: ProposalGenerator[A])(implicit converter: PartialToFullParameterConverter[A]) { 19 | def toParameterProposal: ProposalGenerator[RenderParameter] = new ProposalGenerator[RenderParameter] { 20 | override def propose(current: RenderParameter): RenderParameter = { 21 | val prop = proposal.propose(converter.partialParameter(current)) 22 | converter.fullParameter(prop, current) 23 | } 24 | 25 | override def toString: String = proposal.toString 26 | } 27 | } 28 | 29 | /** implicit wrapper class for partial parameter proposals */ 30 | implicit class PartialParameterProposal[A](proposal: ProposalGenerator[A] with TransitionProbability[A])(implicit converter: PartialToFullParameterConverter[A]) { 31 | def toParameterProposal: ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] = new ProposalGenerator[RenderParameter] with TransitionProbability[RenderParameter] { 32 | override def propose(current: RenderParameter): RenderParameter = { 33 | val prop = proposal.propose(converter.partialParameter(current)) 34 | converter.fullParameter(prop, current) 35 | } 36 | 37 | override def logTransitionProbability(from: RenderParameter, to: RenderParameter): Double = proposal.logTransitionProbability(converter.partialParameter(from), converter.partialParameter(to)) 38 | 39 | override def toString: String = proposal.toString 40 | } 41 | } 42 | 43 | /** implicit wrapper class for symmetric partial parameter proposals */ 44 | implicit class PartialSymmetricParameterProposal[A](proposal: ProposalGenerator[A] with SymmetricTransitionRatio[A])(implicit converter: PartialToFullParameterConverter[A]) { 45 | def toParameterProposal: ProposalGenerator[RenderParameter] with SymmetricTransitionRatio[RenderParameter] = new ProposalGenerator[RenderParameter] with SymmetricTransitionRatio[RenderParameter] { 46 | override def propose(current: RenderParameter): RenderParameter = { 47 | val prop = proposal.propose(converter.partialParameter(current)) 48 | converter.fullParameter(prop, current) 49 | } 50 | 51 | override def toString: String = proposal.toString 52 | } 53 | } 54 | 55 | /** implicit wrapper class for symmtric partial parameter proposals with a transition probability */ 56 | implicit class PartialTransitionSymmetricParameterProposal[A](proposal: ProposalGenerator[A] with SymmetricTransitionRatio[A] with TransitionProbability[A])(implicit converter: PartialToFullParameterConverter[A]) { 57 | def toParameterProposal: ProposalGenerator[RenderParameter] with 58 | SymmetricTransitionRatio[RenderParameter] with 59 | TransitionProbability[RenderParameter] = 60 | new ProposalGenerator[RenderParameter] with 61 | SymmetricTransitionRatio[RenderParameter] with 62 | TransitionProbability[RenderParameter] { 63 | override def propose(current: RenderParameter): RenderParameter = { 64 | val prop = proposal.propose(converter.partialParameter(current)) 65 | converter.fullParameter(prop, current) 66 | } 67 | 68 | override def logTransitionProbability(from: RenderParameter, to: RenderParameter): Double = proposal.logTransitionProbability(converter.partialParameter(from), converter.partialParameter(to)) 69 | 70 | override def toString: String = proposal.toString 71 | } 72 | } 73 | 74 | // all implicits use the same promotion converters 75 | 76 | trait PartialToFullParameterConverter[A] { 77 | def fullParameter(partial: A, blueprint: RenderParameter): RenderParameter 78 | 79 | def partialParameter(full: RenderParameter): A 80 | } 81 | 82 | implicit object ParameterAsFullParameter extends PartialToFullParameterConverter[RenderParameter] { 83 | override def fullParameter(partial: RenderParameter, blueprint: RenderParameter): RenderParameter = partial 84 | 85 | override def partialParameter(full: RenderParameter): RenderParameter = full 86 | } 87 | 88 | implicit object PoseAsFullParameter extends PartialToFullParameterConverter[Pose] { 89 | override def fullParameter(partial: Pose, blueprint: RenderParameter): RenderParameter = blueprint.copy(pose = partial) 90 | 91 | override def partialParameter(full: RenderParameter): Pose = full.pose 92 | } 93 | 94 | implicit object CameraAsFullParameter extends PartialToFullParameterConverter[Camera] { 95 | override def fullParameter(partial: Camera, blueprint: RenderParameter): RenderParameter = blueprint.copy(camera = partial) 96 | 97 | override def partialParameter(full: RenderParameter): Camera = full.camera 98 | } 99 | 100 | implicit object ColorAsFullParameter extends PartialToFullParameterConverter[ColorTransform] { 101 | override def fullParameter(partial: ColorTransform, blueprint: RenderParameter): RenderParameter = blueprint.copy(colorTransform = partial) 102 | 103 | override def partialParameter(full: RenderParameter): ColorTransform = full.colorTransform 104 | } 105 | 106 | implicit object MoMoInstanceAsFullParameter extends PartialToFullParameterConverter[MoMoInstance] { 107 | override def fullParameter(partial: MoMoInstance, blueprint: RenderParameter): RenderParameter = { 108 | val modelPart = blueprint.momo.copy( 109 | shape = partial.shape, 110 | color = partial.color, 111 | expression = partial.expression 112 | ) 113 | blueprint.copy(momo = modelPart) 114 | } 115 | override def partialParameter(full: RenderParameter): MoMoInstance = full.momo 116 | } 117 | 118 | implicit object SphericalHarmonicsLightAsFullParameter extends PartialToFullParameterConverter[SphericalHarmonicsLight] { 119 | override def fullParameter(partial: SphericalHarmonicsLight, blueprint: RenderParameter): RenderParameter = blueprint.withEnvironmentMap(partial) 120 | 121 | override def partialParameter(full: RenderParameter): SphericalHarmonicsLight = full.environmentMap 122 | } 123 | 124 | implicit object DirectionalLightAsFullParameter extends PartialToFullParameterConverter[DirectionalLight] { 125 | override def fullParameter(partial: DirectionalLight, blueprint: RenderParameter): RenderParameter = blueprint.withDirectionalLight(partial) 126 | 127 | override def partialParameter(full: RenderParameter): DirectionalLight = full.directionalLight 128 | } 129 | } 130 | } 131 | -------------------------------------------------------------------------------- /scala/src/main/scala/faces/lib/VertexAlbedoMesh.scala: -------------------------------------------------------------------------------- 1 | package faces.lib 2 | 3 | /* 4 | * 5 | * Licensed under the Apache License, Version 2.0 (the "License"); 6 | * you may not use this file except in compliance with the License. 7 | * You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | * 17 | * This code was adapted from 18 | * https://github.com/unibas-gravis/scalismo-faces 19 | * Copyright University of Basel, Graphics and Vision Research Group 20 | */ 21 | 22 | 23 | import scalismo.color.RGBA 24 | import scalismo.geometry.{Point, _3D} 25 | import scalismo.mesh.{SurfacePointProperty, TriangleMesh3D, VertexColorMesh3D} 26 | 27 | /** 28 | * colored mesh with RGBA specular and diffuse Albedo per vertex 29 | * @param shape positions 30 | * @param diffuseAlbedo diffuse albedo of mesh surface, per point 31 | * @param specularAlbedo specular of mesh surface, per point 32 | */ 33 | case class VertexAlbedoMesh3D(shape: TriangleMesh3D, diffuseAlbedo: SurfacePointProperty[RGBA], specularAlbedo: SurfacePointProperty[RGBA]) { 34 | require(shape.triangulation == diffuseAlbedo.triangulation) 35 | require(shape.triangulation == specularAlbedo.triangulation) 36 | 37 | def transform(trafo: Point[_3D] => Point[_3D]): VertexAlbedoMesh3D = { 38 | val s = shape.transform { trafo } 39 | copy(shape = s) 40 | } 41 | 42 | def diffuseAlbedoMesh(): VertexColorMesh3D ={ 43 | VertexColorMesh3D(shape, diffuseAlbedo) 44 | } 45 | 46 | def specularAlbedoMesh(): VertexColorMesh3D ={ 47 | VertexColorMesh3D(shape, specularAlbedo) 48 | } 49 | } -------------------------------------------------------------------------------- /scala/src/test/scala/faces/FacesTestSuite.scala: -------------------------------------------------------------------------------- 1 | /* 2 | * 3 | * Licensed under the Apache License, Version 2.0 (the "License"); 4 | * you may not use this file except in compliance with the License. 5 | * You may obtain a copy of the License at 6 | * 7 | * http://www.apache.org/licenses/LICENSE-2.0 8 | * 9 | * Unless required by applicable law or agreed to in writing, software 10 | * distributed under the License is distributed on an "AS IS" BASIS, 11 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | * See the License for the specific language governing permissions and 13 | * limitations under the License. 14 | * 15 | * This code was adapted from 16 | * https://github.com/unibas-gravis/scalismo-faces 17 | * Copyright University of Basel, Graphics and Vision Research Group 18 | */ 19 | 20 | package faces 21 | 22 | import java.net.URI 23 | 24 | import breeze.linalg.{DenseMatrix, DenseVector, qr} 25 | import faces.lib 26 | import faces.lib.AlbedoMoMo 27 | import org.scalatest._ 28 | import scalismo.color.{RGB, RGBA} 29 | import scalismo.common.{PointId, UnstructuredPointsDomain} 30 | import scalismo.faces.image.{AccessMode, PixelImage, PixelImageDomain} 31 | import scalismo.faces.mesh.{ColorNormalMesh3D, TextureMappedProperty} 32 | import scalismo.faces.momo.PancakeDLRGP 33 | import scalismo.geometry._ 34 | import scalismo.mesh._ 35 | import scalismo.statisticalmodel.AlbedoModelHelpers 36 | import scalismo.utils.Random 37 | 38 | import scala.collection.mutable.ArrayBuffer 39 | 40 | class FacesTestSuite extends FunSpec with Matchers { 41 | scalismo.initialize() 42 | 43 | implicit val rnd = Random(43) 44 | 45 | def randomRGB(implicit rnd: Random) = RGB(rnd.scalaRandom.nextDouble(), rnd.scalaRandom.nextDouble(), rnd.scalaRandom.nextDouble()) 46 | 47 | def randomRGBA(implicit rnd: Random) = RGBA(rnd.scalaRandom.nextDouble(), rnd.scalaRandom.nextDouble(), rnd.scalaRandom.nextDouble(), rnd.scalaRandom.nextDouble()) 48 | 49 | def randomVector3D(implicit rnd: Random) = EuclideanVector3D(rnd.scalaRandom.nextDouble(), rnd.scalaRandom.nextDouble(), rnd.scalaRandom.nextDouble()) 50 | 51 | def randomDouble(implicit rnd: Random): Double = rnd.scalaRandom.nextDouble() 52 | 53 | def randomImageRGB(w: Int, h: Int)(implicit rnd: Random): PixelImage[RGB] = { 54 | val imageSeq = IndexedSeq.fill[RGB](w * h)(randomRGB) 55 | val domain = PixelImageDomain(w, h) 56 | PixelImage(domain, (x, y) => imageSeq(domain.index(x, y))).withAccessMode(AccessMode.Strict()) 57 | } 58 | 59 | def randomImageRGBA(w: Int, h: Int)(implicit rnd: Random): PixelImage[RGBA] = { 60 | val imageSeq = IndexedSeq.fill[RGBA](w * h)(randomRGBA) 61 | val domain = PixelImageDomain(w, h) 62 | PixelImage(domain, (x, y) => imageSeq(domain.index(x, y))).withAccessMode(AccessMode.Strict()) 63 | } 64 | 65 | 66 | def randomInt(max: Int)(implicit rnd: Random): Int = rnd.scalaRandom.nextInt(max) 67 | 68 | def randomDirection(implicit rnd: Random): EuclideanVector[_2D] = EuclideanVector.fromPolar(1.0, rnd.scalaRandom.nextDouble() * 2.0 * math.Pi) 69 | 70 | def randomGridMesh(cols: Int = 3, rows: Int = 3, sdev: Double = 0.25)(implicit rnd: Random): VertexColorMesh3D = { 71 | val n = cols * rows 72 | 73 | val points = IndexedSeq.tabulate(n)(i => Point(i % cols, i / cols, 0f)).map(p => p + EuclideanVector(rnd.scalaRandom.nextGaussian(), rnd.scalaRandom.nextGaussian(), rnd.scalaRandom.nextGaussian()) * sdev) 74 | val colors = IndexedSeq.fill(n)(randomRGBA) 75 | // mesh structure 76 | val triangles = new ArrayBuffer[TriangleCell](n * 2) 77 | for (i <- 0 until n) { 78 | val c = i % cols 79 | val r = i / cols 80 | if (c < cols - 1 && i < n - cols - 1) { 81 | triangles += TriangleCell(PointId(i), PointId(i + 1), PointId(i + cols)) 82 | triangles += TriangleCell(PointId(i + cols), PointId(i + 1), PointId(i + cols + 1)) 83 | } 84 | } 85 | val triangleList = TriangleList(triangles.toIndexedSeq) 86 | // mesh 87 | VertexColorMesh3D( 88 | TriangleMesh3D(points, triangleList), 89 | SurfacePointProperty(triangleList, colors) 90 | ) 91 | } 92 | 93 | def randomGridMeshWithTexture(cols: Int = 3, rows: Int = 3, sdev: Double = 0.25)(implicit rnd: Random): ColorNormalMesh3D = { 94 | val vcMesh = randomGridMesh(cols, rows, sdev) 95 | val centerPoints: IndexedSeq[Point[_2D]] = IndexedSeq.tabulate(cols * rows)(i => Point(i % cols, i / cols)) 96 | val uvCoords: IndexedSeq[Point[_2D]] = centerPoints.map { pt => TextureMappedProperty.imageCoordinatesToUV(pt, cols, rows) } 97 | 98 | val texMap: SurfacePointProperty[Point[_2D]] = SurfacePointProperty(vcMesh.shape.triangulation, uvCoords) 99 | val texture = TextureMappedProperty(vcMesh.shape.triangulation, texMap, randomImageRGBA(cols, rows)) 100 | ColorNormalMesh3D( 101 | vcMesh.shape, 102 | texture, 103 | vcMesh.shape.vertexNormals 104 | ) 105 | } 106 | 107 | def randomString(length: Int)(implicit rnd: Random): String = rnd.scalaRandom.alphanumeric.take(length).mkString 108 | 109 | def randomURI(length: Int): URI = { 110 | val string = randomString(length).filter { 111 | _.isLetterOrDigit 112 | } 113 | new URI(string) 114 | } 115 | 116 | def randomGridModel(rank: Int = 10, sdev: Double = 5.0, noise: Double = 0.05, cols: Int = 5, rows: Int = 5, orthogonalExpressions: Boolean = false)(implicit rnd: Random): AlbedoMoMo = { 117 | 118 | val reference = randomGridMesh(cols, rows) 119 | 120 | val compN = 3 * cols * rows 121 | val fullShapeComponents = qr(DenseMatrix.fill[Double](compN, 2 * rank)(rnd.scalaRandom.nextGaussian)).q 122 | 123 | val shapeN = compN 124 | val shapeMean = DenseVector.fill[Double](shapeN)(rnd.scalaRandom.nextGaussian) 125 | val shapePCABases = fullShapeComponents(::, 0 until rank) 126 | val shapeVariance = DenseVector.fill[Double](rank)(rnd.scalaRandom.nextDouble * sdev) 127 | assert(shapeMean.length == shapePCABases.rows, "rows is not correct") 128 | assert(shapePCABases.cols == rank, "model is of incorrect rank") 129 | assert(shapeVariance.length == shapePCABases.cols, "wrong number of variances") 130 | val shape = AlbedoModelHelpers.buildFrom[_3D, UnstructuredPointsDomain[_3D], Point[_3D]](reference.shape.pointSet, shapeMean, shapeVariance, shapePCABases) 131 | 132 | val diffuseN = compN 133 | val diffuseMean = DenseVector.fill[Double](diffuseN)(rnd.scalaRandom.nextGaussian) 134 | val diffusePCABases = qr.reduced(DenseMatrix.fill[Double](diffuseN, rank)(rnd.scalaRandom.nextGaussian)).q 135 | val diffuseVariance = DenseVector.fill[Double](rank)(rnd.scalaRandom.nextDouble * sdev) 136 | assert(diffuseMean.length == diffusePCABases.rows, "rows is not correct") 137 | assert(diffusePCABases.cols == rank, "model is of incorrect rank") 138 | assert(diffuseVariance.length == diffusePCABases.cols, "wrong number of variances") 139 | val diffuse = AlbedoModelHelpers.buildFrom[_3D, UnstructuredPointsDomain[_3D], RGB](reference.shape.pointSet, diffuseMean, diffuseVariance, diffusePCABases) 140 | 141 | val specularN = compN 142 | val specularMean = DenseVector.fill[Double](specularN)(rnd.scalaRandom.nextGaussian) 143 | val specularPCABases = qr.reduced(DenseMatrix.fill[Double](specularN, rank)(rnd.scalaRandom.nextGaussian)).q 144 | val specularVariance = DenseVector.fill[Double](rank)(rnd.scalaRandom.nextDouble * sdev) 145 | assert(specularMean.length == specularPCABases.rows, "rows is not correct") 146 | assert(specularPCABases.cols == rank, "model is of incorrect rank") 147 | assert(specularVariance.length == specularPCABases.cols, "wrong number of variances") 148 | val specular = AlbedoModelHelpers.buildFrom[_3D, UnstructuredPointsDomain[_3D], RGB](reference.shape.pointSet, specularMean, specularVariance, specularPCABases) 149 | 150 | val expressionN = compN 151 | val expressionMean = DenseVector.fill[Double](expressionN)(rnd.scalaRandom.nextGaussian) 152 | val expressionPCABases = 153 | if (orthogonalExpressions) 154 | fullShapeComponents(::, rank until 2 * rank) 155 | else 156 | qr.reduced(DenseMatrix.fill[Double](expressionN, rank)(rnd.scalaRandom.nextGaussian)).q 157 | val expressionVariance = DenseVector.fill[Double](rank)(rnd.scalaRandom.nextDouble * sdev) 158 | assert(expressionMean.length == expressionPCABases.rows, "rows is not correct") 159 | assert(expressionPCABases.cols == rank, "model is of incorrect rank") 160 | assert(expressionVariance.length == expressionPCABases.cols, "wrong number of variances") 161 | val expression = AlbedoModelHelpers.buildFrom[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]](reference.shape.pointSet, expressionMean, expressionVariance, expressionPCABases) 162 | 163 | def randomName = randomString(10) 164 | def randomPointOnRef = reference.shape.pointSet.points.toIndexedSeq(rnd.scalaRandom.nextInt(reference.shape.pointSet.numberOfPoints)) 165 | def randomLM: Landmark[_3D] = Landmark(randomName, randomPointOnRef) 166 | val randomLandmarks = for (i <- 0 until 5) yield randomName -> randomLM 167 | 168 | lib.AlbedoMoMo(reference.shape, 169 | PancakeDLRGP(shape, noise), 170 | PancakeDLRGP(diffuse, noise), 171 | PancakeDLRGP(specular, noise), 172 | PancakeDLRGP(expression, noise), 173 | randomLandmarks.toMap) 174 | } 175 | 176 | def randomGridModelExpress(rank: Int = 10, sdev: Double = 5.0, noise: Double = 0.05, cols: Int = 5, rows: Int = 5)(implicit rnd: Random): PancakeDLRGP[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]] = { 177 | 178 | val reference = randomGridMesh(cols, rows) 179 | val expressionN = 3 * cols * rows 180 | val expressionMean = DenseVector.fill[Double](expressionN)(rnd.scalaRandom.nextGaussian) 181 | val expressionDecomposition = qr(DenseMatrix.fill[Double](expressionN, rank)(rnd.scalaRandom.nextGaussian)) 182 | val expressionPCABases = expressionDecomposition.q 183 | val expressionVariance = DenseVector.fill[Double](expressionN)(rnd.scalaRandom.nextDouble * sdev) 184 | val expressionNoise = rnd.scalaRandom.nextDouble * noise 185 | PancakeDLRGP(AlbedoModelHelpers.buildFrom[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]](reference.shape.pointSet, expressionMean, expressionVariance, expressionPCABases)) 186 | 187 | } 188 | } 189 | -------------------------------------------------------------------------------- /scala/src/test/scala/faces/lib/AlbedoMoMoTest.scala: -------------------------------------------------------------------------------- 1 | /* 2 | * 3 | * Licensed under the Apache License, Version 2.0 (the "License"); 4 | * you may not use this file except in compliance with the License. 5 | * You may obtain a copy of the License at 6 | * 7 | * http://www.apache.org/licenses/LICENSE-2.0 8 | * 9 | * Unless required by applicable law or agreed to in writing, software 10 | * distributed under the License is distributed on an "AS IS" BASIS, 11 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | * See the License for the specific language governing permissions and 13 | * limitations under the License. 14 | * 15 | * This code was adapted from 16 | * https://github.com/unibas-gravis/scalismo-faces 17 | * Copyright University of Basel, Graphics and Vision Research Group 18 | */ 19 | package faces.lib 20 | 21 | import java.io._ 22 | 23 | import breeze.linalg.{DenseVector, norm, sum} 24 | import breeze.stats.distributions.Gaussian 25 | import faces.FacesTestSuite 26 | import scalismo.faces.momo.{MoMoCoefficients, PancakeDLRGP} 27 | 28 | class AlbedoMoMoTests extends FacesTestSuite { 29 | 30 | def meshDist(mesh1: VertexAlbedoMesh3D, mesh2: VertexAlbedoMesh3D): Double = { 31 | val shp1: DenseVector[Double] = DenseVector(mesh1.shape.pointSet.points.toIndexedSeq.flatMap(p => IndexedSeq(p.x, p.y, p.z)).toArray) 32 | val shp2: DenseVector[Double] = DenseVector(mesh2.shape.pointSet.points.toIndexedSeq.flatMap(p => IndexedSeq(p.x, p.y, p.z)).toArray) 33 | val dif1: DenseVector[Double] = DenseVector(mesh1.diffuseAlbedo.pointData.flatMap(p => IndexedSeq(p.r, p.g, p.b)).toArray) 34 | val dif2: DenseVector[Double] = DenseVector(mesh2.diffuseAlbedo.pointData.flatMap(p => IndexedSeq(p.r, p.g, p.b)).toArray) 35 | val spe1: DenseVector[Double] = DenseVector(mesh1.specularAlbedo.pointData.flatMap(p => IndexedSeq(p.r, p.g, p.b)).toArray) 36 | val spe2: DenseVector[Double] = DenseVector(mesh2.specularAlbedo.pointData.flatMap(p => IndexedSeq(p.r, p.g, p.b)).toArray) 37 | val shapeDistSq = sum((shp1 - shp2).map(v => v * v)) 38 | val diffuseDistSq = sum((dif1 - dif2).map(v => v * v)) 39 | val specularDistSq = sum((spe1 - spe2).map(v => v * v)) 40 | math.sqrt(shapeDistSq + diffuseDistSq + specularDistSq) 41 | } 42 | 43 | def coeffsDist(coeffs1: MoMoCoefficients, coeffs2: MoMoCoefficients): Double = { 44 | require(coeffs1.shape.length == coeffs2.shape.length) 45 | require(coeffs1.color.length == coeffs2.color.length) 46 | require(coeffs1.expression.length == coeffs2.expression.length) 47 | norm(coeffs1.shape - coeffs2.shape) + norm(coeffs1.color - coeffs2.color) + norm(coeffs1.expression - coeffs2.expression) 48 | } 49 | 50 | describe("A AlbedoMoMo") { 51 | 52 | // Build random PCA model for projection tests and write it to disk 53 | lazy val randomAlbedoMoMo = randomGridModel(10, 5, 0.1, 5, 5, orthogonalExpressions = false) 54 | val rf = File.createTempFile("AlbedoMoMo-gravisAlbedoMoMoio-test", ".h5") 55 | rf.deleteOnExit() 56 | AlbedoMoMoIO.write(randomAlbedoMoMo, rf) 57 | 58 | // load hdf5 default model 59 | lazy val fullAlbedoMoMo = AlbedoMoMoIO.read(new File(getClass.getResource("/random-AlbedoMoMo.h5").getPath)).get.expressionModel.get 60 | lazy val fullShapeCoeffs = for (_ <- 0 until fullAlbedoMoMo.shape.rank) yield Gaussian(0, 1).draw() 61 | //since coeffs coupled only diffuse 62 | lazy val fullDiffuseAlbedoCoeffs = for (_ <- 0 until fullAlbedoMoMo.diffuseAlbedo.rank) yield Gaussian(0, 1).draw() 63 | lazy val fullExpressCoeffs = for (_ <- 0 until fullAlbedoMoMo.expression.rank) yield Gaussian(0, 1).draw() 64 | lazy val fullAlbedoMoMoCoeffs = MoMoCoefficients(fullShapeCoeffs, fullDiffuseAlbedoCoeffs, fullExpressCoeffs) 65 | 66 | lazy val fullSample = fullAlbedoMoMo.instance(fullAlbedoMoMoCoeffs) 67 | 68 | // PCA model needed for projection tests 69 | lazy val albedoMoMo = AlbedoMoMoIO.read(rf).get.expressionModel.get 70 | 71 | val AlbedoMoMoPCA = AlbedoMoMo( 72 | albedoMoMo.referenceMesh, 73 | PancakeDLRGP(albedoMoMo.shape.gpModel), 74 | PancakeDLRGP(albedoMoMo.diffuseAlbedo.gpModel), 75 | PancakeDLRGP(albedoMoMo.specularAlbedo.gpModel), 76 | PancakeDLRGP(albedoMoMo.expression.gpModel), 77 | albedoMoMo.landmarks) 78 | 79 | lazy val shapeCoeffs = for (_ <- 0 until albedoMoMo.shape.rank) yield Gaussian(0, 1).draw() 80 | //since coupled only diffuse 81 | lazy val diffuseCoeffs = for (_ <- 0 until albedoMoMo.diffuseAlbedo.rank) yield Gaussian(0, 1).draw() 82 | lazy val expressCoeffs = for (_ <- 0 until albedoMoMo.expression.rank) yield Gaussian(0, 1).draw() 83 | lazy val AlbedoMoMoCoeffs = MoMoCoefficients(shapeCoeffs, diffuseCoeffs, expressCoeffs) 84 | lazy val AlbedoMoMoCoeffsNoEx = AlbedoMoMoCoeffs.copy(expression = DenseVector.zeros[Double](0)) 85 | lazy val sample = albedoMoMo.instance(AlbedoMoMoCoeffs) 86 | lazy val sampleNoEx = albedoMoMo.neutralModel.instance(AlbedoMoMoCoeffsNoEx) 87 | 88 | val distThres = 0.01 * sample.shape.pointSet.numberOfPoints + 0.01 * sample.diffuseAlbedo.triangulation.pointIds.size+ 0.01 * sample.specularAlbedo.triangulation.pointIds.size 89 | 90 | it("can load from disk") { 91 | albedoMoMo.shape.rank should be > 0 92 | } 93 | 94 | it("should create shape samples") { 95 | sample.shape.pointSet.numberOfPoints should be > 0 96 | } 97 | 98 | it("should create color samples") { 99 | sample.diffuseAlbedo.triangulation.pointIds.size should be > 0 100 | sample.specularAlbedo.triangulation.pointIds.size should be > 0 101 | } 102 | 103 | it("can generate random samples") { 104 | albedoMoMo.sample().shape.triangulation.pointIds should be (albedoMoMo.referenceMesh.triangulation.pointIds) 105 | } 106 | 107 | it("can generate random samples (PCA)") { 108 | AlbedoMoMoPCA.sample().shape.triangulation.pointIds should be (AlbedoMoMoPCA.referenceMesh.triangulation.pointIds) 109 | } 110 | 111 | it("can create samples with fewer parameters set") { 112 | val AlbedoMoMoRed = AlbedoMoMoCoeffs.copy( 113 | shape = DenseVector(AlbedoMoMoCoeffs.shape.toArray.take(1)), 114 | color = DenseVector(AlbedoMoMoCoeffs.color.toArray.take(1)), 115 | expression = DenseVector(AlbedoMoMoCoeffs.expression.toArray.take(1)) 116 | ) 117 | val sample = albedoMoMo.instance(AlbedoMoMoRed) 118 | val AlbedoMoMoRedFull = AlbedoMoMoCoeffs.copy( 119 | shape = DenseVector(AlbedoMoMoCoeffs.shape.toArray.take(1) ++ Array.fill(albedoMoMo.shape.rank - 1)(0.0)), 120 | color = DenseVector(AlbedoMoMoCoeffs.color.toArray.take(1) ++ Array.fill(albedoMoMo.shape.rank - 1)(0.0)), 121 | expression = DenseVector(AlbedoMoMoCoeffs.expression.toArray.take(1) ++ Array.fill(albedoMoMo.expression.rank - 1)(0.0)) 122 | ) 123 | val sampleFull = albedoMoMo.instance(AlbedoMoMoRedFull) 124 | meshDist(sample, sampleFull) should be < distThres 125 | } 126 | 127 | it("should creates instances identical to underlying GP model (no noise on instance)") { 128 | val instPCA = AlbedoMoMoPCA.instance(AlbedoMoMoCoeffs) 129 | meshDist(instPCA, sample) should be < distThres 130 | } 131 | 132 | it("should not alter a sample through projection (PCA model only, no expressions)") { 133 | val projected = AlbedoMoMoPCA.neutralModel.project(sampleNoEx) 134 | meshDist(projected, sampleNoEx) should be < distThres 135 | } 136 | 137 | it("should not alter a sample through projection (PCA model only, with expression)") { 138 | val projected = AlbedoMoMoPCA.project(sample) 139 | meshDist(projected, sample) should be < distThres 140 | } 141 | 142 | it("should regularize a sample through projection (closer to mean, no expression)") { 143 | val projected = albedoMoMo.neutralModel.project(sampleNoEx) 144 | meshDist(projected, albedoMoMo.neutralModel.mean) should be < meshDist(sampleNoEx, albedoMoMo.neutralModel.mean) 145 | } 146 | 147 | it("should regularize the coefficients of a sample (closer to mean)") { 148 | val projCoeffs = albedoMoMo.coefficients(sample) 149 | val projCoeffPCA = AlbedoMoMoPCA.coefficients(sample) 150 | norm(projCoeffs.shape) + norm(projCoeffs.expression) should be < (norm(projCoeffPCA.shape) + norm(projCoeffPCA.expression)) 151 | } 152 | 153 | it("should regularize the coefficients of a sample of the neutral model (closer to mean)") { 154 | val projCoeffs = albedoMoMo.neutralModel.coefficients(sampleNoEx) 155 | val projCoeffPCA = AlbedoMoMoPCA.neutralModel.coefficients(sampleNoEx) 156 | norm(projCoeffs.shape) should be < norm(projCoeffPCA.shape) 157 | } 158 | 159 | it("should yield the same shape coefficients used to draw a sample (PCA model only)") { 160 | val projCoeffs = AlbedoMoMoPCA.coefficients(sample) 161 | val pC = albedoMoMo.coefficients(sample) 162 | norm(projCoeffs.shape - AlbedoMoMoCoeffs.shape) should be < 0.01 * AlbedoMoMoCoeffs.shape.length 163 | norm(projCoeffs.color - AlbedoMoMoCoeffsNoEx.color) should be < 0.01 * AlbedoMoMoCoeffs.color.length 164 | norm(projCoeffs.expression - AlbedoMoMoCoeffs.expression) should be < 0.01 * AlbedoMoMoCoeffs.expression.length 165 | } 166 | 167 | it("should yield proper coefficients for the mean sample") { 168 | val meanSample = albedoMoMo.mean 169 | val meanCoeffs = albedoMoMo.coefficients(meanSample) 170 | norm(meanCoeffs.shape) + norm(meanCoeffs.color) should be < 0.01 * AlbedoMoMoCoeffs.shape.length + 0.01 * AlbedoMoMoCoeffs.color.length 171 | } 172 | 173 | val f = File.createTempFile("AlbedoMoMo", ".h5") 174 | f.deleteOnExit() 175 | AlbedoMoMoIO.write(albedoMoMo, f).get 176 | val loadedAlbedoMoMo = AlbedoMoMoIO.read(f).get.expressionModel.get 177 | 178 | describe("should save to and load from disk unaltered") { 179 | it("reference") { 180 | loadedAlbedoMoMo.referenceMesh should be(albedoMoMo.referenceMesh) 181 | } 182 | 183 | it("shape") { 184 | loadedAlbedoMoMo.shape should be(albedoMoMo.shape) 185 | } 186 | 187 | it("color") { 188 | loadedAlbedoMoMo.diffuseAlbedo should be(albedoMoMo.diffuseAlbedo) 189 | loadedAlbedoMoMo.specularAlbedo should be(albedoMoMo.specularAlbedo) 190 | } 191 | 192 | it("landmarks") { 193 | loadedAlbedoMoMo.landmarks should be(albedoMoMo.landmarks) 194 | } 195 | 196 | it("complete AlbedoMoMo") { 197 | loadedAlbedoMoMo should be(albedoMoMo) 198 | } 199 | } 200 | 201 | /* it("can load model built in c++ from disk") { 202 | val loadedCPP = AlbedoMoMoIO.read(new File(getClass.getResource("/random-l4.h5").getPath)).get.neutralModel 203 | loadedCPP.shape.rank shouldBe 19 204 | loadedCPP.diffuseAlbedo.rank shouldBe 19 205 | loadedCPP.specularAlbedo.rank shouldBe 19 206 | }*/ 207 | 208 | lazy val reducedAlbedoMoMo = albedoMoMo.truncate(5, 5, 5) 209 | lazy val reducedShapeCoeffs = for (_ <- 0 until reducedAlbedoMoMo.shape.rank) yield Gaussian(0, 1).draw() 210 | //since they are coupled only diffuse 211 | lazy val reducedDiffuseAlbedoCoeffs = for (_ <- 0 until reducedAlbedoMoMo.diffuseAlbedo.rank) yield Gaussian(0, 1).draw() 212 | lazy val reducedExpressCoeffs = for (_ <- 0 until reducedAlbedoMoMo.expression.rank) yield Gaussian(0, 1).draw() 213 | lazy val reducedAlbedoMoMoCoeffs = MoMoCoefficients(reducedShapeCoeffs, reducedDiffuseAlbedoCoeffs, reducedExpressCoeffs) 214 | lazy val reducedAlbedoMoMoCoeffsNoEx = MoMoCoefficients(reducedShapeCoeffs, reducedDiffuseAlbedoCoeffs, IndexedSeq.empty) 215 | lazy val reducedSample = reducedAlbedoMoMo.instance(reducedAlbedoMoMoCoeffs) 216 | lazy val reducedSampleNoEx = reducedAlbedoMoMo.neutralModel.instance(reducedAlbedoMoMoCoeffs) 217 | 218 | 219 | it("can be reduced to 5 shape components") { 220 | reducedAlbedoMoMo.shape.rank should be(5) 221 | } 222 | 223 | it("can be reduced to 5 color components") { 224 | reducedAlbedoMoMo.diffuseAlbedo.rank should be(5) 225 | reducedAlbedoMoMo.specularAlbedo.rank should be(5) 226 | } 227 | 228 | it("can be reduced to 5 expression components") { 229 | reducedAlbedoMoMo.expression.rank should be(5) 230 | } 231 | 232 | it("should create shape samples (reduced)") { 233 | reducedSample.shape.pointSet.numberOfPoints should be > 0 234 | } 235 | 236 | it("should create color samples (reduced)") { 237 | reducedSample.diffuseAlbedo.pointData.length should be > 0 238 | reducedSample.specularAlbedo.pointData.length should be > 0 239 | } 240 | 241 | it("should not alter a sample through projection (reduced)") { 242 | val projected = reducedAlbedoMoMo.project(reducedSample) 243 | meshDist(projected, reducedSample) should be < 0.1 * reducedSample.shape.pointSet.numberOfPoints + 0.01 * reducedSample.diffuseAlbedo.pointData.length + 0.01 * reducedSample.specularAlbedo.pointData.length 244 | } 245 | 246 | // it("should yield the same coefficients used to draw a sample (reduced)") { 247 | // val projCoeffs = reducedAlbedoMoMo.coefficients(reducedSample) 248 | // coeffsDist(projCoeffs, reducedAlbedoMoMoCoeffs) should be < 0.01 * reducedAlbedoMoMoCoeffs.shape.length + 0.01 * reducedAlbedoMoMoCoeffs.color.length + 0.01 * reducedAlbedoMoMoCoeffs.expression.length 249 | // } 250 | 251 | it("can be written to disk and be read again (and be equal)") { 252 | val f = File.createTempFile("reduced-model", ".h5") 253 | f.deleteOnExit() 254 | AlbedoMoMoIO.write(albedoMoMo, f).get 255 | val loadedAlbedoMoMo = AlbedoMoMoIO.read(f).get 256 | loadedAlbedoMoMo should be(albedoMoMo) 257 | } 258 | /* 259 | it("supports loading using an URI") { 260 | val modelFile = new File(getClass.getResource("/random-AlbedoMoMo.h5").getPath) 261 | val AlbedoMoMo = AlbedoMoMoIO.read(modelFile, "").get 262 | val uri = modelFile.toURI 263 | val uriAlbedoMoMo = AlbedoMoMoIO.read(uri).get 264 | uriAlbedoMoMo shouldBe albedoMoMo 265 | } 266 | 267 | it("supports cached loading using an URI") { 268 | val uri = new File(getClass.getResource("/random-AlbedoMoMo.h5").getPath).toURI 269 | val uriAlbedoMoMo1 = AlbedoMoMoIO.read(uri).get 270 | val uriAlbedoMoMo2 = AlbedoMoMoIO.read(uri).get 271 | // ensure same model instance: caching serves the identical instance twice 272 | assert(uriAlbedoMoMo1 eq uriAlbedoMoMo2) 273 | } 274 | */ 275 | it("it prevents drawing an instance with an empty parameter vector") { 276 | val emptyVector = MoMoCoefficients(IndexedSeq.empty, IndexedSeq.empty, IndexedSeq.empty) 277 | intercept[IllegalArgumentException](albedoMoMo.instance(emptyVector)) 278 | } 279 | 280 | it("can handle an empty expression model") { 281 | val neutralCoeffs = AlbedoMoMoCoeffs.copy(expression = DenseVector.zeros(0)) 282 | noException should be thrownBy albedoMoMo.neutralModel.instance(neutralCoeffs) 283 | } 284 | } 285 | } --------------------------------------------------------------------------------