├── .gitignore
├── README.md
├── assets
├── VU-VRM-elf.vrm
├── VU-VRM.gif
├── VU-VRM.jpg
└── VU-VRM.vrm
├── index.html
├── script.js
└── style.css
/.gitignore:
--------------------------------------------------------------------------------
1 |
2 | .DS_Store
3 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # VU-VRM
2 | A lip-sync VRM avatar client for zero-webcam mic-based vtubing:
3 | [automattic.github.io/VU-VRM/](https://automattic.github.io/VU-VRM/)
4 |
5 | 
6 |
7 | # Why?
8 | Because multitasking. Because sometimes you need to run an avatar without a webcam. Because vtubers are disabled too. Because everyone gets webcam fatigue. Because it's not always essential to bring your face to work. Because an avatar the folks associate with you can be more personable than a little green light when you're on a voice call. Because I made this for use at work, and it's turned out real handy. Because what if [VRM](https://vrm.dev/en/), but with PNGtuber rules?
9 |
10 |
11 | # Usage
12 | While this works just fine for testing if you [visit its pages url](https://automattic.github.io/VU-VRM/) (and allow mic access), it's intended for use in [OBS](https://obsproject.com) as a browser source.
13 |
14 | Use the interface (or drag and drop) to load a local .vrm file, set your levels, dismiss the UI and you're good to lipsync in a kinda-lifelike way for steams or virtual webcam for other chat apps.
15 |
16 | Plays nice with VRMs created in [VroidStudio](https://vroid.com/en/studio) and other standard compliant VRMs.
17 |
18 | # Interface
19 | Minimal; intended to acheive a mic volume threshold at which the avatar's mouth and body moves, adjust gain as needed, then to be dismissed.
20 |
21 | This volume = movement aspect is what makes this avatar client literally a form of VU volume unit meter, hence its name.
22 |
23 | # OBS launch specifics
24 | To allow browser sources in OBS (like this) to receive mic input, OBS needs launching with these arguments:
25 |
26 | `--use-fake-ui-for-media-stream --allow-file-access-from-files`
27 |
28 | - MacOS Terminal: `/Applications/OBS.app/Contents/MacOS/obs --use-fake-ui-for-media-stream --allow-file-access-from-files`
29 | - Windows: create a shortcut to OBS and add the arguments to the *Target* field in the shortcut's properties.
30 | - Linux users don't need hints to launch things with arguments ;)
31 |
32 | VU-VRM can then be added as an OBS browser source [from a URL](https://automattic.github.io/VU-VRM/) or as a [local file](https://github.com/Automattic/VU-VRM/archive/refs/heads/trunk.zip), and is intentionally transparent for the purpose.
33 |
34 | # ToDo
35 | - Mic input selector
36 | - Background controls
37 | - localstorage use
38 | - Smoother more natural state-to-state eased body movement
39 | - Use expression blendshapes to ease between low percentage thereof for more facial motion
40 | - Use all available vowel blendshapes
41 | - Separate frequency reponse array into vowelsounds and sibliants to cue appropriate shapes
42 | - a less basic default pose
43 | - Migrate from ScriptProcessorNode method to AudioWorkletNode
44 | - Hook it to chat app APIs for group VRM chats!
45 |
46 | # Changelog
47 | - Veritcal slider based interface
48 | - Input level VU
49 | - Two-expression crossfade with random wander and bias slider
50 |
--------------------------------------------------------------------------------
/assets/VU-VRM-elf.vrm:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Automattic/VU-VRM/db1ed7ee1dbd4d5ec1ab4cc6e5faf4dd9ed0f708/assets/VU-VRM-elf.vrm
--------------------------------------------------------------------------------
/assets/VU-VRM.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Automattic/VU-VRM/db1ed7ee1dbd4d5ec1ab4cc6e5faf4dd9ed0f708/assets/VU-VRM.gif
--------------------------------------------------------------------------------
/assets/VU-VRM.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Automattic/VU-VRM/db1ed7ee1dbd4d5ec1ab4cc6e5faf4dd9ed0f708/assets/VU-VRM.jpg
--------------------------------------------------------------------------------
/assets/VU-VRM.vrm:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Automattic/VU-VRM/db1ed7ee1dbd4d5ec1ab4cc6e5faf4dd9ed0f708/assets/VU-VRM.vrm
--------------------------------------------------------------------------------
/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 | VU-VRM
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
25 |
VU-VRM
26 | A simple mic-driven VRM client
27 | Mic access required / Designed for use in OBS
28 | Built for hands-free webcam-free inclusivity
29 |
30 |
31 |
32 |
33 |
v1.0 by itsTallulahhh
34 |
an Automattic Empathy
35 |
Microphone permission required
36 |
37 |
Click to dismiss
38 |
39 |
40 |
41 |
42 |
Launch OBS with these arguments to use this as a Browser Source:
43 |
--use-fake-ui-for-media-stream --allow-file-access-from-files
44 |
45 |
46 |
47 |
48 |
❌
49 |
50 |
80 |
81 |
85 |
86 |
87 |
88 |
89 |
90 |
91 |
--------------------------------------------------------------------------------
/script.js:
--------------------------------------------------------------------------------
1 | // setup
2 |
3 | //expression setup
4 | var expressionyay = 0;
5 | var expressionoof = 0;
6 | var expressionlimityay = 0.5;
7 | var expressionlimitoof = 0.5;
8 | var expressionease = 100;
9 | var expressionintensity = 0.75;
10 |
11 | //interface values
12 | if (localStorage.localvalues) {
13 | var initvalues = true;
14 | var mouththreshold = Number(localStorage.mouththreshold) ;
15 | var mouthboost = Number(localStorage.mouthboost) ;
16 | var bodythreshold = Number(localStorage.bodythreshold) ;
17 | var bodymotion = Number(localStorage.bodymotion) ;
18 | var expression = Number(localStorage.expression) ;
19 | } else {
20 | var mouththreshold = 10;
21 | var mouthboost = 10;
22 | var bodythreshold = 10;
23 | var bodymotion = 10;
24 | var expression = 80;
25 | }
26 |
27 | // setup three-vrm
28 |
29 | // renderer
30 | const renderer = new THREE.WebGLRenderer({ alpha: true , antialias: true ,powerPreference: "low-power" });
31 | renderer.setSize(window.innerWidth, window.innerHeight);
32 | renderer.setPixelRatio(window.devicePixelRatio);
33 | document.body.appendChild(renderer.domElement);
34 |
35 | // camera
36 | const camera = new THREE.PerspectiveCamera( 30.0, window.innerWidth / window.innerHeight, 0.1, 20.0 );
37 | camera.position.set(0.0, 1.45, 0.75);
38 |
39 | // camera controls
40 | const controls = new THREE.OrbitControls(camera, renderer.domElement);
41 | controls.screenSpacePanning = true;
42 | controls.target.set(0.0, 1.45, 0.0);
43 | controls.update();
44 |
45 | // scene
46 | const scene = new THREE.Scene();
47 |
48 | // light
49 | const light = new THREE.DirectionalLight(0xffffff);
50 | light.position.set(1.0, 1.0, 1.0).normalize();
51 | scene.add(light);
52 |
53 | // lookat target
54 | const lookAtTarget = new THREE.Object3D();
55 | camera.add(lookAtTarget);
56 |
57 | // gltf and vrm
58 | let currentVrm = undefined;
59 | const loader = new THREE.GLTFLoader();
60 |
61 | function load( url ) {
62 |
63 | loader.crossOrigin = 'anonymous';
64 | loader.load(
65 |
66 | url,
67 |
68 | ( gltf ) => {
69 |
70 | //THREE.VRMUtils.removeUnnecessaryVertices( gltf.scene ); Vroid VRM can't handle this for some reason
71 | THREE.VRMUtils.removeUnnecessaryJoints( gltf.scene );
72 |
73 | THREE.VRM.from( gltf ).then( ( vrm ) => {
74 |
75 | if ( currentVrm ) {
76 |
77 | scene.remove( currentVrm.scene );
78 | currentVrm.dispose();
79 |
80 | }
81 |
82 | currentVrm = vrm;
83 | scene.add( vrm.scene );
84 |
85 | vrm.humanoid.getBoneNode( THREE.VRMSchema.HumanoidBoneName.Hips ).rotation.y = Math.PI;
86 | vrm.springBoneManager.reset();
87 |
88 | // un-T-pose
89 |
90 | vrm.humanoid.getBoneNode(
91 | THREE.VRMSchema.HumanoidBoneName.RightUpperArm
92 | ).rotation.z = 250;
93 |
94 | vrm.humanoid.getBoneNode(
95 | THREE.VRMSchema.HumanoidBoneName.RightLowerArm
96 | ).rotation.z = -0.2;
97 |
98 | vrm.humanoid.getBoneNode(
99 | THREE.VRMSchema.HumanoidBoneName.LeftUpperArm
100 | ).rotation.z = -250;
101 |
102 | vrm.humanoid.getBoneNode(
103 | THREE.VRMSchema.HumanoidBoneName.LeftLowerArm
104 | ).rotation.z = 0.2;
105 |
106 | // randomise init positions
107 |
108 | function randomsomesuch() {
109 | return (Math.random() - 0.5) / 10;
110 | }
111 |
112 | vrm.humanoid.getBoneNode(
113 | THREE.VRMSchema.HumanoidBoneName.Head
114 | ).rotation.x = randomsomesuch();
115 | vrm.humanoid.getBoneNode(
116 | THREE.VRMSchema.HumanoidBoneName.Head
117 | ).rotation.y = randomsomesuch();
118 | vrm.humanoid.getBoneNode(
119 | THREE.VRMSchema.HumanoidBoneName.Head
120 | ).rotation.z = randomsomesuch();
121 |
122 | vrm.humanoid.getBoneNode(
123 | THREE.VRMSchema.HumanoidBoneName.Neck
124 | ).rotation.x = randomsomesuch();
125 | vrm.humanoid.getBoneNode(
126 | THREE.VRMSchema.HumanoidBoneName.Neck
127 | ).rotation.y = randomsomesuch();
128 | vrm.humanoid.getBoneNode(
129 | THREE.VRMSchema.HumanoidBoneName.Neck
130 | ).rotation.z = randomsomesuch();
131 |
132 | vrm.humanoid.getBoneNode(
133 | THREE.VRMSchema.HumanoidBoneName.Spine
134 | ).rotation.x = randomsomesuch();
135 | vrm.humanoid.getBoneNode(
136 | THREE.VRMSchema.HumanoidBoneName.Spine
137 | ).rotation.y = randomsomesuch();
138 | vrm.humanoid.getBoneNode(
139 | THREE.VRMSchema.HumanoidBoneName.Spine
140 | ).rotation.z = randomsomesuch();
141 |
142 | vrm.lookAt.target = lookAtTarget;
143 | vrm.springBoneManager.reset();
144 |
145 | console.log(vrm);
146 | } );
147 |
148 | },
149 |
150 | ( progress ) => console.log( 'Loading model...', 100.0 * ( progress.loaded / progress.total ), '%' ),
151 |
152 | ( error ) => console.error( error )
153 |
154 | );
155 |
156 | }
157 |
158 | // beware of CORS errors when using this locally. If you can't https, import the required libraries.
159 | load( 'https://automattic.github.io/VU-VRM/assets/VU-VRM-elf.vrm' );
160 |
161 | // grid / axis helpers
162 | // const gridHelper = new THREE.GridHelper( 10, 10 );
163 | // scene.add( gridHelper );
164 | // const axesHelper = new THREE.AxesHelper( 5 );
165 | // scene.add( axesHelper );
166 |
167 | // animate
168 |
169 | const clock = new THREE.Clock();
170 |
171 | function animate() {
172 | requestAnimationFrame(animate);
173 |
174 | const deltaTime = clock.getDelta();
175 |
176 | if (currentVrm) {
177 | // update vrm
178 | currentVrm.update(deltaTime);
179 | }
180 |
181 | renderer.render(scene, camera);
182 | }
183 |
184 | animate();
185 |
186 | // mic listener - get a value
187 | navigator.mediaDevices
188 | .getUserMedia({
189 | audio: true
190 | })
191 | .then(
192 | function (stream) {
193 | audioContext = new AudioContext();
194 | analyser = audioContext.createAnalyser();
195 | microphone = audioContext.createMediaStreamSource(stream);
196 | javascriptNode = audioContext.createScriptProcessor(256, 1, 1);
197 |
198 | analyser.smoothingTimeConstant = 0.5;
199 | analyser.fftSize = 1024;
200 |
201 | microphone.connect(analyser);
202 | analyser.connect(javascriptNode);
203 | javascriptNode.connect(audioContext.destination);
204 |
205 | javascriptNode.onaudioprocess = function () {
206 | var array = new Uint8Array(analyser.frequencyBinCount);
207 | analyser.getByteFrequencyData(array);
208 | var values = 0;
209 |
210 | var length = array.length;
211 | for (var i = 0; i < length; i++) {
212 | values += array[i];
213 | }
214 |
215 | // audio in expressed as one number
216 | var average = values / length;
217 | var inputvolume = average;
218 |
219 | // audio in spectrum expressed as array
220 | // console.log(array.toString());
221 | // useful for mouth shape variance
222 |
223 | // move the interface slider
224 | document.getElementById("inputlevel").value = inputvolume;
225 |
226 |
227 |
228 | // mic based / endless animations (do stuff)
229 |
230 | if (currentVrm != undefined){ //best to be sure
231 |
232 | // talk
233 |
234 | if (talktime == true){
235 | // todo: more vowelshapes
236 | var voweldamp = 53;
237 | var vowelmin = 12;
238 | if (inputvolume > (mouththreshold * 2)) {
239 | currentVrm.blendShapeProxy.setValue(
240 | THREE.VRMSchema.BlendShapePresetName.A,
241 | (
242 | (average - vowelmin) / voweldamp) * (mouthboost/10)
243 | );
244 |
245 | }else{
246 | currentVrm.blendShapeProxy.setValue(
247 | THREE.VRMSchema.BlendShapePresetName.A, 0
248 | );
249 | }}
250 |
251 |
252 | // move body
253 |
254 | // todo: replace with ease-to-target behaviour
255 | var damping = 750/(bodymotion/10);
256 | var springback = 1.001;
257 |
258 | if (average > (1 * bodythreshold)) {
259 | currentVrm.humanoid.getBoneNode(
260 | THREE.VRMSchema.HumanoidBoneName.Head
261 | ).rotation.x += (Math.random() - 0.5) / damping;
262 | currentVrm.humanoid.getBoneNode(
263 | THREE.VRMSchema.HumanoidBoneName.Head
264 | ).rotation.x /= springback;
265 | currentVrm.humanoid.getBoneNode(
266 | THREE.VRMSchema.HumanoidBoneName.Head
267 | ).rotation.y += (Math.random() - 0.5) / damping;
268 | currentVrm.humanoid.getBoneNode(
269 | THREE.VRMSchema.HumanoidBoneName.Head
270 | ).rotation.y /= springback;
271 | currentVrm.humanoid.getBoneNode(
272 | THREE.VRMSchema.HumanoidBoneName.Head
273 | ).rotation.z += (Math.random() - 0.5) / damping;
274 | currentVrm.humanoid.getBoneNode(
275 | THREE.VRMSchema.HumanoidBoneName.Head
276 | ).rotation.z /= springback;
277 |
278 | currentVrm.humanoid.getBoneNode(
279 | THREE.VRMSchema.HumanoidBoneName.Neck
280 | ).rotation.x += (Math.random() - 0.5) / damping;
281 | currentVrm.humanoid.getBoneNode(
282 | THREE.VRMSchema.HumanoidBoneName.Neck
283 | ).rotation.x /= springback;
284 | currentVrm.humanoid.getBoneNode(
285 | THREE.VRMSchema.HumanoidBoneName.Neck
286 | ).rotation.y += (Math.random() - 0.5) / damping;
287 | currentVrm.humanoid.getBoneNode(
288 | THREE.VRMSchema.HumanoidBoneName.Neck
289 | ).rotation.y /= springback;
290 | currentVrm.humanoid.getBoneNode(
291 | THREE.VRMSchema.HumanoidBoneName.Neck
292 | ).rotation.z += (Math.random() - 0.5) / damping;
293 | currentVrm.humanoid.getBoneNode(
294 | THREE.VRMSchema.HumanoidBoneName.Neck
295 | ).rotation.z /= springback;
296 |
297 | currentVrm.humanoid.getBoneNode(
298 | THREE.VRMSchema.HumanoidBoneName.UpperChest
299 | ).rotation.x += (Math.random() - 0.5) / damping;
300 | currentVrm.humanoid.getBoneNode(
301 | THREE.VRMSchema.HumanoidBoneName.UpperChest
302 | ).rotation.x /= springback;
303 | currentVrm.humanoid.getBoneNode(
304 | THREE.VRMSchema.HumanoidBoneName.UpperChest
305 | ).rotation.y += (Math.random() - 0.5) / damping;
306 | currentVrm.humanoid.getBoneNode(
307 | THREE.VRMSchema.HumanoidBoneName.UpperChest
308 | ).rotation.y /= springback;
309 | currentVrm.humanoid.getBoneNode(
310 | THREE.VRMSchema.HumanoidBoneName.UpperChest
311 | ).rotation.z += (Math.random() - 0.5) / damping;
312 | currentVrm.humanoid.getBoneNode(
313 | THREE.VRMSchema.HumanoidBoneName.UpperChest
314 | ).rotation.z /= springback;
315 |
316 | currentVrm.humanoid.getBoneNode(
317 | THREE.VRMSchema.HumanoidBoneName.RightShoulder
318 | ).rotation.x += (Math.random() - 0.5) / damping;
319 | currentVrm.humanoid.getBoneNode(
320 | THREE.VRMSchema.HumanoidBoneName.RightShoulder
321 | ).rotation.x /= springback;
322 | currentVrm.humanoid.getBoneNode(
323 | THREE.VRMSchema.HumanoidBoneName.RightShoulder
324 | ).rotation.y += (Math.random() - 0.5) / damping;
325 | currentVrm.humanoid.getBoneNode(
326 | THREE.VRMSchema.HumanoidBoneName.RightShoulder
327 | ).rotation.y /= springback;
328 | currentVrm.humanoid.getBoneNode(
329 | THREE.VRMSchema.HumanoidBoneName.RightShoulder
330 | ).rotation.z += (Math.random() - 0.5) / damping;
331 | currentVrm.humanoid.getBoneNode(
332 | THREE.VRMSchema.HumanoidBoneName.RightShoulder
333 | ).rotation.z /= springback;
334 |
335 | currentVrm.humanoid.getBoneNode(
336 | THREE.VRMSchema.HumanoidBoneName.LeftShoulder
337 | ).rotation.x += (Math.random() - 0.5) / damping;
338 | currentVrm.humanoid.getBoneNode(
339 | THREE.VRMSchema.HumanoidBoneName.LeftShoulder
340 | ).rotation.x /= springback;
341 | currentVrm.humanoid.getBoneNode(
342 | THREE.VRMSchema.HumanoidBoneName.LeftShoulder
343 | ).rotation.y += (Math.random() - 0.5) / damping;
344 | currentVrm.humanoid.getBoneNode(
345 | THREE.VRMSchema.HumanoidBoneName.LeftShoulder
346 | ).rotation.y /= springback;
347 | currentVrm.humanoid.getBoneNode(
348 | THREE.VRMSchema.HumanoidBoneName.LeftShoulder
349 | ).rotation.z += (Math.random() - 0.5) / damping;
350 | currentVrm.humanoid.getBoneNode(
351 | THREE.VRMSchema.HumanoidBoneName.LeftShoulder
352 | ).rotation.z /= springback;
353 |
354 | }
355 |
356 | // yay/oof expression drift
357 | expressionyay += (Math.random() - 0.5) / expressionease;
358 | if(expressionyay > expressionlimityay){expressionyay=expressionlimityay};
359 | if(expressionyay < 0){expressionyay=0};
360 | currentVrm.blendShapeProxy.setValue(THREE.VRMSchema.BlendShapePresetName.Fun, expressionyay);
361 | expressionoof += (Math.random() - 0.5) / expressionease;
362 | if(expressionoof > expressionlimitoof){expressionoof=expressionlimitoof};
363 | if(expressionoof < 0){expressionoof=0};
364 | currentVrm.blendShapeProxy.setValue(THREE.VRMSchema.BlendShapePresetName.Angry, expressionoof);
365 |
366 | }
367 |
368 |
369 |
370 |
371 |
372 | //look at camera is more efficient on blink
373 | lookAtTarget.position.x = camera.position.x;
374 | lookAtTarget.position.y = ((camera.position.y-camera.position.y-camera.position.y)/2)+0.5;
375 |
376 | }; // end fn stream
377 | },
378 | function (err) {
379 | console.log("The following error occured: " + err.name);
380 | }
381 | );
382 |
383 |
384 | // blink
385 |
386 | function blink() {
387 | var blinktimeout = Math.floor(Math.random() * 250) + 50;
388 | lookAtTarget.position.y =
389 | camera.position.y - camera.position.y * 2 + 1.25;
390 |
391 | setTimeout(() => {
392 | currentVrm.blendShapeProxy.setValue(
393 | THREE.VRMSchema.BlendShapePresetName.BlinkL,
394 | 0
395 | );
396 | currentVrm.blendShapeProxy.setValue(
397 | THREE.VRMSchema.BlendShapePresetName.BlinkR,
398 | 0
399 | );
400 | }, blinktimeout);
401 |
402 | currentVrm.blendShapeProxy.setValue(
403 | THREE.VRMSchema.BlendShapePresetName.BlinkL,
404 | 1
405 | );
406 | currentVrm.blendShapeProxy.setValue(
407 | THREE.VRMSchema.BlendShapePresetName.BlinkR,
408 | 1
409 | );
410 | }
411 |
412 |
413 |
414 | // loop blink timing
415 | (function loop() {
416 | var rand = Math.round(Math.random() * 10000) + 1000;
417 | setTimeout(function () {
418 | blink();
419 | loop();
420 | }, rand);
421 | })();
422 |
423 | // drag and drop + file handler
424 | window.addEventListener( 'dragover', function( event ) {
425 | event.preventDefault();
426 | } );
427 |
428 | window.addEventListener( 'drop', function( event ) {
429 | event.preventDefault();
430 |
431 | // read given file then convert it to blob url
432 | const files = event.dataTransfer.files;
433 | if ( !files ) { return; }
434 | const file = files[0];
435 | if ( !file ) { return; }
436 | const blob = new Blob( [ file ], { type: "application/octet-stream" } );
437 | const url = URL.createObjectURL( blob );
438 | load( url );
439 | } );
440 |
441 |
442 | // handle window resizes
443 |
444 |
445 | window.addEventListener( 'resize', onWindowResize, false );
446 |
447 | function onWindowResize(){
448 |
449 | camera.aspect = window.innerWidth / window.innerHeight;
450 | camera.updateProjectionMatrix();
451 |
452 | renderer.setSize( window.innerWidth, window.innerHeight );
453 |
454 | }
455 | // interface handling
456 |
457 | var talktime = true;
458 |
459 | function interface() {
460 |
461 | if (initvalues == true){
462 | if (localStorage.localvalues) {
463 | initvalues = false;
464 | document.getElementById("mouththreshold").value = mouththreshold;
465 | document.getElementById("mouthboost").value = mouthboost;
466 | document.getElementById("bodythreshold").value = bodythreshold;
467 | document.getElementById("bodymotion").value = bodymotion;
468 | document.getElementById("expression").value = expression;
469 | }}
470 |
471 | mouththreshold = document.getElementById("mouththreshold").value;
472 | mouthboost = document.getElementById("mouthboost").value;
473 | bodythreshold = document.getElementById("bodythreshold").value;
474 | bodymotion = document.getElementById("bodymotion").value;
475 |
476 | expression = document.getElementById("expression").value;
477 | expressionlimityay = (expression);
478 | expressionlimitoof = (100 - expression);
479 | expressionlimityay = expressionlimityay/100;
480 | expressionlimitoof = expressionlimitoof/100;
481 | expressionlimityay = expressionlimityay*expressionintensity;
482 | expressionlimitoof = expressionlimitoof*expressionintensity;
483 |
484 | console.log("Expression " + expressionyay + " yay / " + expressionoof + " oof");
485 | console.log("Expression mix " + expressionlimityay + " yay / " + expressionlimitoof + " oof");
486 |
487 | // store it too
488 | localStorage.localvalues = 1;
489 | localStorage.mouththreshold = mouththreshold;
490 | localStorage.mouthboost = mouthboost;
491 | localStorage.bodythreshold = bodythreshold;
492 | localStorage.bodymotion = bodymotion;
493 | localStorage.expression = expression;
494 |
495 | }
496 |
497 | // click to dismiss non-vrm divs
498 | function hideinterface() {
499 |
500 | var a = document.getElementById("backplate");
501 | var b = document.getElementById("interface");
502 | var x = document.getElementById("infobar");
503 | var y = document.getElementById("credits");
504 | a.style.display = "none";
505 | b.style.display = "none";
506 | x.style.display = "none";
507 | y.style.display = "none";
508 |
509 | }
510 |
511 | // click to dismiss non-interface divs
512 | function hideinfo() {
513 |
514 | var a = document.getElementById("backplate");
515 | var x = document.getElementById("infobar");
516 | var y = document.getElementById("credits");
517 | a.style.display = "none";
518 | x.style.display = "none";
519 | y.style.display = "none";
520 |
521 | }
522 |
523 | // load file from user picker
524 | function dofile(){
525 | var file = document.querySelector('input[type=file]').files[0];
526 | if ( !file ) { return; }
527 | const blob = new Blob( [ file ], { type: "application/octet-stream" } );
528 | const url = URL.createObjectURL( blob );
529 | load( url );
530 | }
531 | // end
532 |
533 | // wait to trigger interface and load init values
534 |
535 | setTimeout(() => { interface(); }, 500);
536 |
537 | //ok
538 |
--------------------------------------------------------------------------------
/style.css:
--------------------------------------------------------------------------------
1 | * {
2 | margin: 0;
3 | padding: 0;
4 | border: 0;
5 | overflow: hidden;
6 | }
7 |
8 | body {
9 | background-color:black;
10 | }
11 |
12 | canvas {
13 | position:absolute;
14 | width:100vw !important;
15 | height:100vh !important;
16 | /*cursor: none;*/
17 | z-index:1;
18 | }
19 |
20 | .backplate{
21 | position:absolute;
22 | font-family: Sans-Serif;
23 | color:white;
24 | margin:10px;
25 | font-size:14px;
26 | opacity:0.75;
27 | font-family: Sans-Serif;
28 | background-color:#2F5164;
29 | padding:10px;
30 | border-radius:5px;
31 | }
32 |
33 | .credits{
34 | position:absolute;
35 | right:0px;
36 | font-family: Sans-Serif;
37 | color:white;
38 | margin:10px;
39 | font-size:14px;
40 | text-align:right;
41 | opacity:0.75;
42 | font-family: Sans-Serif;
43 | background-color:#2F5164;
44 | padding:10px;
45 | border-radius:5px;
46 | z-index:1;
47 | }
48 |
49 | .infobar{
50 | position:absolute;
51 | bottom:0px;
52 | left:0px;
53 | font-family: Sans-Serif;
54 | background-color:#2F5164;
55 | margin:10px;
56 | font-size:14px;
57 | z-index:998;
58 | color:white;
59 | padding:5px;
60 | border-radius:5px;
61 | opacity:0.75;
62 | }
63 |
64 | .interface{
65 | position:absolute;
66 | bottom:0px;
67 | right:0px;
68 | font-family: Sans-Serif;
69 | background-color:#2F5164;
70 | margin:10px;
71 | font-size:13px;
72 | z-index:999;
73 | color:white;
74 | padding:10px;
75 | border-radius:5px;
76 | width:200px;
77 | opacity:0.75;
78 | }
79 |
80 | .interface .sliders{
81 | }
82 |
83 |
84 | .interface .fileinput {
85 | margin-top:10px;
86 | }
87 |
88 |
89 | .interface .slider{
90 | appearance: none;
91 | width: 100%;
92 | height: 13px;
93 | background: #000;
94 | opacity:1;
95 | margin-top:3px;
96 | margin-bottom:3px;
97 | overflow:hidden;
98 | border-radius:30px;
99 | }
100 |
101 | .slider::-webkit-slider-thumb {
102 | -webkit-appearance: none;
103 | appearance: none;
104 | background: #fff;
105 | width: 18px;
106 | height: 18px;
107 | border-radius:30px;
108 | }
109 |
110 |
111 | .interface .file{
112 | background-color:#2F5164;
113 | }
114 |
115 |
116 | .interface .closebtn{
117 | color:red;
118 | text-align:right;
119 | margin-top: -5px;
120 | margin-bottom: -15px;
121 | font-size:14px;
122 | }
123 |
124 | .interface input[type=number]::-webkit-inner-spin-button,
125 | .interface input[type=number]::-webkit-outer-spin-button {
126 | opacity: 1;
127 | }
128 |
129 | .highlight{
130 | background-color : black;
131 | border-radius : 5px;
132 | padding: 1px;
133 | padding-left:5px;
134 | padding-right:5px;
135 | color: white;
136 | font-weight:bold;
137 | font-size:12px;
138 | line-height:15px;
139 | }
140 |
141 |
--------------------------------------------------------------------------------