├── LICENSE
├── README.md
├── gst-jack-janus.sh
├── images
├── image1.png
├── image2.png
└── image3.png
├── web
├── css
│ ├── demo.css
│ └── piano.css
├── favicon.ico
├── index.html
├── index.js
└── janus.js
└── webrtc-piano.lua
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2020 Lorenzo Miniero
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | webrtc-piano
2 | ============
3 |
4 | This is the code I originally wrote for the "Dangerous Demo" session at ClueCon '19 a few months ago.
5 |
6 | The short demo (about 3 minutes) can be seen on [YouTube](https://youtu.be/8Hzg4hSJMsQ?t=790), and was basically an attempt to use WebRTC and Jack together for music purposes. More specifically:
7 |
8 | * I used this cool CSS [piano keyboard](https://codepen.io/felipefialho/pen/oDEki) snippet by [Felipe Fialho](https://piano.felipefialho.com/) to allow multiple web participants to "play" the same keyboard.
9 | * I then hooked the key hits to a data channel, connected to the [Janus WebRTC Server](https://github.com/meetecho/janus-gateway/).
10 | * A Lua script in Janus translates the key hits to MIDI events, which are passed to "Midi Through" in Jack.
11 | * A custom Gstreamer pipeline then listens for the output, encodes it to Opus, and sends it via RTP to a [Streaming mountpoint](https://janus.conf.meetecho.com/docs/streaming) in Janus.
12 | * All web users connected to the demo subscribe to the mountpoint, and so can hear the output via WebRTC in real-time.
13 |
14 | While of course not perfect, I thought this was a cool experiment, so I finally decided to update it and clean it up a bit, and share it here. Hopefully, it will help foster more usages of WebRTC for music-related purposes, which as I wrote in [this post](https://linuxmusicians.com/viewtopic.php?f=28&t=21617) is something I'd really love to see happening more. Feel free to do with it whatever you want!
15 |
16 | ## Requirements
17 |
18 | The demo requires a working installation of Janus, and of the Lua plugin: please refer to the Janus README for instructions on how to install both properly. Besides, you'll need to install the [midialsa](https://www.pjb.com.au/comp/lua/midialsa.html) library for Lua, as the script provided in this repo makes use of it.
19 |
20 | You'll also need an installation of gstreamer that supports [jackaudiosrc](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-jackaudiosrc.html), besides Opus and RTP, since the script provided in this repo uses them all.
21 |
22 | Of course, you'll need a working [Jack](https://jackaudio.org/) setup as well. How you'll configure that (e.g., what to use to render the MIDI data) will be up to you, and explained later.
23 |
24 | ## Configuring Janus
25 |
26 | Besides the usual configuration, you'll need to do two things:
27 |
28 | 1. configure the Lua plugin to use the script provided here:
29 | 2. create a Streaming plugin mountpoint that can receive the stream encoded by the gstreamer pipeline in this repo.
30 |
31 | The former is quite easy, and can be done by editing the `janus.plugin.lua.jcfg` configuration file, e.g.:
32 |
33 | ```
34 | general: {
35 | path = "/opt/janus/share/janus/lua"
36 | script = "/home/lminiero/webrtc-piano/webrtc-piano.lua"
37 | }
38 | ```
39 |
40 | If everything's working, you should see something like this in the logs when you launch Janus:
41 |
42 | ```
43 | Loading plugin 'libjanus_lua.so'...
44 | [webrtc-piano.lua] Loading...
45 | [webrtc-piano.lua] Loaded
46 | [webrtc-piano.lua] Initialized
47 | Janus Lua plugin initialized!
48 | ```
49 |
50 | Creating the Streaming plugin mountpoint is quite straightforward too. All you need to do is creating an audio-only mountpoint that listens on the port the gstreamer pipeline is configured to send to: of course, feel free to choose a different port and update the gstreamer pipeline accordingly. This is a simple example you can add to the `janus.plugin.streaming.jcfg` configuration file:
51 |
52 | ```
53 | webrtc-piano: {
54 | type = "rtp"
55 | id = "2019"
56 | audio = true
57 | audioport = 20190
58 | audiopt = 111
59 | audiortpmap = "opus/48000/2"
60 | secret = "adminpwd"
61 | }
62 | ```
63 |
64 | If everything's working, when you start the gstreamer script (`gst-jack-janus.sh`) you should see something like this in the Janus logs:
65 |
66 | ```
67 | [webrtc-piano] New audio stream! (ssrc=1980195523)
68 | ```
69 |
70 | which means Janus is correctly receiving data from it.
71 |
72 | ## Playing with Jack
73 |
74 | After you start Janus and launch the gstreamer script, your Jack graph should look something like this:
75 |
76 | 
77 |
78 | As you can see in the screenshot above, the gstreamer pipeline should be ready to receive incoming audio via Jack, while the Lua script in Janus (which registers as "Janus WebRTC Piano client") should automatically connect to the MIDI through node via ALSA. I chose to have it connected there to make it easier to switch instruments dynamically, but you're of course free to edit the Lua script to implement a different behaviour, or change the connections to your liking after the fact.
79 |
80 | After that, it should be straightforward to figure out the next steps: all you need to do is launch whatever you want to be able to receive the MIDI input and render it to audio, and connect things accordingly. In the screenshot below, I used a yoshimi instance for the purpose, but it should work with anything else. In case the application you want to send MIDI to doesn't support ALSA, but only Jack, then you'll have to use something like [a2jmidid](https://github.com/linuxaudio/a2jmidid/) for bridging them.
81 |
82 | 
83 |
84 | I simply connected the MIDI through node to Yoshimi (so that it will receive the notes played by the web user), and then connected the output of yoshimi itself to the gstreamer pipeline (which means that whaterver yoshimi plays will encoded and sent in real-time to Janus, and from there to users). At this point, the demo should be ready to be tested.
85 |
86 | ## Using the demo
87 |
88 | To try the demo, you need to serve the contents of the `web` folder with a web server. Notice that, while locally (localhost) it's not needed, you may have to serve the page via HTTPS if the user is on a different machine: in that case, make sure you follow the [deployment guidelines](https://janus.conf.meetecho.com/docs/deploy) for Janus, to serve the Janus API and the pages through the same server (e.g., via nginx); notice you'll need to update the `server` variable in `index.js` accordingly, in case.
89 |
90 | **IMPORTANT:** browsers currently don't support Jack, which means that if you launch the demo on the same machine Janus and Jack are started on, you will NOT hear the output coming via WebRTC through the browser. You'll need a browser launched on a separate machine for this.
91 |
92 | That said, using the demo should be quite straightforward. After opening the page, you should be prompted for a name, which will be how you'll be seen in the demo: a random color is picked for you as well, and the same will happen for other users joining the session.
93 |
94 | 
95 |
96 | Playing on the keyboard should result in the audio (in my case rendered by yoshimi) through your browser. When you hit a key, the event is relayed to all the other users in the session as well: this means that, if another participant plays a note, you'll see the related key changing color to the one associated to the participant, and so will them when it's you playing. Good chances this is very buggy (I didn't test it much), but for demo purposes this is more than enough.
97 |
98 | Hope you'll enjoy this little toy! Please feel free to join the [Janus community](https://groups.google.com/forum/#!forum/meetecho-janus) in case you have trouble installing Janus or getting it to run as you want to play with this.
99 |
--------------------------------------------------------------------------------
/gst-jack-janus.sh:
--------------------------------------------------------------------------------
1 | #!/bin/sh
2 |
3 | gst-launch-1.0 \
4 | jackaudiosrc name="Janus Streaming mountpoint" connect=0 ! \
5 | audioconvert ! opusenc bitrate=20000 ! rtpopuspay ! \
6 | udpsink host=127.0.0.1 port=20190
7 |
--------------------------------------------------------------------------------
/images/image1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/webrtc-piano/dbcb9329d1626dfdf354355c1f7c4619a66bec07/images/image1.png
--------------------------------------------------------------------------------
/images/image2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/webrtc-piano/dbcb9329d1626dfdf354355c1f7c4619a66bec07/images/image2.png
--------------------------------------------------------------------------------
/images/image3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/webrtc-piano/dbcb9329d1626dfdf354355c1f7c4619a66bec07/images/image3.png
--------------------------------------------------------------------------------
/web/css/demo.css:
--------------------------------------------------------------------------------
1 | .rounded {
2 | border-radius: 5px;
3 | }
4 |
5 | .centered {
6 | display: block;
7 | margin: auto;
8 | }
9 |
10 | .relative {
11 | position: relative;
12 | }
13 |
14 | .navbar-brand {
15 | margin-left: 0px !important;
16 | }
17 |
18 | .navbar-default {
19 | -webkit-box-shadow: 0px 3px 5px rgba(100, 100, 100, 0.49);
20 | -moz-box-shadow: 0px 3px 5px rgba(100, 100, 100, 0.49);
21 | box-shadow: 0px 3px 5px rgba(100, 100, 100, 0.49);
22 | }
23 |
24 | .navbar-header {
25 | padding-left: 40px;
26 | }
27 |
28 | .margin-sm {
29 | margin: 5px !important;
30 | }
31 | .margin-md {
32 | margin: 10px !important;
33 | }
34 | .margin-xl {
35 | margin: 20px !important;
36 | }
37 | .margin-bottom-sm {
38 | margin-bottom: 5px !important;
39 | }
40 | .margin-bottom-md {
41 | margin-bottom: 10px !important;
42 | }
43 | .margin-bottom-xl {
44 | margin-bottom: 20px !important;
45 | }
46 |
47 | .divider {
48 | width: 100%;
49 | text-align: center;
50 | }
51 |
52 | .divider hr {
53 | margin-left: auto;
54 | margin-right: auto;
55 | width: 45%;
56 | }
57 |
58 | .fa-2 {
59 | font-size: 2em !important;
60 | }
61 | .fa-3 {
62 | font-size: 4em !important;
63 | }
64 | .fa-4 {
65 | font-size: 7em !important;
66 | }
67 | .fa-5 {
68 | font-size: 12em !important;
69 | }
70 | .fa-6 {
71 | font-size: 20em !important;
72 | }
73 |
74 | div.no-video-container {
75 | position: relative;
76 | }
77 |
78 | .no-video-icon {
79 | width: 100%;
80 | height: 240px;
81 | text-align: center;
82 | }
83 |
84 | .no-video-text {
85 | text-align: center;
86 | position: absolute;
87 | bottom: 0px;
88 | right: 0px;
89 | left: 0px;
90 | font-size: 24px;
91 | }
92 |
93 | .meetecho-logo {
94 | padding: 12px !important;
95 | }
96 |
97 | .meetecho-logo > img {
98 | height: 26px;
99 | }
100 |
101 | pre {
102 | white-space: pre-wrap;
103 | white-space: -moz-pre-wrap;
104 | white-space: -pre-wrap;
105 | white-space: -o-pre-wrap;
106 | word-wrap: break-word;
107 | }
108 |
109 | .progress {
110 | position: relative;
111 | cursor: pointer;
112 | }
113 |
114 | .progress span {
115 | position: absolute;
116 | display: block;
117 | width: 100%;
118 | color: black;
119 | }
120 |
121 | textarea {
122 | font-family: monospace;
123 | background-color: black;
124 | color: white;
125 | font-size: large;
126 | width: 100%;
127 | }
128 |
--------------------------------------------------------------------------------
/web/css/piano.css:
--------------------------------------------------------------------------------
1 | .piano {
2 | background: -webkit-linear-gradient(-65deg, #000, #222, #000, #666, #222 75%);
3 | background: -moz-linear-gradient(-65deg, #000, #222, #000, #666, #222 75%);
4 | background: -o-linear-gradient(-65deg, #000, #222, #000, #666, #222 75%);
5 | background: linear-gradient(-65deg, #000, #222, #000, #666, #222 75%);
6 | border-top: 2px solid #111;
7 | box-shadow: inset 0 -1px 1px rgba(255, 255, 255, 0.5), inset 0 -4px 5px #000;
8 | margin: 0;
9 | padding: 0 1% 3%;
10 | text-align: center;
11 | }
12 | .key {
13 | display: inline-block;
14 | position: relative;
15 | margin: 0 2px;
16 | width: 6%;
17 | max-width: 85px;
18 | }
19 | .key:active .black-key,
20 | .key.active .black-key {
21 | top: -5px;
22 | }
23 | .key .white-key {
24 | background: -webkit-linear-gradient(-30deg, #f8f8f8, #fff);
25 | background: -moz-linear-gradient(-30deg, #f8f8f8, #fff);
26 | background: -o-linear-gradient(-30deg, #f8f8f8, #fff);
27 | background: linear-gradient(-30deg, #f8f8f8, #fff);
28 | box-shadow: inset 0 1px 0px #ffffff, inset 0 -1px 0px #ffffff, inset 1px 0px 0px #ffffff, inset -1px 0px 0px #ffffff, 0 4px 3px rgba(0, 0, 0, 0.7), inset 0 -1px 0px #ffffff, inset 1px 0px 0px #ffffff, inset -1px -1px 15px rgba(0, 0, 0, 0.5), -3px 4px 6px rgba(0, 0, 0, 0.5);
29 | display: block;
30 | height: 300px;
31 | }
32 | .key .white-key:active,
33 | .key .white-key.active {
34 | box-shadow: inset 0 1px 0px #ffffff, inset 0 -1px 0px #ffffff, inset 1px 0px 0px #ffffff, inset -1px 0px 0px #ffffff, 0 4px 3px rgba(0, 0, 0, 0.7), inset 0 -1px 0px #ffffff, inset 1px 0px 0px #ffffff, inset -1px -1px 15px #000000, -3px 4px 6px rgba(0, 0, 0, 0.5);
35 | position: relative;
36 | top: -5px;
37 | height: 295px;
38 | }
39 | .key .black-key {
40 | content: "";
41 | box-shadow: inset 0px -1px 2px rgba(255, 255, 255, 0.4), 0 2px 3px rgba(0, 0, 0, 0.4);
42 | background: -webkit-linear-gradient(-20deg, #222, #000, #222);
43 | background: -moz-linear-gradient(-20deg, #222, #000, #222);
44 | background: -o-linear-gradient(-20deg, #222, #000, #222);
45 | background: linear-gradient(-20deg, #222, #000, #222);
46 | border-width: 1px 3px 8px;
47 | border-style: solid;
48 | border-color: #666 #222 #111 #555;
49 | height: 160px;
50 | position: absolute;
51 | top: 0px;
52 | right: -40%;
53 | width: 70%;
54 | z-index: 10;
55 | }
56 | .key .black-key:active,
57 | .key .black-key.active {
58 | border-bottom-width: 3px;
59 | top: 0;
60 | }
61 | @media (max-width: 767px) {
62 | .key {
63 | width: 8%;
64 | }
65 | .key:nth-child(1),
66 | .key:nth-child(2),
67 | .key:nth-child(3) {
68 | display: none;
69 | }
70 | }
71 | @media (max-width: 480px) {
72 | .key {
73 | width: 12%;
74 | }
75 | .key:nth-child(11),
76 | .key:nth-child(12),
77 | .key:nth-child(13),
78 | .key:nth-child(14) {
79 | display: none;
80 | }
81 | }
82 |
--------------------------------------------------------------------------------
/web/favicon.ico:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/webrtc-piano/dbcb9329d1626dfdf354355c1f7c4619a66bec07/web/favicon.ico
--------------------------------------------------------------------------------
/web/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 | Janus WebRTC Piano Demo
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
25 |
46 |
47 |
48 |
49 |
50 |
51 |
56 |
57 |
58 |
59 |
Demo details
60 |
Originally written as a Dangerous Demo for ClueCon 2019
61 | a few months ago: having fun with WebRTC and music!
62 |
Press the Start
button above to launch the demo.
63 |
64 |
65 |
66 |
67 |
68 |
69 |
70 |
71 |
72 |
Panel title
73 |
74 |
75 |
76 |
77 |
78 |
79 |
80 |
81 |
82 |
83 |
84 |
85 |
86 |
87 |
88 |
89 |
90 |
91 |
92 |
93 |
94 |
95 |
96 |
97 |
98 |
99 |
100 |
101 |
102 |
103 |
104 |
105 |
106 |
107 |
108 |
109 |
110 |
111 |
112 |
113 |
114 |
115 |
116 |
117 |
118 |
119 |
120 |
121 |
122 |
123 |
124 |
125 |
126 |
127 |
128 |
129 |
130 |
131 |
132 |
133 |
134 |
135 |
136 |
137 |
138 |
139 |
140 |
141 |
142 |
143 |
146 |
147 |
148 |
149 |
150 |
--------------------------------------------------------------------------------
/web/index.js:
--------------------------------------------------------------------------------
1 | // We make use of this 'server' variable to provide the address of the
2 | // REST Janus API. By default, in this example we assume that Janus is
3 | // co-located with the web server hosting the HTML pages but listening
4 | // on a different port (8088, the default for HTTP in Janus), which is
5 | // why we make use of the 'window.location.hostname' base address. Since
6 | // Janus can also do HTTPS, and considering we don't really want to make
7 | // use of HTTP for Janus if your demos are served on HTTPS, we also rely
8 | // on the 'window.location.protocol' prefix to build the variable, in
9 | // particular to also change the port used to contact Janus (8088 for
10 | // HTTP and 8089 for HTTPS, if enabled).
11 | // In case you place Janus behind an Apache frontend (as we did on the
12 | // online demos at http://janus.conf.meetecho.com) you can just use a
13 | // relative path for the variable, e.g.:
14 | //
15 | // var server = "/janus";
16 | //
17 | // which will take care of this on its own.
18 | //
19 | //
20 | // If you want to use the WebSockets frontend to Janus, instead, you'll
21 | // have to pass a different kind of address, e.g.:
22 | //
23 | // var server = "ws://" + window.location.hostname + ":8188";
24 | //
25 | // Of course this assumes that support for WebSockets has been built in
26 | // when compiling the server. WebSockets support has not been tested
27 | // as much as the REST API, so handle with care!
28 | //
29 | //
30 | // If you have multiple options available, and want to let the library
31 | // autodetect the best way to contact your server (or pool of servers),
32 | // you can also pass an array of servers, e.g., to provide alternative
33 | // means of access (e.g., try WebSockets first and, if that fails, fall
34 | // back to plain HTTP) or just have failover servers:
35 | //
36 | // var server = [
37 | // "ws://" + window.location.hostname + ":8188",
38 | // "/janus"
39 | // ];
40 | //
41 | // This will tell the library to try connecting to each of the servers
42 | // in the presented order. The first working server will be used for
43 | // the whole session.
44 | //
45 | var server = null;
46 | if(window.location.protocol === 'http:')
47 | server = "http://" + window.location.hostname + ":8088/janus";
48 | else
49 | server = "https://" + window.location.hostname + ":8089/janus";
50 |
51 | var janus = null;
52 | var streaming = null, controller = null;
53 | var opaqueId = "webrtc-piano-"+Janus.randomString(12);
54 |
55 | var myname = null, mycolor = createRandomColor();
56 |
57 | var streamId = 2019;
58 | var notes = {}, bgs = {};
59 |
60 | $(document).ready(function() {
61 | // Initialize the library (all console debuggers enabled)
62 | Janus.init({debug: "all", callback: function() {
63 | // Use a button to start the demo
64 | $('#start').one('click', function() {
65 | $(this).attr('disabled', true).unbind('click');
66 | // Make sure the browser supports WebRTC
67 | if(!Janus.isWebrtcSupported()) {
68 | bootbox.alert("No WebRTC support... ");
69 | return;
70 | }
71 | // Create session
72 | janus = new Janus(
73 | {
74 | server: server,
75 | success: function() {
76 | // Prompt for a name
77 | askForName();
78 | },
79 | error: function(error) {
80 | Janus.error(error);
81 | bootbox.alert(error, function() {
82 | window.location.reload();
83 | });
84 | },
85 | destroyed: function() {
86 | window.location.reload();
87 | }
88 | });
89 | });
90 | }});
91 | });
92 |
93 | function askForName() {
94 | bootbox.prompt("What's your name?", function(result) {
95 | if(!result || result === "") {
96 | askForName();
97 | return;
98 | }
99 | myname = result;
100 | // Create the MIDI controller first
101 | createController();
102 | // Start the streaming mountpoint as well
103 | receiveAudio();
104 | });
105 | }
106 |
107 | function createController() {
108 | // Attach to the Lua plugin
109 | janus.attach(
110 | {
111 | plugin: "janus.plugin.lua",
112 | opaqueId: opaqueId,
113 | success: function(pluginHandle) {
114 | controller = pluginHandle;
115 | Janus.log("Plugin attached! (" + controller.getPlugin() + ", id=" + controller.getId() + ")");
116 | // Setup the DataChannel
117 | var body = { request: "setup" };
118 | Janus.debug("Sending message (" + JSON.stringify(body) + ")");
119 | controller.send({ message: body });
120 | },
121 | error: function(error) {
122 | console.error(" -- Error attaching plugin...", error);
123 | bootbox.alert("Error attaching plugin... " + error);
124 | },
125 | webrtcState: function(on) {
126 | Janus.log("Janus says our controller WebRTC PeerConnection is " + (on ? "up" : "down") + " now");
127 | },
128 | onmessage: function(msg, jsep) {
129 | Janus.debug(" ::: Got a message :::", msg);
130 | if(msg["error"]) {
131 | bootbox.alert(msg["error"]);
132 | }
133 | if(jsep) {
134 | // Answer
135 | controller.createAnswer(
136 | {
137 | jsep: jsep,
138 | media: { audio: false, video: false, data: true }, // We only use datachannels
139 | success: function(jsep) {
140 | Janus.debug("Got SDP!", jsep);
141 | var body = { request: "ack" };
142 | controller.send({ message: body, jsep: jsep });
143 | },
144 | error: function(error) {
145 | Janus.error("WebRTC error:", error);
146 | bootbox.alert("WebRTC error... " + error.message);
147 | }
148 | });
149 | }
150 | },
151 | ondataopen: function(data) {
152 | Janus.log("The DataChannel is available!");
153 | $('#details').remove();
154 | $('#start').removeAttr('disabled').html("Stop")
155 | .click(function() {
156 | $(this).attr('disabled', true);
157 | janus.destroy();
158 | });
159 | // Register the name+color
160 | sendDataMessage({ action: "register", name: myname, color: mycolor });
161 | // Show the piano
162 | $('#piano').removeClass('hide');
163 | $('.white-key, .black-key')
164 | .on('mousedown', function(ev) {
165 | notes[ev.target.dataset.key] = true;
166 | sendDataMessage({ action: "play", note: parseInt(ev.target.dataset.key) });
167 | })
168 | .on('mouseup mouseleave', function(ev) {
169 | if(notes[ev.target.dataset.key] === true) {
170 | delete notes[ev.target.dataset.key];
171 | sendDataMessage({ action: "stop", note: parseInt(ev.target.dataset.key) });
172 | }
173 | });
174 | },
175 | ondata: function(data) {
176 | Janus.debug("We got data from the DataChannel!", data);
177 | handleResponse(JSON.parse(data));
178 | },
179 | oncleanup: function() {
180 | Janus.log(" ::: Got a cleanup notification :::");
181 | }
182 | });
183 | }
184 |
185 | function sendDataMessage(note) {
186 | controller.data({
187 | text: JSON.stringify(note),
188 | error: function(reason) {
189 | bootbox.alert(reason);
190 | },
191 | success: function() {
192 | // TODO
193 | }
194 | });
195 | }
196 |
197 | function handleResponse(data) {
198 | Janus.debug(data);
199 | if(data["response"] === "error") {
200 | bootbox.alert(data["error"]);
201 | } else if(data["event"]) {
202 | if(data["event"] === "join") {
203 | var id = data["id"];
204 | var name = data["name"];
205 | var color = data["color"];
206 | $('#players').append('' + name + '
');
207 | } else if(data["event"] === "leave") {
208 | var id = data["id"];
209 | $('#p' + id).remove();
210 | } else if(data["event"] === "play") {
211 | var note = data["note"];
212 | var name = data["name"];
213 | var color = data["color"];
214 | if(!bgs[note])
215 | bgs[note] = { original: $('[data-key=' + note + ']').css('background'), count: 0, colors: [] };
216 | bgs[note].count++;
217 | bgs[note].colors.push(color);
218 | $('[data-key=' + note + ']').css('background', color);
219 | console.log(bgs);
220 | } else if(data["event"] === "stop") {
221 | var note = data["note"];
222 | var name = data["name"];
223 | if(!bgs[note])
224 | return;
225 | var index = bgs[note].colors.indexOf(color);
226 | if(index > -1)
227 | bgs[note].colors.splice(index, 1);
228 | if(bgs[note].colors.length === 0)
229 | $('[data-key=' + note + ']').css('background', bgs[note].original);
230 | else
231 | $('[data-key=' + note + ']').css('background', bgs[note].colors[bgs[note].colors.length-1]);
232 | bgs[note].count--;
233 | if(bgs[note].count === 0) {
234 | $('[data-key=' + note + ']').css('background', bgs[note].original);
235 | delete bgs[note];
236 | }
237 | console.log(bgs);
238 | }
239 | }
240 | }
241 |
242 | function receiveAudio() {
243 | // Attach to streaming plugin
244 | janus.attach(
245 | {
246 | plugin: "janus.plugin.streaming",
247 | opaqueId: opaqueId,
248 | success: function(pluginHandle) {
249 | streaming = pluginHandle;
250 | var body = { request: "watch", id: streamId };
251 | streaming.send({ message: body });
252 | },
253 | error: function(error) {
254 | Janus.error(" -- Error attaching plugin... ", error);
255 | bootbox.alert("Error attaching plugin... " + error);
256 | },
257 | webrtcState: function(on) {
258 | Janus.log("Janus says our streaming WebRTC PeerConnection is " + (on ? "up" : "down") + " now");
259 | },
260 | onmessage: function(msg, jsep) {
261 | Janus.debug(" ::: Got a message :::", msg);
262 | if(msg["error"]) {
263 | bootbox.alert(msg["error"]);
264 | return;
265 | }
266 | if(jsep) {
267 | Janus.debug("Handling SDP as well...", jsep);
268 | // Offer from the plugin, let's answer
269 | streaming.createAnswer(
270 | {
271 | jsep: jsep,
272 | // We only want recvonly audio
273 | media: { audioSend: false, videoSend: false, data: false },
274 | success: function(jsep) {
275 | Janus.debug("Got SDP!");
276 | Janus.debug(jsep);
277 | var body = { request: "start" };
278 | streaming.send({ message: body, jsep: jsep });
279 | },
280 | error: function(error) {
281 | Janus.error("WebRTC error:", error);
282 | bootbox.alert("WebRTC error... " + error.message);
283 | }
284 | });
285 | }
286 | },
287 | onremotestream: function(stream) {
288 | Janus.debug(" ::: Got a remote stream :::");
289 | Janus.debug(stream);
290 | if($('#remoteaudio').length === 0) {
291 | // Add the video element
292 | $('#piano').append(' ');
293 | }
294 | Janus.attachMediaStream($('#remoteaudio').get(0), stream);
295 | },
296 | oncleanup: function() {
297 | Janus.log(" ::: Got a cleanup notification :::");
298 | }
299 | });
300 | }
301 |
302 | function createRandomColor() {
303 | return "hsla(" + ~~(360 * Math.random()) + "," + "70%," + "80%,1)"
304 | }
305 |
--------------------------------------------------------------------------------
/web/janus.js:
--------------------------------------------------------------------------------
1 | /*
2 | The MIT License (MIT)
3 |
4 | Copyright (c) 2016 Meetecho
5 |
6 | Permission is hereby granted, free of charge, to any person obtaining
7 | a copy of this software and associated documentation files (the "Software"),
8 | to deal in the Software without restriction, including without limitation
9 | the rights to use, copy, modify, merge, publish, distribute, sublicense,
10 | and/or sell copies of the Software, and to permit persons to whom the
11 | Software is furnished to do so, subject to the following conditions:
12 |
13 | The above copyright notice and this permission notice shall be included
14 | in all copies or substantial portions of the Software.
15 |
16 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
17 | OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
19 | THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
20 | OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
21 | ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
22 | OTHER DEALINGS IN THE SOFTWARE.
23 | */
24 |
25 | // List of sessions
26 | Janus.sessions = {};
27 |
28 | Janus.isExtensionEnabled = function() {
29 | if(navigator.mediaDevices && navigator.mediaDevices.getDisplayMedia) {
30 | // No need for the extension, getDisplayMedia is supported
31 | return true;
32 | }
33 | if(window.navigator.userAgent.match('Chrome')) {
34 | var chromever = parseInt(window.navigator.userAgent.match(/Chrome\/(.*) /)[1], 10);
35 | var maxver = 33;
36 | if(window.navigator.userAgent.match('Linux'))
37 | maxver = 35; // "known" crash in chrome 34 and 35 on linux
38 | if(chromever >= 26 && chromever <= maxver) {
39 | // Older versions of Chrome don't support this extension-based approach, so lie
40 | return true;
41 | }
42 | return Janus.extension.isInstalled();
43 | } else {
44 | // Firefox of others, no need for the extension (but this doesn't mean it will work)
45 | return true;
46 | }
47 | };
48 |
49 | var defaultExtension = {
50 | // Screensharing Chrome Extension ID
51 | extensionId: 'hapfgfdkleiggjjpfpenajgdnfckjpaj',
52 | isInstalled: function() { return document.querySelector('#janus-extension-installed') !== null; },
53 | getScreen: function (callback) {
54 | var pending = window.setTimeout(function () {
55 | var error = new Error('NavigatorUserMediaError');
56 | error.name = 'The required Chrome extension is not installed: click here to install it. (NOTE: this will need you to refresh the page)';
57 | return callback(error);
58 | }, 1000);
59 | this.cache[pending] = callback;
60 | window.postMessage({ type: 'janusGetScreen', id: pending }, '*');
61 | },
62 | init: function () {
63 | var cache = {};
64 | this.cache = cache;
65 | // Wait for events from the Chrome Extension
66 | window.addEventListener('message', function (event) {
67 | if(event.origin != window.location.origin)
68 | return;
69 | if(event.data.type == 'janusGotScreen' && cache[event.data.id]) {
70 | var callback = cache[event.data.id];
71 | delete cache[event.data.id];
72 |
73 | if (event.data.sourceId === '') {
74 | // user canceled
75 | var error = new Error('NavigatorUserMediaError');
76 | error.name = 'You cancelled the request for permission, giving up...';
77 | callback(error);
78 | } else {
79 | callback(null, event.data.sourceId);
80 | }
81 | } else if (event.data.type == 'janusGetScreenPending') {
82 | console.log('clearing ', event.data.id);
83 | window.clearTimeout(event.data.id);
84 | }
85 | });
86 | }
87 | };
88 |
89 | Janus.useDefaultDependencies = function (deps) {
90 | var f = (deps && deps.fetch) || fetch;
91 | var p = (deps && deps.Promise) || Promise;
92 | var socketCls = (deps && deps.WebSocket) || WebSocket;
93 |
94 | return {
95 | newWebSocket: function(server, proto) { return new socketCls(server, proto); },
96 | extension: (deps && deps.extension) || defaultExtension,
97 | isArray: function(arr) { return Array.isArray(arr); },
98 | webRTCAdapter: (deps && deps.adapter) || adapter,
99 | httpAPICall: function(url, options) {
100 | var fetchOptions = {
101 | method: options.verb,
102 | headers: {
103 | 'Accept': 'application/json, text/plain, */*'
104 | },
105 | cache: 'no-cache'
106 | };
107 | if(options.verb === "POST") {
108 | fetchOptions.headers['Content-Type'] = 'application/json';
109 | }
110 | if(options.withCredentials !== undefined) {
111 | fetchOptions.credentials = options.withCredentials === true ? 'include' : (options.withCredentials ? options.withCredentials : 'omit');
112 | }
113 | if(options.body) {
114 | fetchOptions.body = JSON.stringify(options.body);
115 | }
116 |
117 | var fetching = f(url, fetchOptions).catch(function(error) {
118 | return p.reject({message: 'Probably a network error, is the server down?', error: error});
119 | });
120 |
121 | /*
122 | * fetch() does not natively support timeouts.
123 | * Work around this by starting a timeout manually, and racing it agains the fetch() to see which thing resolves first.
124 | */
125 |
126 | if(options.timeout) {
127 | var timeout = new p(function(resolve, reject) {
128 | var timerId = setTimeout(function() {
129 | clearTimeout(timerId);
130 | return reject({message: 'Request timed out', timeout: options.timeout});
131 | }, options.timeout);
132 | });
133 | fetching = p.race([fetching, timeout]);
134 | }
135 |
136 | fetching.then(function(response) {
137 | if(response.ok) {
138 | if(typeof(options.success) === typeof(Janus.noop)) {
139 | return response.json().then(function(parsed) {
140 | options.success(parsed);
141 | }).catch(function(error) {
142 | return p.reject({message: 'Failed to parse response body', error: error, response: response});
143 | });
144 | }
145 | }
146 | else {
147 | return p.reject({message: 'API call failed', response: response});
148 | }
149 | }).catch(function(error) {
150 | if(typeof(options.error) === typeof(Janus.noop)) {
151 | options.error(error.message || '<< internal error >>', error);
152 | }
153 | });
154 |
155 | return fetching;
156 | }
157 | }
158 | };
159 |
160 | Janus.useOldDependencies = function (deps) {
161 | var jq = (deps && deps.jQuery) || jQuery;
162 | var socketCls = (deps && deps.WebSocket) || WebSocket;
163 | return {
164 | newWebSocket: function(server, proto) { return new socketCls(server, proto); },
165 | isArray: function(arr) { return jq.isArray(arr); },
166 | extension: (deps && deps.extension) || defaultExtension,
167 | webRTCAdapter: (deps && deps.adapter) || adapter,
168 | httpAPICall: function(url, options) {
169 | var payload = options.body !== undefined ? {
170 | contentType: 'application/json',
171 | data: JSON.stringify(options.body)
172 | } : {};
173 | var credentials = options.withCredentials !== undefined ? {xhrFields: {withCredentials: options.withCredentials}} : {};
174 |
175 | return jq.ajax(jq.extend(payload, credentials, {
176 | url: url,
177 | type: options.verb,
178 | cache: false,
179 | dataType: 'json',
180 | async: options.async,
181 | timeout: options.timeout,
182 | success: function(result) {
183 | if(typeof(options.success) === typeof(Janus.noop)) {
184 | options.success(result);
185 | }
186 | },
187 | error: function(xhr, status, err) {
188 | if(typeof(options.error) === typeof(Janus.noop)) {
189 | options.error(status, err);
190 | }
191 | }
192 | }));
193 | }
194 | };
195 | };
196 |
197 | Janus.noop = function() {};
198 |
199 | Janus.dataChanDefaultLabel = "JanusDataChannel";
200 |
201 | // Note: in the future we may want to change this, e.g., as was
202 | // attempted in https://github.com/meetecho/janus-gateway/issues/1670
203 | Janus.endOfCandidates = null;
204 |
205 | // Stop all tracks from a given stream
206 | Janus.stopAllTracks = function(stream) {
207 | try {
208 | // Try a MediaStreamTrack.stop() for each track
209 | var tracks = stream.getTracks();
210 | for(var mst of tracks) {
211 | Janus.log(mst);
212 | if(mst) {
213 | mst.stop();
214 | }
215 | }
216 | } catch(e) {
217 | // Do nothing if this fails
218 | }
219 | }
220 |
221 | // Initialization
222 | Janus.init = function(options) {
223 | options = options || {};
224 | options.callback = (typeof options.callback == "function") ? options.callback : Janus.noop;
225 | if(Janus.initDone) {
226 | // Already initialized
227 | options.callback();
228 | } else {
229 | if(typeof console == "undefined" || typeof console.log == "undefined") {
230 | console = { log: function() {} };
231 | }
232 | // Console logging (all debugging disabled by default)
233 | Janus.trace = Janus.noop;
234 | Janus.debug = Janus.noop;
235 | Janus.vdebug = Janus.noop;
236 | Janus.log = Janus.noop;
237 | Janus.warn = Janus.noop;
238 | Janus.error = Janus.noop;
239 | if(options.debug === true || options.debug === "all") {
240 | // Enable all debugging levels
241 | Janus.trace = console.trace.bind(console);
242 | Janus.debug = console.debug.bind(console);
243 | Janus.vdebug = console.debug.bind(console);
244 | Janus.log = console.log.bind(console);
245 | Janus.warn = console.warn.bind(console);
246 | Janus.error = console.error.bind(console);
247 | } else if(Array.isArray(options.debug)) {
248 | for(var d of options.debug) {
249 | switch(d) {
250 | case "trace":
251 | Janus.trace = console.trace.bind(console);
252 | break;
253 | case "debug":
254 | Janus.debug = console.debug.bind(console);
255 | break;
256 | case "vdebug":
257 | Janus.vdebug = console.debug.bind(console);
258 | break;
259 | case "log":
260 | Janus.log = console.log.bind(console);
261 | break;
262 | case "warn":
263 | Janus.warn = console.warn.bind(console);
264 | break;
265 | case "error":
266 | Janus.error = console.error.bind(console);
267 | break;
268 | default:
269 | console.error("Unknown debugging option '" + d + "' (supported: 'trace', 'debug', 'vdebug', 'log', warn', 'error')");
270 | break;
271 | }
272 | }
273 | }
274 | Janus.log("Initializing library");
275 |
276 | var usedDependencies = options.dependencies || Janus.useDefaultDependencies();
277 | Janus.isArray = usedDependencies.isArray;
278 | Janus.webRTCAdapter = usedDependencies.webRTCAdapter;
279 | Janus.httpAPICall = usedDependencies.httpAPICall;
280 | Janus.newWebSocket = usedDependencies.newWebSocket;
281 | Janus.extension = usedDependencies.extension;
282 | Janus.extension.init();
283 |
284 | // Helper method to enumerate devices
285 | Janus.listDevices = function(callback, config) {
286 | callback = (typeof callback == "function") ? callback : Janus.noop;
287 | if (config == null) config = { audio: true, video: true };
288 | if(Janus.isGetUserMediaAvailable()) {
289 | navigator.mediaDevices.getUserMedia(config)
290 | .then(function(stream) {
291 | navigator.mediaDevices.enumerateDevices().then(function(devices) {
292 | Janus.debug(devices);
293 | callback(devices);
294 | // Get rid of the now useless stream
295 | Janus.stopAllTracks(stream)
296 | });
297 | })
298 | .catch(function(err) {
299 | Janus.error(err);
300 | callback([]);
301 | });
302 | } else {
303 | Janus.warn("navigator.mediaDevices unavailable");
304 | callback([]);
305 | }
306 | };
307 | // Helper methods to attach/reattach a stream to a video element (previously part of adapter.js)
308 | Janus.attachMediaStream = function(element, stream) {
309 | try {
310 | element.srcObject = stream;
311 | } catch (e) {
312 | try {
313 | element.src = URL.createObjectURL(stream);
314 | } catch (e) {
315 | Janus.error("Error attaching stream to element");
316 | }
317 | }
318 | };
319 | Janus.reattachMediaStream = function(to, from) {
320 | try {
321 | to.srcObject = from.srcObject;
322 | } catch (e) {
323 | try {
324 | to.src = from.src;
325 | } catch (e) {
326 | Janus.error("Error reattaching stream to element");
327 | }
328 | }
329 | };
330 | // Detect tab close: make sure we don't loose existing onbeforeunload handlers
331 | // (note: for iOS we need to subscribe to a different event, 'pagehide', see
332 | // https://gist.github.com/thehunmonkgroup/6bee8941a49b86be31a787fe8f4b8cfe)
333 | var iOS = ['iPad', 'iPhone', 'iPod'].indexOf(navigator.platform) >= 0;
334 | var eventName = iOS ? 'pagehide' : 'beforeunload';
335 | var oldOBF = window["on" + eventName];
336 | window.addEventListener(eventName, function(event) {
337 | Janus.log("Closing window");
338 | for(var s in Janus.sessions) {
339 | if(Janus.sessions[s] && Janus.sessions[s].destroyOnUnload) {
340 | Janus.log("Destroying session " + s);
341 | Janus.sessions[s].destroy({unload: true, notifyDestroyed: false});
342 | }
343 | }
344 | if(oldOBF && typeof oldOBF == "function") {
345 | oldOBF();
346 | }
347 | });
348 | // If this is a Safari Technology Preview, check if VP8 is supported
349 | Janus.safariVp8 = false;
350 | if(Janus.webRTCAdapter.browserDetails.browser === 'safari' &&
351 | Janus.webRTCAdapter.browserDetails.version >= 605) {
352 | // Let's see if RTCRtpSender.getCapabilities() is there
353 | if(RTCRtpSender && RTCRtpSender.getCapabilities && RTCRtpSender.getCapabilities("video") &&
354 | RTCRtpSender.getCapabilities("video").codecs && RTCRtpSender.getCapabilities("video").codecs.length) {
355 | for(var codec of RTCRtpSender.getCapabilities("video").codecs) {
356 | if(codec && codec.mimeType && codec.mimeType.toLowerCase() === "video/vp8") {
357 | Janus.safariVp8 = true;
358 | break;
359 | }
360 | }
361 | if(Janus.safariVp8) {
362 | Janus.log("This version of Safari supports VP8");
363 | } else {
364 | Janus.warn("This version of Safari does NOT support VP8: if you're using a Technology Preview, " +
365 | "try enabling the 'WebRTC VP8 codec' setting in the 'Experimental Features' Develop menu");
366 | }
367 | } else {
368 | // We do it in a very ugly way, as there's no alternative...
369 | // We create a PeerConnection to see if VP8 is in an offer
370 | var testpc = new RTCPeerConnection({});
371 | testpc.createOffer({offerToReceiveVideo: true}).then(function(offer) {
372 | Janus.safariVp8 = offer.sdp.indexOf("VP8") !== -1;
373 | if(Janus.safariVp8) {
374 | Janus.log("This version of Safari supports VP8");
375 | } else {
376 | Janus.warn("This version of Safari does NOT support VP8: if you're using a Technology Preview, " +
377 | "try enabling the 'WebRTC VP8 codec' setting in the 'Experimental Features' Develop menu");
378 | }
379 | testpc.close();
380 | testpc = null;
381 | });
382 | }
383 | }
384 | // Check if this browser supports Unified Plan and transceivers
385 | // Based on https://codepen.io/anon/pen/ZqLwWV?editors=0010
386 | Janus.unifiedPlan = false;
387 | if(Janus.webRTCAdapter.browserDetails.browser === 'firefox' &&
388 | Janus.webRTCAdapter.browserDetails.version >= 59) {
389 | // Firefox definitely does, starting from version 59
390 | Janus.unifiedPlan = true;
391 | } else if(Janus.webRTCAdapter.browserDetails.browser === 'chrome' &&
392 | Janus.webRTCAdapter.browserDetails.version < 72) {
393 | // Chrome does, but it's only usable from version 72 on
394 | Janus.unifiedPlan = false;
395 | } else if(!window.RTCRtpTransceiver || !('currentDirection' in RTCRtpTransceiver.prototype)) {
396 | // Safari supports addTransceiver() but not Unified Plan when
397 | // currentDirection is not defined (see codepen above).
398 | Janus.unifiedPlan = false;
399 | } else {
400 | // Check if addTransceiver() throws an exception
401 | var tempPc = new RTCPeerConnection();
402 | try {
403 | tempPc.addTransceiver('audio');
404 | Janus.unifiedPlan = true;
405 | } catch (e) {}
406 | tempPc.close();
407 | }
408 | Janus.initDone = true;
409 | options.callback();
410 | }
411 | };
412 |
413 | // Helper method to check whether WebRTC is supported by this browser
414 | Janus.isWebrtcSupported = function() {
415 | return !!window.RTCPeerConnection;
416 | };
417 | // Helper method to check whether devices can be accessed by this browser (e.g., not possible via plain HTTP)
418 | Janus.isGetUserMediaAvailable = function() {
419 | return navigator.mediaDevices && navigator.mediaDevices.getUserMedia;
420 | };
421 |
422 | // Helper method to create random identifiers (e.g., transaction)
423 | Janus.randomString = function(len) {
424 | var charSet = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
425 | var randomString = '';
426 | for (var i = 0; i < len; i++) {
427 | var randomPoz = Math.floor(Math.random() * charSet.length);
428 | randomString += charSet.substring(randomPoz,randomPoz+1);
429 | }
430 | return randomString;
431 | };
432 |
433 | function Janus(gatewayCallbacks) {
434 | gatewayCallbacks = gatewayCallbacks || {};
435 | gatewayCallbacks.success = (typeof gatewayCallbacks.success == "function") ? gatewayCallbacks.success : Janus.noop;
436 | gatewayCallbacks.error = (typeof gatewayCallbacks.error == "function") ? gatewayCallbacks.error : Janus.noop;
437 | gatewayCallbacks.destroyed = (typeof gatewayCallbacks.destroyed == "function") ? gatewayCallbacks.destroyed : Janus.noop;
438 | if(!Janus.initDone) {
439 | gatewayCallbacks.error("Library not initialized");
440 | return {};
441 | }
442 | if(!Janus.isWebrtcSupported()) {
443 | gatewayCallbacks.error("WebRTC not supported by this browser");
444 | return {};
445 | }
446 | Janus.log("Library initialized: " + Janus.initDone);
447 | if(!gatewayCallbacks.server) {
448 | gatewayCallbacks.error("Invalid server url");
449 | return {};
450 | }
451 | var websockets = false;
452 | var ws = null;
453 | var wsHandlers = {};
454 | var wsKeepaliveTimeoutId = null;
455 | var servers = null;
456 | var serversIndex = 0;
457 | var server = gatewayCallbacks.server;
458 | if(Janus.isArray(server)) {
459 | Janus.log("Multiple servers provided (" + server.length + "), will use the first that works");
460 | server = null;
461 | servers = gatewayCallbacks.server;
462 | Janus.debug(servers);
463 | } else {
464 | if(server.indexOf("ws") === 0) {
465 | websockets = true;
466 | Janus.log("Using WebSockets to contact Janus: " + server);
467 | } else {
468 | websockets = false;
469 | Janus.log("Using REST API to contact Janus: " + server);
470 | }
471 | }
472 | var iceServers = gatewayCallbacks.iceServers || [{urls: "stun:stun.l.google.com:19302"}];
473 | var iceTransportPolicy = gatewayCallbacks.iceTransportPolicy;
474 | var bundlePolicy = gatewayCallbacks.bundlePolicy;
475 | // Whether IPv6 candidates should be gathered
476 | var ipv6Support = (gatewayCallbacks.ipv6 === true);
477 | // Whether we should enable the withCredentials flag for XHR requests
478 | var withCredentials = false;
479 | if(gatewayCallbacks.withCredentials !== undefined && gatewayCallbacks.withCredentials !== null)
480 | withCredentials = gatewayCallbacks.withCredentials === true;
481 | // Optional max events
482 | var maxev = 10;
483 | if(gatewayCallbacks.max_poll_events !== undefined && gatewayCallbacks.max_poll_events !== null)
484 | maxev = gatewayCallbacks.max_poll_events;
485 | if(maxev < 1)
486 | maxev = 1;
487 | // Token to use (only if the token based authentication mechanism is enabled)
488 | var token = null;
489 | if(gatewayCallbacks.token !== undefined && gatewayCallbacks.token !== null)
490 | token = gatewayCallbacks.token;
491 | // API secret to use (only if the shared API secret is enabled)
492 | var apisecret = null;
493 | if(gatewayCallbacks.apisecret !== undefined && gatewayCallbacks.apisecret !== null)
494 | apisecret = gatewayCallbacks.apisecret;
495 | // Whether we should destroy this session when onbeforeunload is called
496 | this.destroyOnUnload = true;
497 | if(gatewayCallbacks.destroyOnUnload !== undefined && gatewayCallbacks.destroyOnUnload !== null)
498 | this.destroyOnUnload = (gatewayCallbacks.destroyOnUnload === true);
499 | // Some timeout-related values
500 | var keepAlivePeriod = 25000;
501 | if(gatewayCallbacks.keepAlivePeriod !== undefined && gatewayCallbacks.keepAlivePeriod !== null)
502 | keepAlivePeriod = gatewayCallbacks.keepAlivePeriod;
503 | if(isNaN(keepAlivePeriod))
504 | keepAlivePeriod = 25000;
505 | var longPollTimeout = 60000;
506 | if(gatewayCallbacks.longPollTimeout !== undefined && gatewayCallbacks.longPollTimeout !== null)
507 | longPollTimeout = gatewayCallbacks.longPollTimeout;
508 | if(isNaN(longPollTimeout))
509 | longPollTimeout = 60000;
510 |
511 | // overrides for default maxBitrate values for simulcasting
512 | function getMaxBitrates(simulcastMaxBitrates) {
513 | var maxBitrates = {
514 | high: 900000,
515 | medium: 300000,
516 | low: 100000,
517 | };
518 |
519 | if (simulcastMaxBitrates !== undefined && simulcastMaxBitrates !== null) {
520 | if (simulcastMaxBitrates.high)
521 | maxBitrates.high = simulcastMaxBitrates.high;
522 | if (simulcastMaxBitrates.medium)
523 | maxBitrates.medium = simulcastMaxBitrates.medium;
524 | if (simulcastMaxBitrates.low)
525 | maxBitrates.low = simulcastMaxBitrates.low;
526 | }
527 |
528 | return maxBitrates;
529 | }
530 |
531 | var connected = false;
532 | var sessionId = null;
533 | var pluginHandles = {};
534 | var that = this;
535 | var retries = 0;
536 | var transactions = {};
537 | createSession(gatewayCallbacks);
538 |
539 | // Public methods
540 | this.getServer = function() { return server; };
541 | this.isConnected = function() { return connected; };
542 | this.reconnect = function(callbacks) {
543 | callbacks = callbacks || {};
544 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
545 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : Janus.noop;
546 | callbacks["reconnect"] = true;
547 | createSession(callbacks);
548 | };
549 | this.getSessionId = function() { return sessionId; };
550 | this.destroy = function(callbacks) { destroySession(callbacks); };
551 | this.attach = function(callbacks) { createHandle(callbacks); };
552 |
553 | function eventHandler() {
554 | if(sessionId == null)
555 | return;
556 | Janus.debug('Long poll...');
557 | if(!connected) {
558 | Janus.warn("Is the server down? (connected=false)");
559 | return;
560 | }
561 | var longpoll = server + "/" + sessionId + "?rid=" + new Date().getTime();
562 | if(maxev)
563 | longpoll = longpoll + "&maxev=" + maxev;
564 | if(token)
565 | longpoll = longpoll + "&token=" + encodeURIComponent(token);
566 | if(apisecret)
567 | longpoll = longpoll + "&apisecret=" + encodeURIComponent(apisecret);
568 | Janus.httpAPICall(longpoll, {
569 | verb: 'GET',
570 | withCredentials: withCredentials,
571 | success: handleEvent,
572 | timeout: longPollTimeout,
573 | error: function(textStatus, errorThrown) {
574 | Janus.error(textStatus + ":", errorThrown);
575 | retries++;
576 | if(retries > 3) {
577 | // Did we just lose the server? :-(
578 | connected = false;
579 | gatewayCallbacks.error("Lost connection to the server (is it down?)");
580 | return;
581 | }
582 | eventHandler();
583 | }
584 | });
585 | }
586 |
587 | // Private event handler: this will trigger plugin callbacks, if set
588 | function handleEvent(json, skipTimeout) {
589 | retries = 0;
590 | if(!websockets && sessionId !== undefined && sessionId !== null && skipTimeout !== true)
591 | eventHandler();
592 | if(!websockets && Janus.isArray(json)) {
593 | // We got an array: it means we passed a maxev > 1, iterate on all objects
594 | for(var i=0; i data channel: ' + dcState);
1457 | if(dcState === 'open') {
1458 | // Any pending messages to send?
1459 | if(config.dataChannel[label].pending && config.dataChannel[label].pending.length > 0) {
1460 | Janus.log("Sending pending messages on <" + label + ">:", config.dataChannel[label].pending.length);
1461 | for(var data of config.dataChannel[label].pending) {
1462 | Janus.log("Sending data on data channel <" + label + ">");
1463 | Janus.debug(data);
1464 | config.dataChannel[label].send(data);
1465 | }
1466 | config.dataChannel[label].pending = [];
1467 | }
1468 | // Notify the open data channel
1469 | pluginHandle.ondataopen(label, protocol);
1470 | }
1471 | };
1472 | var onDataChannelError = function(error) {
1473 | Janus.error('Got error on data channel:', error);
1474 | // TODO
1475 | };
1476 | if(!incoming) {
1477 | // FIXME Add options (ordered, maxRetransmits, etc.)
1478 | var dcoptions = { ordered: true };
1479 | if(dcprotocol)
1480 | dcoptions.protocol = dcprotocol;
1481 | config.dataChannel[dclabel] = config.pc.createDataChannel(dclabel, dcoptions);
1482 | } else {
1483 | // The channel was created by Janus
1484 | config.dataChannel[dclabel] = incoming;
1485 | }
1486 | config.dataChannel[dclabel].onmessage = onDataChannelMessage;
1487 | config.dataChannel[dclabel].onopen = onDataChannelStateChange;
1488 | config.dataChannel[dclabel].onclose = onDataChannelStateChange;
1489 | config.dataChannel[dclabel].onerror = onDataChannelError;
1490 | config.dataChannel[dclabel].pending = [];
1491 | if(pendingData)
1492 | config.dataChannel[dclabel].pending.push(pendingData);
1493 | }
1494 |
1495 | // Private method to send a data channel message
1496 | function sendData(handleId, callbacks) {
1497 | callbacks = callbacks || {};
1498 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
1499 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : Janus.noop;
1500 | var pluginHandle = pluginHandles[handleId];
1501 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
1502 | Janus.warn("Invalid handle");
1503 | callbacks.error("Invalid handle");
1504 | return;
1505 | }
1506 | var config = pluginHandle.webrtcStuff;
1507 | var data = callbacks.text || callbacks.data;
1508 | if(!data) {
1509 | Janus.warn("Invalid data");
1510 | callbacks.error("Invalid data");
1511 | return;
1512 | }
1513 | var label = callbacks.label ? callbacks.label : Janus.dataChanDefaultLabel;
1514 | if(!config.dataChannel[label]) {
1515 | // Create new data channel and wait for it to open
1516 | createDataChannel(handleId, label, callbacks.protocol, false, data, callbacks.protocol);
1517 | callbacks.success();
1518 | return;
1519 | }
1520 | if(config.dataChannel[label].readyState !== "open") {
1521 | config.dataChannel[label].pending.push(data);
1522 | callbacks.success();
1523 | return;
1524 | }
1525 | Janus.log("Sending data on data channel <" + label + ">");
1526 | Janus.debug(data);
1527 | config.dataChannel[label].send(data);
1528 | callbacks.success();
1529 | }
1530 |
1531 | // Private method to send a DTMF tone
1532 | function sendDtmf(handleId, callbacks) {
1533 | callbacks = callbacks || {};
1534 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
1535 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : Janus.noop;
1536 | var pluginHandle = pluginHandles[handleId];
1537 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
1538 | Janus.warn("Invalid handle");
1539 | callbacks.error("Invalid handle");
1540 | return;
1541 | }
1542 | var config = pluginHandle.webrtcStuff;
1543 | if(!config.dtmfSender) {
1544 | // Create the DTMF sender the proper way, if possible
1545 | if(config.pc) {
1546 | var senders = config.pc.getSenders();
1547 | var audioSender = senders.find(function(sender) {
1548 | return sender.track && sender.track.kind === 'audio';
1549 | });
1550 | if(!audioSender) {
1551 | Janus.warn("Invalid DTMF configuration (no audio track)");
1552 | callbacks.error("Invalid DTMF configuration (no audio track)");
1553 | return;
1554 | }
1555 | config.dtmfSender = audioSender.dtmf;
1556 | if(config.dtmfSender) {
1557 | Janus.log("Created DTMF Sender");
1558 | config.dtmfSender.ontonechange = function(tone) { Janus.debug("Sent DTMF tone: " + tone.tone); };
1559 | }
1560 | }
1561 | if(!config.dtmfSender) {
1562 | Janus.warn("Invalid DTMF configuration");
1563 | callbacks.error("Invalid DTMF configuration");
1564 | return;
1565 | }
1566 | }
1567 | var dtmf = callbacks.dtmf;
1568 | if(!dtmf) {
1569 | Janus.warn("Invalid DTMF parameters");
1570 | callbacks.error("Invalid DTMF parameters");
1571 | return;
1572 | }
1573 | var tones = dtmf.tones;
1574 | if(!tones) {
1575 | Janus.warn("Invalid DTMF string");
1576 | callbacks.error("Invalid DTMF string");
1577 | return;
1578 | }
1579 | var duration = (typeof dtmf.duration === 'number') ? dtmf.duration : 500; // We choose 500ms as the default duration for a tone
1580 | var gap = (typeof dtmf.gap === 'number') ? dtmf.gap : 50; // We choose 50ms as the default gap between tones
1581 | Janus.debug("Sending DTMF string " + tones + " (duration " + duration + "ms, gap " + gap + "ms)");
1582 | config.dtmfSender.insertDTMF(tones, duration, gap);
1583 | callbacks.success();
1584 | }
1585 |
1586 | // Private method to destroy a plugin handle
1587 | function destroyHandle(handleId, callbacks) {
1588 | callbacks = callbacks || {};
1589 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
1590 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : Janus.noop;
1591 | var noRequest = (callbacks.noRequest === true);
1592 | Janus.log("Destroying handle " + handleId + " (only-locally=" + noRequest + ")");
1593 | cleanupWebrtc(handleId);
1594 | var pluginHandle = pluginHandles[handleId];
1595 | if(!pluginHandle || pluginHandle.detached) {
1596 | // Plugin was already detached by Janus, calling detach again will return a handle not found error, so just exit here
1597 | delete pluginHandles[handleId];
1598 | callbacks.success();
1599 | return;
1600 | }
1601 | if(noRequest) {
1602 | // We're only removing the handle locally
1603 | delete pluginHandles[handleId];
1604 | callbacks.success();
1605 | return;
1606 | }
1607 | if(!connected) {
1608 | Janus.warn("Is the server down? (connected=false)");
1609 | callbacks.error("Is the server down? (connected=false)");
1610 | return;
1611 | }
1612 | var request = { "janus": "detach", "transaction": Janus.randomString(12) };
1613 | if(pluginHandle.token)
1614 | request["token"] = pluginHandle.token;
1615 | if(apisecret)
1616 | request["apisecret"] = apisecret;
1617 | if(websockets) {
1618 | request["session_id"] = sessionId;
1619 | request["handle_id"] = handleId;
1620 | ws.send(JSON.stringify(request));
1621 | delete pluginHandles[handleId];
1622 | callbacks.success();
1623 | return;
1624 | }
1625 | Janus.httpAPICall(server + "/" + sessionId + "/" + handleId, {
1626 | verb: 'POST',
1627 | withCredentials: withCredentials,
1628 | body: request,
1629 | success: function(json) {
1630 | Janus.log("Destroyed handle:");
1631 | Janus.debug(json);
1632 | if(json["janus"] !== "success") {
1633 | Janus.error("Ooops: " + json["error"].code + " " + json["error"].reason); // FIXME
1634 | }
1635 | delete pluginHandles[handleId];
1636 | callbacks.success();
1637 | },
1638 | error: function(textStatus, errorThrown) {
1639 | Janus.error(textStatus + ":", errorThrown); // FIXME
1640 | // We cleanup anyway
1641 | delete pluginHandles[handleId];
1642 | callbacks.success();
1643 | }
1644 | });
1645 | }
1646 |
1647 | // WebRTC stuff
1648 | function streamsDone(handleId, jsep, media, callbacks, stream) {
1649 | var pluginHandle = pluginHandles[handleId];
1650 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
1651 | Janus.warn("Invalid handle");
1652 | // Close all tracks if the given stream has been created internally
1653 | if(!callbacks.stream) {
1654 | Janus.stopAllTracks(stream);
1655 | }
1656 | callbacks.error("Invalid handle");
1657 | return;
1658 | }
1659 | var config = pluginHandle.webrtcStuff;
1660 | Janus.debug("streamsDone:", stream);
1661 | if(stream) {
1662 | Janus.debug(" -- Audio tracks:", stream.getAudioTracks());
1663 | Janus.debug(" -- Video tracks:", stream.getVideoTracks());
1664 | }
1665 | // We're now capturing the new stream: check if we're updating or if it's a new thing
1666 | var addTracks = false;
1667 | if(!config.myStream || !media.update || config.streamExternal) {
1668 | config.myStream = stream;
1669 | addTracks = true;
1670 | } else {
1671 | // We only need to update the existing stream
1672 | if(((!media.update && isAudioSendEnabled(media)) || (media.update && (media.addAudio || media.replaceAudio))) &&
1673 | stream.getAudioTracks() && stream.getAudioTracks().length) {
1674 | config.myStream.addTrack(stream.getAudioTracks()[0]);
1675 | if(Janus.unifiedPlan) {
1676 | // Use Transceivers
1677 | Janus.log((media.replaceAudio ? "Replacing" : "Adding") + " audio track:", stream.getAudioTracks()[0]);
1678 | var audioTransceiver = null;
1679 | var transceivers = config.pc.getTransceivers();
1680 | if(transceivers && transceivers.length > 0) {
1681 | for(var t of transceivers) {
1682 | if((t.sender && t.sender.track && t.sender.track.kind === "audio") ||
1683 | (t.receiver && t.receiver.track && t.receiver.track.kind === "audio")) {
1684 | audioTransceiver = t;
1685 | break;
1686 | }
1687 | }
1688 | }
1689 | if(audioTransceiver && audioTransceiver.sender) {
1690 | audioTransceiver.sender.replaceTrack(stream.getAudioTracks()[0]);
1691 | } else {
1692 | config.pc.addTrack(stream.getAudioTracks()[0], stream);
1693 | }
1694 | } else {
1695 | Janus.log((media.replaceAudio ? "Replacing" : "Adding") + " audio track:", stream.getAudioTracks()[0]);
1696 | config.pc.addTrack(stream.getAudioTracks()[0], stream);
1697 | }
1698 | }
1699 | if(((!media.update && isVideoSendEnabled(media)) || (media.update && (media.addVideo || media.replaceVideo))) &&
1700 | stream.getVideoTracks() && stream.getVideoTracks().length) {
1701 | config.myStream.addTrack(stream.getVideoTracks()[0]);
1702 | if(Janus.unifiedPlan) {
1703 | // Use Transceivers
1704 | Janus.log((media.replaceVideo ? "Replacing" : "Adding") + " video track:", stream.getVideoTracks()[0]);
1705 | var videoTransceiver = null;
1706 | var transceivers = config.pc.getTransceivers();
1707 | if(transceivers && transceivers.length > 0) {
1708 | for(var t of transceivers) {
1709 | if((t.sender && t.sender.track && t.sender.track.kind === "video") ||
1710 | (t.receiver && t.receiver.track && t.receiver.track.kind === "video")) {
1711 | videoTransceiver = t;
1712 | break;
1713 | }
1714 | }
1715 | }
1716 | if(videoTransceiver && videoTransceiver.sender) {
1717 | videoTransceiver.sender.replaceTrack(stream.getVideoTracks()[0]);
1718 | } else {
1719 | config.pc.addTrack(stream.getVideoTracks()[0], stream);
1720 | }
1721 | } else {
1722 | Janus.log((media.replaceVideo ? "Replacing" : "Adding") + " video track:", stream.getVideoTracks()[0]);
1723 | config.pc.addTrack(stream.getVideoTracks()[0], stream);
1724 | }
1725 | }
1726 | }
1727 | // If we still need to create a PeerConnection, let's do that
1728 | if(!config.pc) {
1729 | var pc_config = {"iceServers": iceServers, "iceTransportPolicy": iceTransportPolicy, "bundlePolicy": bundlePolicy};
1730 | if(Janus.webRTCAdapter.browserDetails.browser === "chrome") {
1731 | // For Chrome versions before 72, we force a plan-b semantic, and unified-plan otherwise
1732 | pc_config["sdpSemantics"] = (Janus.webRTCAdapter.browserDetails.version < 72) ? "plan-b" : "unified-plan";
1733 | }
1734 | var pc_constraints = {
1735 | "optional": [{"DtlsSrtpKeyAgreement": true}]
1736 | };
1737 | if(ipv6Support) {
1738 | pc_constraints.optional.push({"googIPv6":true});
1739 | }
1740 | // Any custom constraint to add?
1741 | if(callbacks.rtcConstraints && typeof callbacks.rtcConstraints === 'object') {
1742 | Janus.debug("Adding custom PeerConnection constraints:", callbacks.rtcConstraints);
1743 | for(var i in callbacks.rtcConstraints) {
1744 | pc_constraints.optional.push(callbacks.rtcConstraints[i]);
1745 | }
1746 | }
1747 | if(Janus.webRTCAdapter.browserDetails.browser === "edge") {
1748 | // This is Edge, enable BUNDLE explicitly
1749 | pc_config.bundlePolicy = "max-bundle";
1750 | }
1751 | // Check if a sender or receiver transform has been provided
1752 | if(RTCRtpSender && RTCRtpSender.prototype.createEncodedAudioStreams &&
1753 | RTCRtpSender.prototype.createEncodedVideoStreams &&
1754 | (callbacks.senderTransforms || callbacks.receiverTransforms)) {
1755 | config.senderTransforms = callbacks.senderTransforms;
1756 | config.receiverTransforms = callbacks.receiverTransforms;
1757 | pc_config["forceEncodedAudioInsertableStreams"] = true;
1758 | pc_config["forceEncodedVideoInsertableStreams"] = true;
1759 | pc_config["encodedInsertableStreams"] = true;
1760 | }
1761 | Janus.log("Creating PeerConnection");
1762 | Janus.debug(pc_constraints);
1763 | config.pc = new RTCPeerConnection(pc_config, pc_constraints);
1764 | Janus.debug(config.pc);
1765 | if(config.pc.getStats) { // FIXME
1766 | config.volume = {};
1767 | config.bitrate.value = "0 kbits/sec";
1768 | }
1769 | Janus.log("Preparing local SDP and gathering candidates (trickle=" + config.trickle + ")");
1770 | config.pc.oniceconnectionstatechange = function(e) {
1771 | if(config.pc)
1772 | pluginHandle.iceState(config.pc.iceConnectionState);
1773 | };
1774 | config.pc.onicecandidate = function(event) {
1775 | if (!event.candidate ||
1776 | (Janus.webRTCAdapter.browserDetails.browser === 'edge' && event.candidate.candidate.indexOf('endOfCandidates') > 0)) {
1777 | Janus.log("End of candidates.");
1778 | config.iceDone = true;
1779 | if(config.trickle === true) {
1780 | // Notify end of candidates
1781 | sendTrickleCandidate(handleId, {"completed": true});
1782 | } else {
1783 | // No trickle, time to send the complete SDP (including all candidates)
1784 | sendSDP(handleId, callbacks);
1785 | }
1786 | } else {
1787 | // JSON.stringify doesn't work on some WebRTC objects anymore
1788 | // See https://code.google.com/p/chromium/issues/detail?id=467366
1789 | var candidate = {
1790 | "candidate": event.candidate.candidate,
1791 | "sdpMid": event.candidate.sdpMid,
1792 | "sdpMLineIndex": event.candidate.sdpMLineIndex
1793 | };
1794 | if(config.trickle === true) {
1795 | // Send candidate
1796 | sendTrickleCandidate(handleId, candidate);
1797 | }
1798 | }
1799 | };
1800 | config.pc.ontrack = function(event) {
1801 | Janus.log("Handling Remote Track");
1802 | Janus.debug(event);
1803 | if(!event.streams)
1804 | return;
1805 | config.remoteStream = event.streams[0];
1806 | pluginHandle.onremotestream(config.remoteStream);
1807 | if(event.track.onended)
1808 | return;
1809 | if(config.receiverTransforms) {
1810 | var receiverStreams = null;
1811 | if(event.track.kind === "audio" && config.receiverTransforms["audio"]) {
1812 | receiverStreams = event.receiver.createEncodedAudioStreams();
1813 | } else if(event.track.kind === "video" && config.receiverTransforms["video"]) {
1814 | receiverStreams = event.receiver.createEncodedVideoStreams();
1815 | }
1816 | if(receiverStreams) {
1817 | receiverStreams.readableStream
1818 | .pipeThrough(config.receiverTransforms[event.track.kind])
1819 | .pipeTo(receiverStreams.writableStream);
1820 | }
1821 | }
1822 | var trackMutedTimeoutId = null;
1823 | Janus.log("Adding onended callback to track:", event.track);
1824 | event.track.onended = function(ev) {
1825 | Janus.log("Remote track removed:", ev);
1826 | if(config.remoteStream) {
1827 | clearTimeout(trackMutedTimeoutId);
1828 | config.remoteStream.removeTrack(ev.target);
1829 | pluginHandle.onremotestream(config.remoteStream);
1830 | }
1831 | };
1832 | event.track.onmute = function(ev) {
1833 | Janus.log("Remote track muted:", ev);
1834 | if(config.remoteStream && trackMutedTimeoutId == null) {
1835 | trackMutedTimeoutId = setTimeout(function() {
1836 | Janus.log("Removing remote track");
1837 | if (config.remoteStream) {
1838 | config.remoteStream.removeTrack(ev.target);
1839 | pluginHandle.onremotestream(config.remoteStream);
1840 | }
1841 | trackMutedTimeoutId = null;
1842 | // Chrome seems to raise mute events only at multiples of 834ms;
1843 | // we set the timeout to three times this value (rounded to 840ms)
1844 | }, 3 * 840);
1845 | }
1846 | };
1847 | event.track.onunmute = function(ev) {
1848 | Janus.log("Remote track flowing again:", ev);
1849 | if(trackMutedTimeoutId != null) {
1850 | clearTimeout(trackMutedTimeoutId);
1851 | trackMutedTimeoutId = null;
1852 | } else {
1853 | try {
1854 | config.remoteStream.addTrack(ev.target);
1855 | pluginHandle.onremotestream(config.remoteStream);
1856 | } catch(e) {
1857 | Janus.error(e);
1858 | };
1859 | }
1860 | };
1861 | };
1862 | }
1863 | if(addTracks && stream) {
1864 | Janus.log('Adding local stream');
1865 | var simulcast2 = (callbacks.simulcast2 === true);
1866 | stream.getTracks().forEach(function(track) {
1867 | Janus.log('Adding local track:', track);
1868 | if(!simulcast2) {
1869 | var sender = config.pc.addTrack(track, stream);
1870 | // Check if insertable streams are involved
1871 | if(sender && config.senderTransforms) {
1872 | var senderStreams = null;
1873 | if(sender.track.kind === "audio" && config.senderTransforms["audio"]) {
1874 | senderStreams = sender.createEncodedAudioStreams();
1875 | } else if(sender.track.kind === "video" && config.senderTransforms["video"]) {
1876 | senderStreams = sender.createEncodedVideoStreams();
1877 | }
1878 | if(senderStreams) {
1879 | senderStreams.readableStream
1880 | .pipeThrough(config.senderTransforms[sender.track.kind])
1881 | .pipeTo(senderStreams.writableStream);
1882 | }
1883 | }
1884 | } else {
1885 | if(track.kind === "audio") {
1886 | config.pc.addTrack(track, stream);
1887 | } else {
1888 | Janus.log('Enabling rid-based simulcasting:', track);
1889 | var maxBitrates = getMaxBitrates(callbacks.simulcastMaxBitrates);
1890 | config.pc.addTransceiver(track, {
1891 | direction: "sendrecv",
1892 | streams: [stream],
1893 | sendEncodings: [
1894 | { rid: "h", active: true, maxBitrate: maxBitrates.high },
1895 | { rid: "m", active: true, maxBitrate: maxBitrates.medium, scaleResolutionDownBy: 2 },
1896 | { rid: "l", active: true, maxBitrate: maxBitrates.low, scaleResolutionDownBy: 4 }
1897 | ]
1898 | });
1899 | }
1900 | }
1901 | });
1902 | }
1903 | // Any data channel to create?
1904 | if(isDataEnabled(media) && !config.dataChannel[Janus.dataChanDefaultLabel]) {
1905 | Janus.log("Creating default data channel");
1906 | createDataChannel(handleId, Janus.dataChanDefaultLabel, null, false);
1907 | config.pc.ondatachannel = function(event) {
1908 | Janus.log("Data channel created by Janus:", event);
1909 | createDataChannel(handleId, event.channel.label, event.channel.protocol, event.channel);
1910 | };
1911 | }
1912 | // If there's a new local stream, let's notify the application
1913 | if(config.myStream) {
1914 | pluginHandle.onlocalstream(config.myStream);
1915 | }
1916 | // Create offer/answer now
1917 | if(!jsep) {
1918 | createOffer(handleId, media, callbacks);
1919 | } else {
1920 | config.pc.setRemoteDescription(jsep)
1921 | .then(function() {
1922 | Janus.log("Remote description accepted!");
1923 | config.remoteSdp = jsep.sdp;
1924 | // Any trickle candidate we cached?
1925 | if(config.candidates && config.candidates.length > 0) {
1926 | for(var i = 0; i< config.candidates.length; i++) {
1927 | var candidate = config.candidates[i];
1928 | Janus.debug("Adding remote candidate:", candidate);
1929 | if(!candidate || candidate.completed === true) {
1930 | // end-of-candidates
1931 | config.pc.addIceCandidate(Janus.endOfCandidates);
1932 | } else {
1933 | // New candidate
1934 | config.pc.addIceCandidate(candidate);
1935 | }
1936 | }
1937 | config.candidates = [];
1938 | }
1939 | // Create the answer now
1940 | createAnswer(handleId, media, callbacks);
1941 | }, callbacks.error);
1942 | }
1943 | }
1944 |
1945 | function prepareWebrtc(handleId, offer, callbacks) {
1946 | callbacks = callbacks || {};
1947 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
1948 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : webrtcError;
1949 | var jsep = callbacks.jsep;
1950 | if(offer && jsep) {
1951 | Janus.error("Provided a JSEP to a createOffer");
1952 | callbacks.error("Provided a JSEP to a createOffer");
1953 | return;
1954 | } else if(!offer && (!jsep || !jsep.type || !jsep.sdp)) {
1955 | Janus.error("A valid JSEP is required for createAnswer");
1956 | callbacks.error("A valid JSEP is required for createAnswer");
1957 | return;
1958 | }
1959 | /* Check that callbacks.media is a (not null) Object */
1960 | callbacks.media = (typeof callbacks.media === 'object' && callbacks.media) ? callbacks.media : { audio: true, video: true };
1961 | var media = callbacks.media;
1962 | var pluginHandle = pluginHandles[handleId];
1963 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
1964 | Janus.warn("Invalid handle");
1965 | callbacks.error("Invalid handle");
1966 | return;
1967 | }
1968 | var config = pluginHandle.webrtcStuff;
1969 | config.trickle = isTrickleEnabled(callbacks.trickle);
1970 | // Are we updating a session?
1971 | if(!config.pc) {
1972 | // Nope, new PeerConnection
1973 | media.update = false;
1974 | media.keepAudio = false;
1975 | media.keepVideo = false;
1976 | } else {
1977 | Janus.log("Updating existing media session");
1978 | media.update = true;
1979 | // Check if there's anything to add/remove/replace, or if we
1980 | // can go directly to preparing the new SDP offer or answer
1981 | if(callbacks.stream) {
1982 | // External stream: is this the same as the one we were using before?
1983 | if(callbacks.stream !== config.myStream) {
1984 | Janus.log("Renegotiation involves a new external stream");
1985 | }
1986 | } else {
1987 | // Check if there are changes on audio
1988 | if(media.addAudio) {
1989 | media.keepAudio = false;
1990 | media.replaceAudio = false;
1991 | media.removeAudio = false;
1992 | media.audioSend = true;
1993 | if(config.myStream && config.myStream.getAudioTracks() && config.myStream.getAudioTracks().length) {
1994 | Janus.error("Can't add audio stream, there already is one");
1995 | callbacks.error("Can't add audio stream, there already is one");
1996 | return;
1997 | }
1998 | } else if(media.removeAudio) {
1999 | media.keepAudio = false;
2000 | media.replaceAudio = false;
2001 | media.addAudio = false;
2002 | media.audioSend = false;
2003 | } else if(media.replaceAudio) {
2004 | media.keepAudio = false;
2005 | media.addAudio = false;
2006 | media.removeAudio = false;
2007 | media.audioSend = true;
2008 | }
2009 | if(!config.myStream) {
2010 | // No media stream: if we were asked to replace, it's actually an "add"
2011 | if(media.replaceAudio) {
2012 | media.keepAudio = false;
2013 | media.replaceAudio = false;
2014 | media.addAudio = true;
2015 | media.audioSend = true;
2016 | }
2017 | if(isAudioSendEnabled(media)) {
2018 | media.keepAudio = false;
2019 | media.addAudio = true;
2020 | }
2021 | } else {
2022 | if(!config.myStream.getAudioTracks() || config.myStream.getAudioTracks().length === 0) {
2023 | // No audio track: if we were asked to replace, it's actually an "add"
2024 | if(media.replaceAudio) {
2025 | media.keepAudio = false;
2026 | media.replaceAudio = false;
2027 | media.addAudio = true;
2028 | media.audioSend = true;
2029 | }
2030 | if(isAudioSendEnabled(media)) {
2031 | media.keepAudio = false;
2032 | media.addAudio = true;
2033 | }
2034 | } else {
2035 | // We have an audio track: should we keep it as it is?
2036 | if(isAudioSendEnabled(media) &&
2037 | !media.removeAudio && !media.replaceAudio) {
2038 | media.keepAudio = true;
2039 | }
2040 | }
2041 | }
2042 | // Check if there are changes on video
2043 | if(media.addVideo) {
2044 | media.keepVideo = false;
2045 | media.replaceVideo = false;
2046 | media.removeVideo = false;
2047 | media.videoSend = true;
2048 | if(config.myStream && config.myStream.getVideoTracks() && config.myStream.getVideoTracks().length) {
2049 | Janus.error("Can't add video stream, there already is one");
2050 | callbacks.error("Can't add video stream, there already is one");
2051 | return;
2052 | }
2053 | } else if(media.removeVideo) {
2054 | media.keepVideo = false;
2055 | media.replaceVideo = false;
2056 | media.addVideo = false;
2057 | media.videoSend = false;
2058 | } else if(media.replaceVideo) {
2059 | media.keepVideo = false;
2060 | media.addVideo = false;
2061 | media.removeVideo = false;
2062 | media.videoSend = true;
2063 | }
2064 | if(!config.myStream) {
2065 | // No media stream: if we were asked to replace, it's actually an "add"
2066 | if(media.replaceVideo) {
2067 | media.keepVideo = false;
2068 | media.replaceVideo = false;
2069 | media.addVideo = true;
2070 | media.videoSend = true;
2071 | }
2072 | if(isVideoSendEnabled(media)) {
2073 | media.keepVideo = false;
2074 | media.addVideo = true;
2075 | }
2076 | } else {
2077 | if(!config.myStream.getVideoTracks() || config.myStream.getVideoTracks().length === 0) {
2078 | // No video track: if we were asked to replace, it's actually an "add"
2079 | if(media.replaceVideo) {
2080 | media.keepVideo = false;
2081 | media.replaceVideo = false;
2082 | media.addVideo = true;
2083 | media.videoSend = true;
2084 | }
2085 | if(isVideoSendEnabled(media)) {
2086 | media.keepVideo = false;
2087 | media.addVideo = true;
2088 | }
2089 | } else {
2090 | // We have a video track: should we keep it as it is?
2091 | if(isVideoSendEnabled(media) && !media.removeVideo && !media.replaceVideo) {
2092 | media.keepVideo = true;
2093 | }
2094 | }
2095 | }
2096 | // Data channels can only be added
2097 | if(media.addData) {
2098 | media.data = true;
2099 | }
2100 | }
2101 | // If we're updating and keeping all tracks, let's skip the getUserMedia part
2102 | if((isAudioSendEnabled(media) && media.keepAudio) &&
2103 | (isVideoSendEnabled(media) && media.keepVideo)) {
2104 | pluginHandle.consentDialog(false);
2105 | streamsDone(handleId, jsep, media, callbacks, config.myStream);
2106 | return;
2107 | }
2108 | }
2109 | // If we're updating, check if we need to remove/replace one of the tracks
2110 | if(media.update && !config.streamExternal) {
2111 | if(media.removeAudio || media.replaceAudio) {
2112 | if(config.myStream && config.myStream.getAudioTracks() && config.myStream.getAudioTracks().length) {
2113 | var at = config.myStream.getAudioTracks()[0];
2114 | Janus.log("Removing audio track:", at);
2115 | config.myStream.removeTrack(at);
2116 | try {
2117 | at.stop();
2118 | } catch(e) {}
2119 | }
2120 | if(config.pc.getSenders() && config.pc.getSenders().length) {
2121 | var ra = true;
2122 | if(media.replaceAudio && Janus.unifiedPlan) {
2123 | // We can use replaceTrack
2124 | ra = false;
2125 | }
2126 | if(ra) {
2127 | for(var asnd of config.pc.getSenders()) {
2128 | if(asnd && asnd.track && asnd.track.kind === "audio") {
2129 | Janus.log("Removing audio sender:", asnd);
2130 | config.pc.removeTrack(asnd);
2131 | }
2132 | }
2133 | }
2134 | }
2135 | }
2136 | if(media.removeVideo || media.replaceVideo) {
2137 | if(config.myStream && config.myStream.getVideoTracks() && config.myStream.getVideoTracks().length) {
2138 | var vt = config.myStream.getVideoTracks()[0];
2139 | Janus.log("Removing video track:", vt);
2140 | config.myStream.removeTrack(vt);
2141 | try {
2142 | vt.stop();
2143 | } catch(e) {}
2144 | }
2145 | if(config.pc.getSenders() && config.pc.getSenders().length) {
2146 | var rv = true;
2147 | if(media.replaceVideo && Janus.unifiedPlan) {
2148 | // We can use replaceTrack
2149 | rv = false;
2150 | }
2151 | if(rv) {
2152 | for(var vsnd of config.pc.getSenders()) {
2153 | if(vsnd && vsnd.track && vsnd.track.kind === "video") {
2154 | Janus.log("Removing video sender:", vsnd);
2155 | config.pc.removeTrack(vsnd);
2156 | }
2157 | }
2158 | }
2159 | }
2160 | }
2161 | }
2162 | // Was a MediaStream object passed, or do we need to take care of that?
2163 | if(callbacks.stream) {
2164 | var stream = callbacks.stream;
2165 | Janus.log("MediaStream provided by the application");
2166 | Janus.debug(stream);
2167 | // If this is an update, let's check if we need to release the previous stream
2168 | if(media.update) {
2169 | if(config.myStream && config.myStream !== callbacks.stream && !config.streamExternal) {
2170 | // We're replacing a stream we captured ourselves with an external one
2171 | Janus.stopAllTracks(config.myStream);
2172 | config.myStream = null;
2173 | }
2174 | }
2175 | // Skip the getUserMedia part
2176 | config.streamExternal = true;
2177 | pluginHandle.consentDialog(false);
2178 | streamsDone(handleId, jsep, media, callbacks, stream);
2179 | return;
2180 | }
2181 | if(isAudioSendEnabled(media) || isVideoSendEnabled(media)) {
2182 | if(!Janus.isGetUserMediaAvailable()) {
2183 | callbacks.error("getUserMedia not available");
2184 | return;
2185 | }
2186 | var constraints = { mandatory: {}, optional: []};
2187 | pluginHandle.consentDialog(true);
2188 | var audioSupport = isAudioSendEnabled(media);
2189 | if(audioSupport && media && typeof media.audio === 'object')
2190 | audioSupport = media.audio;
2191 | var videoSupport = isVideoSendEnabled(media);
2192 | if(videoSupport && media) {
2193 | var simulcast = (callbacks.simulcast === true);
2194 | var simulcast2 = (callbacks.simulcast2 === true);
2195 | if((simulcast || simulcast2) && !jsep && !media.video)
2196 | media.video = "hires";
2197 | if(media.video && media.video != 'screen' && media.video != 'window') {
2198 | if(typeof media.video === 'object') {
2199 | videoSupport = media.video;
2200 | } else {
2201 | var width = 0;
2202 | var height = 0, maxHeight = 0;
2203 | if(media.video === 'lowres') {
2204 | // Small resolution, 4:3
2205 | height = 240;
2206 | maxHeight = 240;
2207 | width = 320;
2208 | } else if(media.video === 'lowres-16:9') {
2209 | // Small resolution, 16:9
2210 | height = 180;
2211 | maxHeight = 180;
2212 | width = 320;
2213 | } else if(media.video === 'hires' || media.video === 'hires-16:9' || media.video === 'hdres') {
2214 | // High(HD) resolution is only 16:9
2215 | height = 720;
2216 | maxHeight = 720;
2217 | width = 1280;
2218 | } else if(media.video === 'fhdres') {
2219 | // Full HD resolution is only 16:9
2220 | height = 1080;
2221 | maxHeight = 1080;
2222 | width = 1920;
2223 | } else if(media.video === '4kres') {
2224 | // 4K resolution is only 16:9
2225 | height = 2160;
2226 | maxHeight = 2160;
2227 | width = 3840;
2228 | } else if(media.video === 'stdres') {
2229 | // Normal resolution, 4:3
2230 | height = 480;
2231 | maxHeight = 480;
2232 | width = 640;
2233 | } else if(media.video === 'stdres-16:9') {
2234 | // Normal resolution, 16:9
2235 | height = 360;
2236 | maxHeight = 360;
2237 | width = 640;
2238 | } else {
2239 | Janus.log("Default video setting is stdres 4:3");
2240 | height = 480;
2241 | maxHeight = 480;
2242 | width = 640;
2243 | }
2244 | Janus.log("Adding media constraint:", media.video);
2245 | videoSupport = {
2246 | 'height': {'ideal': height},
2247 | 'width': {'ideal': width}
2248 | };
2249 | Janus.log("Adding video constraint:", videoSupport);
2250 | }
2251 | } else if(media.video === 'screen' || media.video === 'window') {
2252 | if(navigator.mediaDevices && navigator.mediaDevices.getDisplayMedia) {
2253 | // The new experimental getDisplayMedia API is available, let's use that
2254 | // https://groups.google.com/forum/#!topic/discuss-webrtc/Uf0SrR4uxzk
2255 | // https://webrtchacks.com/chrome-screensharing-getdisplaymedia/
2256 | constraints.video = {};
2257 | if(media.screenshareFrameRate) {
2258 | constraints.video.frameRate = media.screenshareFrameRate;
2259 | }
2260 | if(media.screenshareHeight) {
2261 | constraints.video.height = media.screenshareHeight;
2262 | }
2263 | if(media.screenshareWidth) {
2264 | constraints.video.width = media.screenshareWidth;
2265 | }
2266 | constraints.audio = media.captureDesktopAudio;
2267 | navigator.mediaDevices.getDisplayMedia(constraints)
2268 | .then(function(stream) {
2269 | pluginHandle.consentDialog(false);
2270 | if(isAudioSendEnabled(media) && !media.keepAudio) {
2271 | navigator.mediaDevices.getUserMedia({ audio: true, video: false })
2272 | .then(function (audioStream) {
2273 | stream.addTrack(audioStream.getAudioTracks()[0]);
2274 | streamsDone(handleId, jsep, media, callbacks, stream);
2275 | })
2276 | } else {
2277 | streamsDone(handleId, jsep, media, callbacks, stream);
2278 | }
2279 | }, function (error) {
2280 | pluginHandle.consentDialog(false);
2281 | callbacks.error(error);
2282 | });
2283 | return;
2284 | }
2285 | // We're going to try and use the extension for Chrome 34+, the old approach
2286 | // for older versions of Chrome, or the experimental support in Firefox 33+
2287 | function callbackUserMedia (error, stream) {
2288 | pluginHandle.consentDialog(false);
2289 | if(error) {
2290 | callbacks.error(error);
2291 | } else {
2292 | streamsDone(handleId, jsep, media, callbacks, stream);
2293 | }
2294 | }
2295 | function getScreenMedia(constraints, gsmCallback, useAudio) {
2296 | Janus.log("Adding media constraint (screen capture)");
2297 | Janus.debug(constraints);
2298 | navigator.mediaDevices.getUserMedia(constraints)
2299 | .then(function(stream) {
2300 | if(useAudio) {
2301 | navigator.mediaDevices.getUserMedia({ audio: true, video: false })
2302 | .then(function (audioStream) {
2303 | stream.addTrack(audioStream.getAudioTracks()[0]);
2304 | gsmCallback(null, stream);
2305 | })
2306 | } else {
2307 | gsmCallback(null, stream);
2308 | }
2309 | })
2310 | .catch(function(error) { pluginHandle.consentDialog(false); gsmCallback(error); });
2311 | }
2312 | if(Janus.webRTCAdapter.browserDetails.browser === 'chrome') {
2313 | var chromever = Janus.webRTCAdapter.browserDetails.version;
2314 | var maxver = 33;
2315 | if(window.navigator.userAgent.match('Linux'))
2316 | maxver = 35; // "known" crash in chrome 34 and 35 on linux
2317 | if(chromever >= 26 && chromever <= maxver) {
2318 | // Chrome 26->33 requires some awkward chrome://flags manipulation
2319 | constraints = {
2320 | video: {
2321 | mandatory: {
2322 | googLeakyBucket: true,
2323 | maxWidth: window.screen.width,
2324 | maxHeight: window.screen.height,
2325 | minFrameRate: media.screenshareFrameRate,
2326 | maxFrameRate: media.screenshareFrameRate,
2327 | chromeMediaSource: 'screen'
2328 | }
2329 | },
2330 | audio: isAudioSendEnabled(media) && !media.keepAudio
2331 | };
2332 | getScreenMedia(constraints, callbackUserMedia);
2333 | } else {
2334 | // Chrome 34+ requires an extension
2335 | Janus.extension.getScreen(function (error, sourceId) {
2336 | if (error) {
2337 | pluginHandle.consentDialog(false);
2338 | return callbacks.error(error);
2339 | }
2340 | constraints = {
2341 | audio: false,
2342 | video: {
2343 | mandatory: {
2344 | chromeMediaSource: 'desktop',
2345 | maxWidth: window.screen.width,
2346 | maxHeight: window.screen.height,
2347 | minFrameRate: media.screenshareFrameRate,
2348 | maxFrameRate: media.screenshareFrameRate,
2349 | },
2350 | optional: [
2351 | {googLeakyBucket: true},
2352 | {googTemporalLayeredScreencast: true}
2353 | ]
2354 | }
2355 | };
2356 | constraints.video.mandatory.chromeMediaSourceId = sourceId;
2357 | getScreenMedia(constraints, callbackUserMedia,
2358 | isAudioSendEnabled(media) && !media.keepAudio);
2359 | });
2360 | }
2361 | } else if(Janus.webRTCAdapter.browserDetails.browser === 'firefox') {
2362 | if(Janus.webRTCAdapter.browserDetails.version >= 33) {
2363 | // Firefox 33+ has experimental support for screen sharing
2364 | constraints = {
2365 | video: {
2366 | mozMediaSource: media.video,
2367 | mediaSource: media.video
2368 | },
2369 | audio: isAudioSendEnabled(media) && !media.keepAudio
2370 | };
2371 | getScreenMedia(constraints, function (err, stream) {
2372 | callbackUserMedia(err, stream);
2373 | // Workaround for https://bugzilla.mozilla.org/show_bug.cgi?id=1045810
2374 | if (!err) {
2375 | var lastTime = stream.currentTime;
2376 | var polly = window.setInterval(function () {
2377 | if(!stream)
2378 | window.clearInterval(polly);
2379 | if(stream.currentTime == lastTime) {
2380 | window.clearInterval(polly);
2381 | if(stream.onended) {
2382 | stream.onended();
2383 | }
2384 | }
2385 | lastTime = stream.currentTime;
2386 | }, 500);
2387 | }
2388 | });
2389 | } else {
2390 | var error = new Error('NavigatorUserMediaError');
2391 | error.name = 'Your version of Firefox does not support screen sharing, please install Firefox 33 (or more recent versions)';
2392 | pluginHandle.consentDialog(false);
2393 | callbacks.error(error);
2394 | return;
2395 | }
2396 | }
2397 | return;
2398 | }
2399 | }
2400 | // If we got here, we're not screensharing
2401 | if(!media || media.video !== 'screen') {
2402 | // Check whether all media sources are actually available or not
2403 | navigator.mediaDevices.enumerateDevices().then(function(devices) {
2404 | var audioExist = devices.some(function(device) {
2405 | return device.kind === 'audioinput';
2406 | }),
2407 | videoExist = isScreenSendEnabled(media) || devices.some(function(device) {
2408 | return device.kind === 'videoinput';
2409 | });
2410 |
2411 | // Check whether a missing device is really a problem
2412 | var audioSend = isAudioSendEnabled(media);
2413 | var videoSend = isVideoSendEnabled(media);
2414 | var needAudioDevice = isAudioSendRequired(media);
2415 | var needVideoDevice = isVideoSendRequired(media);
2416 | if(audioSend || videoSend || needAudioDevice || needVideoDevice) {
2417 | // We need to send either audio or video
2418 | var haveAudioDevice = audioSend ? audioExist : false;
2419 | var haveVideoDevice = videoSend ? videoExist : false;
2420 | if(!haveAudioDevice && !haveVideoDevice) {
2421 | // FIXME Should we really give up, or just assume recvonly for both?
2422 | pluginHandle.consentDialog(false);
2423 | callbacks.error('No capture device found');
2424 | return false;
2425 | } else if(!haveAudioDevice && needAudioDevice) {
2426 | pluginHandle.consentDialog(false);
2427 | callbacks.error('Audio capture is required, but no capture device found');
2428 | return false;
2429 | } else if(!haveVideoDevice && needVideoDevice) {
2430 | pluginHandle.consentDialog(false);
2431 | callbacks.error('Video capture is required, but no capture device found');
2432 | return false;
2433 | }
2434 | }
2435 |
2436 | var gumConstraints = {
2437 | audio: (audioExist && !media.keepAudio) ? audioSupport : false,
2438 | video: (videoExist && !media.keepVideo) ? videoSupport : false
2439 | };
2440 | Janus.debug("getUserMedia constraints", gumConstraints);
2441 | if (!gumConstraints.audio && !gumConstraints.video) {
2442 | pluginHandle.consentDialog(false);
2443 | streamsDone(handleId, jsep, media, callbacks, stream);
2444 | } else {
2445 | navigator.mediaDevices.getUserMedia(gumConstraints)
2446 | .then(function(stream) {
2447 | pluginHandle.consentDialog(false);
2448 | streamsDone(handleId, jsep, media, callbacks, stream);
2449 | }).catch(function(error) {
2450 | pluginHandle.consentDialog(false);
2451 | callbacks.error({code: error.code, name: error.name, message: error.message});
2452 | });
2453 | }
2454 | })
2455 | .catch(function(error) {
2456 | pluginHandle.consentDialog(false);
2457 | callbacks.error(error);
2458 | });
2459 | }
2460 | } else {
2461 | // No need to do a getUserMedia, create offer/answer right away
2462 | streamsDone(handleId, jsep, media, callbacks);
2463 | }
2464 | }
2465 |
2466 | function prepareWebrtcPeer(handleId, callbacks) {
2467 | callbacks = callbacks || {};
2468 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
2469 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : webrtcError;
2470 | var jsep = callbacks.jsep;
2471 | var pluginHandle = pluginHandles[handleId];
2472 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
2473 | Janus.warn("Invalid handle");
2474 | callbacks.error("Invalid handle");
2475 | return;
2476 | }
2477 | var config = pluginHandle.webrtcStuff;
2478 | if(jsep) {
2479 | if(!config.pc) {
2480 | Janus.warn("Wait, no PeerConnection?? if this is an answer, use createAnswer and not handleRemoteJsep");
2481 | callbacks.error("No PeerConnection: if this is an answer, use createAnswer and not handleRemoteJsep");
2482 | return;
2483 | }
2484 | config.pc.setRemoteDescription(jsep)
2485 | .then(function() {
2486 | Janus.log("Remote description accepted!");
2487 | config.remoteSdp = jsep.sdp;
2488 | // Any trickle candidate we cached?
2489 | if(config.candidates && config.candidates.length > 0) {
2490 | for(var i = 0; i< config.candidates.length; i++) {
2491 | var candidate = config.candidates[i];
2492 | Janus.debug("Adding remote candidate:", candidate);
2493 | if(!candidate || candidate.completed === true) {
2494 | // end-of-candidates
2495 | config.pc.addIceCandidate(Janus.endOfCandidates);
2496 | } else {
2497 | // New candidate
2498 | config.pc.addIceCandidate(candidate);
2499 | }
2500 | }
2501 | config.candidates = [];
2502 | }
2503 | // Done
2504 | callbacks.success();
2505 | }, callbacks.error);
2506 | } else {
2507 | callbacks.error("Invalid JSEP");
2508 | }
2509 | }
2510 |
2511 | function createOffer(handleId, media, callbacks) {
2512 | callbacks = callbacks || {};
2513 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
2514 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : Janus.noop;
2515 | callbacks.customizeSdp = (typeof callbacks.customizeSdp == "function") ? callbacks.customizeSdp : Janus.noop;
2516 | var pluginHandle = pluginHandles[handleId];
2517 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
2518 | Janus.warn("Invalid handle");
2519 | callbacks.error("Invalid handle");
2520 | return;
2521 | }
2522 | var config = pluginHandle.webrtcStuff;
2523 | var simulcast = (callbacks.simulcast === true);
2524 | if(!simulcast) {
2525 | Janus.log("Creating offer (iceDone=" + config.iceDone + ")");
2526 | } else {
2527 | Janus.log("Creating offer (iceDone=" + config.iceDone + ", simulcast=" + simulcast + ")");
2528 | }
2529 | // https://code.google.com/p/webrtc/issues/detail?id=3508
2530 | var mediaConstraints = {};
2531 | if(Janus.unifiedPlan) {
2532 | // We can use Transceivers
2533 | var audioTransceiver = null, videoTransceiver = null;
2534 | var transceivers = config.pc.getTransceivers();
2535 | if(transceivers && transceivers.length > 0) {
2536 | for(var t of transceivers) {
2537 | if((t.sender && t.sender.track && t.sender.track.kind === "audio") ||
2538 | (t.receiver && t.receiver.track && t.receiver.track.kind === "audio")) {
2539 | if(!audioTransceiver) {
2540 | audioTransceiver = t;
2541 | }
2542 | continue;
2543 | }
2544 | if((t.sender && t.sender.track && t.sender.track.kind === "video") ||
2545 | (t.receiver && t.receiver.track && t.receiver.track.kind === "video")) {
2546 | if(!videoTransceiver) {
2547 | videoTransceiver = t;
2548 | }
2549 | continue;
2550 | }
2551 | }
2552 | }
2553 | // Handle audio (and related changes, if any)
2554 | var audioSend = isAudioSendEnabled(media);
2555 | var audioRecv = isAudioRecvEnabled(media);
2556 | if(!audioSend && !audioRecv) {
2557 | // Audio disabled: have we removed it?
2558 | if(media.removeAudio && audioTransceiver) {
2559 | if (audioTransceiver.setDirection) {
2560 | audioTransceiver.setDirection("inactive");
2561 | } else {
2562 | audioTransceiver.direction = "inactive";
2563 | }
2564 | Janus.log("Setting audio transceiver to inactive:", audioTransceiver);
2565 | }
2566 | } else {
2567 | // Take care of audio m-line
2568 | if(audioSend && audioRecv) {
2569 | if(audioTransceiver) {
2570 | if (audioTransceiver.setDirection) {
2571 | audioTransceiver.setDirection("sendrecv");
2572 | } else {
2573 | audioTransceiver.direction = "sendrecv";
2574 | }
2575 | Janus.log("Setting audio transceiver to sendrecv:", audioTransceiver);
2576 | }
2577 | } else if(audioSend && !audioRecv) {
2578 | if(audioTransceiver) {
2579 | if (audioTransceiver.setDirection) {
2580 | audioTransceiver.setDirection("sendonly");
2581 | } else {
2582 | audioTransceiver.direction = "sendonly";
2583 | }
2584 | Janus.log("Setting audio transceiver to sendonly:", audioTransceiver);
2585 | }
2586 | } else if(!audioSend && audioRecv) {
2587 | if(audioTransceiver) {
2588 | if (audioTransceiver.setDirection) {
2589 | audioTransceiver.setDirection("recvonly");
2590 | } else {
2591 | audioTransceiver.direction = "recvonly";
2592 | }
2593 | Janus.log("Setting audio transceiver to recvonly:", audioTransceiver);
2594 | } else {
2595 | // In theory, this is the only case where we might not have a transceiver yet
2596 | audioTransceiver = config.pc.addTransceiver("audio", { direction: "recvonly" });
2597 | Janus.log("Adding recvonly audio transceiver:", audioTransceiver);
2598 | }
2599 | }
2600 | }
2601 | // Handle video (and related changes, if any)
2602 | var videoSend = isVideoSendEnabled(media);
2603 | var videoRecv = isVideoRecvEnabled(media);
2604 | if(!videoSend && !videoRecv) {
2605 | // Video disabled: have we removed it?
2606 | if(media.removeVideo && videoTransceiver) {
2607 | if (videoTransceiver.setDirection) {
2608 | videoTransceiver.setDirection("inactive");
2609 | } else {
2610 | videoTransceiver.direction = "inactive";
2611 | }
2612 | Janus.log("Setting video transceiver to inactive:", videoTransceiver);
2613 | }
2614 | } else {
2615 | // Take care of video m-line
2616 | if(videoSend && videoRecv) {
2617 | if(videoTransceiver) {
2618 | if (videoTransceiver.setDirection) {
2619 | videoTransceiver.setDirection("sendrecv");
2620 | } else {
2621 | videoTransceiver.direction = "sendrecv";
2622 | }
2623 | Janus.log("Setting video transceiver to sendrecv:", videoTransceiver);
2624 | }
2625 | } else if(videoSend && !videoRecv) {
2626 | if(videoTransceiver) {
2627 | if (videoTransceiver.setDirection) {
2628 | videoTransceiver.setDirection("sendonly");
2629 | } else {
2630 | videoTransceiver.direction = "sendonly";
2631 | }
2632 | Janus.log("Setting video transceiver to sendonly:", videoTransceiver);
2633 | }
2634 | } else if(!videoSend && videoRecv) {
2635 | if(videoTransceiver) {
2636 | if (videoTransceiver.setDirection) {
2637 | videoTransceiver.setDirection("recvonly");
2638 | } else {
2639 | videoTransceiver.direction = "recvonly";
2640 | }
2641 | Janus.log("Setting video transceiver to recvonly:", videoTransceiver);
2642 | } else {
2643 | // In theory, this is the only case where we might not have a transceiver yet
2644 | videoTransceiver = config.pc.addTransceiver("video", { direction: "recvonly" });
2645 | Janus.log("Adding recvonly video transceiver:", videoTransceiver);
2646 | }
2647 | }
2648 | }
2649 | } else {
2650 | mediaConstraints["offerToReceiveAudio"] = isAudioRecvEnabled(media);
2651 | mediaConstraints["offerToReceiveVideo"] = isVideoRecvEnabled(media);
2652 | }
2653 | var iceRestart = (callbacks.iceRestart === true);
2654 | if(iceRestart) {
2655 | mediaConstraints["iceRestart"] = true;
2656 | }
2657 | Janus.debug(mediaConstraints);
2658 | // Check if this is Firefox and we've been asked to do simulcasting
2659 | var sendVideo = isVideoSendEnabled(media);
2660 | if(sendVideo && simulcast && Janus.webRTCAdapter.browserDetails.browser === "firefox") {
2661 | // FIXME Based on https://gist.github.com/voluntas/088bc3cc62094730647b
2662 | Janus.log("Enabling Simulcasting for Firefox (RID)");
2663 | var sender = config.pc.getSenders().find(function(s) {return s.track.kind === "video"});
2664 | if(sender) {
2665 | var parameters = sender.getParameters();
2666 | if(!parameters) {
2667 | parameters = {};
2668 | }
2669 | var maxBitrates = getMaxBitrates(callbacks.simulcastMaxBitrates);
2670 | parameters.encodings = [
2671 | { rid: "h", active: true, maxBitrate: maxBitrates.high },
2672 | { rid: "m", active: true, maxBitrate: maxBitrates.medium, scaleResolutionDownBy: 2 },
2673 | { rid: "l", active: true, maxBitrate: maxBitrates.low, scaleResolutionDownBy: 4 }
2674 | ];
2675 | sender.setParameters(parameters);
2676 | }
2677 | }
2678 | config.pc.createOffer(mediaConstraints)
2679 | .then(function(offer) {
2680 | Janus.debug(offer);
2681 | // JSON.stringify doesn't work on some WebRTC objects anymore
2682 | // See https://code.google.com/p/chromium/issues/detail?id=467366
2683 | var jsep = {
2684 | "type": offer.type,
2685 | "sdp": offer.sdp
2686 | };
2687 | callbacks.customizeSdp(jsep);
2688 | offer.sdp = jsep.sdp;
2689 | Janus.log("Setting local description");
2690 | if(sendVideo && simulcast) {
2691 | // This SDP munging only works with Chrome (Safari STP may support it too)
2692 | if(Janus.webRTCAdapter.browserDetails.browser === "chrome" ||
2693 | Janus.webRTCAdapter.browserDetails.browser === "safari") {
2694 | Janus.log("Enabling Simulcasting for Chrome (SDP munging)");
2695 | offer.sdp = mungeSdpForSimulcasting(offer.sdp);
2696 | } else if(Janus.webRTCAdapter.browserDetails.browser !== "firefox") {
2697 | Janus.warn("simulcast=true, but this is not Chrome nor Firefox, ignoring");
2698 | }
2699 | }
2700 | config.mySdp = offer.sdp;
2701 | config.pc.setLocalDescription(offer)
2702 | .catch(callbacks.error);
2703 | config.mediaConstraints = mediaConstraints;
2704 | if(!config.iceDone && !config.trickle) {
2705 | // Don't do anything until we have all candidates
2706 | Janus.log("Waiting for all candidates...");
2707 | return;
2708 | }
2709 | // If transforms are present, notify Janus that the media is end-to-end encrypted
2710 | if(config.senderTransforms || config.receiverTransforms) {
2711 | offer["e2ee"] = true;
2712 | }
2713 | callbacks.success(offer);
2714 | }, callbacks.error);
2715 | }
2716 |
2717 | function createAnswer(handleId, media, callbacks) {
2718 | callbacks = callbacks || {};
2719 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
2720 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : Janus.noop;
2721 | callbacks.customizeSdp = (typeof callbacks.customizeSdp == "function") ? callbacks.customizeSdp : Janus.noop;
2722 | var pluginHandle = pluginHandles[handleId];
2723 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
2724 | Janus.warn("Invalid handle");
2725 | callbacks.error("Invalid handle");
2726 | return;
2727 | }
2728 | var config = pluginHandle.webrtcStuff;
2729 | var simulcast = (callbacks.simulcast === true);
2730 | if(!simulcast) {
2731 | Janus.log("Creating answer (iceDone=" + config.iceDone + ")");
2732 | } else {
2733 | Janus.log("Creating answer (iceDone=" + config.iceDone + ", simulcast=" + simulcast + ")");
2734 | }
2735 | var mediaConstraints = null;
2736 | if(Janus.unifiedPlan) {
2737 | // We can use Transceivers
2738 | mediaConstraints = {};
2739 | var audioTransceiver = null, videoTransceiver = null;
2740 | var transceivers = config.pc.getTransceivers();
2741 | if(transceivers && transceivers.length > 0) {
2742 | for(var t of transceivers) {
2743 | if((t.sender && t.sender.track && t.sender.track.kind === "audio") ||
2744 | (t.receiver && t.receiver.track && t.receiver.track.kind === "audio")) {
2745 | if(!audioTransceiver)
2746 | audioTransceiver = t;
2747 | continue;
2748 | }
2749 | if((t.sender && t.sender.track && t.sender.track.kind === "video") ||
2750 | (t.receiver && t.receiver.track && t.receiver.track.kind === "video")) {
2751 | if(!videoTransceiver)
2752 | videoTransceiver = t;
2753 | continue;
2754 | }
2755 | }
2756 | }
2757 | // Handle audio (and related changes, if any)
2758 | var audioSend = isAudioSendEnabled(media);
2759 | var audioRecv = isAudioRecvEnabled(media);
2760 | if(!audioSend && !audioRecv) {
2761 | // Audio disabled: have we removed it?
2762 | if(media.removeAudio && audioTransceiver) {
2763 | try {
2764 | if (audioTransceiver.setDirection) {
2765 | audioTransceiver.setDirection("inactive");
2766 | } else {
2767 | audioTransceiver.direction = "inactive";
2768 | }
2769 | Janus.log("Setting audio transceiver to inactive:", audioTransceiver);
2770 | } catch(e) {
2771 | Janus.error(e);
2772 | }
2773 | }
2774 | } else {
2775 | // Take care of audio m-line
2776 | if(audioSend && audioRecv) {
2777 | if(audioTransceiver) {
2778 | try {
2779 | if (audioTransceiver.setDirection) {
2780 | audioTransceiver.setDirection("sendrecv");
2781 | } else {
2782 | audioTransceiver.direction = "sendrecv";
2783 | }
2784 | Janus.log("Setting audio transceiver to sendrecv:", audioTransceiver);
2785 | } catch(e) {
2786 | Janus.error(e);
2787 | }
2788 | }
2789 | } else if(audioSend && !audioRecv) {
2790 | try {
2791 | if(audioTransceiver) {
2792 | if (audioTransceiver.setDirection) {
2793 | audioTransceiver.setDirection("sendonly");
2794 | } else {
2795 | audioTransceiver.direction = "sendonly";
2796 | }
2797 | Janus.log("Setting audio transceiver to sendonly:", audioTransceiver);
2798 | }
2799 | } catch(e) {
2800 | Janus.error(e);
2801 | }
2802 | } else if(!audioSend && audioRecv) {
2803 | if(audioTransceiver) {
2804 | try {
2805 | if (audioTransceiver.setDirection) {
2806 | audioTransceiver.setDirection("recvonly");
2807 | } else {
2808 | audioTransceiver.direction = "recvonly";
2809 | }
2810 | Janus.log("Setting audio transceiver to recvonly:", audioTransceiver);
2811 | } catch(e) {
2812 | Janus.error(e);
2813 | }
2814 | } else {
2815 | // In theory, this is the only case where we might not have a transceiver yet
2816 | audioTransceiver = config.pc.addTransceiver("audio", { direction: "recvonly" });
2817 | Janus.log("Adding recvonly audio transceiver:", audioTransceiver);
2818 | }
2819 | }
2820 | }
2821 | // Handle video (and related changes, if any)
2822 | var videoSend = isVideoSendEnabled(media);
2823 | var videoRecv = isVideoRecvEnabled(media);
2824 | if(!videoSend && !videoRecv) {
2825 | // Video disabled: have we removed it?
2826 | if(media.removeVideo && videoTransceiver) {
2827 | try {
2828 | if (videoTransceiver.setDirection) {
2829 | videoTransceiver.setDirection("inactive");
2830 | } else {
2831 | videoTransceiver.direction = "inactive";
2832 | }
2833 | Janus.log("Setting video transceiver to inactive:", videoTransceiver);
2834 | } catch(e) {
2835 | Janus.error(e);
2836 | }
2837 | }
2838 | } else {
2839 | // Take care of video m-line
2840 | if(videoSend && videoRecv) {
2841 | if(videoTransceiver) {
2842 | try {
2843 | if (videoTransceiver.setDirection) {
2844 | videoTransceiver.setDirection("sendrecv");
2845 | } else {
2846 | videoTransceiver.direction = "sendrecv";
2847 | }
2848 | Janus.log("Setting video transceiver to sendrecv:", videoTransceiver);
2849 | } catch(e) {
2850 | Janus.error(e);
2851 | }
2852 | }
2853 | } else if(videoSend && !videoRecv) {
2854 | if(videoTransceiver) {
2855 | try {
2856 | if (videoTransceiver.setDirection) {
2857 | videoTransceiver.setDirection("sendonly");
2858 | } else {
2859 | videoTransceiver.direction = "sendonly";
2860 | }
2861 | Janus.log("Setting video transceiver to sendonly:", videoTransceiver);
2862 | } catch(e) {
2863 | Janus.error(e);
2864 | }
2865 | }
2866 | } else if(!videoSend && videoRecv) {
2867 | if(videoTransceiver) {
2868 | try {
2869 | if (videoTransceiver.setDirection) {
2870 | videoTransceiver.setDirection("recvonly");
2871 | } else {
2872 | videoTransceiver.direction = "recvonly";
2873 | }
2874 | Janus.log("Setting video transceiver to recvonly:", videoTransceiver);
2875 | } catch(e) {
2876 | Janus.error(e);
2877 | }
2878 | } else {
2879 | // In theory, this is the only case where we might not have a transceiver yet
2880 | videoTransceiver = config.pc.addTransceiver("video", { direction: "recvonly" });
2881 | Janus.log("Adding recvonly video transceiver:", videoTransceiver);
2882 | }
2883 | }
2884 | }
2885 | } else {
2886 | if(Janus.webRTCAdapter.browserDetails.browser === "firefox" || Janus.webRTCAdapter.browserDetails.browser === "edge") {
2887 | mediaConstraints = {
2888 | offerToReceiveAudio: isAudioRecvEnabled(media),
2889 | offerToReceiveVideo: isVideoRecvEnabled(media)
2890 | };
2891 | } else {
2892 | mediaConstraints = {
2893 | mandatory: {
2894 | OfferToReceiveAudio: isAudioRecvEnabled(media),
2895 | OfferToReceiveVideo: isVideoRecvEnabled(media)
2896 | }
2897 | };
2898 | }
2899 | }
2900 | Janus.debug(mediaConstraints);
2901 | // Check if this is Firefox and we've been asked to do simulcasting
2902 | var sendVideo = isVideoSendEnabled(media);
2903 | if(sendVideo && simulcast && Janus.webRTCAdapter.browserDetails.browser === "firefox") {
2904 | // FIXME Based on https://gist.github.com/voluntas/088bc3cc62094730647b
2905 | Janus.log("Enabling Simulcasting for Firefox (RID)");
2906 | var sender = config.pc.getSenders()[1];
2907 | Janus.log(sender);
2908 | var parameters = sender.getParameters();
2909 | Janus.log(parameters);
2910 |
2911 | var maxBitrates = getMaxBitrates(callbacks.simulcastMaxBitrates);
2912 | sender.setParameters({encodings: [
2913 | { rid: "high", active: true, priority: "high", maxBitrate: maxBitrates.high },
2914 | { rid: "medium", active: true, priority: "medium", maxBitrate: maxBitrates.medium },
2915 | { rid: "low", active: true, priority: "low", maxBitrate: maxBitrates.low }
2916 | ]});
2917 | }
2918 | config.pc.createAnswer(mediaConstraints)
2919 | .then(function(answer) {
2920 | Janus.debug(answer);
2921 | // JSON.stringify doesn't work on some WebRTC objects anymore
2922 | // See https://code.google.com/p/chromium/issues/detail?id=467366
2923 | var jsep = {
2924 | "type": answer.type,
2925 | "sdp": answer.sdp
2926 | };
2927 | callbacks.customizeSdp(jsep);
2928 | answer.sdp = jsep.sdp;
2929 | Janus.log("Setting local description");
2930 | if(sendVideo && simulcast) {
2931 | // This SDP munging only works with Chrome
2932 | if(Janus.webRTCAdapter.browserDetails.browser === "chrome") {
2933 | // FIXME Apparently trying to simulcast when answering breaks video in Chrome...
2934 | //~ Janus.log("Enabling Simulcasting for Chrome (SDP munging)");
2935 | //~ answer.sdp = mungeSdpForSimulcasting(answer.sdp);
2936 | Janus.warn("simulcast=true, but this is an answer, and video breaks in Chrome if we enable it");
2937 | } else if(Janus.webRTCAdapter.browserDetails.browser !== "firefox") {
2938 | Janus.warn("simulcast=true, but this is not Chrome nor Firefox, ignoring");
2939 | }
2940 | }
2941 | config.mySdp = answer.sdp;
2942 | config.pc.setLocalDescription(answer)
2943 | .catch(callbacks.error);
2944 | config.mediaConstraints = mediaConstraints;
2945 | if(!config.iceDone && !config.trickle) {
2946 | // Don't do anything until we have all candidates
2947 | Janus.log("Waiting for all candidates...");
2948 | return;
2949 | }
2950 | // If transforms are present, notify Janus that the media is end-to-end encrypted
2951 | if(config.senderTransforms || config.receiverTransforms) {
2952 | answer["e2ee"] = true;
2953 | }
2954 | callbacks.success(answer);
2955 | }, callbacks.error);
2956 | }
2957 |
2958 | function sendSDP(handleId, callbacks) {
2959 | callbacks = callbacks || {};
2960 | callbacks.success = (typeof callbacks.success == "function") ? callbacks.success : Janus.noop;
2961 | callbacks.error = (typeof callbacks.error == "function") ? callbacks.error : Janus.noop;
2962 | var pluginHandle = pluginHandles[handleId];
2963 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
2964 | Janus.warn("Invalid handle, not sending anything");
2965 | return;
2966 | }
2967 | var config = pluginHandle.webrtcStuff;
2968 | Janus.log("Sending offer/answer SDP...");
2969 | if(!config.mySdp) {
2970 | Janus.warn("Local SDP instance is invalid, not sending anything...");
2971 | return;
2972 | }
2973 | config.mySdp = {
2974 | "type": config.pc.localDescription.type,
2975 | "sdp": config.pc.localDescription.sdp
2976 | };
2977 | if(config.trickle === false)
2978 | config.mySdp["trickle"] = false;
2979 | Janus.debug(callbacks);
2980 | config.sdpSent = true;
2981 | callbacks.success(config.mySdp);
2982 | }
2983 |
2984 | function getVolume(handleId, remote) {
2985 | var pluginHandle = pluginHandles[handleId];
2986 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
2987 | Janus.warn("Invalid handle");
2988 | return 0;
2989 | }
2990 | var stream = remote ? "remote" : "local";
2991 | var config = pluginHandle.webrtcStuff;
2992 | if(!config.volume[stream])
2993 | config.volume[stream] = { value: 0 };
2994 | // Start getting the volume, if audioLevel in getStats is supported (apparently
2995 | // they're only available in Chrome/Safari right now: https://webrtc-stats.callstats.io/)
2996 | if(config.pc.getStats && (Janus.webRTCAdapter.browserDetails.browser === "chrome" ||
2997 | Janus.webRTCAdapter.browserDetails.browser === "safari")) {
2998 | if(remote && !config.remoteStream) {
2999 | Janus.warn("Remote stream unavailable");
3000 | return 0;
3001 | } else if(!remote && !config.myStream) {
3002 | Janus.warn("Local stream unavailable");
3003 | return 0;
3004 | }
3005 | if(!config.volume[stream].timer) {
3006 | Janus.log("Starting " + stream + " volume monitor");
3007 | config.volume[stream].timer = setInterval(function() {
3008 | config.pc.getStats()
3009 | .then(function(stats) {
3010 | stats.forEach(function (res) {
3011 | if(!res || res.kind !== "audio")
3012 | return;
3013 | if((remote && !res.remoteSource) || (!remote && res.type !== "media-source"))
3014 | return;
3015 | config.volume[stream].value = (res.audioLevel ? res.audioLevel : 0);
3016 | });
3017 | });
3018 | }, 200);
3019 | return 0; // We don't have a volume to return yet
3020 | }
3021 | return config.volume[stream].value;
3022 | } else {
3023 | // audioInputLevel and audioOutputLevel seem only available in Chrome? audioLevel
3024 | // seems to be available on Chrome and Firefox, but they don't seem to work
3025 | Janus.warn("Getting the " + stream + " volume unsupported by browser");
3026 | return 0;
3027 | }
3028 | }
3029 |
3030 | function isMuted(handleId, video) {
3031 | var pluginHandle = pluginHandles[handleId];
3032 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
3033 | Janus.warn("Invalid handle");
3034 | return true;
3035 | }
3036 | var config = pluginHandle.webrtcStuff;
3037 | if(!config.pc) {
3038 | Janus.warn("Invalid PeerConnection");
3039 | return true;
3040 | }
3041 | if(!config.myStream) {
3042 | Janus.warn("Invalid local MediaStream");
3043 | return true;
3044 | }
3045 | if(video) {
3046 | // Check video track
3047 | if(!config.myStream.getVideoTracks() || config.myStream.getVideoTracks().length === 0) {
3048 | Janus.warn("No video track");
3049 | return true;
3050 | }
3051 | return !config.myStream.getVideoTracks()[0].enabled;
3052 | } else {
3053 | // Check audio track
3054 | if(!config.myStream.getAudioTracks() || config.myStream.getAudioTracks().length === 0) {
3055 | Janus.warn("No audio track");
3056 | return true;
3057 | }
3058 | return !config.myStream.getAudioTracks()[0].enabled;
3059 | }
3060 | }
3061 |
3062 | function mute(handleId, video, mute) {
3063 | var pluginHandle = pluginHandles[handleId];
3064 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
3065 | Janus.warn("Invalid handle");
3066 | return false;
3067 | }
3068 | var config = pluginHandle.webrtcStuff;
3069 | if(!config.pc) {
3070 | Janus.warn("Invalid PeerConnection");
3071 | return false;
3072 | }
3073 | if(!config.myStream) {
3074 | Janus.warn("Invalid local MediaStream");
3075 | return false;
3076 | }
3077 | if(video) {
3078 | // Mute/unmute video track
3079 | if(!config.myStream.getVideoTracks() || config.myStream.getVideoTracks().length === 0) {
3080 | Janus.warn("No video track");
3081 | return false;
3082 | }
3083 | config.myStream.getVideoTracks()[0].enabled = !mute;
3084 | return true;
3085 | } else {
3086 | // Mute/unmute audio track
3087 | if(!config.myStream.getAudioTracks() || config.myStream.getAudioTracks().length === 0) {
3088 | Janus.warn("No audio track");
3089 | return false;
3090 | }
3091 | config.myStream.getAudioTracks()[0].enabled = !mute;
3092 | return true;
3093 | }
3094 | }
3095 |
3096 | function getBitrate(handleId) {
3097 | var pluginHandle = pluginHandles[handleId];
3098 | if(!pluginHandle || !pluginHandle.webrtcStuff) {
3099 | Janus.warn("Invalid handle");
3100 | return "Invalid handle";
3101 | }
3102 | var config = pluginHandle.webrtcStuff;
3103 | if(!config.pc)
3104 | return "Invalid PeerConnection";
3105 | // Start getting the bitrate, if getStats is supported
3106 | if(config.pc.getStats) {
3107 | if(!config.bitrate.timer) {
3108 | Janus.log("Starting bitrate timer (via getStats)");
3109 | config.bitrate.timer = setInterval(function() {
3110 | config.pc.getStats()
3111 | .then(function(stats) {
3112 | stats.forEach(function (res) {
3113 | if(!res)
3114 | return;
3115 | var inStats = false;
3116 | // Check if these are statistics on incoming media
3117 | if((res.mediaType === "video" || res.id.toLowerCase().indexOf("video") > -1) &&
3118 | res.type === "inbound-rtp" && res.id.indexOf("rtcp") < 0) {
3119 | // New stats
3120 | inStats = true;
3121 | } else if(res.type == 'ssrc' && res.bytesReceived &&
3122 | (res.googCodecName === "VP8" || res.googCodecName === "")) {
3123 | // Older Chromer versions
3124 | inStats = true;
3125 | }
3126 | // Parse stats now
3127 | if(inStats) {
3128 | config.bitrate.bsnow = res.bytesReceived;
3129 | config.bitrate.tsnow = res.timestamp;
3130 | if(config.bitrate.bsbefore === null || config.bitrate.tsbefore === null) {
3131 | // Skip this round
3132 | config.bitrate.bsbefore = config.bitrate.bsnow;
3133 | config.bitrate.tsbefore = config.bitrate.tsnow;
3134 | } else {
3135 | // Calculate bitrate
3136 | var timePassed = config.bitrate.tsnow - config.bitrate.tsbefore;
3137 | if(Janus.webRTCAdapter.browserDetails.browser === "safari")
3138 | timePassed = timePassed/1000; // Apparently the timestamp is in microseconds, in Safari
3139 | var bitRate = Math.round((config.bitrate.bsnow - config.bitrate.bsbefore) * 8 / timePassed);
3140 | if(Janus.webRTCAdapter.browserDetails.browser === "safari")
3141 | bitRate = parseInt(bitRate/1000);
3142 | config.bitrate.value = bitRate + ' kbits/sec';
3143 | //~ Janus.log("Estimated bitrate is " + config.bitrate.value);
3144 | config.bitrate.bsbefore = config.bitrate.bsnow;
3145 | config.bitrate.tsbefore = config.bitrate.tsnow;
3146 | }
3147 | }
3148 | });
3149 | });
3150 | }, 1000);
3151 | return "0 kbits/sec"; // We don't have a bitrate value yet
3152 | }
3153 | return config.bitrate.value;
3154 | } else {
3155 | Janus.warn("Getting the video bitrate unsupported by browser");
3156 | return "Feature unsupported by browser";
3157 | }
3158 | }
3159 |
3160 | function webrtcError(error) {
3161 | Janus.error("WebRTC error:", error);
3162 | }
3163 |
3164 | function cleanupWebrtc(handleId, hangupRequest) {
3165 | Janus.log("Cleaning WebRTC stuff");
3166 | var pluginHandle = pluginHandles[handleId];
3167 | if(!pluginHandle) {
3168 | // Nothing to clean
3169 | return;
3170 | }
3171 | var config = pluginHandle.webrtcStuff;
3172 | if(config) {
3173 | if(hangupRequest === true) {
3174 | // Send a hangup request (we don't really care about the response)
3175 | var request = { "janus": "hangup", "transaction": Janus.randomString(12) };
3176 | if(pluginHandle.token)
3177 | request["token"] = pluginHandle.token;
3178 | if(apisecret)
3179 | request["apisecret"] = apisecret;
3180 | Janus.debug("Sending hangup request (handle=" + handleId + "):");
3181 | Janus.debug(request);
3182 | if(websockets) {
3183 | request["session_id"] = sessionId;
3184 | request["handle_id"] = handleId;
3185 | ws.send(JSON.stringify(request));
3186 | } else {
3187 | Janus.httpAPICall(server + "/" + sessionId + "/" + handleId, {
3188 | verb: 'POST',
3189 | withCredentials: withCredentials,
3190 | body: request
3191 | });
3192 | }
3193 | }
3194 | // Cleanup stack
3195 | config.remoteStream = null;
3196 | if(config.volume) {
3197 | if(config.volume["local"] && config.volume["local"].timer)
3198 | clearInterval(config.volume["local"].timer);
3199 | if(config.volume["remote"] && config.volume["remote"].timer)
3200 | clearInterval(config.volume["remote"].timer);
3201 | }
3202 | config.volume = {};
3203 | if(config.bitrate.timer)
3204 | clearInterval(config.bitrate.timer);
3205 | config.bitrate.timer = null;
3206 | config.bitrate.bsnow = null;
3207 | config.bitrate.bsbefore = null;
3208 | config.bitrate.tsnow = null;
3209 | config.bitrate.tsbefore = null;
3210 | config.bitrate.value = null;
3211 | if(!config.streamExternal && config.myStream) {
3212 | Janus.log("Stopping local stream tracks");
3213 | Janus.stopAllTracks(config.myStream);
3214 | }
3215 | config.streamExternal = false;
3216 | config.myStream = null;
3217 | // Close PeerConnection
3218 | try {
3219 | config.pc.close();
3220 | } catch(e) {
3221 | // Do nothing
3222 | }
3223 | config.pc = null;
3224 | config.candidates = null;
3225 | config.mySdp = null;
3226 | config.remoteSdp = null;
3227 | config.iceDone = false;
3228 | config.dataChannel = {};
3229 | config.dtmfSender = null;
3230 | config.senderTransforms = null;
3231 | config.receiverTransforms = null;
3232 | }
3233 | pluginHandle.oncleanup();
3234 | }
3235 |
3236 | // Helper method to munge an SDP to enable simulcasting (Chrome only)
3237 | function mungeSdpForSimulcasting(sdp) {
3238 | // Let's munge the SDP to add the attributes for enabling simulcasting
3239 | // (based on https://gist.github.com/ggarber/a19b4c33510028b9c657)
3240 | var lines = sdp.split("\r\n");
3241 | var video = false;
3242 | var ssrc = [ -1 ], ssrc_fid = [ -1 ];
3243 | var cname = null, msid = null, mslabel = null, label = null;
3244 | var insertAt = -1;
3245 | for(var i=0; i -1) {
3261 | // We're done, let's add the new attributes here
3262 | insertAt = i;
3263 | break;
3264 | }
3265 | }
3266 | continue;
3267 | }
3268 | if(!video)
3269 | continue;
3270 | var fid = lines[i].match(/a=ssrc-group:FID (\d+) (\d+)/);
3271 | if(fid) {
3272 | ssrc[0] = fid[1];
3273 | ssrc_fid[0] = fid[2];
3274 | lines.splice(i, 1); i--;
3275 | continue;
3276 | }
3277 | if(ssrc[0]) {
3278 | var match = lines[i].match('a=ssrc:' + ssrc[0] + ' cname:(.+)')
3279 | if(match) {
3280 | cname = match[1];
3281 | }
3282 | match = lines[i].match('a=ssrc:' + ssrc[0] + ' msid:(.+)')
3283 | if(match) {
3284 | msid = match[1];
3285 | }
3286 | match = lines[i].match('a=ssrc:' + ssrc[0] + ' mslabel:(.+)')
3287 | if(match) {
3288 | mslabel = match[1];
3289 | }
3290 | match = lines[i].match('a=ssrc:' + ssrc[0] + ' label:(.+)')
3291 | if(match) {
3292 | label = match[1];
3293 | }
3294 | if(lines[i].indexOf('a=ssrc:' + ssrc_fid[0]) === 0) {
3295 | lines.splice(i, 1); i--;
3296 | continue;
3297 | }
3298 | if(lines[i].indexOf('a=ssrc:' + ssrc[0]) === 0) {
3299 | lines.splice(i, 1); i--;
3300 | continue;
3301 | }
3302 | }
3303 | if(lines[i].length == 0) {
3304 | lines.splice(i, 1); i--;
3305 | continue;
3306 | }
3307 | }
3308 | if(ssrc[0] < 0) {
3309 | // Couldn't find a FID attribute, let's just take the first video SSRC we find
3310 | insertAt = -1;
3311 | video = false;
3312 | for(var i=0; i -1) {
3328 | // We're done, let's add the new attributes here
3329 | insertAt = i;
3330 | break;
3331 | }
3332 | }
3333 | continue;
3334 | }
3335 | if(!video)
3336 | continue;
3337 | if(ssrc[0] < 0) {
3338 | var value = lines[i].match(/a=ssrc:(\d+)/);
3339 | if(value) {
3340 | ssrc[0] = value[1];
3341 | lines.splice(i, 1); i--;
3342 | continue;
3343 | }
3344 | } else {
3345 | var match = lines[i].match('a=ssrc:' + ssrc[0] + ' cname:(.+)')
3346 | if(match) {
3347 | cname = match[1];
3348 | }
3349 | match = lines[i].match('a=ssrc:' + ssrc[0] + ' msid:(.+)')
3350 | if(match) {
3351 | msid = match[1];
3352 | }
3353 | match = lines[i].match('a=ssrc:' + ssrc[0] + ' mslabel:(.+)')
3354 | if(match) {
3355 | mslabel = match[1];
3356 | }
3357 | match = lines[i].match('a=ssrc:' + ssrc[0] + ' label:(.+)')
3358 | if(match) {
3359 | label = match[1];
3360 | }
3361 | if(lines[i].indexOf('a=ssrc:' + ssrc_fid[0]) === 0) {
3362 | lines.splice(i, 1); i--;
3363 | continue;
3364 | }
3365 | if(lines[i].indexOf('a=ssrc:' + ssrc[0]) === 0) {
3366 | lines.splice(i, 1); i--;
3367 | continue;
3368 | }
3369 | }
3370 | if(lines[i].length === 0) {
3371 | lines.splice(i, 1); i--;
3372 | continue;
3373 | }
3374 | }
3375 | }
3376 | if(ssrc[0] < 0) {
3377 | // Still nothing, let's just return the SDP we were asked to munge
3378 | Janus.warn("Couldn't find the video SSRC, simulcasting NOT enabled");
3379 | return sdp;
3380 | }
3381 | if(insertAt < 0) {
3382 | // Append at the end
3383 | insertAt = lines.length;
3384 | }
3385 | // Generate a couple of SSRCs (for retransmissions too)
3386 | // Note: should we check if there are conflicts, here?
3387 | ssrc[1] = Math.floor(Math.random()*0xFFFFFFFF);
3388 | ssrc[2] = Math.floor(Math.random()*0xFFFFFFFF);
3389 | ssrc_fid[1] = Math.floor(Math.random()*0xFFFFFFFF);
3390 | ssrc_fid[2] = Math.floor(Math.random()*0xFFFFFFFF);
3391 | // Add attributes to the SDP
3392 | for(var i=0; i