├── .gitignore
├── JamRTC.glade
├── LICENSE.md
├── Makefile
├── README.md
├── images
├── architecture.jpg
├── gui-1.png
├── gui-2.png
├── jack-1.png
├── jack-2.png
├── jack-3.png
└── jack-4.png
└── src
├── debug.h
├── jamrtc.c
├── mutex.h
├── refcount.h
├── webrtc.c
└── webrtc.h
/.gitignore:
--------------------------------------------------------------------------------
1 | src/*.o
2 | JamRTC
3 |
--------------------------------------------------------------------------------
/JamRTC.glade:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
306 |
307 |
--------------------------------------------------------------------------------
/LICENSE.md:
--------------------------------------------------------------------------------
1 | GNU General Public License
2 | ==========================
3 |
4 | _Version 3, 29 June 2007_
5 | _Copyright © 2007 Free Software Foundation, Inc. <>_
6 |
7 | Everyone is permitted to copy and distribute verbatim copies of this license
8 | document, but changing it is not allowed.
9 |
10 | ## Preamble
11 |
12 | The GNU General Public License is a free, copyleft license for software and other
13 | kinds of works.
14 |
15 | The licenses for most software and other practical works are designed to take away
16 | your freedom to share and change the works. By contrast, the GNU General Public
17 | License is intended to guarantee your freedom to share and change all versions of a
18 | program--to make sure it remains free software for all its users. We, the Free
19 | Software Foundation, use the GNU General Public License for most of our software; it
20 | applies also to any other work released this way by its authors. You can apply it to
21 | your programs, too.
22 |
23 | When we speak of free software, we are referring to freedom, not price. Our General
24 | Public Licenses are designed to make sure that you have the freedom to distribute
25 | copies of free software (and charge for them if you wish), that you receive source
26 | code or can get it if you want it, that you can change the software or use pieces of
27 | it in new free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you these rights or
30 | asking you to surrender the rights. Therefore, you have certain responsibilities if
31 | you distribute copies of the software, or if you modify it: responsibilities to
32 | respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether gratis or for a fee,
35 | you must pass on to the recipients the same freedoms that you received. You must make
36 | sure that they, too, receive or can get the source code. And you must show them these
37 | terms so they know their rights.
38 |
39 | Developers that use the GNU GPL protect your rights with two steps: **(1)** assert
40 | copyright on the software, and **(2)** offer you this License giving you legal permission
41 | to copy, distribute and/or modify it.
42 |
43 | For the developers' and authors' protection, the GPL clearly explains that there is
44 | no warranty for this free software. For both users' and authors' sake, the GPL
45 | requires that modified versions be marked as changed, so that their problems will not
46 | be attributed erroneously to authors of previous versions.
47 |
48 | Some devices are designed to deny users access to install or run modified versions of
49 | the software inside them, although the manufacturer can do so. This is fundamentally
50 | incompatible with the aim of protecting users' freedom to change the software. The
51 | systematic pattern of such abuse occurs in the area of products for individuals to
52 | use, which is precisely where it is most unacceptable. Therefore, we have designed
53 | this version of the GPL to prohibit the practice for those products. If such problems
54 | arise substantially in other domains, we stand ready to extend this provision to
55 | those domains in future versions of the GPL, as needed to protect the freedom of
56 | users.
57 |
58 | Finally, every program is threatened constantly by software patents. States should
59 | not allow patents to restrict development and use of software on general-purpose
60 | computers, but in those that do, we wish to avoid the special danger that patents
61 | applied to a free program could make it effectively proprietary. To prevent this, the
62 | GPL assures that patents cannot be used to render the program non-free.
63 |
64 | The precise terms and conditions for copying, distribution and modification follow.
65 |
66 | ## TERMS AND CONDITIONS
67 |
68 | ### 0. Definitions
69 |
70 | “This License” refers to version 3 of the GNU General Public License.
71 |
72 | “Copyright” also means copyright-like laws that apply to other kinds of
73 | works, such as semiconductor masks.
74 |
75 | “The Program” refers to any copyrightable work licensed under this
76 | License. Each licensee is addressed as “you”. “Licensees” and
77 | “recipients” may be individuals or organizations.
78 |
79 | To “modify” a work means to copy from or adapt all or part of the work in
80 | a fashion requiring copyright permission, other than the making of an exact copy. The
81 | resulting work is called a “modified version” of the earlier work or a
82 | work “based on” the earlier work.
83 |
84 | A “covered work” means either the unmodified Program or a work based on
85 | the Program.
86 |
87 | To “propagate” a work means to do anything with it that, without
88 | permission, would make you directly or secondarily liable for infringement under
89 | applicable copyright law, except executing it on a computer or modifying a private
90 | copy. Propagation includes copying, distribution (with or without modification),
91 | making available to the public, and in some countries other activities as well.
92 |
93 | To “convey” a work means any kind of propagation that enables other
94 | parties to make or receive copies. Mere interaction with a user through a computer
95 | network, with no transfer of a copy, is not conveying.
96 |
97 | An interactive user interface displays “Appropriate Legal Notices” to the
98 | extent that it includes a convenient and prominently visible feature that **(1)**
99 | displays an appropriate copyright notice, and **(2)** tells the user that there is no
100 | warranty for the work (except to the extent that warranties are provided), that
101 | licensees may convey the work under this License, and how to view a copy of this
102 | License. If the interface presents a list of user commands or options, such as a
103 | menu, a prominent item in the list meets this criterion.
104 |
105 | ### 1. Source Code
106 |
107 | The “source code” for a work means the preferred form of the work for
108 | making modifications to it. “Object code” means any non-source form of a
109 | work.
110 |
111 | A “Standard Interface” means an interface that either is an official
112 | standard defined by a recognized standards body, or, in the case of interfaces
113 | specified for a particular programming language, one that is widely used among
114 | developers working in that language.
115 |
116 | The “System Libraries” of an executable work include anything, other than
117 | the work as a whole, that **(a)** is included in the normal form of packaging a Major
118 | Component, but which is not part of that Major Component, and **(b)** serves only to
119 | enable use of the work with that Major Component, or to implement a Standard
120 | Interface for which an implementation is available to the public in source code form.
121 | A “Major Component”, in this context, means a major essential component
122 | (kernel, window system, and so on) of the specific operating system (if any) on which
123 | the executable work runs, or a compiler used to produce the work, or an object code
124 | interpreter used to run it.
125 |
126 | The “Corresponding Source” for a work in object code form means all the
127 | source code needed to generate, install, and (for an executable work) run the object
128 | code and to modify the work, including scripts to control those activities. However,
129 | it does not include the work's System Libraries, or general-purpose tools or
130 | generally available free programs which are used unmodified in performing those
131 | activities but which are not part of the work. For example, Corresponding Source
132 | includes interface definition files associated with source files for the work, and
133 | the source code for shared libraries and dynamically linked subprograms that the work
134 | is specifically designed to require, such as by intimate data communication or
135 | control flow between those subprograms and other parts of the work.
136 |
137 | The Corresponding Source need not include anything that users can regenerate
138 | automatically from other parts of the Corresponding Source.
139 |
140 | The Corresponding Source for a work in source code form is that same work.
141 |
142 | ### 2. Basic Permissions
143 |
144 | All rights granted under this License are granted for the term of copyright on the
145 | Program, and are irrevocable provided the stated conditions are met. This License
146 | explicitly affirms your unlimited permission to run the unmodified Program. The
147 | output from running a covered work is covered by this License only if the output,
148 | given its content, constitutes a covered work. This License acknowledges your rights
149 | of fair use or other equivalent, as provided by copyright law.
150 |
151 | You may make, run and propagate covered works that you do not convey, without
152 | conditions so long as your license otherwise remains in force. You may convey covered
153 | works to others for the sole purpose of having them make modifications exclusively
154 | for you, or provide you with facilities for running those works, provided that you
155 | comply with the terms of this License in conveying all material for which you do not
156 | control copyright. Those thus making or running the covered works for you must do so
157 | exclusively on your behalf, under your direction and control, on terms that prohibit
158 | them from making any copies of your copyrighted material outside their relationship
159 | with you.
160 |
161 | Conveying under any other circumstances is permitted solely under the conditions
162 | stated below. Sublicensing is not allowed; section 10 makes it unnecessary.
163 |
164 | ### 3. Protecting Users' Legal Rights From Anti-Circumvention Law
165 |
166 | No covered work shall be deemed part of an effective technological measure under any
167 | applicable law fulfilling obligations under article 11 of the WIPO copyright treaty
168 | adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention
169 | of such measures.
170 |
171 | When you convey a covered work, you waive any legal power to forbid circumvention of
172 | technological measures to the extent such circumvention is effected by exercising
173 | rights under this License with respect to the covered work, and you disclaim any
174 | intention to limit operation or modification of the work as a means of enforcing,
175 | against the work's users, your or third parties' legal rights to forbid circumvention
176 | of technological measures.
177 |
178 | ### 4. Conveying Verbatim Copies
179 |
180 | You may convey verbatim copies of the Program's source code as you receive it, in any
181 | medium, provided that you conspicuously and appropriately publish on each copy an
182 | appropriate copyright notice; keep intact all notices stating that this License and
183 | any non-permissive terms added in accord with section 7 apply to the code; keep
184 | intact all notices of the absence of any warranty; and give all recipients a copy of
185 | this License along with the Program.
186 |
187 | You may charge any price or no price for each copy that you convey, and you may offer
188 | support or warranty protection for a fee.
189 |
190 | ### 5. Conveying Modified Source Versions
191 |
192 | You may convey a work based on the Program, or the modifications to produce it from
193 | the Program, in the form of source code under the terms of section 4, provided that
194 | you also meet all of these conditions:
195 |
196 | * **a)** The work must carry prominent notices stating that you modified it, and giving a
197 | relevant date.
198 | * **b)** The work must carry prominent notices stating that it is released under this
199 | License and any conditions added under section 7. This requirement modifies the
200 | requirement in section 4 to “keep intact all notices”.
201 | * **c)** You must license the entire work, as a whole, under this License to anyone who
202 | comes into possession of a copy. This License will therefore apply, along with any
203 | applicable section 7 additional terms, to the whole of the work, and all its parts,
204 | regardless of how they are packaged. This License gives no permission to license the
205 | work in any other way, but it does not invalidate such permission if you have
206 | separately received it.
207 | * **d)** If the work has interactive user interfaces, each must display Appropriate Legal
208 | Notices; however, if the Program has interactive interfaces that do not display
209 | Appropriate Legal Notices, your work need not make them do so.
210 |
211 | A compilation of a covered work with other separate and independent works, which are
212 | not by their nature extensions of the covered work, and which are not combined with
213 | it such as to form a larger program, in or on a volume of a storage or distribution
214 | medium, is called an “aggregate” if the compilation and its resulting
215 | copyright are not used to limit the access or legal rights of the compilation's users
216 | beyond what the individual works permit. Inclusion of a covered work in an aggregate
217 | does not cause this License to apply to the other parts of the aggregate.
218 |
219 | ### 6. Conveying Non-Source Forms
220 |
221 | You may convey a covered work in object code form under the terms of sections 4 and
222 | 5, provided that you also convey the machine-readable Corresponding Source under the
223 | terms of this License, in one of these ways:
224 |
225 | * **a)** Convey the object code in, or embodied in, a physical product (including a
226 | physical distribution medium), accompanied by the Corresponding Source fixed on a
227 | durable physical medium customarily used for software interchange.
228 | * **b)** Convey the object code in, or embodied in, a physical product (including a
229 | physical distribution medium), accompanied by a written offer, valid for at least
230 | three years and valid for as long as you offer spare parts or customer support for
231 | that product model, to give anyone who possesses the object code either **(1)** a copy of
232 | the Corresponding Source for all the software in the product that is covered by this
233 | License, on a durable physical medium customarily used for software interchange, for
234 | a price no more than your reasonable cost of physically performing this conveying of
235 | source, or **(2)** access to copy the Corresponding Source from a network server at no
236 | charge.
237 | * **c)** Convey individual copies of the object code with a copy of the written offer to
238 | provide the Corresponding Source. This alternative is allowed only occasionally and
239 | noncommercially, and only if you received the object code with such an offer, in
240 | accord with subsection 6b.
241 | * **d)** Convey the object code by offering access from a designated place (gratis or for
242 | a charge), and offer equivalent access to the Corresponding Source in the same way
243 | through the same place at no further charge. You need not require recipients to copy
244 | the Corresponding Source along with the object code. If the place to copy the object
245 | code is a network server, the Corresponding Source may be on a different server
246 | (operated by you or a third party) that supports equivalent copying facilities,
247 | provided you maintain clear directions next to the object code saying where to find
248 | the Corresponding Source. Regardless of what server hosts the Corresponding Source,
249 | you remain obligated to ensure that it is available for as long as needed to satisfy
250 | these requirements.
251 | * **e)** Convey the object code using peer-to-peer transmission, provided you inform
252 | other peers where the object code and Corresponding Source of the work are being
253 | offered to the general public at no charge under subsection 6d.
254 |
255 | A separable portion of the object code, whose source code is excluded from the
256 | Corresponding Source as a System Library, need not be included in conveying the
257 | object code work.
258 |
259 | A “User Product” is either **(1)** a “consumer product”, which
260 | means any tangible personal property which is normally used for personal, family, or
261 | household purposes, or **(2)** anything designed or sold for incorporation into a
262 | dwelling. In determining whether a product is a consumer product, doubtful cases
263 | shall be resolved in favor of coverage. For a particular product received by a
264 | particular user, “normally used” refers to a typical or common use of
265 | that class of product, regardless of the status of the particular user or of the way
266 | in which the particular user actually uses, or expects or is expected to use, the
267 | product. A product is a consumer product regardless of whether the product has
268 | substantial commercial, industrial or non-consumer uses, unless such uses represent
269 | the only significant mode of use of the product.
270 |
271 | “Installation Information” for a User Product means any methods,
272 | procedures, authorization keys, or other information required to install and execute
273 | modified versions of a covered work in that User Product from a modified version of
274 | its Corresponding Source. The information must suffice to ensure that the continued
275 | functioning of the modified object code is in no case prevented or interfered with
276 | solely because modification has been made.
277 |
278 | If you convey an object code work under this section in, or with, or specifically for
279 | use in, a User Product, and the conveying occurs as part of a transaction in which
280 | the right of possession and use of the User Product is transferred to the recipient
281 | in perpetuity or for a fixed term (regardless of how the transaction is
282 | characterized), the Corresponding Source conveyed under this section must be
283 | accompanied by the Installation Information. But this requirement does not apply if
284 | neither you nor any third party retains the ability to install modified object code
285 | on the User Product (for example, the work has been installed in ROM).
286 |
287 | The requirement to provide Installation Information does not include a requirement to
288 | continue to provide support service, warranty, or updates for a work that has been
289 | modified or installed by the recipient, or for the User Product in which it has been
290 | modified or installed. Access to a network may be denied when the modification itself
291 | materially and adversely affects the operation of the network or violates the rules
292 | and protocols for communication across the network.
293 |
294 | Corresponding Source conveyed, and Installation Information provided, in accord with
295 | this section must be in a format that is publicly documented (and with an
296 | implementation available to the public in source code form), and must require no
297 | special password or key for unpacking, reading or copying.
298 |
299 | ### 7. Additional Terms
300 |
301 | “Additional permissions” are terms that supplement the terms of this
302 | License by making exceptions from one or more of its conditions. Additional
303 | permissions that are applicable to the entire Program shall be treated as though they
304 | were included in this License, to the extent that they are valid under applicable
305 | law. If additional permissions apply only to part of the Program, that part may be
306 | used separately under those permissions, but the entire Program remains governed by
307 | this License without regard to the additional permissions.
308 |
309 | When you convey a copy of a covered work, you may at your option remove any
310 | additional permissions from that copy, or from any part of it. (Additional
311 | permissions may be written to require their own removal in certain cases when you
312 | modify the work.) You may place additional permissions on material, added by you to a
313 | covered work, for which you have or can give appropriate copyright permission.
314 |
315 | Notwithstanding any other provision of this License, for material you add to a
316 | covered work, you may (if authorized by the copyright holders of that material)
317 | supplement the terms of this License with terms:
318 |
319 | * **a)** Disclaiming warranty or limiting liability differently from the terms of
320 | sections 15 and 16 of this License; or
321 | * **b)** Requiring preservation of specified reasonable legal notices or author
322 | attributions in that material or in the Appropriate Legal Notices displayed by works
323 | containing it; or
324 | * **c)** Prohibiting misrepresentation of the origin of that material, or requiring that
325 | modified versions of such material be marked in reasonable ways as different from the
326 | original version; or
327 | * **d)** Limiting the use for publicity purposes of names of licensors or authors of the
328 | material; or
329 | * **e)** Declining to grant rights under trademark law for use of some trade names,
330 | trademarks, or service marks; or
331 | * **f)** Requiring indemnification of licensors and authors of that material by anyone
332 | who conveys the material (or modified versions of it) with contractual assumptions of
333 | liability to the recipient, for any liability that these contractual assumptions
334 | directly impose on those licensors and authors.
335 |
336 | All other non-permissive additional terms are considered “further
337 | restrictions” within the meaning of section 10. If the Program as you received
338 | it, or any part of it, contains a notice stating that it is governed by this License
339 | along with a term that is a further restriction, you may remove that term. If a
340 | license document contains a further restriction but permits relicensing or conveying
341 | under this License, you may add to a covered work material governed by the terms of
342 | that license document, provided that the further restriction does not survive such
343 | relicensing or conveying.
344 |
345 | If you add terms to a covered work in accord with this section, you must place, in
346 | the relevant source files, a statement of the additional terms that apply to those
347 | files, or a notice indicating where to find the applicable terms.
348 |
349 | Additional terms, permissive or non-permissive, may be stated in the form of a
350 | separately written license, or stated as exceptions; the above requirements apply
351 | either way.
352 |
353 | ### 8. Termination
354 |
355 | You may not propagate or modify a covered work except as expressly provided under
356 | this License. Any attempt otherwise to propagate or modify it is void, and will
357 | automatically terminate your rights under this License (including any patent licenses
358 | granted under the third paragraph of section 11).
359 |
360 | However, if you cease all violation of this License, then your license from a
361 | particular copyright holder is reinstated **(a)** provisionally, unless and until the
362 | copyright holder explicitly and finally terminates your license, and **(b)** permanently,
363 | if the copyright holder fails to notify you of the violation by some reasonable means
364 | prior to 60 days after the cessation.
365 |
366 | Moreover, your license from a particular copyright holder is reinstated permanently
367 | if the copyright holder notifies you of the violation by some reasonable means, this
368 | is the first time you have received notice of violation of this License (for any
369 | work) from that copyright holder, and you cure the violation prior to 30 days after
370 | your receipt of the notice.
371 |
372 | Termination of your rights under this section does not terminate the licenses of
373 | parties who have received copies or rights from you under this License. If your
374 | rights have been terminated and not permanently reinstated, you do not qualify to
375 | receive new licenses for the same material under section 10.
376 |
377 | ### 9. Acceptance Not Required for Having Copies
378 |
379 | You are not required to accept this License in order to receive or run a copy of the
380 | Program. Ancillary propagation of a covered work occurring solely as a consequence of
381 | using peer-to-peer transmission to receive a copy likewise does not require
382 | acceptance. However, nothing other than this License grants you permission to
383 | propagate or modify any covered work. These actions infringe copyright if you do not
384 | accept this License. Therefore, by modifying or propagating a covered work, you
385 | indicate your acceptance of this License to do so.
386 |
387 | ### 10. Automatic Licensing of Downstream Recipients
388 |
389 | Each time you convey a covered work, the recipient automatically receives a license
390 | from the original licensors, to run, modify and propagate that work, subject to this
391 | License. You are not responsible for enforcing compliance by third parties with this
392 | License.
393 |
394 | An “entity transaction” is a transaction transferring control of an
395 | organization, or substantially all assets of one, or subdividing an organization, or
396 | merging organizations. If propagation of a covered work results from an entity
397 | transaction, each party to that transaction who receives a copy of the work also
398 | receives whatever licenses to the work the party's predecessor in interest had or
399 | could give under the previous paragraph, plus a right to possession of the
400 | Corresponding Source of the work from the predecessor in interest, if the predecessor
401 | has it or can get it with reasonable efforts.
402 |
403 | You may not impose any further restrictions on the exercise of the rights granted or
404 | affirmed under this License. For example, you may not impose a license fee, royalty,
405 | or other charge for exercise of rights granted under this License, and you may not
406 | initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging
407 | that any patent claim is infringed by making, using, selling, offering for sale, or
408 | importing the Program or any portion of it.
409 |
410 | ### 11. Patents
411 |
412 | A “contributor” is a copyright holder who authorizes use under this
413 | License of the Program or a work on which the Program is based. The work thus
414 | licensed is called the contributor's “contributor version”.
415 |
416 | A contributor's “essential patent claims” are all patent claims owned or
417 | controlled by the contributor, whether already acquired or hereafter acquired, that
418 | would be infringed by some manner, permitted by this License, of making, using, or
419 | selling its contributor version, but do not include claims that would be infringed
420 | only as a consequence of further modification of the contributor version. For
421 | purposes of this definition, “control” includes the right to grant patent
422 | sublicenses in a manner consistent with the requirements of this License.
423 |
424 | Each contributor grants you a non-exclusive, worldwide, royalty-free patent license
425 | under the contributor's essential patent claims, to make, use, sell, offer for sale,
426 | import and otherwise run, modify and propagate the contents of its contributor
427 | version.
428 |
429 | In the following three paragraphs, a “patent license” is any express
430 | agreement or commitment, however denominated, not to enforce a patent (such as an
431 | express permission to practice a patent or covenant not to sue for patent
432 | infringement). To “grant” such a patent license to a party means to make
433 | such an agreement or commitment not to enforce a patent against the party.
434 |
435 | If you convey a covered work, knowingly relying on a patent license, and the
436 | Corresponding Source of the work is not available for anyone to copy, free of charge
437 | and under the terms of this License, through a publicly available network server or
438 | other readily accessible means, then you must either **(1)** cause the Corresponding
439 | Source to be so available, or **(2)** arrange to deprive yourself of the benefit of the
440 | patent license for this particular work, or **(3)** arrange, in a manner consistent with
441 | the requirements of this License, to extend the patent license to downstream
442 | recipients. “Knowingly relying” means you have actual knowledge that, but
443 | for the patent license, your conveying the covered work in a country, or your
444 | recipient's use of the covered work in a country, would infringe one or more
445 | identifiable patents in that country that you have reason to believe are valid.
446 |
447 | If, pursuant to or in connection with a single transaction or arrangement, you
448 | convey, or propagate by procuring conveyance of, a covered work, and grant a patent
449 | license to some of the parties receiving the covered work authorizing them to use,
450 | propagate, modify or convey a specific copy of the covered work, then the patent
451 | license you grant is automatically extended to all recipients of the covered work and
452 | works based on it.
453 |
454 | A patent license is “discriminatory” if it does not include within the
455 | scope of its coverage, prohibits the exercise of, or is conditioned on the
456 | non-exercise of one or more of the rights that are specifically granted under this
457 | License. You may not convey a covered work if you are a party to an arrangement with
458 | a third party that is in the business of distributing software, under which you make
459 | payment to the third party based on the extent of your activity of conveying the
460 | work, and under which the third party grants, to any of the parties who would receive
461 | the covered work from you, a discriminatory patent license **(a)** in connection with
462 | copies of the covered work conveyed by you (or copies made from those copies), or **(b)**
463 | primarily for and in connection with specific products or compilations that contain
464 | the covered work, unless you entered into that arrangement, or that patent license
465 | was granted, prior to 28 March 2007.
466 |
467 | Nothing in this License shall be construed as excluding or limiting any implied
468 | license or other defenses to infringement that may otherwise be available to you
469 | under applicable patent law.
470 |
471 | ### 12. No Surrender of Others' Freedom
472 |
473 | If conditions are imposed on you (whether by court order, agreement or otherwise)
474 | that contradict the conditions of this License, they do not excuse you from the
475 | conditions of this License. If you cannot convey a covered work so as to satisfy
476 | simultaneously your obligations under this License and any other pertinent
477 | obligations, then as a consequence you may not convey it at all. For example, if you
478 | agree to terms that obligate you to collect a royalty for further conveying from
479 | those to whom you convey the Program, the only way you could satisfy both those terms
480 | and this License would be to refrain entirely from conveying the Program.
481 |
482 | ### 13. Use with the GNU Affero General Public License
483 |
484 | Notwithstanding any other provision of this License, you have permission to link or
485 | combine any covered work with a work licensed under version 3 of the GNU Affero
486 | General Public License into a single combined work, and to convey the resulting work.
487 | The terms of this License will continue to apply to the part which is the covered
488 | work, but the special requirements of the GNU Affero General Public License, section
489 | 13, concerning interaction through a network will apply to the combination as such.
490 |
491 | ### 14. Revised Versions of this License
492 |
493 | The Free Software Foundation may publish revised and/or new versions of the GNU
494 | General Public License from time to time. Such new versions will be similar in spirit
495 | to the present version, but may differ in detail to address new problems or concerns.
496 |
497 | Each version is given a distinguishing version number. If the Program specifies that
498 | a certain numbered version of the GNU General Public License “or any later
499 | version” applies to it, you have the option of following the terms and
500 | conditions either of that numbered version or of any later version published by the
501 | Free Software Foundation. If the Program does not specify a version number of the GNU
502 | General Public License, you may choose any version ever published by the Free
503 | Software Foundation.
504 |
505 | If the Program specifies that a proxy can decide which future versions of the GNU
506 | General Public License can be used, that proxy's public statement of acceptance of a
507 | version permanently authorizes you to choose that version for the Program.
508 |
509 | Later license versions may give you additional or different permissions. However, no
510 | additional obligations are imposed on any author or copyright holder as a result of
511 | your choosing to follow a later version.
512 |
513 | ### 15. Disclaimer of Warranty
514 |
515 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
516 | EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
517 | PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER
518 | EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
519 | MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE
520 | QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE
521 | DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
522 |
523 | ### 16. Limitation of Liability
524 |
525 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY
526 | COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS
527 | PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL,
528 | INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
529 | PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE
530 | OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE
531 | WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
532 | POSSIBILITY OF SUCH DAMAGES.
533 |
534 | ### 17. Interpretation of Sections 15 and 16
535 |
536 | If the disclaimer of warranty and limitation of liability provided above cannot be
537 | given local legal effect according to their terms, reviewing courts shall apply local
538 | law that most closely approximates an absolute waiver of all civil liability in
539 | connection with the Program, unless a warranty or assumption of liability accompanies
540 | a copy of the Program in return for a fee.
541 |
542 | _END OF TERMS AND CONDITIONS_
543 |
544 | ## How to Apply These Terms to Your New Programs
545 |
546 | If you develop a new program, and you want it to be of the greatest possible use to
547 | the public, the best way to achieve this is to make it free software which everyone
548 | can redistribute and change under these terms.
549 |
550 | To do so, attach the following notices to the program. It is safest to attach them
551 | to the start of each source file to most effectively state the exclusion of warranty;
552 | and each file should have at least the “copyright” line and a pointer to
553 | where the full notice is found.
554 |
555 |
556 | Copyright (C)
557 |
558 | This program is free software: you can redistribute it and/or modify
559 | it under the terms of the GNU General Public License as published by
560 | the Free Software Foundation, either version 3 of the License, or
561 | (at your option) any later version.
562 |
563 | This program is distributed in the hope that it will be useful,
564 | but WITHOUT ANY WARRANTY; without even the implied warranty of
565 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
566 | GNU General Public License for more details.
567 |
568 | You should have received a copy of the GNU General Public License
569 | along with this program. If not, see .
570 |
571 | Also add information on how to contact you by electronic and paper mail.
572 |
573 | If the program does terminal interaction, make it output a short notice like this
574 | when it starts in an interactive mode:
575 |
576 | Copyright (C)
577 | This program comes with ABSOLUTELY NO WARRANTY; for details type 'show w'.
578 | This is free software, and you are welcome to redistribute it
579 | under certain conditions; type 'show c' for details.
580 |
581 | The hypothetical commands `show w` and `show c` should show the appropriate parts of
582 | the General Public License. Of course, your program's commands might be different;
583 | for a GUI interface, you would use an “about box”.
584 |
585 | You should also get your employer (if you work as a programmer) or school, if any, to
586 | sign a “copyright disclaimer” for the program, if necessary. For more
587 | information on this, and how to apply and follow the GNU GPL, see
588 | <>.
589 |
590 | The GNU General Public License does not permit incorporating your program into
591 | proprietary programs. If your program is a subroutine library, you may consider it
592 | more useful to permit linking proprietary applications with the library. If this is
593 | what you want to do, use the GNU Lesser General Public License instead of this
594 | License. But first, please read
595 | <>.
596 |
--------------------------------------------------------------------------------
/Makefile:
--------------------------------------------------------------------------------
1 | CC = gcc
2 | STUFF = $(shell pkg-config --cflags gdk-3.0 gtk+-3.0 "gstreamer-webrtc-1.0 >= 1.16" "gstreamer-sdp-1.0 >= 1.16" gstreamer-video-1.0 libwebsockets json-glib-1.0) -D_GNU_SOURCE
3 | STUFF_LIBS = $(shell pkg-config --libs gdk-3.0 gtk+-3.0 "gstreamer-webrtc-1.0 >= 1.16" "gstreamer-sdp-1.0 >= 1.16" gstreamer-video-1.0 libwebsockets json-glib-1.0)
4 | OPTS = -Wall -Wstrict-prototypes -Wmissing-prototypes -Wmissing-declarations -Wunused #-Werror #-O2
5 | GDB = -g -ggdb
6 | OBJS = src/jamrtc.o src/webrtc.o
7 |
8 | all: jamrtc
9 |
10 | %.o: %.c
11 | $(CC) $(ASAN) $(STUFF) -fPIC $(GDB) -c $< -o $@ $(OPTS)
12 |
13 | jamrtc: $(OBJS)
14 | $(CC) $(GDB) -o JamRTC $(OBJS) $(ASAN_LIBS) $(STUFF_LIBS)
15 |
16 | clean:
17 | rm -f JamRTC src/*.o
18 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | (pre-alpha) JamRTC -- Jam sessions with Janus!
2 | ==============================================
3 |
4 | This is an attempt to create a simple prototype for doing jam sessions using WebRTC as a technology, and in particular taking advantage of [Janus](https://github.com/meetecho/janus-gateway/) as an open source WebRTC server for the purpose. The main reason why this effort uses Janus and not P2P or another server is simply that I'm the main author of Janus itself, and so that's what I'm most familiar with: you're of course free to try and come up with different attempts that use different backends, or no backend at all (the more we use WebRTC to help music, the better!).
5 |
6 | It's very early stages, and mostly meant as a playground for interested developers, musicians and more in general tinkerers, to experiment whether it's a viable road or not.
7 |
8 | # Motivation
9 |
10 | In 2021, I made a presentation at [FOSDEM](https://fosdem.org/2021/) called [Can WebRTC help musicians?](https://fosdem.org/2021/schedule/event/webrtc_musicians/), which tried to go through some common music-based scenarios, and how WebRTC could possibly help implement them on the Internet when real-time communication is required, especially in light of the impact the COVID pandemic had on the music world. The presentation was mostly an overview, and probably asked more questions than it answered, but it did contain a section on jam sessions as a key scenario WebRTC might help with, considering the real-time nature of the technology.
11 |
12 | It started from a few assumptions, they most important one being that browsers (usually the main target for WebRTC applications) are not a good device for participating in a jam session: in fact, the audio pipeline they implement presents some obstacles, like too much tinkering with the audio they capture (AGC, AEC), too much latency (integrated buffers and synchronization mechanisms), no way to easily capture generic devices (made for generic microphones, mostly), and no way to interact with lower-latency audio systems on the operating system (e.g., the [JACK Audio Connection Kit](https://github.com/jackaudio)). As such, the idea was to try and come up with a native application instead, in order to attempt to overcome the obstacles above. This is what this repo is trying to provide a foundation for.
13 |
14 | ### Why not Jamulus, Jacktrip, ``?
15 |
16 | This effort is not trying to replace anything, let alone well known applications: for instance, while I never used it, I'm well aware that Jamulus is very widespread among musicians, who use it successfully every day. Heck, I even thought of calling this project "JAMus" (as in Jam over Janus) and then changed my mind because it sounded too much like Jamulus itself! :grin: The only purpose of this project is to experiment with whether or not WebRTC (which the other tools are not built upon) can also help build something comparable in terms of requirements, especially considering there's advantages in having WebRTC as part of the stack in a scenario like this: if it grows into something more than that, then great, but that's not the point.
17 |
18 | If you're looking for something you can use RIGHT AWAY and is known to work great, then Jamulus or a similar application is indeed what you should use! If you're curious about WebRTC and music, then this is a good place to experiment with that, knowing it's not much more than a toy right now.
19 |
20 | ### Why WebRTC?
21 |
22 | Why not? :grin: Jokes apart, the main reason is that, as technology, WebRTC should be suited for the job, since it's main purpose is exchanging live audio/video/data streams in real-time. While it's mostly used for traditional use cases like conferencing, streaming, and stuff like that, the tools are there to do something in the "creative" department as well.
23 |
24 | Apart from that, there are some other advantages in using WebRTC for the job. If the music streams you're exchanging are WebRTC-based, e.g., via an SFU like the Janus VideoRoom or others, then it's very easy to distribute the same streams to "traditional" endpoints as well, like browsers, mobile endpoints, or others. This means you could build a simple web app to serve a small or larger audience, for instance, whose requirements for super-low latency would not be the same as those of the players involved. At the same time, WebRTC streams are easy to integrate in other frameworks for further processing, e.g., for mixing, processing, transcoding, or broadcasting purposes, which make them a useful tool for additional requirements and use cases: you can find some examples in a [Janus Workshop](https://www.youtube.com/watch?v=uW8ztFCkxVk&pp=qAMBugMGCgJpdBAB) I did at ClueCon TGI2021 ([slides](https://www2.slideshare.net/LorenzoMiniero/janus-workshop-pt2-cluecon-2021)).
25 |
26 | # Architecture
27 |
28 | As anticipated, the main idea behind this effort was to try and write a native WebRTC application to optimize the process as much as possible, especially where latency is involved. I decided to base this on @GStreamer 's [webrtcbin](https://gstreamer.freedesktop.org/documentation/webrtc/?gi-language=c), for a few different reasons:
29 |
30 | * I already had a bit of experience with it, having used it briefly in a couple of other projects;
31 | * GStreamer comes with a great set of plugins for decoding, encoding and rendering media;
32 | * GStreamer has dedicated plugins to interact with JACK as well;
33 | * it's relatively easy to build a GUI on top of GStreamer's building blocks (I used GTK+ 3.0, here).
34 |
35 | At the same time, I wanted to use Janus as a reference server for all endpoints engaged in a session. This led to the diagram you can see below:
36 |
37 | 
38 |
39 | In a nutshell, the idea was to have this GStreamer-based native application join a Janus [VideoRoom](https://janus.conf.meetecho.com/docs/videoroom), and then:
40 |
41 | 1. publish media to the server (e.g., your guitar and your webcam/mic);
42 | 2. subscribe to media from other participants (e.g., drum and bass player) via the same server.
43 |
44 | I conceived the application to not only allow you to share musical instruments, but also to have a parallel channel to interact with the other participants. As such, by default when launching JamRTC it will create two WebRTC PeerConnections: one to send your webcam + microphone, and another to send your instrument as an audio-only stream; of course you can choose whether you want to publish everything, just something, or nothing at all. When other participants join the session, the tool is conceived to automatically subscribe to whatever they're sharing: all the feeds are then rendered in a very simple (and super-ugly) UI, where audio streams are visualized via a wavescope.
45 |
46 | It's important to point out that, when using JACK as the audio backend, JamRTC is conceived not to automatically pick the devices to capture: this is up to you to do, by creating the connections you want manually. I'll get back to this later on, when explaining how to test what's currently there.
47 |
48 | # Building JamRTC
49 |
50 | The main dependencies of JamRTC are:
51 |
52 | * [GLib](http://library.gnome.org/devel/glib/)
53 | * [pkg-config](http://www.freedesktop.org/wiki/Software/pkg-config/)
54 | * [GStreamer](https://gstreamer.freedesktop.org/) (>= 1.16)
55 | * [GTK+ 3](https://www.gtk.org/)
56 | * [Glade](https://glade.gnome.org/)
57 | * [libwebsockets](https://libwebsockets.org/)
58 |
59 | Make sure the related development versions of the libraries are installed, before attempting to build JamRTC, as to keep things simple the `Makefile` is actually very raw and naive: it makes use of `pkg-config` to detect where the libraries are installed, but if some are not available it will still try to proceed (and will fail with possibly misleading error messages). All of the libraries should be available in most repos (they definitely are on Fedora, which is what I use everyday, and to my knowledge Ubuntu as well).
60 |
61 | Once the dependencies are installed, all you need to do to build JamRTC is to type:
62 |
63 | make
64 |
65 | This will create a `JamRTC` executable. Trying to launch that without arguments should display a help section:
66 |
67 | ```
68 | [lminiero@lminiero JamRTC]$ ./JamRTC
69 | Usage:
70 | JamRTC [OPTION?] -- Jam sessions with Janus!
71 |
72 | Help Options:
73 | -h, --help Show help options
74 |
75 | Application Options:
76 | -w, --ws Address of the Janus WebSockets backend (e.g., ws://localhost:8188; required)
77 | -r, --room Room to join (e.g., 1234; required)
78 | -d, --display Display name to use in the room (e.g., Lorenzo; required)
79 | -M, --no-mic Don't add an audio source for the local microphone (default: enable audio chat)
80 | -W, --no-webcam Don't add a video source for the local webcam (default: enable video chat)
81 | -v, --video-device Video device to use for the video chat (default: /dev/video0)
82 | -i, --instrument Description of the instrument (e.g., Guitar; default: unknown)
83 | -s, --stereo Whether the instrument will be stereo or mono (default: mono)
84 | -I, --no-instrument Don't add a source for the local instrument (default: enable instrument)
85 | -b, --jitter-buffer Jitter buffer to use in RTP, in milliseconds (default: 0, no buffering)
86 | -c, --src-opts Custom properties to add to jackaudiosrc (local instrument only)
87 | -S, --stun-server STUN server to use, if any (hostname:port)
88 | -T, --turn-server TURN server to use, if any (username:password@host:port)
89 | -l, --log-level Logging level (0=disable logging, 7=maximum log level; default: 4)
90 | -J, --no-jack For testing purposes, use autoaudiosrc/autoaudiosink instead (default: use JACK)
91 | ```
92 |
93 | # Running JamRTC
94 |
95 | At the moment, you can only customize the behaviour of JamRTC at startup, via command-line arguments, as there are no visual settings in the GUI or parameters you can tweak dynamically. This means that there are some required arguments to start the tool, namely:
96 |
97 | * The address of the WebSocket backend for the Janus instance to use;
98 | * The unique ID of the VideoRoom to use for the session;
99 | * The display name to use in the session.
100 |
101 | All the other arguments are optional, and allow you to customize how you want the tool to behave. As anticipated, for instance, by default the tool will try to capture webcam and microphone (for a visual interaction with the other participants) and a local instrument: you can use the available arguments to disable any of those, and only publish what you want to send. The other arguments should be pretty self-explainatory as well.
102 |
103 | > Note well! One important option is `-J`(or `--no-jack`). By default, JamRTC will assume JACK is running, and so will attempt to use it to both capture and render the audio streams: if you just want to test this without JACK, passing this flag will make JamRTC use `autoaudiosrc` and `autoaudiosink` instead where audio is involved, thus in theory choosing whatever you're using in your system. Take into account that this will limit the control you have on how devices are captured (`autoaudiosrc` will pick what it wants and likely not your guitar :grin: ), and that latency will probably be much higher as well.
104 |
105 | Here's a few examples of how you can launch it.
106 |
107 | ### Connect to a local Janus instance in room 1234 as "Lorenzo" and publish everything via JACK, calling the instrument "Guitar"
108 |
109 | ./JamRTC -w ws://localhost:8188 -r 1234 -d Lorenzo -i Guitar
110 |
111 | ### Connect to the online Janus demo to do the same
112 |
113 | ./JamRTC -w wss://janus-legacy.conf.meetecho.com/ws -r 1234 -d Lorenzo -i Guitar
114 |
115 | > Note that room `1234` is the default room on our online demos, and so you'll find random people in it that are just joining via their browser. As such, it's a bad idea to use that room for testing, unless you know there's no one else except you and who you want to test with. You may want to deploy your own Janus instance for testing.
116 |
117 | ### Connect to a local Janus instance in room 1234 as "Lorenzo" and publish everything, but use a different video device
118 |
119 | ./JamRTC -w ws://localhost:8188 -r 1234 -d Lorenzo -i Guitar -v /dev/video2
120 |
121 | ### Connect to a local Janus instance in room 1234 as "Lorenzo" and only publish webcam and instrument (no microphone)
122 |
123 | ./JamRTC -w ws://localhost:8188 -r 1234 -d Lorenzo -M -i Guitar
124 |
125 | ### Connect to a local Janus instance in room 1234 as "Lorenzo" and only publish your instrument (no microphone/webcam) as stereo
126 |
127 | ./JamRTC -w ws://localhost:8188 -r 1234 -d Lorenzo -M -W -i Guitar -s
128 |
129 | ### Connect as a passive attendee
130 |
131 | ./JamRTC -w ws://localhost:8188 -r 1234 -d John -W -M -I
132 |
133 | ### Connect as a passive attendee from a browser
134 |
135 | Tinker with the default `videoroomtest.html` demo page in the Janus repo, or write your own web application for the purpose.
136 |
137 | > Note: In case you plan to write your own web-based frontend, it's worth pointing out that JamRTC "abuses" the display field of the VideoRoom API to correlate the multiple feeds. In fact, as anticipated by default JamRTC publishes two separate PeerConnections, one for webcam+mic, and another for the instrument: in order to allow other participants they both come from you, JamRTC actually uses a JSON object serialized to string as the display used for each publisher stream. This should be taken into account when designing a web-based application, as otherwise different PeerConnections coming from the same participant would be rendered as separate participants (which is what the default VideoRoom demo will indeed do).
138 |
139 | ### The ugliest GUI you'll see today
140 |
141 | When you start JamRTC, you should see a window that looks like this:
142 |
143 | 
144 |
145 | Quite frankly and in all self-deprecating honesty, nothing to write back home about, but as a start (and for someone who knows very little about GUI development) it's not that awful... :hand_over_mouth:
146 |
147 | Assuming you're sharing everything, you'll see your video on top, then a visual representation of your microphone, and finally a visual representation of your instrument. In case any of those is not being shared, they just won't be there, and the related labels will be updated accordingly.
148 |
149 | When someone else joins the session, a new column will appear with the media they're sharing instead:
150 |
151 | 
152 |
153 | I sketched the current layout with Glade, and at the moment it only accomodates four participants, whether they're active or just attendees: the moment more participants join, their media is rendered independently by GStreamer in separate floating windows. Hopefully later on this can be made more dynamic and adaptive, but that's not what I really care about right now.
154 |
155 | As anticipated, the GUI currently is passive, meaning there's no interactive component: there's no menus, no buttons, nothing you can tweak or anything like that, what you see is what you get. Since I'm very new to GUI development, and GTK in particular, this is the best I could come up with: besides, the code itself is probably not very "separated" in terms of logic vs. rendering, so refactoring the UI may not be that easy. Anyway, feedback from who's smarter in this department will definitely help make this more usable in the future, when maybe the media itself works better than it does today!
156 |
157 | > Note: sometimes, when people join some of their media are not rendered right away, and you have to minimize the application and unminimize it again: this is probably related to my poor UI coding skills, as it feels like a missing message somewhere to wake something up.
158 |
159 | # Using JACK with JamRTC
160 |
161 | As anticipated, when using JACK to handle audio, JamRTC will connect subscriptions to the speakers automatically, but will not automatically connect inputs as well: that's up to you to do, as you may want to actually share something specific to your setup (e.g., the raw input from the guitar vs. what Guitarix is processing).
162 |
163 | Assuming you're asking JamRTC to share both microphone (for the audio/video chat) and an instrument, this is what you should see in JACK:
164 |
165 | 
166 |
167 | As you can see, there are two JACK input nodes, both of which currently not receiving input from any source. The first thing you may want to do is feed the instrument node (called "JamRTC Guitar" here) in this picture: let's connect Guitarix in this example, which is getting the raw guitar feed from my Focusrite soundcard; in order to hear what I'm playing, I'm connecting Guitarix to the speakers too. The important part is that the moment we start feeding the instrument node, it will be sent via WebRTC to the server, and so to the other participants:
168 |
169 | 
170 |
171 | Since we want to share our microphone to chat with the other participants as well, we may need an app to help with that: in this instance, in fact, the capture device is my Focusrite soundcard, which is not where my microphone is. Specifically, I want to use the microphone of my laptop, so I can use tools like `zita-a2j` to add an Alsa device as an additional capture device, e.g.:
172 |
173 | zita-a2j -d hw:0
174 |
175 | At this point, I can connect the output of that to the JamRTC chat node (called "JamRTC mic"), which will as a result start sending our voice via WebRTC as well:
176 |
177 | 
178 |
179 | When other participants join, JamRTC automatically plays their contributions by connecting the related JACK notes to the speakers instead, as that's what we assume would be the main purpose:
180 |
181 | 
182 |
183 | Considering how JACK works, you're of course free to (also) connect the output to something else, e.g., a DAW.
184 |
185 | # What's missing?
186 |
187 | This should be very much considered a pre-alpha, as while it "works", I'm still not satisfied with it. The main issue is, obviously, the latency, which is still too high even in a local environment despite my attempts to reduce it. I'm still trying to figure out where the issue might be, as it may be either something in the GStreamer pipelines (WebRTC stack? buffers? queues? JACK integration?), in Janus (something adding latency there?) or WebRTC itself. I don't know enough about the GStreamer internals to know if anything can be improved there (I already disabled the webrtcbin jitter buffer, but not sure it helped much), so the hope is that by sharing this effort and letting more people play with it, especially those knowldedgeable with any of the different technologies involved, we can come up with something that can be really used out there.
188 |
189 | Notice that one other known limitation is that, out of the box, this currently only works on Linux: this is because we explicitly use some GStreamer elements that may only be available there (e.g., `jackaudiosrc`, `jackaudiosink`, `xvimagesink`). This is very simply because Linux is what I use everyday and what I'm familiar with: besides, while I know JACK allows for low latency on Linux, I have no idea what is typically used on other operatng systems instead for the same purpose. Should you want to try and build this on other systems as well, you'll probably have to tweak the code a bit: of course, if you get it running on Windows and/or Mac OS, I'd be glad to have the changes contributes back to the repo!
190 |
191 | Apart from these more fundamental aspects, there are other things I'd like to add to the tool sooner or later, in particolar related to usability, e.g.:
192 |
193 | * being able to mute/unmute your streams (more in general, dynamic manipulation of pipeline elements);
194 | * being able to dynamically add/remove instruments after the start;
195 | * a UI that doesn't suck!
196 |
197 | # Help us improve this
198 |
199 | That's all: feedback and contributions more than welcome!
200 |
201 | There's no group, chat or mailing list for the project, right now: I don't anticipate the need for one in the short term, but if enough people show interest, we can figure something out (maybe a room on Matrix, which worked great for FOSDEM). In the meanwhile, if you have ideas or feedback, feel free to open issues where they can be discussed, and please do submit pull requests if you fix some outstanding bug (of which there are plenty, I'm sure). There's a [thread on LinuxMusicians](https://linuxmusicians.com/viewtopic.php?p=130383) that discusses playing music online as well which you may be interested in: LinuxMusicians itself may be another venue where to discuss the tool in general, maybe in a dedicated thread.
202 |
--------------------------------------------------------------------------------
/images/architecture.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/jamrtc/0e1a4144a58a6495ff2a6447b4ad4e2fd1a5dc4e/images/architecture.jpg
--------------------------------------------------------------------------------
/images/gui-1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/jamrtc/0e1a4144a58a6495ff2a6447b4ad4e2fd1a5dc4e/images/gui-1.png
--------------------------------------------------------------------------------
/images/gui-2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/jamrtc/0e1a4144a58a6495ff2a6447b4ad4e2fd1a5dc4e/images/gui-2.png
--------------------------------------------------------------------------------
/images/jack-1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/jamrtc/0e1a4144a58a6495ff2a6447b4ad4e2fd1a5dc4e/images/jack-1.png
--------------------------------------------------------------------------------
/images/jack-2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/jamrtc/0e1a4144a58a6495ff2a6447b4ad4e2fd1a5dc4e/images/jack-2.png
--------------------------------------------------------------------------------
/images/jack-3.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/jamrtc/0e1a4144a58a6495ff2a6447b4ad4e2fd1a5dc4e/images/jack-3.png
--------------------------------------------------------------------------------
/images/jack-4.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/lminiero/jamrtc/0e1a4144a58a6495ff2a6447b4ad4e2fd1a5dc4e/images/jack-4.png
--------------------------------------------------------------------------------
/src/debug.h:
--------------------------------------------------------------------------------
1 | /*
2 | * JamRTC -- Jam sessions on Janus!
3 | *
4 | * Ugly prototype, just to use as a proof of concept
5 | *
6 | * Developed by Lorenzo Miniero: lorenzo@meetecho.com
7 | * License: GPLv3
8 | *
9 | */
10 |
11 | #ifndef JAMRTC_DEBUG_H
12 | #define JAMRTC_DEBUG_H
13 |
14 | #include
15 |
16 | #include
17 | #include
18 |
19 | extern int jamrtc_log_level;
20 | extern gboolean jamrtc_log_timestamps;
21 | extern gboolean jamrtc_log_colors;
22 |
23 | /* Log colors */
24 | #define ANSI_COLOR_RED "\x1b[31m"
25 | #define ANSI_COLOR_GREEN "\x1b[32m"
26 | #define ANSI_COLOR_YELLOW "\x1b[33m"
27 | #define ANSI_COLOR_BLUE "\x1b[34m"
28 | #define ANSI_COLOR_MAGENTA "\x1b[35m"
29 | #define ANSI_COLOR_CYAN "\x1b[36m"
30 | #define ANSI_COLOR_RESET "\x1b[0m"
31 |
32 | /* Log levels */
33 | #define LOG_NONE (0)
34 | #define LOG_FATAL (1)
35 | #define LOG_ERR (2)
36 | #define LOG_WARN (3)
37 | #define LOG_INFO (4)
38 | #define LOG_VERB (5)
39 | #define LOG_HUGE (6)
40 | #define LOG_DBG (7)
41 | #define LOG_MAX LOG_DBG
42 |
43 | /* Coloured prefixes for errors and warnings logging. */
44 | static const char *jamrtc_log_prefix[] = {
45 | /* no colors */
46 | "",
47 | "[FATAL] ",
48 | "[ERR] ",
49 | "[WARN] ",
50 | "",
51 | "",
52 | "",
53 | "",
54 | /* with colors */
55 | "",
56 | ANSI_COLOR_MAGENTA"[FATAL]"ANSI_COLOR_RESET" ",
57 | ANSI_COLOR_RED"[ERR]"ANSI_COLOR_RESET" ",
58 | ANSI_COLOR_YELLOW"[WARN]"ANSI_COLOR_RESET" ",
59 | "",
60 | "",
61 | "",
62 | ""
63 | };
64 |
65 | /* Simple wrapper to g_print/printf */
66 | #define JAMRTC_PRINT g_print
67 | /* Logger based on different levels, which can either be displayed
68 | * or not according to the configuration of the gateway.
69 | * The format must be a string literal. */
70 | #define JAMRTC_LOG(level, format, ...) \
71 | do { \
72 | if (level > LOG_NONE && level <= LOG_MAX && level <= jamrtc_log_level) { \
73 | char jamrtc_log_ts[64] = ""; \
74 | char jamrtc_log_src[128] = ""; \
75 | if (jamrtc_log_timestamps) { \
76 | struct tm jamrtctmresult; \
77 | time_t jamrtcltime = time(NULL); \
78 | localtime_r(&jamrtcltime, &jamrtctmresult); \
79 | strftime(jamrtc_log_ts, sizeof(jamrtc_log_ts), \
80 | "[%a %b %e %T %Y] ", &jamrtctmresult); \
81 | } \
82 | if (level == LOG_FATAL || level == LOG_ERR || level == LOG_DBG) { \
83 | snprintf(jamrtc_log_src, sizeof(jamrtc_log_src), \
84 | "[%s:%s:%d] ", __FILE__, __FUNCTION__, __LINE__); \
85 | } \
86 | g_print("%s%s%s" format, \
87 | jamrtc_log_ts, \
88 | jamrtc_log_prefix[level | ((int)jamrtc_log_colors << 3)], \
89 | jamrtc_log_src, \
90 | ##__VA_ARGS__); \
91 | } \
92 | } while (0)
93 |
94 | #endif
95 |
--------------------------------------------------------------------------------
/src/jamrtc.c:
--------------------------------------------------------------------------------
1 | /*
2 | * JamRTC -- Jam sessions on Janus!
3 | *
4 | * Ugly prototype, just to use as a proof of concept
5 | *
6 | * Developed by Lorenzo Miniero: lorenzo@meetecho.com
7 | * License: GPLv3
8 | *
9 | */
10 |
11 | /* Generic includes */
12 | #include
13 | #include
14 | #include
15 | #include
16 | #include
17 | #include
18 |
19 | /* GTK includes */
20 | #include
21 |
22 | /* GStreamer includes */
23 | #include
24 |
25 | /* Local includes */
26 | #include "webrtc.h"
27 | #include "debug.h"
28 |
29 |
30 | /* Logging */
31 | int jamrtc_log_level = LOG_INFO;
32 | gboolean jamrtc_log_timestamps = FALSE;
33 | gboolean jamrtc_log_colors = TRUE;
34 | int lock_debug = 0;
35 |
36 | /* Reference counters */
37 |
38 | #ifdef REFCOUNT_DEBUG
39 | int refcount_debug = 1;
40 | GHashTable *counters = NULL;
41 | janus_mutex counters_mutex;
42 | #else
43 | int refcount_debug = 0;
44 | #endif
45 |
46 | /* Signal handler */
47 | static volatile gint stop = 0;
48 | static void jamrtc_handle_signal(int signum) {
49 | JAMRTC_LOG(LOG_INFO, "Stopping JamRTC...\n");
50 | if(g_atomic_int_compare_and_exchange(&stop, 0, 1)) {
51 | jamrtc_webrtc_cleanup();
52 | } else {
53 | g_atomic_int_inc(&stop);
54 | if(g_atomic_int_get(&stop) > 2)
55 | exit(1);
56 | }
57 | }
58 |
59 | /* Signalling/WebRTC callbacks */
60 | static void jamrtc_server_connected(void);
61 | static void jamrtc_server_disconnected(void);
62 | static void jamrtc_joined_room(void);
63 | static void jamrtc_participant_joined(const char *uuid, const char *display);
64 | static void jamrtc_stream_started(const char *uuid, const char *display, const char *instrument,
65 | gboolean has_audio, gboolean has_video);
66 | static void jamrtc_stream_stopped(const char *uuid, const char *display, const char *instrument);
67 | static void jamrtc_participant_left(const char *uuid, const char *display);
68 | static jamrtc_callbacks callbacks =
69 | {
70 | .server_connected = jamrtc_server_connected,
71 | .server_disconnected = jamrtc_server_disconnected,
72 | .joined_room = jamrtc_joined_room,
73 | .participant_joined = jamrtc_participant_joined,
74 | .stream_started = jamrtc_stream_started,
75 | .stream_stopped = jamrtc_stream_stopped,
76 | .participant_left = jamrtc_participant_left,
77 | };
78 |
79 | /* Command line arguments */
80 | static const char *server_url = NULL;
81 | static guint64 room_id = 0;
82 | static const char *display = NULL, *instrument = NULL;
83 | static gboolean no_mic = FALSE, no_webcam = FALSE, no_instrument = FALSE,
84 | stereo = FALSE, no_jack = FALSE;
85 | static const char *video_device = NULL, *src_opts = NULL;
86 | static guint latency = 0;
87 | static const char *stun_server = NULL, *turn_server = NULL;
88 |
89 | static GOptionEntry opt_entries[] = {
90 | { "ws", 'w', 0, G_OPTION_ARG_STRING, &server_url, "Address of the Janus WebSockets backend (e.g., ws://localhost:8188; required)", NULL },
91 | { "room", 'r', 0, G_OPTION_ARG_INT, &room_id, "Room to join (e.g., 1234; required)", NULL },
92 | { "display", 'd', 0, G_OPTION_ARG_STRING, &display, "Display name to use in the room (e.g., Lorenzo; required)", NULL },
93 | { "no-mic", 'M', 0, G_OPTION_ARG_NONE, &no_mic, "Don't add an audio source for the local microphone (default: enable audio chat)", NULL },
94 | { "no-webcam", 'W', 0, G_OPTION_ARG_NONE, &no_webcam, "Don't add a video source for the local webcam (default: enable video chat)", NULL },
95 | { "video-device", 'v', 0, G_OPTION_ARG_STRING, &video_device, "Video device to use for the video chat (default: /dev/video0)", NULL },
96 | { "instrument", 'i', 0, G_OPTION_ARG_STRING, &instrument, "Description of the instrument (e.g., Guitar; default: unknown)", NULL },
97 | { "stereo", 's', 0, G_OPTION_ARG_NONE, &stereo, "Whether the instrument will be stereo or mono (default: mono)", NULL },
98 | { "no-instrument", 'I', 0, G_OPTION_ARG_NONE, &no_instrument, "Don't add a source for the local instrument (default: enable instrument)", NULL },
99 | { "jitter-buffer", 'b', 0, G_OPTION_ARG_INT, &latency, "Jitter buffer to use in RTP, in milliseconds (default: 0, no buffering)", NULL },
100 | { "src-opts", 'c', 0, G_OPTION_ARG_STRING, &src_opts, "Custom properties to add to jackaudiosrc (local instrument only)", NULL },
101 | { "stun-server", 'S', 0, G_OPTION_ARG_STRING, &stun_server, "STUN server to use, if any (hostname:port)", NULL },
102 | { "turn-server", 'T', 0, G_OPTION_ARG_STRING, &turn_server, "TURN server to use, if any (username:password@host:port)", NULL },
103 | { "log-level", 'l', 0, G_OPTION_ARG_INT, &jamrtc_log_level, "Logging level (0=disable logging, 7=maximum log level; default: 4)", NULL },
104 | { "no-jack", 'J', 0, G_OPTION_ARG_NONE, &no_jack, "For testing purposes, use autoaudiosrc/autoaudiosink instead (default: use JACK)", NULL },
105 | { NULL },
106 | };
107 |
108 |
109 | /* Helper method to ensure GStreamer has the modules we need */
110 | static gboolean jamrtc_check_gstreamer_plugins(void) {
111 | /* Note: maybe some of these should be optional? And OS-aware... */
112 | const char *needed[] = {
113 | "jack",
114 | "opus",
115 | "vpx",
116 | "nice",
117 | "webrtc",
118 | "dtls",
119 | "srtp",
120 | "rtpmanager",
121 | "video4linux2",
122 | NULL
123 | };
124 | GstRegistry *registry = gst_registry_get();
125 | if(registry == NULL) {
126 | JAMRTC_LOG(LOG_FATAL, "No plugins registered in gstreamer\n");
127 | return FALSE;
128 | }
129 | gboolean ret = TRUE;
130 | int i = 0;
131 | GstPlugin *plugin = NULL;
132 | for(i = 0; i < g_strv_length((char **) needed); i++) {
133 | plugin = gst_registry_find_plugin(registry, needed[i]);
134 | if(plugin == NULL) {
135 | JAMRTC_LOG(LOG_FATAL, "Required gstreamer plugin '%s' not found\n", needed[i]);
136 | ret = FALSE;
137 | continue;
138 | }
139 | gst_object_unref(plugin);
140 | }
141 | return ret;
142 | }
143 |
144 | /* Thread responsible for the main loop */
145 | static gpointer jamrtc_loop_thread(gpointer user_data) {
146 | GtkBuilder *builder = (GtkBuilder *)user_data;
147 | /* Start the main Glib loop */
148 | GMainLoop *loop = g_main_loop_new(NULL, FALSE);
149 | /* Initialize the Janus stack: we'll continue in the 'server_connected' callback */
150 | if(jamrtc_webrtc_init(&callbacks, builder, loop, server_url, stun_server, turn_server, src_opts, latency, no_jack) < 0) {
151 | g_main_loop_unref(loop);
152 | exit(1);
153 | }
154 | /* Loop forever */
155 | g_main_loop_run(loop);
156 |
157 | /* When we leave the loop, we're done */
158 | g_main_loop_unref(loop);
159 | gtk_main_quit();
160 | return NULL;
161 | }
162 |
163 | /* This function is called when the main window is closed */
164 | static void jamrtc_window_closed(GtkWidget *widget, GdkEvent *event, gpointer user_data) {
165 | /* Simulate a SIGINT */
166 | jamrtc_handle_signal(SIGINT);
167 | }
168 |
169 | /* Main application */
170 | int main(int argc, char *argv[]) {
171 |
172 | /* Parse the command-line arguments */
173 | GError *error = NULL;
174 | GOptionContext *opts = g_option_context_new("-- Jam sessions with Janus!");
175 | g_option_context_set_help_enabled(opts, TRUE);
176 | g_option_context_add_main_entries(opts, opt_entries, NULL);
177 | if(!g_option_context_parse(opts, &argc, &argv, &error)) {
178 | g_error_free(error);
179 | exit(1);
180 | }
181 | /* If some arguments are missing, fail */
182 | if(server_url == NULL || room_id == 0 || display == NULL) {
183 | char *help = g_option_context_get_help(opts, TRUE, NULL);
184 | g_print("%s", help);
185 | g_free(help);
186 | g_option_context_free(opts);
187 | exit(1);
188 | }
189 | /* Assign some defaults */
190 | if(video_device == NULL)
191 | video_device = "/dev/video0";
192 | if(instrument == NULL)
193 | instrument = "unknown";
194 | if(src_opts == NULL)
195 | src_opts = "";
196 | if(latency > 1000)
197 | JAMRTC_LOG(LOG_WARN, "Very high jitter-buffer latency configured (%u)\n", latency);
198 |
199 | /* Logging level: default is info and no timestamps */
200 | if(jamrtc_log_level == 0)
201 | jamrtc_log_level = LOG_INFO;
202 | if(jamrtc_log_level < LOG_NONE)
203 | jamrtc_log_level = 0;
204 | else if(jamrtc_log_level > LOG_MAX)
205 | jamrtc_log_level = LOG_MAX;
206 |
207 | /* Handle SIGINT (CTRL-C), SIGTERM (from service managers) */
208 | signal(SIGINT, jamrtc_handle_signal);
209 | signal(SIGTERM, jamrtc_handle_signal);
210 |
211 | /* Start JamRTC */
212 | JAMRTC_LOG(LOG_INFO, "\n---------------------------------\n");
213 | JAMRTC_LOG(LOG_INFO, "JamRTC -- Jam sessions with Janus!\n");
214 | JAMRTC_LOG(LOG_INFO, "---------------------------------\n\n");
215 | JAMRTC_LOG(LOG_WARN, "Very alpha version!\n\n");
216 | JAMRTC_LOG(LOG_INFO, "Janus backend: %s\n", server_url);
217 | JAMRTC_LOG(LOG_INFO, "VideoRoom ID: %"SCNu64"\n", room_id);
218 | JAMRTC_LOG(LOG_INFO, "Display name: %s\n", display);
219 | JAMRTC_LOG(LOG_INFO, "Videochat: mic %s, webcam %s\n", no_mic ? "disabled" : "enabled", no_webcam ? "disabled" : "enabled");
220 | if(!no_webcam)
221 | JAMRTC_LOG(LOG_INFO, "Video device: %s\n", video_device);
222 | if(no_instrument)
223 | JAMRTC_LOG(LOG_INFO, "Instrument: disabled\n");
224 | else
225 | JAMRTC_LOG(LOG_INFO, "Instrument: %s (%s JACK input)\n", instrument, stereo ? "stereo" : "mono");
226 | if(strlen(src_opts) > 0)
227 | JAMRTC_LOG(LOG_INFO, "JACK capture: %s\n", src_opts);
228 | JAMRTC_LOG(LOG_INFO, "STUN server: %s\n", stun_server ? stun_server : "(none)");
229 | JAMRTC_LOG(LOG_INFO, "TURN server: %s\n\n", turn_server ? turn_server : "(none)");
230 | if(no_jack)
231 | JAMRTC_LOG(LOG_WARN, "For testing purposes, we'll use autoaudiosrc/autoaudiosink, instead of jackaudiosrc/jackaudiosink\n\n");
232 |
233 | /* Initialize GStreamer */
234 | gst_init(NULL, NULL);
235 | /* Make sure our gstreamer dependency has all we need */
236 | if(!jamrtc_check_gstreamer_plugins()) {
237 | g_option_context_free(opts);
238 | exit(1);
239 | }
240 |
241 | /* Initialize GTK */
242 | gtk_init(NULL, NULL);
243 | GtkBuilder *builder = gtk_builder_new();
244 | gtk_builder_add_from_file(builder, "JamRTC.glade", NULL);
245 | gtk_builder_connect_signals(builder, NULL);
246 | GtkWidget *window = GTK_WIDGET(gtk_builder_get_object(builder, "main_window"));
247 | g_signal_connect(G_OBJECT(window), "delete-event", G_CALLBACK(jamrtc_window_closed), NULL);
248 | gtk_widget_show(window);
249 |
250 | /* Spawn a thread to initialize the WebRTC code */
251 | (void)g_thread_try_new("jamrtc loop", jamrtc_loop_thread, builder, &error);
252 | if(error != NULL) {
253 | JAMRTC_LOG(LOG_FATAL, "Got error %d (%s) trying to launch the Jamus loop thread...\n",
254 | error->code, error->message ? error->message : "??");
255 | g_option_context_free(opts);
256 | gst_deinit();
257 | exit(1);
258 | }
259 |
260 | /* Show the application */
261 | gtk_main();
262 |
263 | #ifdef REFCOUNT_DEBUG
264 | /* Any reference counters that are still up while we're leaving? (debug-mode only) */
265 | jamrtc_mutex_lock(&counters_mutex);
266 | if(counters && g_hash_table_size(counters) > 0) {
267 | JAMRTC_PRINT("Debugging reference counters: %d still allocated\n", g_hash_table_size(counters));
268 | GHashTableIter iter;
269 | gpointer value;
270 | g_hash_table_iter_init(&iter, counters);
271 | while(g_hash_table_iter_next(&iter, NULL, &value)) {
272 | JAMRTC_PRINT(" -- %p\n", value);
273 | }
274 | } else {
275 | JAMRTC_PRINT("Debugging reference counters: 0 still allocated\n");
276 | }
277 | jamrtc_mutex_unlock(&counters_mutex);
278 | #endif
279 |
280 | g_option_context_free(opts);
281 | gst_deinit();
282 | JAMRTC_LOG(LOG_INFO, "\nBye!\n");
283 | exit(0);
284 | }
285 |
286 | /* We connected to the Janus instance */
287 | static void jamrtc_server_connected(void) {
288 | /* Now that we're connected, we can join the room */
289 | jamrtc_join_room(room_id, display);
290 | }
291 |
292 | /* We lost the connection to the Janus instance */
293 | static void jamrtc_server_disconnected(void) {
294 | /* Simulate a SIGINT */
295 | jamrtc_handle_signal(SIGINT);
296 | }
297 |
298 | /* We successfully joined the room */
299 | static void jamrtc_joined_room(void) {
300 | /* Check if we need to publish our mic/webcam */
301 | if(!no_mic || !no_webcam)
302 | jamrtc_webrtc_publish_micwebcam(no_mic, no_webcam, video_device);
303 | /* Check if we need to publish our local instrument now */
304 | if(!no_instrument)
305 | jamrtc_webrtc_publish_instrument(instrument, stereo);
306 | }
307 |
308 | /* A new participant just joined the session */
309 | static void jamrtc_participant_joined(const char *uuid, const char *display) {
310 | JAMRTC_LOG(LOG_INFO, "Participant joined (%s, %s)\n", uuid, display);
311 | }
312 |
313 | /* A new stream for a remote participant just became available */
314 | static void jamrtc_stream_started(const char *uuid, const char *display, const char *instrument,
315 | gboolean has_audio, gboolean has_video) {
316 | /* TODO Subscribe */
317 | JAMRTC_LOG(LOG_INFO, "Stream started (%s, %s, %s)\n", uuid, display, instrument);
318 | jamrtc_webrtc_subscribe(uuid, instrument != NULL);
319 | }
320 |
321 | /* An existing stream for a remote participant just went away */
322 | static void jamrtc_stream_stopped(const char *uuid, const char *display, const char *instrument) {
323 | /* TODO Unsubscribe */
324 | JAMRTC_LOG(LOG_INFO, "Stream stopped (%s, %s, %s)\n", uuid, display, instrument);
325 | }
326 |
327 | /* An existing participant just left the session */
328 | static void jamrtc_participant_left(const char *uuid, const char *display) {
329 | JAMRTC_LOG(LOG_INFO, "Participant left (%s, %s)\n", uuid, display);
330 | }
331 |
--------------------------------------------------------------------------------
/src/mutex.h:
--------------------------------------------------------------------------------
1 | /*
2 | * JamRTC -- Jam sessions on Janus!
3 | *
4 | * Ugly prototype, just to use as a proof of concept
5 | *
6 | * Developed by Lorenzo Miniero: lorenzo@meetecho.com
7 | * License: GPLv3
8 | *
9 | */
10 |
11 | #ifndef JAMRTC_MUTEX_H
12 | #define JAMRTC_MUTEX_H
13 |
14 | #include
15 | #include
16 |
17 | #include "debug.h"
18 |
19 | extern int lock_debug;
20 |
21 | #ifdef USE_PTHREAD_MUTEX
22 |
23 | /*! Jamus mutex implementation */
24 | typedef pthread_mutex_t jamrtc_mutex;
25 | /*! Jamus mutex initialization */
26 | #define jamrtc_mutex_init(a) pthread_mutex_init(a,NULL)
27 | /*! Jamus static mutex initializer */
28 | #define JAMRTC_MUTEX_INITIALIZER PTHREAD_MUTEX_INITIALIZER
29 | /*! Jamus mutex destruction */
30 | #define jamrtc_mutex_destroy(a) pthread_mutex_destroy(a)
31 | /*! Jamus mutex lock without debug */
32 | #define jamrtc_mutex_lock_nodebug(a) pthread_mutex_lock(a);
33 | /*! Jamus mutex lock with debug (prints the line that locked a mutex) */
34 | #define jamrtc_mutex_lock_debug(a) { JAMRTC_PRINT("[%s:%s:%d:lock] %p\n", __FILE__, __FUNCTION__, __LINE__, a); pthread_mutex_lock(a); };
35 | /*! Jamus mutex lock wrapper (selective locking debug) */
36 | #define jamrtc_mutex_lock(a) { if(!lock_debug) { jamrtc_mutex_lock_nodebug(a); } else { jamrtc_mutex_lock_debug(a); } };
37 | /*! Jamus mutex unlock without debug */
38 | #define jamrtc_mutex_unlock_nodebug(a) pthread_mutex_unlock(a);
39 | /*! Jamus mutex unlock with debug (prints the line that unlocked a mutex) */
40 | #define jamrtc_mutex_unlock_debug(a) { JAMRTC_PRINT("[%s:%s:%d:unlock] %p\n", __FILE__, __FUNCTION__, __LINE__, a); pthread_mutex_unlock(a); };
41 | /*! Jamus mutex unlock wrapper (selective locking debug) */
42 | #define jamrtc_mutex_unlock(a) { if(!lock_debug) { jamrtc_mutex_unlock_nodebug(a); } else { jamrtc_mutex_unlock_debug(a); } };
43 |
44 | /*! Jamus condition implementation */
45 | typedef pthread_cond_t jamrtc_condition;
46 | /*! Jamus condition initialization */
47 | #define jamrtc_condition_init(a) pthread_cond_init(a,NULL)
48 | /*! Jamus condition destruction */
49 | #define jamrtc_condition_destroy(a) pthread_cond_destroy(a)
50 | /*! Jamus condition wait */
51 | #define jamrtc_condition_wait(a, b) pthread_cond_wait(a, b);
52 | /*! Jamus condition timed wait */
53 | #define jamrtc_condition_timedwait(a, b, c) pthread_cond_timedwait(a, b, c);
54 | /*! Jamus condition signal */
55 | #define jamrtc_condition_signal(a) pthread_cond_signal(a);
56 | /*! Jamus condition broadcast */
57 | #define jamrtc_condition_broadcast(a) pthread_cond_broadcast(a);
58 |
59 | #else
60 |
61 | /*! Jamus mutex implementation */
62 | typedef GMutex jamrtc_mutex;
63 | /*! Jamus mutex initialization */
64 | #define jamrtc_mutex_init(a) g_mutex_init(a)
65 | /*! Jamus static mutex initializer */
66 | #define JAMRTC_MUTEX_INITIALIZER {0}
67 | /*! Jamus mutex destruction */
68 | #define jamrtc_mutex_destroy(a) g_mutex_clear(a)
69 | /*! Jamus mutex lock without debug */
70 | #define jamrtc_mutex_lock_nodebug(a) g_mutex_lock(a);
71 | /*! Jamus mutex lock with debug (prints the line that locked a mutex) */
72 | #define jamrtc_mutex_lock_debug(a) { JAMRTC_PRINT("[%s:%s:%d:lock] %p\n", __FILE__, __FUNCTION__, __LINE__, a); g_mutex_lock(a); };
73 | /*! Jamus mutex lock wrapper (selective locking debug) */
74 | #define jamrtc_mutex_lock(a) { if(!lock_debug) { jamrtc_mutex_lock_nodebug(a); } else { jamrtc_mutex_lock_debug(a); } };
75 | /*! Jamus mutex unlock without debug */
76 | #define jamrtc_mutex_unlock_nodebug(a) g_mutex_unlock(a);
77 | /*! Jamus mutex unlock with debug (prints the line that unlocked a mutex) */
78 | #define jamrtc_mutex_unlock_debug(a) { JAMRTC_PRINT("[%s:%s:%d:unlock] %p\n", __FILE__, __FUNCTION__, __LINE__, a); g_mutex_unlock(a); };
79 | /*! Jamus mutex unlock wrapper (selective locking debug) */
80 | #define jamrtc_mutex_unlock(a) { if(!lock_debug) { jamrtc_mutex_unlock_nodebug(a); } else { jamrtc_mutex_unlock_debug(a); } };
81 |
82 | /*! Jamus condition implementation */
83 | typedef GCond jamrtc_condition;
84 | /*! Jamus condition initialization */
85 | #define jamrtc_condition_init(a) g_cond_init(a)
86 | /*! Jamus condition destruction */
87 | #define jamrtc_condition_destroy(a) g_cond_clear(a)
88 | /*! Jamus condition wait */
89 | #define jamrtc_condition_wait(a, b) g_cond_wait(a, b);
90 | /*! Jamus condition wait until */
91 | #define jamrtc_condition_wait_until(a, b, c) g_cond_wait_until(a, b, c);
92 | /*! Jamus condition signal */
93 | #define jamrtc_condition_signal(a) g_cond_signal(a);
94 | /*! Jamus condition broadcast */
95 | #define jamrtc_condition_broadcast(a) g_cond_broadcast(a);
96 |
97 | #endif
98 |
99 | #endif
100 |
--------------------------------------------------------------------------------
/src/refcount.h:
--------------------------------------------------------------------------------
1 | /*
2 | * JamRTC -- Jam sessions on Janus!
3 | *
4 | * Ugly prototype, just to use as a proof of concept
5 | *
6 | * Developed by Lorenzo Miniero: lorenzo@meetecho.com
7 | * License: GPLv3
8 | *
9 | */
10 |
11 | #ifndef JAMRTC_REFCOUNT_H
12 | #define JAMRTC_REFCOUNT_H
13 |
14 | #include
15 | #include "mutex.h"
16 |
17 | //~ #define REFCOUNT_DEBUG
18 |
19 | extern int refcount_debug;
20 |
21 | #define jamrtc_refcount_containerof(refptr, type, member) \
22 | ((type *)((char *)(refptr) - offsetof(type, member)))
23 |
24 |
25 | /*! Jamus reference counter structure */
26 | typedef struct jamrtc_refcount jamrtc_refcount;
27 | struct jamrtc_refcount {
28 | /*! The reference counter itself */
29 | gint count;
30 | /*! Pointer to the function that will be used to free the object */
31 | void (*free)(const jamrtc_refcount *);
32 | };
33 |
34 |
35 | #ifdef REFCOUNT_DEBUG
36 | /* Reference counters debugging */
37 | extern GHashTable *counters;
38 | extern jamrtc_mutex counters_mutex;
39 | #define jamrtc_refcount_track(refp) { \
40 | jamrtc_mutex_lock(&counters_mutex); \
41 | if(counters == NULL) \
42 | counters = g_hash_table_new(NULL, NULL); \
43 | g_hash_table_insert(counters, refp, refp); \
44 | jamrtc_mutex_unlock(&counters_mutex); \
45 | }
46 | #define jamrtc_refcount_untrack(refp) { \
47 | jamrtc_mutex_lock(&counters_mutex); \
48 | g_hash_table_remove(counters, refp); \
49 | jamrtc_mutex_unlock(&counters_mutex); \
50 | }
51 | #endif
52 |
53 |
54 | /*! Jamus reference counter initialization (debug according to settings) */
55 | #define jamrtc_refcount_init(refp, free_fn) { \
56 | if(!refcount_debug) { \
57 | jamrtc_refcount_init_nodebug(refp, free_fn); \
58 | } else { \
59 | jamrtc_refcount_init_debug(refp, free_fn); \
60 | } \
61 | }
62 | /*! Jamus reference counter initialization (no debug) */
63 | #ifdef REFCOUNT_DEBUG
64 | #define jamrtc_refcount_init_nodebug(refp, free_fn) { \
65 | (refp)->count = 1; \
66 | (refp)->free = free_fn; \
67 | jamrtc_refcount_track((refp)); \
68 | }
69 | #else
70 | #define jamrtc_refcount_init_nodebug(refp, free_fn) { \
71 | (refp)->count = 1; \
72 | (refp)->free = free_fn; \
73 | }
74 | #endif
75 | /*! Jamus reference counter initialization (debug) */
76 | #ifdef REFCOUNT_DEBUG
77 | #define jamrtc_refcount_init_debug(refp, free_fn) { \
78 | (refp)->count = 1; \
79 | JAMRTC_PRINT("[%s:%s:%d:init] %p (%d)\n", __FILE__, __FUNCTION__, __LINE__, refp, (refp)->count); \
80 | (refp)->free = free_fn; \
81 | jamrtc_refcount_track((refp)); \
82 | }
83 | #else
84 | #define jamrtc_refcount_init_debug(refp, free_fn) { \
85 | (refp)->count = 1; \
86 | JAMRTC_PRINT("[%s:%s:%d:init] %p (%d)\n", __FILE__, __FUNCTION__, __LINE__, refp, (refp)->count); \
87 | (refp)->free = free_fn; \
88 | }
89 | #endif
90 |
91 | /*! Increase the Jamus reference counter (debug according to settings) */
92 | #define jamrtc_refcount_increase(refp) { \
93 | if(!refcount_debug) { \
94 | jamrtc_refcount_increase_nodebug(refp); \
95 | } else { \
96 | jamrtc_refcount_increase_debug(refp); \
97 | } \
98 | }
99 | /*! Increase the Jamus reference counter (no debug) */
100 | #define jamrtc_refcount_increase_nodebug(refp) { \
101 | g_atomic_int_inc((gint *)&(refp)->count); \
102 | }
103 | /*! Increase the Jamus reference counter (debug) */
104 | #define jamrtc_refcount_increase_debug(refp) { \
105 | JAMRTC_PRINT("[%s:%s:%d:increase] %p (%d)\n", __FILE__, __FUNCTION__, __LINE__, refp, (refp)->count+1); \
106 | g_atomic_int_inc((gint *)&(refp)->count); \
107 | }
108 |
109 | /*! Decrease the Jamus reference counter (debug according to settings) */
110 | #define jamrtc_refcount_decrease(refp) { \
111 | if(!refcount_debug) { \
112 | jamrtc_refcount_decrease_nodebug(refp); \
113 | } else { \
114 | jamrtc_refcount_decrease_debug(refp); \
115 | } \
116 | }
117 | /*! Decrease the Jamus reference counter (debug) */
118 | #ifdef REFCOUNT_DEBUG
119 | #define jamrtc_refcount_decrease_debug(refp) { \
120 | JAMRTC_PRINT("[%s:%s:%d:decrease] %p (%d)\n", __FILE__, __FUNCTION__, __LINE__, refp, (refp)->count-1); \
121 | if(g_atomic_int_dec_and_test((gint *)&(refp)->count)) { \
122 | (refp)->free(refp); \
123 | jamrtc_refcount_untrack((refp)); \
124 | } \
125 | }
126 | #else
127 | #define jamrtc_refcount_decrease_debug(refp) { \
128 | JAMRTC_PRINT("[%s:%s:%d:decrease] %p (%d)\n", __FILE__, __FUNCTION__, __LINE__, refp, (refp)->count-1); \
129 | if(g_atomic_int_dec_and_test((gint *)&(refp)->count)) { \
130 | (refp)->free(refp); \
131 | } \
132 | }
133 | #endif
134 | /*! Decrease the Jamus reference counter (no debug) */
135 | #ifdef REFCOUNT_DEBUG
136 | #define jamrtc_refcount_decrease_nodebug(refp) { \
137 | if(g_atomic_int_dec_and_test((gint *)&(refp)->count)) { \
138 | (refp)->free(refp); \
139 | jamrtc_refcount_untrack((refp)); \
140 | } \
141 | }
142 | #else
143 | #define jamrtc_refcount_decrease_nodebug(refp) { \
144 | if(g_atomic_int_dec_and_test((gint *)&(refp)->count)) { \
145 | (refp)->free(refp); \
146 | } \
147 | }
148 | #endif
149 |
150 | #endif
151 |
--------------------------------------------------------------------------------
/src/webrtc.c:
--------------------------------------------------------------------------------
1 | /*
2 | * JamRTC -- Jam sessions on Janus!
3 | *
4 | * Ugly prototype, just to use as a proof of concept
5 | *
6 | * Developed by Lorenzo Miniero: lorenzo@meetecho.com
7 | * License: GPLv3
8 | *
9 | */
10 |
11 | /* GTK/GDK includes */
12 | #include
13 | #include
14 |
15 | /* GStreamer includes */
16 | #include
17 | #include
18 | #define GST_USE_UNSTABLE_API
19 | #include
20 | #include
21 |
22 | /* WebSockets/JSON stack(Janus API) */
23 | #include
24 | #include
25 |
26 | /* Local includes */
27 | #include "webrtc.h"
28 | #include "mutex.h"
29 | #include "refcount.h"
30 | #include "debug.h"
31 |
32 |
33 | /* Janus signalling state management */
34 | typedef enum jamrtc_state {
35 | JAMRTC_JANUS_DISCONNECTED = 0,
36 | JAMRTC_JANUS_CONNECTING = 1,
37 | JAMRTC_JANUS_CONNECTION_ERROR,
38 | JAMRTC_JANUS_CONNECTED,
39 | JAMRTC_JANUS_CREATING_SESSION,
40 | JAMRTC_JANUS_SESSION_CREATED,
41 | JAMRTC_JANUS_ATTACHING_PLUGIN,
42 | JAMRTC_JANUS_HANDLE_ATTACHED,
43 | JAMRTC_JANUS_SDP_PREPARED,
44 | JAMRTC_JANUS_STARTED,
45 | JAMRTC_JANUS_API_ERROR,
46 | JAMRTC_JANUS_STATE_ERROR
47 | } jamrtc_state;
48 |
49 | /* Callbacks handler (API) */
50 | static const jamrtc_callbacks *cb = NULL;
51 |
52 | /* Global properties */
53 | static GtkBuilder *builder = NULL;
54 | static GMainLoop *loop = NULL;
55 | static const char *stun_server = NULL, *turn_server = NULL,
56 | *src_opts = NULL, *video_device = NULL;
57 | static gboolean no_mic = FALSE, no_webcam = FALSE, stereo = FALSE, no_jack = FALSE;
58 | static guint latency = 0;
59 |
60 | /* WebSocket properties */
61 | static const char *server_url = NULL;
62 | static const char *protocol = NULL, *address = NULL, *path = NULL;
63 | static int port = 0;
64 | static jamrtc_state state = 0;
65 | static GSource *keep_alives = NULL;
66 | static struct lws_context *context = NULL;
67 | static struct lws *wsi = NULL;
68 | static volatile gint stopping = 0;
69 |
70 | typedef struct jamrtc_ws_client {
71 | struct lws *wsi; /* The libwebsockets client instance */
72 | char *incoming; /* Buffer containing incoming data until it's complete */
73 | unsigned char *buffer; /* Buffer containing the message to send */
74 | int buflen; /* Length of the buffer (may be resized after re-allocations) */
75 | int bufpending; /* Data an interrupted previous write couldn't send */
76 | int bufoffset; /* Offset from where the interrupted previous write should resume */
77 | jamrtc_mutex mutex; /* Mutex to lock/unlock this instance */
78 | } jamrtc_ws_client;
79 | static jamrtc_ws_client *ws_client = NULL;
80 | static GAsyncQueue *messages = NULL; /* Queue of outgoing messages to push */
81 | static jamrtc_mutex writable_mutex;
82 | static GThread *ws_thread = NULL;
83 |
84 | static int jamrtc_ws_callback(struct lws *wsi, enum lws_callback_reasons reason, void *user, void *in, size_t len);
85 | static struct lws_protocols protocols[] = {
86 | { "janus-protocol", jamrtc_ws_callback, sizeof(jamrtc_ws_client), 0 },
87 | { NULL, NULL, 0, 0 }
88 | };
89 | static const struct lws_extension exts[] = {
90 | #ifndef LWS_WITHOUT_EXTENSIONS
91 | { "permessage-deflate", lws_extension_callback_pm_deflate, "permessage-deflate; client_max_window_bits" },
92 | { "deflate-frame", lws_extension_callback_pm_deflate, "deflate_frame" },
93 | #endif
94 | { NULL, NULL, NULL }
95 | };
96 |
97 | /* Janus API properties */
98 | static guint64 session_id = 0, room_id = 0, private_id = 0;
99 | static char *local_uuid = NULL;
100 | static const char *display_name = NULL;
101 | /* Janus API handles management */
102 | static guint64 *jamrtc_uint64_dup(guint64 num) {
103 | guint64 *numdup = g_malloc(sizeof(guint64));
104 | *numdup = num;
105 | return numdup;
106 | }
107 |
108 |
109 | /* JamRTC PeerConnection and related properties */
110 | typedef struct jamrtc_webrtc_pc {
111 | /* Whether it's a remote stream, or a local feed */
112 | gboolean remote;
113 | /* Slot assigned to this stream */
114 | guint slot;
115 | /* Janus handle ID */
116 | guint64 handle_id;
117 | /* Janus VideoRoom user ID */
118 | guint64 user_id;
119 | /* UUID of who's sending this stream */
120 | char *uuid;
121 | /* Display name of who's sending this stream */
122 | char *display;
123 | /* Instrument played in this stream, if any (it may be for the audio/video chat) */
124 | char *instrument;
125 | /* Signalling state */
126 | jamrtc_state state;
127 | /* GStreamer pipeline in use */
128 | GstElement *pipeline;
129 | /* Pointer to the peerConnection object in the pipeline */
130 | GstElement *peerconnection;
131 | /* Whether there's audio and/or video */
132 | gboolean audio, video;
133 | /*! Atomic flag to check if this instance has been destroyed */
134 | volatile gint destroyed;
135 | /* Reference count */
136 | jamrtc_refcount ref;
137 | } jamrtc_webrtc_pc;
138 | static void jamrtc_webrtc_pc_free(const jamrtc_refcount *pc_ref) {
139 | jamrtc_webrtc_pc *pc = jamrtc_refcount_containerof(pc_ref, jamrtc_webrtc_pc, ref);
140 | /* This instance can be destroyed, free all the resources */
141 | g_free(pc->uuid);
142 | g_free(pc->display);
143 | g_free(pc->instrument);
144 | if(pc->pipeline)
145 | gst_object_unref(pc->pipeline);
146 | g_free(pc);
147 | }
148 | static void jamrtc_webrtc_pc_destroy(jamrtc_webrtc_pc *pc) {
149 | if(pc == NULL)
150 | return;
151 | if(!g_atomic_int_compare_and_exchange(&pc->destroyed, 0, 1))
152 | return;
153 | /* Send a detach to Janus */
154 | /* TODO */
155 | /* Quit the PeerConnection loop */
156 | if(pc->pipeline)
157 | gst_element_set_state(GST_ELEMENT(pc->pipeline), GST_STATE_NULL);
158 | /* The PeerConnection will actually be destroyed when the counter gets to 0 */
159 | jamrtc_refcount_decrease(&pc->ref);
160 | }
161 | static void jamrtc_webrtc_pc_unref(jamrtc_webrtc_pc *pc) {
162 | if(pc == NULL)
163 | return;
164 | jamrtc_refcount_decrease(&pc->ref);
165 | }
166 | static jamrtc_webrtc_pc *jamrtc_webrtc_pc_new(const char *uuid, const char *display, gboolean remote, const char *instrument) {
167 | if(uuid == NULL || display == NULL)
168 | return NULL;
169 | jamrtc_webrtc_pc *pc = g_malloc0(sizeof(jamrtc_webrtc_pc));
170 | pc->remote = remote;
171 | pc->uuid = g_strdup(uuid);
172 | pc->display = g_strdup(display);
173 | if(instrument != NULL)
174 | pc->instrument = g_strdup(instrument);
175 | pc->state = JAMRTC_JANUS_SESSION_CREATED;
176 | jamrtc_refcount_init(&pc->ref, jamrtc_webrtc_pc_free);
177 | /* Done */
178 | return pc;
179 | }
180 | /* Media we may be publishing ourselves */
181 | static jamrtc_webrtc_pc *local_micwebcam = NULL;
182 | static jamrtc_webrtc_pc *local_instrument = NULL;
183 |
184 | /* JamRTC participants */
185 | typedef struct jamrtc_webrtc_participant {
186 | /* Janus VideoRoom user IDs */
187 | guint64 user_id, instrument_user_id;
188 | /* Unique UUID as advertised via signalling */
189 | char *uuid;
190 | /* Display name of the participant */
191 | char *display;
192 | /* Slot assigned to this participant */
193 | guint slot;
194 | /* Mic/webcam PeerConnection of this participant, if any */
195 | jamrtc_webrtc_pc *micwebcam;
196 | /* Instrument PeerConnection of this participant, if any */
197 | jamrtc_webrtc_pc *instrument;
198 | /*! Atomic flag to check if this instance has been destroyed */
199 | volatile gint destroyed;
200 | /* Reference count */
201 | jamrtc_refcount ref;
202 | } jamrtc_webrtc_participant;
203 | static void jamrtc_webrtc_participant_free(const jamrtc_refcount *participant_ref) {
204 | jamrtc_webrtc_participant *participant = jamrtc_refcount_containerof(participant_ref, jamrtc_webrtc_participant, ref);
205 | if(participant == NULL)
206 | return;
207 | g_free(participant->uuid);
208 | g_free(participant->display);
209 | g_free(participant);
210 | }
211 | static void jamrtc_webrtc_participant_destroy(jamrtc_webrtc_participant *participant) {
212 | if(participant == NULL)
213 | return;
214 | if(!g_atomic_int_compare_and_exchange(&participant->destroyed, 0, 1))
215 | return;
216 | /* The PeerConnection will actually be destroyed when the counter gets to 0 */
217 | jamrtc_refcount_decrease(&participant->ref);
218 | }
219 | static void jamrtc_webrtc_participant_unref(jamrtc_webrtc_participant *participant) {
220 | if(participant == NULL)
221 | return;
222 | jamrtc_refcount_decrease(&participant->ref);
223 | }
224 | /* Remote participants and PeerConnections */
225 | static GHashTable *participants = NULL;
226 | static GHashTable *participants_byid = NULL;
227 | static GHashTable *participants_byslot = NULL;
228 | static GHashTable *peerconnections = NULL;
229 | static jamrtc_mutex participants_mutex;
230 |
231 | /* Signalling methods and callbacks */
232 | static void jamrtc_connect_websockets(void);
233 | static void jamrtc_send_message(char *text);
234 | static gboolean jamrtc_create_session(void);
235 | static gboolean jamrtc_attach_handle(jamrtc_webrtc_pc *pc);
236 | static gboolean jamrtc_prepare_pipeline(jamrtc_webrtc_pc *pc, gboolean subscription, gboolean do_audio, gboolean do_video);
237 | static void jamrtc_negotiation_needed(GstElement *element, gpointer user_data);
238 | static void jamrtc_sdp_available(GstPromise *promise, gpointer user_data);
239 | static void jamrtc_trickle_candidate(GstElement *webrtc,
240 | guint mlineindex, char *candidate, gpointer user_data);
241 | static void jamrtc_incoming_stream(GstElement *webrtc, GstPad *pad, gpointer user_data);
242 | static void jamrtc_server_message(char *text);
243 | /* Transactions management */
244 | static GHashTable *transactions = NULL;
245 | static jamrtc_mutex transactions_mutex;
246 |
247 |
248 | /* Video rendering callbacks */
249 | typedef enum jamrtc_video_message_action {
250 | JAMRTC_ACTION_NONE = 0,
251 | JAMRTC_ACTION_ADD_PARTICIPANT,
252 | JAMRTC_ACTION_ADD_STREAM,
253 | JAMRTC_ACTION_REMOVE_STREAM,
254 | JAMRTC_ACTION_REMOVE_PARTICIPANT
255 | } jamrtc_video_message_action;
256 | typedef struct jamrtc_video_message {
257 | jamrtc_video_message_action action;
258 | gpointer resource;
259 | gboolean video;
260 | char *sink;
261 | } jamrtc_video_message;
262 | static jamrtc_video_message *jamrtc_video_message_create(jamrtc_video_message_action action,
263 | gpointer resource, gboolean video, const char *sink) {
264 | jamrtc_video_message *msg = g_malloc(sizeof(jamrtc_video_message));
265 | msg->action = action;
266 | msg->resource = resource;
267 | switch(action) {
268 | case JAMRTC_ACTION_ADD_PARTICIPANT:
269 | case JAMRTC_ACTION_REMOVE_PARTICIPANT: {
270 | jamrtc_webrtc_participant *participant = (jamrtc_webrtc_participant *)resource;
271 | if(participant != NULL)
272 | jamrtc_refcount_increase(&participant->ref);
273 | break;
274 | }
275 | case JAMRTC_ACTION_ADD_STREAM:
276 | case JAMRTC_ACTION_REMOVE_STREAM: {
277 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)resource;
278 | if(pc != NULL)
279 | jamrtc_refcount_increase(&pc->ref);
280 | break;
281 | }
282 | default: {
283 | g_free(msg);
284 | return NULL;
285 | }
286 | }
287 | msg->video = video;
288 | msg->sink = g_strdup(sink);
289 | return msg;
290 | }
291 | static void jamrtc_video_message_free(jamrtc_video_message *msg) {
292 | if(msg == NULL)
293 | return;
294 | g_free(msg->sink);
295 | switch(msg->action) {
296 | case JAMRTC_ACTION_ADD_PARTICIPANT:
297 | case JAMRTC_ACTION_REMOVE_PARTICIPANT: {
298 | jamrtc_webrtc_participant *participant = (jamrtc_webrtc_participant *)msg->resource;
299 | if(participant != NULL)
300 | jamrtc_refcount_decrease(&participant->ref);
301 | break;
302 | }
303 | case JAMRTC_ACTION_ADD_STREAM:
304 | case JAMRTC_ACTION_REMOVE_STREAM: {
305 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)msg->resource;
306 | if(pc != NULL)
307 | jamrtc_refcount_decrease(&pc->ref);
308 | break;
309 | }
310 | default:
311 | break;
312 | }
313 | g_free(msg);
314 | }
315 | static gboolean jamrtc_video_message_handle(gpointer user_data) {
316 | jamrtc_video_message *msg = (jamrtc_video_message *)user_data;
317 | if(msg == NULL)
318 | return G_SOURCE_REMOVE;
319 | if(msg->action == JAMRTC_ACTION_ADD_PARTICIPANT) {
320 | /* Initialize the labels in this slot */
321 | jamrtc_webrtc_participant *participant = (jamrtc_webrtc_participant *)msg->resource;
322 | guint slot = participant ? participant->slot : 1;
323 | char display_label[100];
324 | g_snprintf(display_label, sizeof(display_label), "user%u_name", slot);
325 | GtkLabel *display = GTK_LABEL(gtk_builder_get_object(builder, display_label));
326 | gtk_label_set_text(display, participant ? participant->display : local_micwebcam->display);
327 | g_snprintf(display_label, sizeof(display_label), "user%u_mic", slot);
328 | display = GTK_LABEL(gtk_builder_get_object(builder, display_label));
329 | gtk_label_set_text(display, "No microphone (chat)");
330 | g_snprintf(display_label, sizeof(display_label), "user%u_instrument", slot);
331 | display = GTK_LABEL(gtk_builder_get_object(builder, display_label));
332 | gtk_label_set_text(display, "No instrument");
333 | } else if(msg->action == JAMRTC_ACTION_ADD_STREAM) {
334 | /* We have a new stream to render, update the related label too */
335 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)msg->resource;
336 | if(!msg->video) {
337 | /* Update the mic/instrument label */
338 | char audio_label[100];
339 | g_snprintf(audio_label, sizeof(audio_label), "user%u_%s", pc->slot,
340 | pc->instrument ? "instrument" : "mic");
341 | GtkLabel *label = GTK_LABEL(gtk_builder_get_object(builder, audio_label));
342 | gtk_label_set_text(label, pc->instrument ? pc->instrument : "Microphone (chat)");
343 | /* Render the wavescope associated with the audio stream */
344 | char draw[100];
345 | g_snprintf(draw, sizeof(draw), "user%u_%sdraw", pc->slot,
346 | pc->instrument ? "instrument" : "mic");
347 | GtkWidget *widget = GTK_WIDGET(gtk_builder_get_object(builder, draw));
348 | gtk_widget_set_size_request(widget, 320, 100);
349 | GdkWindow *window = gtk_widget_get_window(widget);
350 | gulong xid = GDK_WINDOW_XID(window);
351 | GstElement *sink = gst_bin_get_by_name(GST_BIN(pc->pipeline), msg->sink);
352 | gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(sink), xid);
353 | } else {
354 | /* Render this video stream */
355 | char draw[100];
356 | g_snprintf(draw, sizeof(draw), "user%u_videodraw", pc->slot);
357 | GtkWidget *widget = GTK_WIDGET(gtk_builder_get_object(builder, draw));
358 | gtk_widget_set_size_request(widget, 320, 180);
359 | GdkWindow *window = gtk_widget_get_window(widget);
360 | gulong xid = GDK_WINDOW_XID(window);
361 | GstElement *sink = gst_bin_get_by_name(GST_BIN(pc->pipeline), msg->sink);
362 | gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(sink), xid);
363 | }
364 | } else if(msg->action == JAMRTC_ACTION_REMOVE_STREAM) {
365 | /* A stream we were rendering has gone away, update the related label too */
366 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)msg->resource;
367 | if(pc->slot < 1 || pc->slot > 4) {
368 | JAMRTC_LOG(LOG_WARN, "Invalid stream slot %d, ignoring\n", pc->slot);
369 | } else {
370 | if(pc->audio) {
371 | /* Update the mic/instrument label */
372 | char audio_label[100];
373 | g_snprintf(audio_label, sizeof(audio_label), "user%u_%s", pc->slot,
374 | pc->instrument ? "instrument" : "mic");
375 | GtkLabel *label = GTK_LABEL(gtk_builder_get_object(builder, audio_label));
376 | gtk_label_set_text(label, pc->instrument ? "No instrument" : "No microphone (chat)");
377 | /* Empty the draw element */
378 | char draw[100];
379 | g_snprintf(draw, sizeof(draw), "user%u_%sdraw", pc->slot,
380 | pc->instrument ? "instrument" : "mic");
381 | GtkWidget *widget = GTK_WIDGET(gtk_builder_get_object(builder, draw));
382 | gtk_widget_set_size_request(widget, 0, 0);
383 | }
384 | if(pc->video) {
385 | /* Empty the video draw element */
386 | char draw[100];
387 | g_snprintf(draw, sizeof(draw), "user%u_videodraw", pc->slot);
388 | GtkWidget *widget = GTK_WIDGET(gtk_builder_get_object(builder, draw));
389 | gtk_widget_set_size_request(widget, 0, 0);
390 | }
391 | }
392 | } else if(msg->action == JAMRTC_ACTION_REMOVE_PARTICIPANT) {
393 | /* Reset the labels in this slot */
394 | jamrtc_webrtc_participant *participant = (jamrtc_webrtc_participant *)msg->resource;
395 | if(participant && (participant->slot < 1 || participant->slot > 4)) {
396 | JAMRTC_LOG(LOG_WARN, "Invalid participant slot %d for %s, ignoring\n", participant->slot, participant->display);
397 | } else {
398 | guint slot = participant ? participant->slot : 1;
399 | char display_label[100];
400 | g_snprintf(display_label, sizeof(display_label), "user%u_name", slot);
401 | GtkLabel *display = GTK_LABEL(gtk_builder_get_object(builder, display_label));
402 | gtk_label_set_text(display, "");
403 | g_snprintf(display_label, sizeof(display_label), "user%u_mic", slot);
404 | display = GTK_LABEL(gtk_builder_get_object(builder, display_label));
405 | gtk_label_set_text(display, "");
406 | g_snprintf(display_label, sizeof(display_label), "user%u_instrument", slot);
407 | display = GTK_LABEL(gtk_builder_get_object(builder, display_label));
408 | gtk_label_set_text(display, "");
409 | /* Get rid of the draw elements too, if any */
410 | char draw[100];
411 | g_snprintf(draw, sizeof(draw), "user%u_videodraw", slot);
412 | GtkWidget *widget = GTK_WIDGET(gtk_builder_get_object(builder, draw));
413 | gtk_widget_set_size_request(widget, 0, 0);
414 | g_snprintf(draw, sizeof(draw), "user%u_micdraw", slot);
415 | widget = GTK_WIDGET(gtk_builder_get_object(builder, draw));
416 | gtk_widget_set_size_request(widget, 0, 0);
417 | g_snprintf(draw, sizeof(draw), "user%u_instrumentdraw", slot);
418 | widget = GTK_WIDGET(gtk_builder_get_object(builder, draw));
419 | gtk_widget_set_size_request(widget, 0, 0);
420 | }
421 | }
422 | jamrtc_video_message_free(msg);
423 | return G_SOURCE_REMOVE;
424 | }
425 |
426 | /* Janus stack initialization */
427 | int jamrtc_webrtc_init(const jamrtc_callbacks* callbacks, GtkBuilder *gtkbuilder, GMainLoop *mainloop,
428 | const char *ws, const char *stun, const char *turn, const char *src, guint jitter, gboolean disable_jack) {
429 | /* Validate the input */
430 | if(lws_parse_uri((char *)ws, &protocol, &address, &port, &path)) {
431 | JAMRTC_LOG(LOG_FATAL, "Invalid Janus WebSocket address\n");
432 | return -1;
433 | }
434 | if((strcasecmp(protocol, "ws") && strcasecmp(protocol, "wss")) || !strlen(address)) {
435 | JAMRTC_LOG(LOG_FATAL, "Invalid Janus WebSocket address (only ws:// and wss:// addresses are supported)\n");
436 | JAMRTC_LOG(LOG_FATAL, " -- Protocol: %s\n", protocol);
437 | JAMRTC_LOG(LOG_FATAL, " -- Address: %s\n", address);
438 | JAMRTC_LOG(LOG_FATAL, " -- Path: %s\n", path);
439 | return -1;
440 | }
441 |
442 | /* Take note of the settings */
443 | cb = callbacks;
444 | server_url = g_strdup(ws);
445 | builder = gtkbuilder;
446 | loop = mainloop;
447 | stun_server = stun;
448 | turn_server = turn;
449 | src_opts = src;
450 | latency = jitter;
451 | no_jack = disable_jack;
452 |
453 | /* Initialize hashtables and mutexes */
454 | participants = g_hash_table_new_full(g_str_hash, g_str_equal,
455 | (GDestroyNotify)g_free, (GDestroyNotify)jamrtc_webrtc_participant_destroy);
456 | participants_byid = g_hash_table_new_full(g_int64_hash, g_int64_equal,
457 | (GDestroyNotify)g_free, (GDestroyNotify)jamrtc_webrtc_participant_unref);
458 | participants_byslot = g_hash_table_new_full(NULL, NULL, NULL,
459 | (GDestroyNotify)jamrtc_webrtc_participant_unref);
460 | peerconnections = g_hash_table_new_full(g_int64_hash, g_int64_equal,
461 | (GDestroyNotify)g_free, (GDestroyNotify)jamrtc_webrtc_pc_unref);
462 | jamrtc_mutex_init(&participants_mutex);
463 | transactions = g_hash_table_new_full(g_str_hash, g_str_equal,
464 | (GDestroyNotify)g_free, NULL);
465 | jamrtc_mutex_init(&transactions_mutex);
466 |
467 | /* Connect to Janus */
468 | jamrtc_connect_websockets();
469 | return 0;
470 | }
471 |
472 | /* Janus stack cleanup */
473 | static gboolean jamrtc_webrtc_cleanup_internal(gpointer user_data) {
474 | /* Stop the client */
475 | g_atomic_int_set(&stopping, 1);
476 | if(ws_thread != NULL) {
477 | g_thread_join(ws_thread);
478 | ws_thread = NULL;
479 | }
480 | char *message = NULL;
481 | while((message = g_async_queue_try_pop(messages)) != NULL) {
482 | g_free(message);
483 | }
484 | g_async_queue_unref(messages);
485 |
486 | /* We're done */
487 | jamrtc_mutex_lock(&transactions_mutex);
488 | g_hash_table_destroy(transactions);
489 | transactions = NULL;
490 | jamrtc_mutex_unlock(&transactions_mutex);
491 | jamrtc_mutex_lock(&participants_mutex);
492 | g_hash_table_destroy(peerconnections);
493 | peerconnections = NULL;
494 | g_hash_table_destroy(participants);
495 | participants = NULL;
496 | g_hash_table_destroy(participants_byid);
497 | participants_byid = NULL;
498 | g_hash_table_destroy(participants_byslot);
499 | participants_byslot = NULL;
500 | jamrtc_webrtc_pc_destroy(local_micwebcam);
501 | local_micwebcam = NULL;
502 | jamrtc_webrtc_pc_destroy(local_instrument);
503 | local_instrument = NULL;
504 | jamrtc_mutex_unlock(&participants_mutex);
505 |
506 | /* Quit the main loop: this will eventually exit the application, when done */
507 | if(loop) {
508 | g_main_loop_quit(loop);
509 | loop = NULL;
510 | }
511 | return G_SOURCE_REMOVE;
512 | }
513 | void jamrtc_webrtc_cleanup() {
514 | /* Run this on the loop */
515 | GSource *timeout_source;
516 | timeout_source = g_timeout_source_new(0);
517 | g_source_set_callback(timeout_source, jamrtc_webrtc_cleanup_internal, NULL, NULL);
518 | g_source_attach(timeout_source, NULL);
519 | g_source_unref(timeout_source);
520 | }
521 | /* Join the room as a participant (but don't publish anything yet) */
522 | void jamrtc_join_room(guint64 id, const char *display) {
523 | /* Take note of the properties */
524 | room_id = id;
525 | display_name = display;
526 |
527 | /* Generate a random UUID */
528 | local_uuid = g_uuid_string_random();
529 | JAMRTC_LOG(LOG_INFO, "Generated UUID: %s\n", local_uuid);
530 |
531 | /* Create an instance for our participant */
532 | local_micwebcam = jamrtc_webrtc_pc_new(local_uuid, display, FALSE, NULL);
533 | local_micwebcam->slot = 1;
534 | /* Attach to the VideoRoom plugin: we'll join once we do that */
535 | jamrtc_attach_handle(local_micwebcam);
536 | }
537 |
538 | /* Publish mic/webcam for the chat part */
539 | static gboolean jamrtc_webrtc_publish_micwebcam_internal(gpointer user_data) {
540 | /* Create a GStreamer pipeline for the sendonly PeerConnection */
541 | jamrtc_prepare_pipeline(local_micwebcam, FALSE, !no_mic, !no_webcam);
542 | return G_SOURCE_REMOVE;
543 | }
544 | void jamrtc_webrtc_publish_micwebcam(gboolean ignore_mic, gboolean ignore_webcam, const char *device) {
545 | /* Take note of the properties */
546 | no_mic = ignore_mic;
547 | no_webcam = ignore_webcam;
548 | video_device = device;
549 |
550 | /* Run this on the loop */
551 | GSource *timeout_source;
552 | timeout_source = g_timeout_source_new(0);
553 | g_source_set_callback(timeout_source, jamrtc_webrtc_publish_micwebcam_internal, NULL, NULL);
554 | g_source_attach(timeout_source, NULL);
555 | g_source_unref(timeout_source);
556 | }
557 |
558 | /* Publish the instrument */
559 | static gboolean jamrtc_webrtc_publish_instrument_internal(gpointer user_data) {
560 | /* Attach to the VideoRoom plugin: we'll join once we do that */
561 | jamrtc_attach_handle(local_instrument);
562 | return G_SOURCE_REMOVE;
563 | }
564 | void jamrtc_webrtc_publish_instrument(const char *instrument, gboolean capture_stereo) {
565 | /* Take note of the properties */
566 | stereo = capture_stereo;
567 |
568 | /* Create an instance for our instrument */
569 | local_instrument = jamrtc_webrtc_pc_new(local_uuid, display_name, FALSE, instrument);
570 | local_instrument->slot = 1;
571 | /* Run this on the loop (maybe wait if we're also publishing mic/webcam) */
572 | GSource *timeout_source;
573 | //~ timeout_source = g_timeout_source_new_seconds((no_mic && no_webcam) ? 0 : 1);
574 | timeout_source = g_timeout_source_new(0);
575 | g_source_set_callback(timeout_source, jamrtc_webrtc_publish_instrument_internal, NULL, NULL);
576 | g_source_attach(timeout_source, NULL);
577 | g_source_unref(timeout_source);
578 | }
579 |
580 | /* Subscribe to a remote stream */
581 | static gboolean jamrtc_webrtc_subscribe_internal(gpointer user_data) {
582 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)user_data;
583 | if(pc == NULL)
584 | return G_SOURCE_REMOVE;
585 |
586 | /* Attach to the VideoRoom plugin: we'll subscribe once we do that */
587 | jamrtc_attach_handle(pc);
588 | /* Done, let's wait for the event */
589 | jamrtc_refcount_decrease(&pc->ref);
590 |
591 | return G_SOURCE_REMOVE;
592 | }
593 | int jamrtc_webrtc_subscribe(const char *uuid, gboolean instrument) {
594 | if(uuid == NULL)
595 | return -1;
596 | jamrtc_mutex_lock(&participants_mutex);
597 | /* Find the remote participant */
598 | jamrtc_webrtc_participant *participant = g_hash_table_lookup(participants, uuid);
599 | if(participant == NULL) {
600 | jamrtc_mutex_unlock(&participants_mutex);
601 | JAMRTC_LOG(LOG_ERR, "No such participant %s\n", uuid);
602 | return -2;
603 | }
604 | /* Access the stream we need */
605 | jamrtc_webrtc_pc *pc = (instrument ? participant->instrument : participant->micwebcam);
606 | if(pc == NULL) {
607 | jamrtc_mutex_unlock(&participants_mutex);
608 | JAMRTC_LOG(LOG_ERR, "No such stream from participant %s\n", uuid);
609 | return -3;
610 | }
611 | if(pc->pipeline != NULL) {
612 | jamrtc_mutex_unlock(&participants_mutex);
613 | JAMRTC_LOG(LOG_ERR, "[%s][%s] PeerConnection already available\n",
614 | pc->display, pc->instrument ? pc->instrument : "chat");
615 | return -4;
616 | }
617 | jamrtc_refcount_increase(&pc->ref);
618 | jamrtc_mutex_unlock(&participants_mutex);
619 |
620 | /* Run this on the loop (maybe wait if we're also publishing mic/webcam) */
621 | GSource *timeout_source;
622 | //~ timeout_source = g_timeout_source_new(pc->instrument ? 1000 : 500);
623 | timeout_source = g_timeout_source_new(0);
624 | g_source_set_callback(timeout_source, jamrtc_webrtc_subscribe_internal, pc, NULL);
625 | g_source_attach(timeout_source, NULL);
626 | g_source_unref(timeout_source);
627 |
628 | return 0;
629 | }
630 |
631 |
632 | /* Helper method to wrap up and close everything */
633 | static gboolean jamrtc_cleanup(const char *msg, enum jamrtc_state state) {
634 | if(msg != NULL)
635 | JAMRTC_LOG(LOG_ERR, "%s\n", msg);
636 | state = state;
637 |
638 | /* Tear down the WebSocket connection: this will destroy the Janus session */
639 | /* TODO */
640 |
641 | /* Notify the application */
642 | cb->server_disconnected();
643 |
644 | /* To allow usage as a GSourceFunc */
645 | return G_SOURCE_REMOVE;
646 | }
647 |
648 | /* Helper method to generate a random transaction string */
649 | static char *jamrtc_random_transaction(char *transaction, size_t trlen) {
650 | g_snprintf(transaction, trlen, "%"SCNu32, g_random_int());
651 | return transaction;
652 | }
653 |
654 | /* Helper method to serialize a JsonObject to a string */
655 | static char *jamrtc_json_to_string(JsonObject *object) {
656 | /* Make it the root node */
657 | JsonNode *root = json_node_init_object(json_node_alloc(), object);
658 | JsonGenerator *generator = json_generator_new();
659 | json_generator_set_root(generator, root);
660 | char *text = json_generator_to_data(generator, NULL);
661 |
662 | /* Release everything */
663 | g_object_unref(generator);
664 | json_node_free(root);
665 | return text;
666 | }
667 |
668 | /* Helper method to attach to the VideoRoom plugin */
669 | static gboolean jamrtc_attach_handle(jamrtc_webrtc_pc *pc) {
670 | if(pc == NULL)
671 | return FALSE;
672 | /* Make sure we have a valid Janus session to use as well */
673 | if(session_id == 0)
674 | return FALSE;
675 |
676 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Attaching to the VideoRoom plugin\n",
677 | pc->display, pc->instrument ? pc->instrument : "chat");
678 | pc->state = JAMRTC_JANUS_ATTACHING_PLUGIN;
679 |
680 | /* Prepare the Janus API request */
681 | JsonObject *attach = json_object_new();
682 | json_object_set_string_member(attach, "janus", "attach");
683 | json_object_set_int_member(attach, "session_id", session_id);
684 | json_object_set_string_member(attach, "plugin", "janus.plugin.videoroom");
685 | char transaction[12];
686 | json_object_set_string_member(attach, "transaction", jamrtc_random_transaction(transaction, sizeof(transaction)));
687 | char *text = jamrtc_json_to_string(attach);
688 | json_object_unref(attach);
689 |
690 | /* Track the task */
691 | jamrtc_mutex_lock(&transactions_mutex);
692 | g_hash_table_insert(transactions, g_strdup(transaction), pc);
693 | jamrtc_mutex_unlock(&transactions_mutex);
694 |
695 | /* Send the request: we'll get a response asynchronously */
696 | JAMRTC_LOG(LOG_VERB, "[%s][%s] Sending message: %s\n",
697 | pc->display, pc->instrument ? pc->instrument : "chat", text);
698 | jamrtc_send_message(text);
699 | return TRUE;
700 | }
701 |
702 | /* Callback to be notified about state changes in the pipeline */
703 | static void jamrtc_pipeline_state_changed(GstBus *bus, GstMessage *msg, gpointer user_data) {
704 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)user_data;
705 | GstState old_state, new_state, pending_state;
706 | gst_message_parse_state_changed(msg, &old_state, &new_state, &pending_state);
707 | JAMRTC_LOG(LOG_WARN, "[%s][%s] Pipeline state changed: %s\n",
708 | pc->display, pc->instrument ? pc->instrument : "chat",
709 | gst_element_state_get_name(new_state));
710 | /* TODO Should we use this to refresh the UI? */
711 | }
712 |
713 | /* Helper method to setup the webrtcbin pipeline, and trigger the negotiation process */
714 | static volatile gint pc_index = 0;
715 | static gboolean jamrtc_prepare_pipeline(jamrtc_webrtc_pc *pc, gboolean subscription, gboolean do_audio, gboolean do_video) {
716 | if(pc == NULL)
717 | return FALSE;
718 | g_atomic_int_inc(&pc_index);
719 | char pc_name[10];
720 | g_snprintf(pc_name, sizeof(pc_name), "pc%d", g_atomic_int_get(&pc_index));
721 | /* Prepare the pipeline, using the info we got from the command line */
722 | char stun[255], turn[255], audio[1024], video[1024], gst_pipeline[2048];
723 | stun[0] = '\0';
724 | turn[0] = '\0';
725 | audio[0] = '\0';
726 | video[0] = '\0';
727 | if(stun_server != NULL)
728 | g_snprintf(stun, sizeof(stun), "stun-server=stun://%s", stun_server);
729 | if(turn_server != NULL)
730 | g_snprintf(turn, sizeof(turn), "turn-server=turn://%s", turn_server);
731 | /* The pipeline works differently depending on what the PeerConnection is for */
732 | if(!subscription) {
733 | /* We're publishing something: set up the capture nodes */
734 | if(pc == local_micwebcam) {
735 | /* We're trying to capture mic and/or webcam */
736 | if(do_audio) {
737 | guint32 audio_ssrc = g_random_int();
738 | if(!no_jack) {
739 | /* Use jackaudiosrc and name it */
740 | g_snprintf(audio, sizeof(audio), "jackaudiosrc %s connect=0 client-name=\"JamRTC mic\" ! audio/x-raw,channels=1 ! "
741 | "audioconvert ! audioresample ! audio/x-raw,channels=1,rate=48000 ! tee name=at ! "
742 | "queue ! audioconvert ! wavescope style=1 ! videoconvert ! xvimagesink name=\"ampreview\" "
743 | "at. ! queue ! opusenc bitrate=20000 ! "
744 | "rtpopuspay pt=111 ssrc=%"SCNu32" ! queue ! application/x-rtp,media=audio,encoding-name=OPUS,payload=111 ! %s.",
745 | src_opts, audio_ssrc, pc_name);
746 | } else {
747 | /* Use autoaudiosrc */
748 | g_snprintf(audio, sizeof(audio), "autoaudiosrc %s ! audio/x-raw,channels=1 ! "
749 | "audioconvert ! audioresample ! audio/x-raw,channels=1,rate=48000 ! tee name=at ! "
750 | "queue ! audioconvert ! wavescope style=1 ! videoconvert ! xvimagesink name=\"ampreview\" "
751 | "at. ! queue ! opusenc bitrate=20000 ! "
752 | "rtpopuspay pt=111 ssrc=%"SCNu32" ! queue ! application/x-rtp,media=audio,encoding-name=OPUS,payload=111 ! %s.",
753 | src_opts, audio_ssrc, pc_name);
754 | }
755 | }
756 | if(do_video) {
757 | guint32 video_ssrc = g_random_int();
758 | g_snprintf(video, sizeof(video), "v4l2src device=%s ! videoconvert ! videoscale ! "
759 | "video/x-raw,width=320,height=180 ! tee name=vt ! queue ! xvimagesink name=\"vpreview\" "
760 | "vt. ! queue ! vp8enc deadline=1 cpu-used=10 target-bitrate=128000 ! "
761 | "rtpvp8pay pt=96 ssrc=%"SCNu32" ! queue ! application/x-rtp,media=video,encoding-name=VP8,payload=96 ! %s.",
762 | video_device, video_ssrc, pc_name);
763 | }
764 | } else if(!subscription && pc == local_instrument) {
765 | /* We're trying to capture an instrument */
766 | if(do_audio) {
767 | guint32 audio_ssrc = g_random_int();
768 | if(!no_jack) {
769 | /* Use jackaudiosrc and name it */
770 | g_snprintf(audio, sizeof(audio), "jackaudiosrc %s connect=0 client-name=\"JamRTC %s\" ! audio/x-raw,channels=%d ! "
771 | "audioconvert ! audioresample ! audio/x-raw,channels=%d,rate=48000 ! tee name=at ! "
772 | "queue ! audioconvert ! wavescope style=3 ! videoconvert ! xvimagesink name=\"aipreview\" "
773 | "at. ! queue ! opusenc bitrate=20000 ! "
774 | "rtpopuspay pt=111 ssrc=%"SCNu32" ! queue ! application/x-rtp,media=audio,encoding-name=OPUS,payload=111 ! %s.",
775 | src_opts, pc->instrument, stereo ? 2 : 1, stereo ? 2 : 1, audio_ssrc, pc_name);
776 | } else {
777 | /* Use autoaudiosrc */
778 | g_snprintf(audio, sizeof(audio), "autoaudiosrc %s ! audio/x-raw,channels=%d ! "
779 | "audioconvert ! audioresample ! audio/x-raw,channels=%d,rate=48000 ! tee name=at ! "
780 | "queue ! audioconvert ! wavescope style=3 ! videoconvert ! xvimagesink name=\"aipreview\" "
781 | "at. ! queue ! opusenc bitrate=20000 ! "
782 | "rtpopuspay pt=111 ssrc=%"SCNu32" ! queue ! application/x-rtp,media=audio,encoding-name=OPUS,payload=111 ! %s.",
783 | src_opts, stereo ? 2 : 1, stereo ? 2 : 1, audio_ssrc, pc_name);
784 | }
785 | }
786 | }
787 | /* Let's build the pipeline out of the elements we crafted above */
788 | g_snprintf(gst_pipeline, sizeof(gst_pipeline), "webrtcbin name=%s bundle-policy=%d %s %s %s %s",
789 | pc_name, (do_audio && do_video ? 3 : 0), stun, turn, video, audio);
790 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Initializing the GStreamer pipeline:\n -- %s\n",
791 | pc->display, pc->instrument ? pc->instrument : "chat", gst_pipeline);
792 | GError *error = NULL;
793 | pc->pipeline = gst_parse_launch(gst_pipeline, &error);
794 | if(error) {
795 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Failed to parse/launch the pipeline: %s\n",
796 | pc->display, pc->instrument ? pc->instrument : "chat", error->message);
797 | g_error_free(error);
798 | goto err;
799 | }
800 | /* Get a pointer to the PeerConnection object */
801 | pc->peerconnection = gst_bin_get_by_name(GST_BIN(pc->pipeline), pc_name);
802 | /* Let's configure the function to be invoked when an SDP offer can be prepared */
803 | g_signal_connect(pc->peerconnection, "on-negotiation-needed", G_CALLBACK(jamrtc_negotiation_needed), pc);
804 | } else {
805 | /* Since this is a subscription, we just create the webrtcbin element for the moment */
806 | char pipe_name[100];
807 | g_snprintf(pipe_name, sizeof(pipe_name), "pipe-%s", pc_name);
808 | pc->pipeline = gst_pipeline_new(pipe_name);
809 | pc->peerconnection = gst_element_factory_make("webrtcbin", pc_name);
810 | g_object_set(pc->peerconnection, "bundle-policy", (do_audio && do_video ? 3 : 0), NULL);
811 | if(stun_server != NULL)
812 | g_object_set(pc->peerconnection, "stun-server", stun_server, NULL);
813 | if(turn_server != NULL)
814 | g_object_set(pc->peerconnection, "turn-server", turn_server, NULL);
815 | gst_bin_add_many(GST_BIN(pc->pipeline), pc->peerconnection, NULL);
816 | gst_element_sync_state_with_parent(pc->pipeline);
817 | /* We'll handle incoming streams, and how to render them, dynamically */
818 | g_signal_connect(pc->peerconnection, "pad-added", G_CALLBACK(jamrtc_incoming_stream), pc);
819 | }
820 | pc->audio = do_audio;
821 | pc->video = do_video;
822 | /* We need a different callback to be notified about candidates to trickle to Janus */
823 | g_signal_connect(pc->peerconnection, "on-ice-candidate", G_CALLBACK(jamrtc_trickle_candidate), pc);
824 |
825 | /* For instruments, replace the jitter buffer size in rtpbin (it's 200ms by default, we definitely want less) */
826 | if(pc->instrument != NULL) {
827 | GstElement *rtpbin = gst_bin_get_by_name(GST_BIN(pc->peerconnection), "rtpbin");
828 | g_object_set(rtpbin,
829 | "latency", latency,
830 | "buffer-mode", 0,
831 | NULL);
832 | guint rtp_latency = 0;
833 | g_object_get(rtpbin, "latency", &rtp_latency, NULL);
834 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Configured jitter-buffer size (latency) for PeerConnection to %ums\n",
835 | pc->display, pc->instrument ? pc->instrument : "chat", rtp_latency);
836 | }
837 |
838 | /* Embed the video elements in the UI */
839 | if(!subscription) {
840 | if(do_audio) {
841 | const char *sinkname = (pc == local_micwebcam ? "ampreview" : "aipreview");
842 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_ADD_STREAM,
843 | pc, FALSE, sinkname);
844 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
845 | }
846 | if(do_video) {
847 | const char *sinkname = "vpreview";
848 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_ADD_STREAM,
849 | pc, TRUE, sinkname);
850 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
851 | }
852 | /* Save updated pipeline to a dot file, in case we're debugging */
853 | char dot_name[100];
854 | g_snprintf(dot_name, sizeof(dot_name), "%s_%s",
855 | pc->display, pc->instrument ? pc->instrument : "mic");
856 | GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pc->pipeline), GST_DEBUG_GRAPH_SHOW_ALL, dot_name);
857 | }
858 | //~ /* Monitor changes on the pipeline state */
859 | //~ GstBus *bus = gst_element_get_bus(pc->pipeline);
860 | //~ gst_bus_add_signal_watch(bus);
861 | //~ g_signal_connect(G_OBJECT(bus), "message::state-changed", (GCallback)jamrtc_pipeline_state_changed, pc);
862 | //~ gst_object_unref(bus);
863 | /* Start the pipeline */
864 | gst_element_set_state(pc->pipeline, GST_STATE_READY);
865 | gst_object_unref(pc->peerconnection);
866 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Starting GStreamer pipeline\n",
867 | pc->display, pc->instrument ? pc->instrument : "chat");
868 | GstStateChangeReturn ret = gst_element_set_state(GST_ELEMENT(pc->pipeline), GST_STATE_PLAYING);
869 | if(ret == GST_STATE_CHANGE_FAILURE) {
870 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Failed to start the pipeline%s\n",
871 | pc->display, pc->instrument ? pc->instrument : "chat",
872 | no_jack ? "" : " (is JACK running?)");
873 | goto err;
874 | }
875 |
876 | /* Done */
877 | return TRUE;
878 |
879 | err:
880 | /* If we got here, something went wrong */
881 | if(pc->pipeline)
882 | g_clear_object(&pc->pipeline);
883 | if(pc->peerconnection)
884 | pc->peerconnection = NULL;
885 | return FALSE;
886 | }
887 |
888 | /* Callback invoked when we need to prepare an SDP offer */
889 | static void jamrtc_negotiation_needed(GstElement *element, gpointer user_data) {
890 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)user_data;
891 | if(pc == NULL) {
892 | JAMRTC_LOG(LOG_ERR, "Invalid PeerConnection object\n");
893 | return;
894 | }
895 | pc->state = JAMRTC_JANUS_SDP_PREPARED;
896 | /* Create the offer */
897 | GstPromise *promise = gst_promise_new_with_change_func(jamrtc_sdp_available, pc, NULL);
898 | g_signal_emit_by_name(pc->peerconnection, "create-offer", NULL, promise);
899 | }
900 |
901 | /* Callback invoked when we have an SDP offer or answer ready to be sent */
902 | static void jamrtc_sdp_available(GstPromise *promise, gpointer user_data) {
903 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)user_data;
904 | if(pc == NULL) {
905 | JAMRTC_LOG(LOG_ERR, "Invalid PeerConnection object\n");
906 | return;
907 | }
908 | /* Make sure we're in the right state */
909 | if(pc->state < JAMRTC_JANUS_SDP_PREPARED) {
910 | JAMRTC_LOG(LOG_WARN, "[%s][%s] Can't send offer/answer, not in a PeerConnection\n",
911 | pc->display, pc->instrument ? pc->instrument : "chat");
912 | return;
913 | }
914 | if(gst_promise_wait(promise) != GST_PROMISE_RESULT_REPLIED)
915 | return;
916 | const char *type = (pc == local_micwebcam || pc == local_instrument) ? "offer" : "answer";
917 | const GstStructure *reply = gst_promise_get_reply(promise);
918 | GstWebRTCSessionDescription *offeranswer = NULL;
919 | gst_structure_get(reply, type, GST_TYPE_WEBRTC_SESSION_DESCRIPTION, &offeranswer, NULL);
920 | gst_promise_unref(promise);
921 |
922 | /* Set the local description locally */
923 | promise = gst_promise_new();
924 | g_signal_emit_by_name(pc->peerconnection, "set-local-description", offeranswer, promise);
925 | gst_promise_interrupt(promise);
926 | gst_promise_unref(promise);
927 |
928 | /* Convert the SDP object to a string */
929 | char *text = gst_sdp_message_as_text(offeranswer->sdp);
930 | /* Fix the SDP, as with max-bundle GStreamer will set 0 in m-line ports */
931 | const char *old_string = "m=audio 0";
932 | const char *new_string = "m=audio 9";
933 | char *pos = strstr(text, old_string), *tmp = NULL;
934 | int i = 0;
935 | while(pos) {
936 | i++;
937 | memcpy(pos, new_string, strlen(new_string));
938 | pos += strlen(old_string);
939 | tmp = strstr(pos, old_string);
940 | pos = tmp;
941 | }
942 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Sending SDP %s\n",
943 | pc->display, pc->instrument ? pc->instrument : "chat",
944 | pc->remote ? "answer" : "offer");
945 | JAMRTC_LOG(LOG_VERB, "%s\n", text);
946 | /* Prepare a JSEP offer */
947 | JsonObject *sdp = json_object_new();
948 | json_object_set_string_member(sdp, "type", pc->remote ? "answer" : "offer");
949 | json_object_set_string_member(sdp, "sdp", text);
950 | g_free(text);
951 |
952 | /* Send the SDP to Janus */
953 | if(pc == local_micwebcam || pc == local_instrument) {
954 | /* Prepare the request to the VideoRoom plugin: it's a "configure"
955 | * for mic/webcam, and "joinandconfigure" for instruments instead */
956 | JsonObject *req = json_object_new();
957 | json_object_set_string_member(req, "request", pc == local_micwebcam ? "configure" : "joinandconfigure");
958 | if(pc == local_instrument) {
959 | /* For instruments, we also use this request to join the room */
960 | JsonObject *info = json_object_new();
961 | json_object_set_string_member(info, "uuid", local_uuid);
962 | json_object_set_string_member(info, "display", pc->display);
963 | json_object_set_string_member(info, "instrument", pc->instrument);
964 | char *participant = jamrtc_json_to_string(info);
965 | json_object_unref(info);
966 | /* Join the room as a participant */
967 | json_object_set_string_member(req, "ptype", "publisher");
968 | json_object_set_int_member(req, "room", room_id);
969 | json_object_set_string_member(req, "display", participant);
970 | json_object_set_boolean_member(req, "audio", TRUE);
971 | json_object_set_boolean_member(req, "video", FALSE);
972 | g_free(participant);
973 | }
974 | /* Prepare the Janus API request to send the message to the plugin */
975 | JsonObject *msg = json_object_new();
976 | json_object_set_string_member(msg, "janus", "message");
977 | char transaction[12];
978 | json_object_set_string_member(msg, "transaction", jamrtc_random_transaction(transaction, sizeof(transaction)));
979 | json_object_set_int_member(msg, "session_id", session_id);
980 | json_object_set_int_member(msg, "handle_id", pc->handle_id);
981 | json_object_set_object_member(msg, "body", req);
982 | json_object_set_object_member(msg, "jsep", sdp);
983 | text = jamrtc_json_to_string(msg);
984 | json_object_unref(msg);
985 | /* Send the request via WebSockets */
986 | JAMRTC_LOG(LOG_VERB, "Sending message: %s\n", text);
987 | jamrtc_send_message(text);
988 | } else {
989 | /* Prepare the request to the VideoRoom plugin: it's just "start" */
990 | JsonObject *req = json_object_new();
991 | json_object_set_string_member(req, "request", "start");
992 | /* Prepare the Janus API request to send the message to the plugin */
993 | JsonObject *msg = json_object_new();
994 | json_object_set_string_member(msg, "janus", "message");
995 | char transaction[12];
996 | json_object_set_string_member(msg, "transaction", jamrtc_random_transaction(transaction, sizeof(transaction)));
997 | json_object_set_int_member(msg, "session_id", session_id);
998 | json_object_set_int_member(msg, "handle_id", pc->handle_id);
999 | json_object_set_object_member(msg, "body", req);
1000 | json_object_set_object_member(msg, "jsep", sdp);
1001 | text = jamrtc_json_to_string(msg);
1002 | json_object_unref(msg);
1003 | /* Send the request via WebSockets */
1004 | JAMRTC_LOG(LOG_VERB, "Sending message: %s\n", text);
1005 | jamrtc_send_message(text);
1006 | }
1007 | gst_webrtc_session_description_free(offeranswer);
1008 | }
1009 |
1010 | /* Callback invoked when a candidate to trickle becomes available */
1011 | static void jamrtc_trickle_candidate(GstElement *webrtc,
1012 | guint mlineindex, char *candidate, gpointer user_data) {
1013 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)user_data;
1014 | if(pc == NULL) {
1015 | JAMRTC_LOG(LOG_ERR, "Invalid PeerConnection object\n");
1016 | return;
1017 | }
1018 | if(mlineindex != 0)
1019 | return;
1020 | /* Make sure we're in the right state*/
1021 | if(pc->state < JAMRTC_JANUS_SDP_PREPARED) {
1022 | JAMRTC_LOG(LOG_WARN, "[%s][%s] Can't trickle, not in a PeerConnection\n",
1023 | pc->display, pc->instrument ? pc->instrument : "chat");
1024 | return;
1025 | }
1026 |
1027 | /* Prepare the Janus API request */
1028 | JsonObject *trickle = json_object_new();
1029 | json_object_set_string_member(trickle, "janus", "trickle");
1030 | json_object_set_int_member(trickle, "session_id", session_id);
1031 | json_object_set_int_member(trickle, "handle_id", pc->handle_id);
1032 | char transaction[12];
1033 | json_object_set_string_member(trickle, "transaction", jamrtc_random_transaction(transaction, sizeof(transaction)));
1034 | JsonObject *ice = json_object_new();
1035 | json_object_set_string_member(ice, "candidate", candidate);
1036 | json_object_set_int_member(ice, "sdpMLineIndex", mlineindex);
1037 | json_object_set_object_member(trickle, "candidate", ice);
1038 | char *text = jamrtc_json_to_string(trickle);
1039 | json_object_unref(trickle);
1040 |
1041 | /* Send the request via WebSockets */
1042 | JAMRTC_LOG(LOG_VERB, "[%s][%s] Sending message: %s\n",
1043 | pc->display, pc->instrument ? pc->instrument : "chat", text);
1044 | jamrtc_send_message(text);
1045 | }
1046 |
1047 | /* Callbacks invoked when we have a stream from an existing subscription */
1048 | static void jamrtc_handle_media_stream(jamrtc_webrtc_pc *pc, GstPad *pad, gboolean video) {
1049 | GstElement *entry = gst_element_factory_make(video ? "queue" : "audioconvert", NULL);
1050 | GstElement *conv = gst_element_factory_make(video ? "videoconvert" : "audioconvert", NULL);
1051 | GstElement *sink = gst_element_factory_make(video ? "xvimagesink" :
1052 | (no_jack ? "autoaudiosink" : "jackaudiosink"), NULL);
1053 | if(!video) {
1054 | /* Create a queue to add after the first audioconvert */
1055 | GstElement *q = gst_element_factory_make("queue", NULL);
1056 | /* Name the jackaudiosink element */
1057 | if(!no_jack) {
1058 | /* Assign a name to the Jack node */
1059 | char name[100];
1060 | g_snprintf(name, sizeof(name), "%s's %s",
1061 | pc->display, pc->instrument ? pc->instrument : "mic");
1062 | g_object_set(sink, "client-name", name, NULL);
1063 | //~ g_object_set(sink, "connect", 0, NULL);
1064 | }
1065 | /* This is audio, add a resampler too and a wavescope visualizer */
1066 | GstElement *resample = gst_element_factory_make("audioresample", NULL);
1067 | GstElement *tee = gst_element_factory_make("tee", NULL);
1068 | GstElement *qa = gst_element_factory_make("queue", NULL);
1069 | GstElement *qv = gst_element_factory_make("queue", NULL);
1070 | GstElement *wav = gst_element_factory_make("wavescope", NULL);
1071 | g_object_set(wav, "style", pc->instrument ? 3 : 1, NULL);
1072 | GstElement *vconv = gst_element_factory_make("videoconvert", NULL);
1073 | GstElement *vsink = gst_element_factory_make("xvimagesink", NULL);
1074 | g_object_set(vsink, "name", pc->instrument ? "aiwave" : "amwave", NULL);
1075 | gst_bin_add_many(GST_BIN(pc->pipeline), entry, q, conv, resample, tee, qa, sink, qv, wav, vconv, vsink, NULL);
1076 | gst_element_sync_state_with_parent(entry);
1077 | gst_element_sync_state_with_parent(q);
1078 | gst_element_sync_state_with_parent(conv);
1079 | gst_element_sync_state_with_parent(resample);
1080 | gst_element_sync_state_with_parent(tee);
1081 | gst_element_sync_state_with_parent(qa);
1082 | gst_element_sync_state_with_parent(sink);
1083 | gst_element_sync_state_with_parent(qv);
1084 | gst_element_sync_state_with_parent(wav);
1085 | gst_element_sync_state_with_parent(vconv);
1086 | gst_element_sync_state_with_parent(vsink);
1087 | if(!gst_element_link_many(entry, q, tee, NULL)) {
1088 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error linking audio pad to tee...\n",
1089 | pc->display, pc->instrument ? pc->instrument : "chat");
1090 | }
1091 | if(!gst_element_link_many(qa, conv, resample, sink, NULL)) {
1092 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error linking audio to sink...\n",
1093 | pc->display, pc->instrument ? pc->instrument : "chat");
1094 | }
1095 | if(!gst_element_link_many(qv, wav, vconv, vsink, NULL)) {
1096 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error linking audio to visualizer...\n",
1097 | pc->display, pc->instrument ? pc->instrument : "chat");
1098 | }
1099 | g_object_set(sink, "sync", FALSE, NULL);
1100 | g_object_set(vsink, "sync", FALSE, NULL);
1101 | if(pc->slot != 0) {
1102 | /* Render the video */
1103 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_ADD_STREAM,
1104 | pc, FALSE, pc->instrument ? "aiwave" : "amwave");
1105 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
1106 | }
1107 | /* Manually connect the tee pads */
1108 | GstPad *tee_audio_pad = gst_element_get_request_pad(tee, "src_%u");
1109 | GstPad *queue_audio_pad = gst_element_get_static_pad(qa, "sink");
1110 | if(gst_pad_link(tee_audio_pad, queue_audio_pad) != GST_PAD_LINK_OK) {
1111 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error linking audio pads\n",
1112 | pc->display, pc->instrument ? pc->instrument : "chat");
1113 | }
1114 | GstPad *tee_video_pad = gst_element_get_request_pad(tee, "src_%u");
1115 | GstPad *queue_video_pad = gst_element_get_static_pad(qv, "sink");
1116 | if(gst_pad_link(tee_video_pad, queue_video_pad) != GST_PAD_LINK_OK) {
1117 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error linking audio pads\n",
1118 | pc->display, pc->instrument ? pc->instrument : "chat");
1119 | }
1120 | /* Save updated pipeline to a dot file, in case we're debugging */
1121 | char dot_name[100];
1122 | g_snprintf(dot_name, sizeof(dot_name), "%s_%s",
1123 | pc->display, pc->instrument ? pc->instrument : "mic");
1124 | GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pc->pipeline), GST_DEBUG_GRAPH_SHOW_ALL, dot_name);
1125 | } else {
1126 | g_object_set(sink, "name", "video", NULL);
1127 | gst_bin_add_many(GST_BIN(pc->pipeline), entry, conv, sink, NULL);
1128 | gst_element_sync_state_with_parent(entry);
1129 | gst_element_sync_state_with_parent(conv);
1130 | gst_element_sync_state_with_parent(sink);
1131 | gst_element_link_many(entry, conv, sink, NULL);
1132 | g_object_set(sink, "sync", FALSE, NULL);
1133 | if(pc->slot != 0) {
1134 | /* Render the video */
1135 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_ADD_STREAM,
1136 | pc, TRUE, "video");
1137 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
1138 | }
1139 | }
1140 | /* Finally, let's connect the webrtcbin pad to our entry queue */
1141 | GstPad *entry_pad = gst_element_get_static_pad(entry, "sink");
1142 | GstPadLinkReturn ret = gst_pad_link(pad, entry_pad);
1143 | if(ret != GST_PAD_LINK_OK) {
1144 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error feeding %s (%d)...\n",
1145 | pc->display, pc->instrument ? pc->instrument : "chat",
1146 | GST_OBJECT_NAME(gst_element_get_factory(sink)), ret);
1147 | }
1148 | if(!video)
1149 | GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(pc->pipeline), GST_DEBUG_GRAPH_SHOW_ALL, "sink-3");
1150 | }
1151 | static void jamrtc_incoming_decodebin_stream(GstElement *decodebin, GstPad *pad, gpointer user_data) {
1152 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)user_data;
1153 | if(pc == NULL) {
1154 | JAMRTC_LOG(LOG_ERR, "Invalid PeerConnection object\n");
1155 | return;
1156 | }
1157 | /* The decodebin element has a new stream, render it */
1158 | if(!gst_pad_has_current_caps(pad)) {
1159 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Pad '%s' has no caps, ignoring\n",
1160 | GST_PAD_NAME(pad), pc->display, pc->instrument ? pc->instrument : "chat");
1161 | return;
1162 | }
1163 | GstCaps *caps = gst_pad_get_current_caps(pad);
1164 | const char *name = gst_structure_get_name(gst_caps_get_structure(caps, 0));
1165 | if(g_str_has_prefix(name, "video")) {
1166 | jamrtc_handle_media_stream(pc, pad, TRUE);
1167 | } else if(g_str_has_prefix(name, "audio")) {
1168 | jamrtc_handle_media_stream(pc, pad, FALSE);
1169 | } else {
1170 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Unknown pad %s, ignoring",
1171 | pc->display, pc->instrument ? pc->instrument : "chat", GST_PAD_NAME(pad));
1172 | }
1173 | }
1174 | static void jamrtc_incoming_stream(GstElement *webrtc, GstPad *pad, gpointer user_data) {
1175 | jamrtc_webrtc_pc *pc = (jamrtc_webrtc_pc *)user_data;
1176 | if(pc == NULL) {
1177 | JAMRTC_LOG(LOG_ERR, "Invalid PeerConnection object\n");
1178 | return;
1179 | }
1180 | /* Create an element to decode the stream */
1181 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Creating decodebin element\n",
1182 | pc->display, pc->instrument ? pc->instrument : "chat");
1183 | GstElement *decodebin = gst_element_factory_make("decodebin", NULL);
1184 | g_signal_connect(decodebin, "pad-added", G_CALLBACK(jamrtc_incoming_decodebin_stream), pc);
1185 | gst_bin_add(GST_BIN(pc->pipeline), decodebin);
1186 | gst_element_sync_state_with_parent(decodebin);
1187 | GstPad *sinkpad = gst_element_get_static_pad(decodebin, "sink");
1188 | gst_pad_link(pad, sinkpad);
1189 | gst_object_unref(sinkpad);
1190 | }
1191 |
1192 | /* Helper method to send a keep-alive */
1193 | static gboolean jamrtc_send_keepalive(gpointer user_data) {
1194 | if(session_id == 0)
1195 | return FALSE;
1196 |
1197 | /* Prepare the Janus API request */
1198 | JsonObject *keepalive = json_object_new();
1199 | json_object_set_string_member(keepalive, "janus", "keepalive");
1200 | json_object_set_int_member(keepalive, "session_id", session_id);
1201 | char transaction[12];
1202 | json_object_set_string_member(keepalive, "transaction", jamrtc_random_transaction(transaction, sizeof(transaction)));
1203 | char *text = jamrtc_json_to_string(keepalive);
1204 | json_object_unref(keepalive);
1205 |
1206 | /* Send the request via WebSockets */
1207 | JAMRTC_LOG(LOG_VERB, "Sending message: %s\n", text);
1208 | jamrtc_send_message(text);
1209 |
1210 | return TRUE;
1211 | }
1212 |
1213 | /* Thread to implement the WebSockets loop */
1214 | static gpointer jamrtc_ws_thread(gpointer data) {
1215 | JAMRTC_LOG(LOG_VERB, "Joining Jamus WebSocket client thread\n");
1216 | while(!g_atomic_int_get(&stopping)) {
1217 | /* Loop until we have to stop */
1218 | lws_service(context, 50);
1219 | }
1220 | JAMRTC_LOG(LOG_VERB, "Leaving Jamus WebSocket client thread\n");
1221 | return NULL;
1222 | }
1223 | /* Helper method to connect to the remote Janus backend via WebSockets */
1224 | static void jamrtc_connect_websockets(void) {
1225 | /* Connect */
1226 | JAMRTC_LOG(LOG_INFO, "Connecting to Janus: %s\n", server_url);
1227 | gboolean secure = !strcasecmp(protocol, "wss");
1228 | state = JAMRTC_JANUS_CONNECTING;
1229 |
1230 | struct lws_context_creation_info info = { 0 };
1231 | info.port = CONTEXT_PORT_NO_LISTEN;
1232 | info.protocols = protocols;
1233 | info.gid = -1;
1234 | info.uid = -1;
1235 | if(secure)
1236 | info.options |= LWS_SERVER_OPTION_DO_SSL_GLOBAL_INIT;
1237 | context = lws_create_context(&info);
1238 | if(context == NULL) {
1239 | jamrtc_cleanup("Creating libwebsocket context failed", JAMRTC_JANUS_CONNECTION_ERROR);
1240 | return;
1241 | }
1242 | struct lws_client_connect_info i = { 0 };
1243 | i.host = address;
1244 | i.origin = address;
1245 | i.address = address;
1246 | i.port = port;
1247 | char wspath[256];
1248 | g_snprintf(wspath, sizeof(wspath), "/%s", path);
1249 | i.path = wspath;
1250 | i.context = context;
1251 | if(secure)
1252 | i.ssl_connection = 1;
1253 | i.ietf_version_or_minus_one = -1;
1254 | i.client_exts = exts;
1255 | i.protocol = protocols[0].name;
1256 | wsi = lws_client_connect_via_info(&i);
1257 | if(wsi == NULL) {
1258 | jamrtc_cleanup("Error initializing WebSocket connection", JAMRTC_JANUS_CONNECTION_ERROR);
1259 | return;
1260 | }
1261 | jamrtc_mutex_init(&writable_mutex);
1262 |
1263 | /* Initialize the message queue */
1264 | messages = g_async_queue_new();
1265 |
1266 | /* Start a thread to handle the WebSockets event loop */
1267 | GError *error = NULL;
1268 | ws_thread = g_thread_try_new("jamrtc ws", jamrtc_ws_thread, NULL, &error);
1269 | if(error != NULL) {
1270 | JAMRTC_LOG(LOG_FATAL, "Got error %d (%s) trying to launch the Jamus WebSocket client thread...\n",
1271 | error->code, error->message ? error->message : "??");
1272 | jamrtc_cleanup("Thread error", JAMRTC_JANUS_CONNECTION_ERROR);
1273 | g_error_free(error);
1274 | return;
1275 | }
1276 | }
1277 |
1278 | /* Helper to send a message via WebSockets */
1279 | void jamrtc_send_message(char *text) {
1280 | g_async_queue_push(messages, text);
1281 | #if (LWS_LIBRARY_VERSION_MAJOR >= 3)
1282 | if(context != NULL)
1283 | lws_cancel_service(context);
1284 | #else
1285 | /* On libwebsockets < 3.x we use lws_callback_on_writable */
1286 | jamrtc_mutex_lock(&writable_mutex);
1287 | if(wsi != NULL)
1288 | lws_callback_on_writable(wsi);
1289 | jamrtc_mutex_unlock(&writable_mutex);
1290 | #endif
1291 | }
1292 |
1293 | /* Helper method to create a new Janus session */
1294 | static gboolean jamrtc_create_session(void) {
1295 | JAMRTC_LOG(LOG_INFO, "Creating a new Janus session\n");
1296 | state = JAMRTC_JANUS_CREATING_SESSION;
1297 |
1298 | /* Prepare the Janus API request */
1299 | JsonObject *create = json_object_new();
1300 | json_object_set_string_member(create, "janus", "create");
1301 | char transaction[12];
1302 | json_object_set_string_member(create, "transaction", jamrtc_random_transaction(transaction, sizeof(transaction)));
1303 | char *text = jamrtc_json_to_string(create);
1304 | json_object_unref(create);
1305 |
1306 | /* Send the request, we'll get a response asynchronously */
1307 | JAMRTC_LOG(LOG_VERB, "Sending message: %s\n", text);
1308 | jamrtc_send_message(text);
1309 |
1310 | return TRUE;
1311 | }
1312 |
1313 | /* Helper method to parse an attendee/publisher, in order to
1314 | * figure out if there's a new participant or stream available */
1315 | static void jamrtc_parse_participant(JsonObject *p, gboolean publisher) {
1316 | if(p == NULL)
1317 | return;
1318 | guint64 user_id = json_object_get_int_member(p, "id");
1319 | if((local_micwebcam && local_micwebcam->user_id == user_id) ||
1320 | (local_instrument && local_instrument->user_id == user_id)) {
1321 | /* This is us, ignore */
1322 | return;
1323 | }
1324 | /* This display property is actually supposed to be a stringified JSON, parse it */
1325 | const char *display_json = json_object_get_string_member(p, "display");
1326 | const char *uuid = NULL, *display = display_json, *instrument = NULL;
1327 | JsonParser *display_parser = json_parser_new();
1328 | if(json_parser_load_from_data(display_parser, display_json, -1, NULL)) {
1329 | JsonNode *display_root = json_parser_get_root(display_parser);
1330 | if(JSON_NODE_HOLDS_OBJECT(display_root)) {
1331 | JsonObject *display_object = json_node_get_object(display_root);
1332 | if(json_object_has_member(display_object, "uuid"))
1333 | uuid = json_object_get_string_member(display_object, "uuid");
1334 | if(json_object_has_member(display_object, "display"))
1335 | display = json_object_get_string_member(display_object, "display");
1336 | if(json_object_has_member(display_object, "instrument"))
1337 | instrument = json_object_get_string_member(display_object, "instrument");
1338 | }
1339 | }
1340 | if(uuid != NULL && !strcasecmp(uuid, local_uuid)) {
1341 | /* This is us, ignore */
1342 | if(display_parser != NULL)
1343 | g_object_unref(display_parser);
1344 | return;
1345 | }
1346 | gboolean has_audio = json_object_has_member(p, "audio_codec");
1347 | gboolean has_video = json_object_has_member(p, "video_codec");
1348 | gboolean new_participant = FALSE;
1349 | jamrtc_mutex_lock(&participants_mutex);
1350 | jamrtc_webrtc_participant *participant = uuid ? g_hash_table_lookup(participants, uuid) : NULL;
1351 | if(participant == NULL)
1352 | participant = g_hash_table_lookup(participants_byid, &user_id);
1353 | if(participant == NULL) {
1354 | /* Create a new participant instance */
1355 | new_participant = TRUE;
1356 | participant = g_malloc0(sizeof(jamrtc_webrtc_participant));
1357 | participant->uuid = uuid ? g_strdup(uuid) : g_uuid_string_random();
1358 | participant->display = g_strdup(display);
1359 | jamrtc_refcount_init(&participant->ref, jamrtc_webrtc_participant_free);
1360 | /* Find a slot for this participant */
1361 | guint slot = 2;
1362 | for(slot=2; slot <=4; slot++) {
1363 | if(g_hash_table_lookup(participants_byslot, GUINT_TO_POINTER(slot)) == NULL) {
1364 | /* Found */
1365 | participant->slot = slot;
1366 | g_hash_table_insert(participants_byslot, GUINT_TO_POINTER(slot), participant);
1367 | jamrtc_refcount_increase(&participant->ref);
1368 | break;
1369 | }
1370 | }
1371 | if(participant->slot == 0) {
1372 | JAMRTC_LOG(LOG_WARN, "No slot available for this participant, they won't be rendered in the UI\n");
1373 | } else {
1374 | /* Update the UI */
1375 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_ADD_PARTICIPANT,
1376 | participant, FALSE, NULL);
1377 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
1378 | }
1379 | /* Insert into the hashtables */
1380 | g_hash_table_insert(participants, g_strdup(participant->uuid), participant);
1381 | jamrtc_refcount_increase(&participant->ref);
1382 | }
1383 | if(instrument != NULL) {
1384 | participant->instrument_user_id = user_id;
1385 | if(participant->instrument == NULL)
1386 | participant->instrument = jamrtc_webrtc_pc_new(participant->uuid, display, TRUE, instrument);
1387 | participant->instrument->user_id = participant->instrument_user_id;
1388 | participant->instrument->slot = participant->slot;
1389 | if(publisher) {
1390 | participant->instrument->audio = has_audio;
1391 | participant->instrument->video = has_video;
1392 | }
1393 | } else {
1394 | participant->user_id = user_id;
1395 | if(participant->micwebcam == NULL)
1396 | participant->micwebcam = jamrtc_webrtc_pc_new(participant->uuid, display, TRUE, NULL);
1397 | participant->micwebcam->user_id = participant->user_id;
1398 | participant->micwebcam->slot = participant->slot;
1399 | if(publisher) {
1400 | participant->micwebcam->audio = has_audio;
1401 | participant->micwebcam->video = has_video;
1402 | }
1403 | }
1404 | if(g_hash_table_lookup(participants_byid, &user_id) == NULL) {
1405 | jamrtc_refcount_increase(&participant->ref);
1406 | g_hash_table_insert(participants_byid, jamrtc_uint64_dup(user_id), participant);
1407 | }
1408 | jamrtc_mutex_unlock(&participants_mutex);
1409 | /* Notify the application, if needed */
1410 | if(new_participant)
1411 | cb->participant_joined(participant->uuid, display);
1412 | if(publisher)
1413 | cb->stream_started(participant->uuid, display, instrument, has_audio, has_video);
1414 | if(display_parser != NULL)
1415 | g_object_unref(display_parser);
1416 | }
1417 | /* Helper method to parse a list of attendees/publishers, in order to
1418 | * figure out if there's a new participant or stream available */
1419 | static void jamrtc_parse_participants(JsonArray *list, gboolean publishers) {
1420 | guint len = json_array_get_length(list), i = 0;
1421 | JsonNode *node = NULL;
1422 | JsonObject *p = NULL;
1423 | for(i=0; iserver_connected();
1497 | goto done;
1498 | }
1499 | /* Did we receive a response to something we needed? */
1500 | if(pc == NULL)
1501 | goto done;
1502 | /* This is related to a (new?) PeerConnection */
1503 | if(pc->handle_id == 0) {
1504 | /* We don't have a handle ID yet, so this must be a response to our "attach" */
1505 | JsonObject *child = json_object_get_object_member(object, "data");
1506 | if(!json_object_has_member(child, "id")) {
1507 | JAMRTC_LOG(LOG_ERR, "[%s][%s] No handle ID in response to our attach\n",
1508 | pc->display, pc->instrument ? pc->instrument : "chat");
1509 | pc->state = JAMRTC_JANUS_API_ERROR;
1510 | goto done;
1511 | }
1512 | pc->handle_id = json_object_get_int_member(child, "id");
1513 | pc->state = JAMRTC_JANUS_HANDLE_ATTACHED;
1514 | JAMRTC_LOG(LOG_INFO, "[%s][%s] -- Handle attached: %"SCNu64"\n",
1515 | pc->display, pc->instrument ? pc->instrument : "chat", pc->handle_id);
1516 | jamrtc_mutex_lock(&participants_mutex);
1517 | jamrtc_refcount_increase(&pc->ref);
1518 | g_hash_table_insert(peerconnections, jamrtc_uint64_dup(pc->handle_id), pc);
1519 | jamrtc_mutex_unlock(&participants_mutex);
1520 | /* Check if we should automatically do something */
1521 | if(pc == local_micwebcam) {
1522 | /* We use a stringified JSON object as our display, to carry more info */
1523 | JsonObject *info = json_object_new();
1524 | json_object_set_string_member(info, "uuid", local_uuid);
1525 | json_object_set_string_member(info, "display", pc->display);
1526 | char *participant = jamrtc_json_to_string(info);
1527 | json_object_unref(info);
1528 | /* Join the room as a participant */
1529 | JsonObject *req = json_object_new();
1530 | json_object_set_string_member(req, "request", "join");
1531 | json_object_set_string_member(req, "ptype", "publisher");
1532 | json_object_set_int_member(req, "room", room_id);
1533 | json_object_set_string_member(req, "display", participant);
1534 | json_object_set_boolean_member(req, "audio", !no_mic);
1535 | json_object_set_boolean_member(req, "video", !no_webcam);
1536 | /* Prepare the Janus API request to send the message to the plugin */
1537 | JsonObject *msg = json_object_new();
1538 | json_object_set_string_member(msg, "janus", "message");
1539 | char tr[12];
1540 | json_object_set_string_member(msg, "transaction", jamrtc_random_transaction(tr, sizeof(tr)));
1541 | json_object_set_int_member(msg, "session_id", session_id);
1542 | json_object_set_int_member(msg, "handle_id", pc->handle_id);
1543 | json_object_set_object_member(msg, "body", req);
1544 | char *text = jamrtc_json_to_string(msg);
1545 | json_object_unref(msg);
1546 | g_free(participant);
1547 | /* Track the task */
1548 | jamrtc_mutex_lock(&transactions_mutex);
1549 | g_hash_table_insert(transactions, g_strdup(tr), pc);
1550 | jamrtc_mutex_unlock(&transactions_mutex);
1551 | /* Send the request via WebSockets */
1552 | JAMRTC_LOG(LOG_VERB, "Sending message: %s\n", text);
1553 | jamrtc_send_message(text);
1554 | } else if(pc == local_instrument) {
1555 | /* Create a GStreamer pipeline for the sendonly PeerConnection */
1556 | jamrtc_prepare_pipeline(pc, FALSE, TRUE, FALSE);
1557 | } else {
1558 | /* Create a GStreamer pipeline for the recvonly subscription */
1559 | if(jamrtc_prepare_pipeline(pc, TRUE, pc->audio, pc->video)) {
1560 | /* Now send a "join" request to subscribe to this stream */
1561 | JsonObject *req = json_object_new();
1562 | json_object_set_string_member(req, "request", "join");
1563 | json_object_set_string_member(req, "ptype", "subscriber");
1564 | json_object_set_int_member(req, "room", room_id);
1565 | json_object_set_int_member(req, "feed", pc->user_id);
1566 | json_object_set_int_member(req, "private_id", private_id);
1567 | /* Prepare the Janus API request to send the message to the plugin */
1568 | JsonObject *msg = json_object_new();
1569 | json_object_set_string_member(msg, "janus", "message");
1570 | char tr[12];
1571 | json_object_set_string_member(msg, "transaction", jamrtc_random_transaction(tr, sizeof(tr)));
1572 | json_object_set_int_member(msg, "session_id", session_id);
1573 | json_object_set_int_member(msg, "handle_id", pc->handle_id);
1574 | json_object_set_object_member(msg, "body", req);
1575 | char *text = jamrtc_json_to_string(msg);
1576 | json_object_unref(msg);
1577 | /* Send the request via WebSockets */
1578 | JAMRTC_LOG(LOG_VERB, "Sending message: %s\n", text);
1579 | jamrtc_send_message(text);
1580 | }
1581 | }
1582 | } else if(json_object_has_member(object, "jsep")) {
1583 | /* This message contains a JSEP SDP, which means it must be an offer or answer from Janus */
1584 | JsonObject *child = json_object_get_object_member(object, "jsep");
1585 | if(!json_object_has_member(child, "type")) {
1586 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Received SDP without 'type'\n",
1587 | pc->display, pc->instrument ? pc->instrument : "chat");
1588 | pc->state = JAMRTC_JANUS_API_ERROR;
1589 | goto done;
1590 | }
1591 | gboolean offer = FALSE;
1592 | const char *sdptype = json_object_get_string_member(child, "type");
1593 | offer = !strcasecmp(sdptype, "offer");
1594 | const char *text = json_object_get_string_member(child, "sdp");
1595 | JAMRTC_LOG(LOG_INFO, "[%s][%s] -- Received SDP %s\n",
1596 | pc->display, pc->instrument ? pc->instrument : "chat", sdptype);
1597 | JAMRTC_LOG(LOG_VERB, "%s\n", text);
1598 |
1599 | /* Check if there are any candidates in the SDP: we'll need to fake trickles in case */
1600 | if(strstr(text, "candidate") != NULL) {
1601 | int mlines = 0, i = 0;
1602 | gchar **lines = g_strsplit(text, "\r\n", -1);
1603 | gchar *line = NULL;
1604 | while(lines[i] != NULL) {
1605 | line = lines[i];
1606 | if(strstr(line, "m=") == line) {
1607 | /* New m-line */
1608 | mlines++;
1609 | if(mlines > 1) /* We only need candidates from the first one */
1610 | break;
1611 | } else if(mlines == 1 && strstr(line, "a=candidate") != NULL) {
1612 | /* Found a candidate, fake a trickle */
1613 | line += 2;
1614 | JAMRTC_LOG(LOG_VERB, "[%s][%s] -- Found candidate: %s\n",
1615 | pc->display, pc->instrument ? pc->instrument : "chat", line);
1616 | g_signal_emit_by_name(pc->peerconnection, "add-ice-candidate", 0, line);
1617 | }
1618 | i++;
1619 | }
1620 | g_clear_pointer(&lines, g_strfreev);
1621 | }
1622 |
1623 | /* Convert the SDP to something webrtcbin can digest */
1624 | GstSDPMessage *sdp = NULL;
1625 | int ret = gst_sdp_message_new(&sdp);
1626 | if(ret != GST_SDP_OK) {
1627 | /* Something went wrong */
1628 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error initializing SDP object (%d)\n",
1629 | pc->display, pc->instrument ? pc->instrument : "chat", ret);
1630 | }
1631 | ret = gst_sdp_message_parse_buffer((guint8 *)text, strlen(text), sdp);
1632 | if(ret != GST_SDP_OK) {
1633 | /* Something went wrong */
1634 | JAMRTC_LOG(LOG_ERR, "[%s][%s] Error parsing SDP buffer (%d)\n",
1635 | pc->display, pc->instrument ? pc->instrument : "chat", ret);
1636 | }
1637 | GstWebRTCSessionDescription *gst_sdp = gst_webrtc_session_description_new(
1638 | offer ? GST_WEBRTC_SDP_TYPE_OFFER : GST_WEBRTC_SDP_TYPE_ANSWER, sdp);
1639 |
1640 | /* Set remote description on our pipeline */
1641 | GstPromise *promise = gst_promise_new();
1642 | g_signal_emit_by_name(pc->peerconnection, "set-remote-description", gst_sdp, promise);
1643 | gst_promise_interrupt(promise);
1644 | gst_promise_unref(promise);
1645 |
1646 | pc->state = offer ? JAMRTC_JANUS_SDP_PREPARED : JAMRTC_JANUS_STARTED;
1647 | if(offer && pc->remote) {
1648 | /* We need to prepare an SDP answer */
1649 | promise = gst_promise_new_with_change_func(jamrtc_sdp_available, pc, NULL);
1650 | g_signal_emit_by_name(pc->peerconnection, "create-answer", NULL, promise);
1651 | //~ gst_promise_interrupt(promise);
1652 | //~ gst_promise_unref(promise);
1653 | }
1654 | } else if(json_object_has_member(object, "candidate")) {
1655 | /* This is a trickle candidate */
1656 | const char *candidate = NULL;
1657 | gint sdpmlineindex = 0;
1658 | /* Parse the candidate info */
1659 | JsonObject *child = json_object_get_object_member(object, "candidate");
1660 | if(child != NULL && json_object_has_member(child, "candidate")) {
1661 | candidate = json_object_get_string_member(child, "candidate");
1662 | sdpmlineindex = json_object_get_int_member(child, "sdpMLineIndex");
1663 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Received trickle candidate\n",
1664 | pc->display, pc->instrument ? pc->instrument : "chat");
1665 | JAMRTC_LOG(LOG_VERB, "%s(%d)\n", candidate, sdpmlineindex);
1666 | /* Add ice candidate sent by remote peer */
1667 | g_signal_emit_by_name(pc->peerconnection, "add-ice-candidate", sdpmlineindex, candidate);
1668 | }
1669 | } else {
1670 | /* Other event? Check if it's an error */
1671 | if(json_object_has_member(object, "error")) {
1672 | /* Janus API error */
1673 | JsonObject *error = json_object_get_object_member(object, "error");
1674 | JAMRTC_LOG(LOG_WARN, "[%s][%s] Got a Janus API error: %ld (%s)\n",
1675 | pc->display, pc->instrument ? pc->instrument : "chat",
1676 | json_object_get_int_member(error, "code"),
1677 | json_object_get_string_member(error, "reason"));
1678 | } else if(json_object_has_member(object, "plugindata")) {
1679 | /* Response or event from the plugin */
1680 | JsonObject *plugindata = json_object_get_object_member(object, "plugindata");
1681 | JsonObject *data = json_object_get_object_member(plugindata, "data");
1682 | if(json_object_has_member(data, "error")) {
1683 | /* VideoRoom error */
1684 | JAMRTC_LOG(LOG_WARN, "[%s][%s] Got a VideoRoom error: %ld (%s)\n",
1685 | pc->display, pc->instrument ? pc->instrument : "chat",
1686 | json_object_get_int_member(data, "error_code"),
1687 | json_object_get_string_member(data, "error"));
1688 | jamrtc_cleanup("ERROR: VideoRoom error", JAMRTC_JANUS_API_ERROR);
1689 | goto done;
1690 | }
1691 | /* Check if it's an event we should care about */
1692 | const char *event = json_object_get_string_member(data, "videoroom");
1693 | if(event != NULL && !strcasecmp(event, "joined")) {
1694 | /* This publisher handle just successfully joined the VideoRoom */
1695 | guint64 user_id = json_object_get_int_member(data, "id");
1696 | if(pc == local_micwebcam) {
1697 | local_micwebcam->user_id = user_id;
1698 | private_id = json_object_get_int_member(data, "private_id");
1699 | /* Update the UI */
1700 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_ADD_PARTICIPANT,
1701 | NULL, FALSE, NULL);
1702 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
1703 | /* Notify the application layer we're in */
1704 | cb->joined_room();
1705 | } else if(pc == local_instrument) {
1706 | local_instrument->user_id = user_id;
1707 | }
1708 | }
1709 | /* Check if there's news on attendees and/or publishers */
1710 | //~ JAMRTC_LOG(LOG_WARN, " -- TBD: %s\n", text);
1711 | if(pc == local_micwebcam) {
1712 | if(json_object_has_member(data, "joining")) {
1713 | /* Parse the new VideoRoom participant */
1714 | JsonObject *joining = json_object_get_object_member(data, "joining");
1715 | jamrtc_parse_participant(joining, FALSE);
1716 | }
1717 | if(json_object_has_member(data, "attendees")) {
1718 | /* Parse the attendees list */
1719 | JsonArray *attendees = json_object_get_array_member(data, "attendees");
1720 | jamrtc_parse_participants(attendees, FALSE);
1721 | }
1722 | if(json_object_has_member(data, "publishers")) {
1723 | /* Parse the publishers list */
1724 | JsonArray *publishers = json_object_get_array_member(data, "publishers");
1725 | jamrtc_parse_participants(publishers, TRUE);
1726 | }
1727 | if(json_object_has_member(data, "leaving")) {
1728 | /* A VideoRoom participant left, get rid of the PeerConnection instance */
1729 | guint64 user_id = json_object_get_int_member(data, "leaving");
1730 | jamrtc_mutex_lock(&participants_mutex);
1731 | jamrtc_webrtc_participant *participant = g_hash_table_lookup(participants_byid, &user_id);
1732 | if(participant != NULL) {
1733 | if(participant->user_id == user_id) {
1734 | participant->user_id = 0;
1735 | jamrtc_webrtc_pc *oldpc = participant->micwebcam;
1736 | if(oldpc != NULL) {
1737 | /* Update the UI */
1738 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_REMOVE_STREAM,
1739 | participant->micwebcam, FALSE, NULL);
1740 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
1741 | /* Remove the stream */
1742 | participant->micwebcam = NULL;
1743 | if(oldpc->handle_id != 0)
1744 | g_hash_table_remove(peerconnections, &oldpc->handle_id);
1745 | /* Notify the application */
1746 | cb->stream_stopped(oldpc->uuid, oldpc->display, NULL);
1747 | jamrtc_webrtc_pc_destroy(oldpc);
1748 | }
1749 | } else if(participant->instrument_user_id == user_id) {
1750 | participant->instrument_user_id = 0;
1751 | jamrtc_webrtc_pc *oldpc = participant->instrument;
1752 | if(oldpc != NULL) {
1753 | /* Update the UI */
1754 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_REMOVE_STREAM,
1755 | participant->instrument, FALSE, NULL);
1756 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
1757 | /* Remove the stream */
1758 | participant->instrument = NULL;
1759 | if(oldpc->handle_id != 0)
1760 | g_hash_table_remove(peerconnections, &oldpc->handle_id);
1761 | /* Notify the application */
1762 | cb->stream_stopped(oldpc->uuid, oldpc->display, oldpc->instrument);
1763 | jamrtc_webrtc_pc_destroy(oldpc);
1764 | }
1765 | }
1766 | g_hash_table_remove(participants_byid, &user_id);
1767 | }
1768 | if(participant && participant->user_id == 0 && participant->instrument_user_id == 0) {
1769 | /* No streams left for this participant */
1770 | cb->participant_left(participant->uuid, participant->display);
1771 | /* Update the UI */
1772 | jamrtc_video_message *msg = jamrtc_video_message_create(JAMRTC_ACTION_REMOVE_PARTICIPANT,
1773 | participant, FALSE, NULL);
1774 | g_main_context_invoke(NULL, jamrtc_video_message_handle, msg);
1775 | /* Remove the participant */
1776 | g_hash_table_remove(participants, participant->uuid);
1777 | g_hash_table_remove(participants_byslot, GUINT_TO_POINTER(participant->slot));
1778 | }
1779 | jamrtc_mutex_unlock(&participants_mutex);
1780 | }
1781 | }
1782 | } else if(!strcasecmp(response, "webrtcup")) {
1783 | /* PeerConnection is up */
1784 | JAMRTC_LOG(LOG_INFO, "[%s][%s] PeerConnection with Janus established\n",
1785 | pc->display, pc->instrument ? pc->instrument : "chat");
1786 | } else if(!strcasecmp(response, "media")) {
1787 | /* Notification about media reception */
1788 | const char *type = json_object_get_string_member(object, "type");
1789 | gboolean receiving = json_object_get_boolean_member(object, "receiving");
1790 | if(receiving) {
1791 | JAMRTC_LOG(LOG_INFO, "[%s][%s] Janus is receiving %s from us\n",
1792 | pc->display, pc->instrument ? pc->instrument : "chat", type);
1793 | } else {
1794 | JAMRTC_LOG(LOG_WARN, "[%s][%s] Janus hasn't received %s from us for a while...\n",
1795 | pc->display, pc->instrument ? pc->instrument : "chat", type);
1796 | }
1797 | } else if(!strcasecmp(response, "hangup")) {
1798 | /* PeerConnection is down, wrap up */
1799 | JAMRTC_LOG(LOG_INFO, "[%s][%s] PeerConnection with Janus is down (%s)\n",
1800 | pc->display, pc->instrument ? pc->instrument : "chat",
1801 | json_object_get_string_member(object, "reason"));
1802 | }
1803 | }
1804 |
1805 | done:
1806 | g_object_unref(parser);
1807 | if(pc != NULL && remove_transaction) {
1808 | jamrtc_mutex_lock(&transactions_mutex);
1809 | g_hash_table_remove(transactions, transaction);
1810 | jamrtc_mutex_unlock(&transactions_mutex);
1811 | }
1812 | }
1813 |
1814 | /* Handler for all libwebsockets events */
1815 | static int jamrtc_ws_callback(struct lws *wsi, enum lws_callback_reasons reason, void *user, void *in, size_t len) {
1816 | switch(reason) {
1817 | case LWS_CALLBACK_CLIENT_ESTABLISHED: {
1818 | /* Prepare the session */
1819 | if(ws_client == NULL)
1820 | ws_client = (jamrtc_ws_client *)user;
1821 | ws_client->wsi = wsi;
1822 | ws_client->buffer = NULL;
1823 | ws_client->buflen = 0;
1824 | ws_client->bufpending = 0;
1825 | ws_client->bufoffset = 0;
1826 | jamrtc_mutex_init(&ws_client->mutex);
1827 |
1828 | state = JAMRTC_JANUS_CONNECTED;
1829 | JAMRTC_LOG(LOG_INFO, " -- Connected to Janus\n");
1830 | /* Let's create a Janus session now */
1831 | jamrtc_create_session();
1832 | return 0;
1833 | }
1834 | case LWS_CALLBACK_CLIENT_CONNECTION_ERROR: {
1835 | state = JAMRTC_JANUS_DISCONNECTED;
1836 | jamrtc_cleanup("Error connecting to backend", 0);
1837 | cb->server_disconnected();
1838 | return 1;
1839 | }
1840 | case LWS_CALLBACK_CLIENT_RECEIVE: {
1841 | /* Incoming data */
1842 | if(ws_client == NULL) {
1843 | JAMRTC_LOG(LOG_ERR, "Invalid WebSocket client instance...\n");
1844 | return 1;
1845 | }
1846 | /* Is this a new message, or part of a fragmented one? */
1847 | const size_t remaining = lws_remaining_packet_payload(wsi);
1848 | if(ws_client->incoming == NULL) {
1849 | JAMRTC_LOG(LOG_HUGE, "First fragment: %zu bytes, %zu remaining\n",
1850 | len, remaining);
1851 | ws_client->incoming = g_malloc(len+1);
1852 | memcpy(ws_client->incoming, in, len);
1853 | ws_client->incoming[len] = '\0';
1854 | JAMRTC_LOG(LOG_HUGE, "%s\n", ws_client->incoming);
1855 | } else {
1856 | size_t offset = strlen(ws_client->incoming);
1857 | JAMRTC_LOG(LOG_HUGE, "Appending fragment: offset %zu, %zu bytes, %zu remaining\n",
1858 | offset, len, remaining);
1859 | ws_client->incoming = g_realloc(ws_client->incoming, offset+len+1);
1860 | memcpy(ws_client->incoming+offset, in, len);
1861 | ws_client->incoming[offset+len] = '\0';
1862 | JAMRTC_LOG(LOG_HUGE, "%s\n", ws_client->incoming+offset);
1863 | }
1864 | if(remaining > 0 || !lws_is_final_fragment(wsi)) {
1865 | /* Still waiting for some more fragments */
1866 | JAMRTC_LOG(LOG_HUGE, "Waiting for more fragments\n");
1867 | return 0;
1868 | }
1869 | JAMRTC_LOG(LOG_HUGE, "Done, parsing message: %zu bytes\n", strlen(ws_client->incoming));
1870 | /* If we got here, the message is complete: process the message */
1871 | jamrtc_server_message(ws_client->incoming);
1872 | g_free(ws_client->incoming);
1873 | ws_client->incoming = NULL;
1874 | return 0;
1875 | }
1876 | #if (LWS_LIBRARY_VERSION_MAJOR >= 3)
1877 | /* On libwebsockets >= 3.x, we use this event to mark connections as writable in the event loop */
1878 | case LWS_CALLBACK_EVENT_WAIT_CANCELLED: {
1879 | if(ws_client != NULL && ws_client->wsi != NULL)
1880 | lws_callback_on_writable(ws_client->wsi);
1881 | return 0;
1882 | }
1883 | #endif
1884 | case LWS_CALLBACK_CLIENT_WRITEABLE: {
1885 | if(ws_client == NULL || ws_client->wsi == NULL) {
1886 | JAMRTC_LOG(LOG_ERR, "Invalid WebSocket client instance...\n");
1887 | return -1;
1888 | }
1889 | if(!g_atomic_int_get(&stopping)) {
1890 | jamrtc_mutex_lock(&ws_client->mutex);
1891 | /* Check if we have a pending/partial write to complete first */
1892 | if(ws_client->buffer && ws_client->bufpending > 0 && ws_client->bufoffset > 0
1893 | && !g_atomic_int_get(&stopping)) {
1894 | JAMRTC_LOG(LOG_VERB, "Completing pending WebSocket write (still need to write last %d bytes)...\n",
1895 | ws_client->bufpending);
1896 | int sent = lws_write(wsi, ws_client->buffer + ws_client->bufoffset, ws_client->bufpending, LWS_WRITE_TEXT);
1897 | JAMRTC_LOG(LOG_VERB, " -- Sent %d/%d bytes\n", sent, ws_client->bufpending);
1898 | if(sent > -1 && sent < ws_client->bufpending) {
1899 | /* We still couldn't send everything that was left, we'll try and complete this in the next round */
1900 | ws_client->bufpending -= sent;
1901 | ws_client->bufoffset += sent;
1902 | } else {
1903 | /* Clear the pending/partial write queue */
1904 | ws_client->bufpending = 0;
1905 | ws_client->bufoffset = 0;
1906 | }
1907 | /* Done for this round, check the next response/notification later */
1908 | lws_callback_on_writable(wsi);
1909 | jamrtc_mutex_unlock(&ws_client->mutex);
1910 | return 0;
1911 | }
1912 | /* Shoot all the pending messages */
1913 | char *event = g_async_queue_try_pop(messages);
1914 | if(event && !g_atomic_int_get(&stopping)) {
1915 | /* Gotcha! */
1916 | int buflen = LWS_PRE + strlen(event);
1917 | if(ws_client->buffer == NULL) {
1918 | /* Let's allocate a shared buffer */
1919 | JAMRTC_LOG(LOG_VERB, "Allocating %d bytes (event is %zu bytes)\n", buflen, strlen(event));
1920 | ws_client->buflen = buflen;
1921 | ws_client->buffer = g_malloc0(buflen);
1922 | } else if(buflen > ws_client->buflen) {
1923 | /* We need a larger shared buffer */
1924 | JAMRTC_LOG(LOG_VERB, "Re-allocating to %d bytes (was %d, event is %zu bytes)\n",
1925 | buflen, ws_client->buflen, strlen(event));
1926 | ws_client->buflen = buflen;
1927 | ws_client->buffer = g_realloc(ws_client->buffer, buflen);
1928 | }
1929 | memcpy(ws_client->buffer + LWS_PRE, event, strlen(event));
1930 | JAMRTC_LOG(LOG_VERB, "Sending WebSocket message (%zu bytes)...\n", strlen(event));
1931 | int sent = lws_write(wsi, ws_client->buffer + LWS_PRE, strlen(event), LWS_WRITE_TEXT);
1932 | JAMRTC_LOG(LOG_VERB, " -- Sent %d/%zu bytes\n", sent, strlen(event));
1933 | if(sent > -1 && sent < (int)strlen(event)) {
1934 | /* We couldn't send everything in a single write, we'll complete this in the next round */
1935 | ws_client->bufpending = strlen(event) - sent;
1936 | ws_client->bufoffset = LWS_PRE + sent;
1937 | JAMRTC_LOG(LOG_VERB, " -- Couldn't write all bytes (%d missing), setting offset %d\n",
1938 | ws_client->bufpending, ws_client->bufoffset);
1939 | }
1940 | /* We can get rid of the message */
1941 | g_free(event);
1942 | /* Done for this round, check the next response/notification later */
1943 | lws_callback_on_writable(wsi);
1944 | jamrtc_mutex_unlock(&ws_client->mutex);
1945 | return 0;
1946 | }
1947 | jamrtc_mutex_unlock(&ws_client->mutex);
1948 | }
1949 | return 0;
1950 | }
1951 | #if (LWS_LIBRARY_VERSION_MAJOR >= 3)
1952 | case LWS_CALLBACK_CLIENT_CLOSED: {
1953 | #else
1954 | case LWS_CALLBACK_CLOSED: {
1955 | #endif
1956 | JAMRTC_LOG(LOG_INFO, "Janus connection closed\n");
1957 | if(ws_client != NULL) {
1958 | /* Cleanup */
1959 | jamrtc_mutex_lock(&ws_client->mutex);
1960 | JAMRTC_LOG(LOG_INFO, "Destroying Janus client\n");
1961 | ws_client->wsi = NULL;
1962 | /* Free the shared buffers */
1963 | g_free(ws_client->buffer);
1964 | ws_client->buffer = NULL;
1965 | ws_client->buflen = 0;
1966 | ws_client->bufpending = 0;
1967 | ws_client->bufoffset = 0;
1968 | jamrtc_mutex_unlock(&ws_client->mutex);
1969 | }
1970 | ws_client = NULL;
1971 | wsi = NULL;
1972 | state = JAMRTC_JANUS_DISCONNECTED;
1973 | jamrtc_cleanup("Janus connection closed", 0);
1974 | cb->server_disconnected();
1975 | return 0;
1976 | }
1977 | default:
1978 | //~ if(wsi)
1979 | //~ JAMRTC_LOG(LOG_HUGE, "%d (%s)\n", reason, jamrtc_ws_reason_string(reason));
1980 | break;
1981 | }
1982 | return 0;
1983 | }
1984 |
--------------------------------------------------------------------------------
/src/webrtc.h:
--------------------------------------------------------------------------------
1 | /*
2 | * JamRTC -- Jam sessions on Janus!
3 | *
4 | * Ugly prototype, just to use as a proof of concept
5 | *
6 | * Developed by Lorenzo Miniero: lorenzo@meetecho.com
7 | * License: GPLv3
8 | *
9 | */
10 |
11 | #ifndef JAMRTC_ICE_H
12 | #define JAMRTC_ICE_H
13 |
14 | /* GTK */
15 | #include
16 |
17 | /* GLib */
18 | #include
19 |
20 |
21 | /* Callbacks */
22 | typedef struct jamrtc_callbacks {
23 | void (* const server_connected)(void);
24 | void (* const server_disconnected)(void);
25 | void (* const joined_room)(void);
26 | void (* const participant_joined)(const char *uuid, const char *display);
27 | void (* const stream_started)(const char *uuid, const char *display, const char *instrument,
28 | gboolean has_audio, gboolean has_video);
29 | void (* const stream_stopped)(const char *uuid, const char *display, const char *instrument);
30 | void (* const participant_left)(const char *uuid, const char *display);
31 | } jamrtc_callbacks;
32 |
33 |
34 | /* Janus stack initialization */
35 | int jamrtc_webrtc_init(const jamrtc_callbacks *callbacks, GtkBuilder *builder, GMainLoop *mainloop,
36 | const char *ws, const char *stun, const char *turn, const char *src_opts, guint latency, gboolean no_jack);
37 | /* Janus stack cleanup */
38 | void jamrtc_webrtc_cleanup(void);
39 |
40 | /* Join the room as a participant */
41 | void jamrtc_join_room(guint64 room_id, const char *display);
42 | /* Publish mic/webcam for the chat part */
43 | void jamrtc_webrtc_publish_micwebcam(gboolean no_mic, gboolean no_webcam, const char *video_device);
44 | /* Publish the instrument */
45 | void jamrtc_webrtc_publish_instrument(const char *instrument, gboolean stereo);
46 | /* Subscribe to a remote stream */
47 | int jamrtc_webrtc_subscribe(const char *uuid, gboolean instrument);
48 |
49 |
50 | #endif
51 |
--------------------------------------------------------------------------------