├── .github
├── CODE_OF_CONDUCT.md
├── ISSUE_TEMPLATE.md
├── PULL_REQUEST_TEMPLATE.md
└── workflows
│ └── codeql-analysis.yml
├── .gitignore
├── .vscode
└── settings.json
├── CHANGELOG.md
├── CONTRIBUTING.md
├── LICENSE.md
├── Project
├── .babelrc
├── .deployment
├── .env
├── .gitignore
├── clientConfig.json
├── oAuthConfig.js
├── package-lock.json
├── package.json
├── public
│ ├── OneSignalSDKUpdaterWorker.js
│ ├── OneSignalSDKWorker.js
│ ├── assets
│ │ └── images
│ │ │ ├── ACSBackdrop.png
│ │ │ ├── MicrosoftLearnBackdrop.png
│ │ │ └── acsIcon.png
│ └── index.html
├── serverConfig.json
├── src
│ ├── App.css
│ ├── App.js
│ ├── Constants.js
│ ├── MakeCall
│ │ ├── AddParticipantPopover.js
│ │ ├── AudioEffects
│ │ │ └── AudioEffectsContainer.js
│ │ ├── CallCaption.js
│ │ ├── CallCard.js
│ │ ├── CallSurvey.js
│ │ ├── CurrentCallInformation.js
│ │ ├── DataChannelCard.js
│ │ ├── IncomingCallCard.js
│ │ ├── Lobby.js
│ │ ├── LocalVideoPreviewCard.js
│ │ ├── Login.js
│ │ ├── MakeCall.js
│ │ ├── MediaConstraint.js
│ │ ├── NetworkConfiguration
│ │ │ ├── ProxyConfiguration.js
│ │ │ └── TurnConfiguration.js
│ │ ├── ParticipantMenuOptions.js
│ │ ├── RawVideoAccess
│ │ │ └── CustomVideoEffects.js
│ │ ├── RealTimeTextCard.js
│ │ ├── RemoteParticipantCard.js
│ │ ├── Section.js
│ │ ├── StarRating.js
│ │ ├── StreamRenderer.js
│ │ ├── VideoEffects
│ │ │ ├── VideoEffectsContainer.js
│ │ │ └── VideoEffectsImagePicker.js
│ │ ├── VideoReceiveStats.js
│ │ ├── VideoSendStats.js
│ │ └── VolumeVisualizer.js
│ ├── Utils
│ │ └── Utils.js
│ ├── index.js
│ └── serviceWorker.js
└── webpack.config.js
├── README.md
├── SECURITY.md
└── package-lock.json
/.github/CODE_OF_CONDUCT.md:
--------------------------------------------------------------------------------
1 | # Microsoft Open Source Code of Conduct
2 |
3 | This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
4 |
5 | Resources:
6 |
7 | - [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/)
8 | - [Microsoft Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
9 | - Contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with questions or concerns
10 |
--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE.md:
--------------------------------------------------------------------------------
1 |
4 | > Please provide us with the following information:
5 | > ---------------------------------------------------------------
6 |
7 | ### This issue is for a: (mark with an `x`)
8 | ```
9 | - [ ] bug report -> please search issues before submitting
10 | - [ ] feature request
11 | - [ ] documentation issue or request
12 | - [ ] regression (a behavior that used to work and stopped in a new release)
13 | ```
14 |
15 | ### Minimal steps to reproduce
16 | >
17 |
18 | ### Any log messages given by the failure
19 | >
20 |
21 | ### Expected/desired behavior
22 | >
23 |
24 | ### OS and Version?
25 | > Windows 7, 8 or 10. Linux (which distribution). macOS (Yosemite? El Capitan? Sierra?)
26 |
27 | ### Versions
28 | >
29 |
30 | ### Mention any other details that might be useful
31 |
32 | > ---------------------------------------------------------------
33 | > Thanks! We'll be in touch soon.
34 |
--------------------------------------------------------------------------------
/.github/PULL_REQUEST_TEMPLATE.md:
--------------------------------------------------------------------------------
1 | ## Purpose
2 |
3 | * ...
4 |
5 | ## Does this introduce a breaking change?
6 |
7 | ```
8 | [ ] Yes
9 | [ ] No
10 | ```
11 |
12 | ## Pull Request Type
13 | What kind of change does this Pull Request introduce?
14 |
15 |
16 | ```
17 | [ ] Bugfix
18 | [ ] Feature
19 | [ ] Code style update (formatting, local variables)
20 | [ ] Refactoring (no functional changes, no api changes)
21 | [ ] Documentation content changes
22 | [ ] Other... Please describe:
23 | ```
24 |
25 | ## How to Test
26 | * Get the code
27 |
28 | ```
29 | git clone [repo-address]
30 | cd [repo-name]
31 | git checkout [branch-name]
32 | npm install
33 | ```
34 |
35 | * Test the code
36 |
37 | ```
38 | ```
39 |
40 | ## What to Check
41 | Verify that the following are valid
42 | * ...
43 |
44 | ## Other Information
45 |
--------------------------------------------------------------------------------
/.github/workflows/codeql-analysis.yml:
--------------------------------------------------------------------------------
1 | # For most projects, this workflow file will not need changing; you simply need
2 | # to commit it to your repository.
3 | #
4 | # You may wish to alter this file to override the set of languages analyzed,
5 | # or to provide custom queries or build logic.
6 | name: "CodeQL"
7 |
8 | on:
9 | push:
10 | branches: [main]
11 | pull_request:
12 | # The branches below must be a subset of the branches above
13 | branches: [main]
14 | schedule:
15 | - cron: '0 1 * * 5'
16 |
17 | jobs:
18 | analyze:
19 | name: Analyze
20 | runs-on: ubuntu-latest
21 |
22 | strategy:
23 | fail-fast: false
24 | matrix:
25 | # Override automatic language detection by changing the below list
26 | # Supported options are ['csharp', 'cpp', 'go', 'java', 'javascript', 'python']
27 | language: ['javascript']
28 | # Learn more...
29 | # https://docs.github.com/en/github/finding-security-vulnerabilities-and-errors-in-your-code/configuring-code-scanning#overriding-automatic-language-detection
30 |
31 | steps:
32 | - name: Checkout repository
33 | uses: actions/checkout@v2
34 | with:
35 | # We must fetch at least the immediate parents so that if this is
36 | # a pull request then we can checkout the head.
37 | fetch-depth: 2
38 |
39 | # If this run was triggered by a pull request event, then checkout
40 | # the head of the pull request instead of the merge commit.
41 | - run: git checkout HEAD^2
42 | if: ${{ github.event_name == 'pull_request' }}
43 |
44 | # Initializes the CodeQL tools for scanning.
45 | - name: Initialize CodeQL
46 | uses: github/codeql-action/init@v2
47 | with:
48 | languages: ${{ matrix.language }}
49 | # If you wish to specify custom queries, you can do so here or in a config file.
50 | # By default, queries listed here will override any specified in a config file.
51 | # Prefix the list here with "+" to use these queries and those in the config file.
52 | # queries: ./path/to/local/query, your-org/your-repo/queries@main
53 |
54 | # Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
55 | # If this step fails, then you should remove it and run the build manually (see below)
56 | - name: Autobuild
57 | uses: github/codeql-action/autobuild@v2
58 |
59 | # ℹ️ Command-line programs to run using the OS shell.
60 | # 📚 https://git.io/JvXDl
61 |
62 | # ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
63 | # and modify them (or add more) to build your code if your project
64 | # uses a compiled language
65 |
66 | #- run: |
67 | # make bootstrap
68 | # make release
69 |
70 | - name: Perform CodeQL Analysis
71 | uses: github/codeql-action/analyze@v2
72 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | ## Ignore Visual Studio temporary files, build results, and
2 | ## files generated by popular Visual Studio add-ons.
3 | ##
4 | ## Get latest from https://github.com/github/gitignore/blob/master/VisualStudio.gitignore
5 |
6 | # User-specific files
7 | *.rsuser
8 | *.suo
9 | *.user
10 | *.userosscache
11 | *.sln.docstates
12 |
13 | # User-specific files (MonoDevelop/Xamarin Studio)
14 | *.userprefs
15 |
16 | # Mono auto generated files
17 | mono_crash.*
18 |
19 | # Build results
20 | [Dd]ebug/
21 | [Dd]ebugPublic/
22 | [Rr]elease/
23 | [Rr]eleases/
24 | x64/
25 | x86/
26 | [Aa][Rr][Mm]/
27 | [Aa][Rr][Mm]64/
28 | bld/
29 | [Bb]in/
30 | [Oo]bj/
31 | [Ll]og/
32 | [Ll]ogs/
33 |
34 | # Visual Studio 2015/2017 cache/options directory
35 | .vs/
36 | # Uncomment if you have tasks that create the project's static files in wwwroot
37 | #wwwroot/
38 |
39 | # Visual Studio 2017 auto generated files
40 | Generated\ Files/
41 |
42 | # MSTest test Results
43 | [Tt]est[Rr]esult*/
44 | [Bb]uild[Ll]og.*
45 |
46 | # NUnit
47 | *.VisualState.xml
48 | TestResult.xml
49 | nunit-*.xml
50 |
51 | # Build Results of an ATL Project
52 | [Dd]ebugPS/
53 | [Rr]eleasePS/
54 | dlldata.c
55 |
56 | # Benchmark Results
57 | BenchmarkDotNet.Artifacts/
58 |
59 | # .NET Core
60 | project.lock.json
61 | project.fragment.lock.json
62 | artifacts/
63 |
64 | # StyleCop
65 | StyleCopReport.xml
66 |
67 | # Files built by Visual Studio
68 | *_i.c
69 | *_p.c
70 | *_h.h
71 | *.ilk
72 | *.meta
73 | *.obj
74 | *.iobj
75 | *.pch
76 | *.pdb
77 | *.ipdb
78 | *.pgc
79 | *.pgd
80 | *.rsp
81 | *.sbr
82 | *.tlb
83 | *.tli
84 | *.tlh
85 | *.tmp
86 | *.tmp_proj
87 | *_wpftmp.csproj
88 | *.log
89 | *.vspscc
90 | *.vssscc
91 | .builds
92 | *.pidb
93 | *.svclog
94 | *.scc
95 |
96 | # Chutzpah Test files
97 | _Chutzpah*
98 |
99 | # Visual C++ cache files
100 | ipch/
101 | *.aps
102 | *.ncb
103 | *.opendb
104 | *.opensdf
105 | *.sdf
106 | *.cachefile
107 | *.VC.db
108 | *.VC.VC.opendb
109 |
110 | # Visual Studio profiler
111 | *.psess
112 | *.vsp
113 | *.vspx
114 | *.sap
115 |
116 | # Visual Studio Trace Files
117 | *.e2e
118 |
119 | # TFS 2012 Local Workspace
120 | $tf/
121 |
122 | # Guidance Automation Toolkit
123 | *.gpState
124 |
125 | # ReSharper is a .NET coding add-in
126 | _ReSharper*/
127 | *.[Rr]e[Ss]harper
128 | *.DotSettings.user
129 |
130 | # TeamCity is a build add-in
131 | _TeamCity*
132 |
133 | # DotCover is a Code Coverage Tool
134 | *.dotCover
135 |
136 | # AxoCover is a Code Coverage Tool
137 | .axoCover/*
138 | !.axoCover/settings.json
139 |
140 | # Visual Studio code coverage results
141 | *.coverage
142 | *.coveragexml
143 |
144 | # NCrunch
145 | _NCrunch_*
146 | .*crunch*.local.xml
147 | nCrunchTemp_*
148 |
149 | # MightyMoose
150 | *.mm.*
151 | AutoTest.Net/
152 |
153 | # Web workbench (sass)
154 | .sass-cache/
155 |
156 | # Installshield output folder
157 | [Ee]xpress/
158 |
159 | # DocProject is a documentation generator add-in
160 | DocProject/buildhelp/
161 | DocProject/Help/*.HxT
162 | DocProject/Help/*.HxC
163 | DocProject/Help/*.hhc
164 | DocProject/Help/*.hhk
165 | DocProject/Help/*.hhp
166 | DocProject/Help/Html2
167 | DocProject/Help/html
168 |
169 | # Click-Once directory
170 | publish/
171 |
172 | # Publish Web Output
173 | *.[Pp]ublish.xml
174 | *.azurePubxml
175 | # Note: Comment the next line if you want to checkin your web deploy settings,
176 | # but database connection strings (with potential passwords) will be unencrypted
177 | *.pubxml
178 | *.publishproj
179 |
180 | # Microsoft Azure Web App publish settings. Comment the next line if you want to
181 | # checkin your Azure Web App publish settings, but sensitive information contained
182 | # in these scripts will be unencrypted
183 | PublishScripts/
184 |
185 | # NuGet Packages
186 | *.nupkg
187 | # NuGet Symbol Packages
188 | *.snupkg
189 | # The packages folder can be ignored because of Package Restore
190 | **/[Pp]ackages/*
191 | # except build/, which is used as an MSBuild target.
192 | !**/[Pp]ackages/build/
193 | # Uncomment if necessary however generally it will be regenerated when needed
194 | #!**/[Pp]ackages/repositories.config
195 | # NuGet v3's project.json files produces more ignorable files
196 | *.nuget.props
197 | *.nuget.targets
198 |
199 | # Microsoft Azure Build Output
200 | csx/
201 | *.build.csdef
202 |
203 | # Microsoft Azure Emulator
204 | ecf/
205 | rcf/
206 |
207 | # Windows Store app package directories and files
208 | AppPackages/
209 | BundleArtifacts/
210 | Package.StoreAssociation.xml
211 | _pkginfo.txt
212 | *.appx
213 | *.appxbundle
214 | *.appxupload
215 |
216 | # Visual Studio cache files
217 | # files ending in .cache can be ignored
218 | *.[Cc]ache
219 | # but keep track of directories ending in .cache
220 | !?*.[Cc]ache/
221 |
222 | # Others
223 | ClientBin/
224 | ~$*
225 | *~
226 | *.dbmdl
227 | *.dbproj.schemaview
228 | *.jfm
229 | *.pfx
230 | *.publishsettings
231 | orleans.codegen.cs
232 |
233 | # Including strong name files can present a security risk
234 | # (https://github.com/github/gitignore/pull/2483#issue-259490424)
235 | #*.snk
236 |
237 | # Since there are multiple workflows, uncomment next line to ignore bower_components
238 | # (https://github.com/github/gitignore/pull/1529#issuecomment-104372622)
239 | #bower_components/
240 |
241 | # RIA/Silverlight projects
242 | Generated_Code/
243 |
244 | # Backup & report files from converting an old project file
245 | # to a newer Visual Studio version. Backup files are not needed,
246 | # because we have git ;-)
247 | _UpgradeReport_Files/
248 | Backup*/
249 | UpgradeLog*.XML
250 | UpgradeLog*.htm
251 | ServiceFabricBackup/
252 | *.rptproj.bak
253 |
254 | # SQL Server files
255 | *.mdf
256 | *.ldf
257 | *.ndf
258 |
259 | # Business Intelligence projects
260 | *.rdl.data
261 | *.bim.layout
262 | *.bim_*.settings
263 | *.rptproj.rsuser
264 | *- [Bb]ackup.rdl
265 | *- [Bb]ackup ([0-9]).rdl
266 | *- [Bb]ackup ([0-9][0-9]).rdl
267 |
268 | # Microsoft Fakes
269 | FakesAssemblies/
270 |
271 | # GhostDoc plugin setting file
272 | *.GhostDoc.xml
273 |
274 | # Node.js Tools for Visual Studio
275 | .ntvs_analysis.dat
276 | node_modules/
277 |
278 | # Visual Studio 6 build log
279 | *.plg
280 |
281 | # Visual Studio 6 workspace options file
282 | *.opt
283 |
284 | # Visual Studio 6 auto-generated workspace file (contains which files were open etc.)
285 | *.vbw
286 |
287 | # Visual Studio LightSwitch build output
288 | **/*.HTMLClient/GeneratedArtifacts
289 | **/*.DesktopClient/GeneratedArtifacts
290 | **/*.DesktopClient/ModelManifest.xml
291 | **/*.Server/GeneratedArtifacts
292 | **/*.Server/ModelManifest.xml
293 | _Pvt_Extensions
294 |
295 | # Paket dependency manager
296 | .paket/paket.exe
297 | paket-files/
298 |
299 | # FAKE - F# Make
300 | .fake/
301 |
302 | # CodeRush personal settings
303 | .cr/personal
304 |
305 | # Python Tools for Visual Studio (PTVS)
306 | __pycache__/
307 | *.pyc
308 |
309 | # Cake - Uncomment if you are using it
310 | # tools/**
311 | # !tools/packages.config
312 |
313 | # Tabs Studio
314 | *.tss
315 |
316 | # Telerik's JustMock configuration file
317 | *.jmconfig
318 |
319 | # BizTalk build output
320 | *.btp.cs
321 | *.btm.cs
322 | *.odx.cs
323 | *.xsd.cs
324 |
325 | # OpenCover UI analysis results
326 | OpenCover/
327 |
328 | # Azure Stream Analytics local run output
329 | ASALocalRun/
330 |
331 | # MSBuild Binary and Structured Log
332 | *.binlog
333 |
334 | # NVidia Nsight GPU debugger configuration file
335 | *.nvuser
336 |
337 | # MFractors (Xamarin productivity tool) working folder
338 | .mfractor/
339 |
340 | # Local History for Visual Studio
341 | .localhistory/
342 |
343 | # BeatPulse healthcheck temp database
344 | healthchecksdb
345 |
346 | # Backup folder for Package Reference Convert tool in Visual Studio 2017
347 | MigrationBackup/
348 |
349 | # Ionide (cross platform F# VS Code tools) working folder
350 | .ionide/
351 |
352 | .DS_Store
353 |
--------------------------------------------------------------------------------
/.vscode/settings.json:
--------------------------------------------------------------------------------
1 | {
2 | "appService.defaultWebAppToDeploy": "None"
3 | }
--------------------------------------------------------------------------------
/CHANGELOG.md:
--------------------------------------------------------------------------------
1 | ## [project-title] Changelog
2 |
3 |
4 | # x.y.z (yyyy-mm-dd)
5 |
6 | *Features*
7 | * ...
8 |
9 | *Bug Fixes*
10 | * ...
11 |
12 | *Breaking Changes*
13 | * ...
14 |
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | # Contributing to [project-title]
2 |
3 | This project welcomes contributions and suggestions. Most contributions require you to agree to a
4 | Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us
5 | the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
6 |
7 | When you submit a pull request, a CLA bot will automatically determine whether you need to provide
8 | a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions
9 | provided by the bot. You will only need to do this once across all repos using our CLA.
10 |
11 | This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
12 | For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
13 | contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
14 |
15 | - [Code of Conduct](#coc)
16 | - [Issues and Bugs](#issue)
17 | - [Feature Requests](#feature)
18 | - [Submission Guidelines](#submit)
19 |
20 | ## Code of Conduct
21 | Help us keep this project open and inclusive. Please read and follow our [Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
22 |
23 | ## Found an Issue?
24 | If you find a bug in the source code or a mistake in the documentation, you can help us by
25 | [submitting an issue](#submit-issue) to the GitHub Repository. Even better, you can
26 | [submit a Pull Request](#submit-pr) with a fix.
27 |
28 | ## Want a Feature?
29 | You can *request* a new feature by [submitting an issue](#submit-issue) to the GitHub
30 | Repository. If you would like to *implement* a new feature, please submit an issue with
31 | a proposal for your work first, to be sure that we can use it.
32 |
33 | * **Small Features** can be crafted and directly [submitted as a Pull Request](#submit-pr).
34 |
35 | ## Submission Guidelines
36 |
37 | ### Submitting an Issue
38 | Before you submit an issue, search the archive, maybe your question was already answered.
39 |
40 | If your issue appears to be a bug, and hasn't been reported, open a new issue.
41 | Help us to maximize the effort we can spend fixing issues and adding new
42 | features, by not reporting duplicate issues. Providing the following information will increase the
43 | chances of your issue being dealt with quickly:
44 |
45 | * **Overview of the Issue** - if an error is being thrown a non-minified stack trace helps
46 | * **Version** - what version is affected (e.g. 0.1.2)
47 | * **Motivation for or Use Case** - explain what are you trying to do and why the current behavior is a bug for you
48 | * **Browsers and Operating System** - is this a problem with all browsers?
49 | * **Reproduce the Error** - provide a live example or a unambiguous set of steps
50 | * **Related Issues** - has a similar issue been reported before?
51 | * **Suggest a Fix** - if you can't fix the bug yourself, perhaps you can point to what might be
52 | causing the problem (line of code or commit)
53 |
54 | You can file new issues by providing the above information at the corresponding repository's issues link: https://github.com/[organization-name]/[repository-name]/issues/new].
55 |
56 | ### Submitting a Pull Request (PR)
57 | Before you submit your Pull Request (PR) consider the following guidelines:
58 |
59 | * Search the repository (https://github.com/[organization-name]/[repository-name]/pulls) for an open or closed PR
60 | that relates to your submission. You don't want to duplicate effort.
61 |
62 | * Make your changes in a new git fork:
63 |
64 | * Commit your changes using a descriptive commit message
65 | * Push your fork to GitHub:
66 | * In GitHub, create a pull request
67 | * If we suggest changes then:
68 | * Make the required updates.
69 | * Rebase your fork and force push to your GitHub repository (this will update your Pull Request):
70 |
71 | ```shell
72 | git rebase master -i
73 | git push -f
74 | ```
75 |
76 | That's it! Thank you for your contribution!
77 |
--------------------------------------------------------------------------------
/LICENSE.md:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) Microsoft Corporation.
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE
--------------------------------------------------------------------------------
/Project/.babelrc:
--------------------------------------------------------------------------------
1 | {
2 | "presets": [
3 | "@babel/preset-env",
4 | "@babel/preset-react"
5 | ],
6 | "plugins": [
7 | "@babel/plugin-transform-class-properties",
8 | ["@babel/plugin-transform-runtime",
9 | {
10 | "regenerator": true
11 | }]
12 | ]
13 | }
--------------------------------------------------------------------------------
/Project/.deployment:
--------------------------------------------------------------------------------
1 | [config]
2 | SCM_DO_BUILD_DURING_DEPLOYMENT=true
--------------------------------------------------------------------------------
/Project/.env:
--------------------------------------------------------------------------------
1 | SKIP_PREFLIGHT_CHECK=true
--------------------------------------------------------------------------------
/Project/.gitignore:
--------------------------------------------------------------------------------
1 | # See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
2 |
3 | # dependencies
4 | /node_modules
5 | /.pnp
6 | .pnp.js
7 |
8 | # testing
9 | /coverage
10 |
11 | # production
12 | /build
13 | /dist
14 |
15 | # misc
16 | .DS_Store
17 | .env.local
18 | .env.development.local
19 | .env.test.local
20 | .env.production.local
21 |
22 | npm-debug.log*
23 | yarn-debug.log*
24 | yarn-error.log*
25 |
26 | # build dropped
27 | sdk.bundle.js
28 | ./config.js
--------------------------------------------------------------------------------
/Project/clientConfig.json:
--------------------------------------------------------------------------------
1 | {
2 | "oneSignalAppId": "",
3 | "oneSignalSafariWebId": "",
4 | "appInsightsConnectionString": ""
5 | }
--------------------------------------------------------------------------------
/Project/oAuthConfig.js:
--------------------------------------------------------------------------------
1 | const authConfig = {
2 | auth: {
3 | clientId: 'ENTER_CLIENT_ID',
4 | authority: 'https://login.microsoftonline.com/ENTER_TENANT_ID'
5 | }
6 | };
7 | // Add here scopes for id token to be used at MS Identity Platform endpoints.
8 | const authScopes = {
9 | popUpLogin: [],
10 | m365Login: []
11 | };
12 |
13 | module.exports = {authConfig, authScopes }
--------------------------------------------------------------------------------
/Project/package.json:
--------------------------------------------------------------------------------
1 | {
2 | "name": "ACSCallingSample",
3 | "version": "1.0.0",
4 | "private": true,
5 | "dependencies": {
6 | "@azure/communication-calling": "1.35.1-beta.1",
7 | "@azure/communication-calling-effects": "1.1.1-beta.1",
8 | "@azure/communication-common": "^2.3.0",
9 | "@azure/communication-identity": "^1.3.0",
10 | "@azure/communication-network-traversal": "^1.1.0-beta.1",
11 | "@azure/communication-rooms": "1.2.0",
12 | "@azure/logger": "^1.0.3",
13 | "@azure/msal-browser": "^2.33.0",
14 | "@azure/msal-node": "^1.17.1",
15 | "@fluentui/react": "^8.122.9",
16 | "@fluentui/react-icons-mdl2": "^1.3.82",
17 | "@microsoft/applicationinsights-web": "^3.0.2",
18 | "pako": "^2.1.0",
19 | "react": "^16.14.0",
20 | "react-dom": "^16.14.0",
21 | "react-onesignal": "^2.0.4",
22 | "react-toastify": "9.0.1"
23 | },
24 | "scripts": {
25 | "start-local": "webpack-dev-server --port 5000 --mode development",
26 | "build-local": "webpack --mode development",
27 | "start": "webpack-dev-server --host 0.0.0.0",
28 | "build": "webpack"
29 | },
30 | "devDependencies": {
31 | "@babel/core": "^7.8.7",
32 | "@babel/plugin-transform-class-properties": "^7.22.5",
33 | "@babel/plugin-transform-runtime": "^7.8.3",
34 | "@babel/preset-env": "^7.8.7",
35 | "@babel/preset-react": "^7.8.3",
36 | "@babel/runtime": "^7.8.7",
37 | "axios": "^1.3.4",
38 | "babel-loader": "^8.0.6",
39 | "css-loader": "^5.2.6",
40 | "html-loader": "^4.2.0",
41 | "html-webpack-plugin": "^5.5.3",
42 | "style-loader": "^1.1.3",
43 | "webpack": "^5.88.2",
44 | "webpack-cli": "^5.1.4",
45 | "webpack-dev-server": "^4.15.1"
46 | }
47 | }
--------------------------------------------------------------------------------
/Project/public/OneSignalSDKUpdaterWorker.js:
--------------------------------------------------------------------------------
1 | importScripts('https://cdn.onesignal.com/sdks/OneSignalSDKWorker.js');
--------------------------------------------------------------------------------
/Project/public/OneSignalSDKWorker.js:
--------------------------------------------------------------------------------
1 | importScripts('https://cdn.onesignal.com/sdks/OneSignalSDKWorker.js');
--------------------------------------------------------------------------------
/Project/public/assets/images/ACSBackdrop.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/communication-services-web-calling-tutorial/8d280b817bbf0fdc1178aa0bfce8ff9d1cea8702/Project/public/assets/images/ACSBackdrop.png
--------------------------------------------------------------------------------
/Project/public/assets/images/MicrosoftLearnBackdrop.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/communication-services-web-calling-tutorial/8d280b817bbf0fdc1178aa0bfce8ff9d1cea8702/Project/public/assets/images/MicrosoftLearnBackdrop.png
--------------------------------------------------------------------------------
/Project/public/assets/images/acsIcon.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/communication-services-web-calling-tutorial/8d280b817bbf0fdc1178aa0bfce8ff9d1cea8702/Project/public/assets/images/acsIcon.png
--------------------------------------------------------------------------------
/Project/public/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 | ACS Calling Sample
11 |
12 |
13 | You need to enable JavaScript to run this app.
14 |
15 |
19 |
20 |
--------------------------------------------------------------------------------
/Project/serverConfig.json:
--------------------------------------------------------------------------------
1 | {
2 | "connectionString": "REPLACE_WITH_CONNECTION_STRING",
3 | "functionAppOneSignalTokenRegistrationUrl": ""
4 | }
5 |
--------------------------------------------------------------------------------
/Project/src/App.css:
--------------------------------------------------------------------------------
1 | /*********************************
2 | * Small screen *
3 | *********************************/
4 | @media (max-width: 575.98px) {
5 | .sdk-docs-header {
6 | text-align: left;
7 | }
8 |
9 | .in-call-button {
10 | font-size: x-large;
11 | }
12 |
13 | .remote-video-loading-spinner {
14 | border: 8px solid #f3f3f3;
15 | border-radius: 50%;
16 | border-top: 8px solid #75b6e7;
17 | width: 60px;
18 | height: 60px;
19 | -webkit-animation: spin 2s linear infinite;
20 | /* Safari */
21 | animation: spin 2s linear infinite;
22 | position: absolute;
23 | margin: auto;
24 | top: 0;
25 | bottom: 0;
26 | left: 0;
27 | right: 0;
28 | transform: translate(-50%, -50%);
29 | }
30 |
31 | .card {
32 | border-top: 1px solid #605e5c;
33 | padding-top: 1.5em;
34 | padding-bottom: 1.5em;
35 | padding-right: 0em;
36 | padding-left: 0em;
37 | }
38 |
39 | .login-pannel.teams{
40 | margin-top:10px;
41 | }
42 | }
43 |
44 | body {
45 | margin: 0;
46 | font-size: 14px !important;
47 | font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
48 | 'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
49 | sans-serif !important;
50 | -webkit-font-smoothing: antialiased;
51 | -moz-osx-font-smoothing: grayscale;
52 | background-color: #0D1114;
53 | color: #c5c5c5 !important;
54 | font-weight: 500;
55 | }
56 |
57 | /*********************************
58 | * Code Examples *
59 | *********************************/
60 | pre {
61 | overflow-x: auto;
62 | }
63 |
64 | code {
65 | font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',
66 | monospace;
67 | }
68 |
69 | /*********************************
70 | * Headers *
71 | *********************************/
72 | h1,
73 | h2,
74 | h3,
75 | h4,
76 | h5,
77 | h6 {
78 | color: #ffffff;
79 | margin-top: 0;
80 | margin-bottom: .5rem;
81 | font-weight: unset;
82 | }
83 |
84 | .header {
85 | padding-top: 1em;
86 | padding-bottom: 0.5em;
87 | }
88 |
89 | .nav-bar-icon {
90 | height: 5em;
91 | }
92 |
93 | .sdk-docs-header {
94 | text-align: right;
95 | font-weight: 300;
96 | }
97 |
98 | .sdk-docs-link {
99 | color: #edebe9;
100 | text-decoration: underline;
101 | }
102 |
103 | .login-pannel {
104 | height:300px;
105 | }
106 |
107 | input:disabled {
108 | color: #8f8f8f !important;
109 | }
110 |
111 | .loader {
112 | border: 2px solid #edebe9;
113 | border-top: 2px solid #75b6e7;
114 | border-radius: 50%;
115 | width: 14px;
116 | height: 14px;
117 | animation: spin 0.75s linear infinite;
118 | }
119 |
120 | .ringing-loader {
121 | border: 5px solid #edebe9;
122 | border-top: 5px solid #75b6e7;
123 | border-radius: 50%;
124 | width: 30px;
125 | height: 30px;
126 | animation: spin 1.5s linear infinite;
127 | }
128 |
129 | @keyframes spin {
130 | 0% {
131 | transform: rotate(0deg);
132 | }
133 |
134 | 100% {
135 | transform: rotate(360deg);
136 | }
137 | }
138 |
139 | /* Safari */
140 | @-webkit-keyframes spin {
141 | 0% {
142 | -webkit-transform: rotate(0deg);
143 | }
144 |
145 | 100% {
146 | -webkit-transform: rotate(360deg);
147 | }
148 | }
149 |
150 | .identity {
151 | color: #75b6e7
152 | }
153 |
154 | .ListGroup {
155 | max-height: 15rem;
156 | overflow: auto;
157 | }
158 |
159 | .ListGroup li {
160 | padding-top: 0.5rem;
161 | padding-bottom: 0.5rem;
162 | }
163 |
164 | button:focus {
165 | outline: 0;
166 | }
167 |
168 | .card {
169 | border-top: 1px solid #605e5c;
170 | padding-top: 3em;
171 | padding-bottom: 3em;
172 | padding-right: 1em;
173 | padding-left: 1em;
174 | }
175 |
176 | ul {
177 | list-style-type: none;
178 | }
179 |
180 | .pre-call-grid-container {
181 | display: flex;
182 | flex-wrap: wrap;
183 | }
184 |
185 | .pre-call-grid {
186 | display: flex;
187 | flex-direction: column;
188 | padding-right: 2em;
189 | }
190 |
191 | .pre-call-grid span {
192 | margin: 20px 20px 10px 0;
193 | }
194 |
195 | .pre-call-grid-panel {
196 | padding: 0.3em;
197 | border: 1px solid #605e5c;
198 | height: 100%;
199 | }
200 |
201 | .ms-u-sm2 {
202 | width: 8em
203 | }
204 |
205 | .participants-panel {
206 | max-height: 60em;
207 | overflow-y: auto;
208 | padding: 1em;
209 | width: fit-content;
210 | }
211 |
212 | .participants-panel-list {
213 | padding-left: 0;
214 | margin-top: 0;
215 | }
216 |
217 | .participant-item {
218 | display: flex;
219 | flex-wrap: wrap;
220 | padding-top: 2em;
221 | padding-bottom: 2em;
222 | padding-left: 1em;
223 | white-space: nowrap;
224 | }
225 |
226 | .participant-item:hover {
227 | background-color: #201f1e;
228 | }
229 |
230 | .ms-Persona-primaryText, .ms-Persona-primaryText:hover {
231 | color: #edebe9;
232 | text-wrap: wrap;
233 | }
234 |
235 | .participant-remove,
236 | .participant-remove:hover {
237 | color: #a4262c;
238 | text-decoration: none;
239 | }
240 |
241 | .add-participant-panel {
242 | padding: 0.5em;
243 | border: 1px solid #605e5c;
244 | }
245 |
246 | .add-participant-button {
247 | padding-bottom: 0.4em;
248 | padding-right: 0.4em;
249 | padding-left: 0.4em;
250 | font-size: x-large;
251 | color: #edebe9;
252 | }
253 |
254 | .separator {
255 | margin-top: 10px;
256 | margin-bottom: 10px;
257 | border-bottom: 1px solid #605e5c;
258 | }
259 |
260 | .ms-TextField {
261 | margin-top: 25px;
262 | margin-bottom: 25px;
263 | }
264 |
265 | .ms-TextField-fieldGroup {
266 | border: 0px;
267 | box-sizing: unset;
268 | background-color: #0D1114;
269 | }
270 |
271 | .ms-TextField-field {
272 | border: 1px solid #3b3b3b !important;
273 | background-color: #0D1114;
274 | color: #edebe9;
275 | border-radius: 20px;
276 | height: 35px;
277 | padding-left: 1em;
278 | }
279 |
280 | .ms-TextField-field::placeholder {
281 | font-size: 14px;
282 | }
283 |
284 | .ms-TextField-field:hover {
285 | background-color: #0D1114;
286 | color: #edebe9;
287 | }
288 |
289 | .ms-TextField-wrapper>label {
290 | color: #c5c5c5;
291 | font-weight: 400;
292 | font-size: 12px;
293 | }
294 |
295 | div.push-notification-options[disabled],
296 | div.ms-Checkbox.is-disabled>input[type="checkbox"]+label.ms-Checkbox-label>span.ms-Checkbox-text,
297 | .ms-TextField-field:disabled,
298 | .ms-TextField-field::placeholder,
299 | .ms-TextField.is-disabled>.ms-TextField-wrapper>label,
300 | div.ms-Checkbox.is-disabled>input[type="checkbox"]+label.ms-Checkbox-label>div.ms-Checkbox-checkbox>i.ms-Checkbox-checkmark {
301 | color: #484644 !important;
302 | }
303 |
304 | div.ms-Checkbox>input[type="checkbox"]+label.ms-Checkbox-label>div.ms-Checkbox-checkbox {
305 | background-color: #201f1e;
306 | border: 1px solid #605e5c;
307 | }
308 |
309 | div.ms-Checkbox.is-disabled>input[type="checkbox"]+label.ms-Checkbox-label>div.ms-Checkbox-checkbox {
310 | border: 1px solid #484644;
311 | }
312 |
313 | div.ms-Checkbox>input[type="checkbox"]+label.ms-Checkbox-label>span.ms-Checkbox-text {
314 | color: white;
315 | font-size: 0.8rem;
316 | }
317 |
318 | .primary-button {
319 | font-size: 14px;
320 | height: 40px;
321 | color: #000000;
322 | background-color: #e1e1e1;
323 | border: 1px solid #e1e1e1;
324 | outline: none;
325 | margin-right: 1em;
326 | margin-bottom: 1em;
327 | border-radius: 20px;
328 | padding-right: 25px;
329 | padding-left: 25px;
330 | }
331 |
332 | .primary-button:hover {
333 | color: #278cda;
334 | background-color: #201f1e;
335 | border: 1px solid #278cda;
336 | }
337 |
338 | .primary-button:disabled {
339 | background-color: #2f2f2f;
340 | border-color: #4b4b4b;
341 | outline: none;
342 | color: #000000;
343 | }
344 |
345 | .secondary-button {
346 | font-size: 14px;
347 | height: 40px;
348 | color: #000000;
349 | background-color: #e1e1e1;
350 | border: 1px solid #e1e1e1;
351 | outline: none;
352 | margin-right: 1em;
353 | margin-bottom: 1em;
354 | border-radius: 20px;
355 | padding-right: 25px;
356 | padding-left: 25px;
357 | }
358 |
359 | .secondary-button:hover {
360 | color: #278cda;
361 | background-color: #201f1e;
362 | border: 1px solid #278cda;
363 | }
364 |
365 | .call-input-panel {
366 | padding: 0 0.6em;
367 | }
368 |
369 | .call-input-panel-input-label-disabled {
370 | color: #484644 !important;
371 | }
372 |
373 | .in-call-button,
374 | .incoming-call-button {
375 | padding-right: 0.4em;
376 | padding-left: 0.4em;
377 | font-size: 24px;
378 | }
379 |
380 | .incoming-call-button,
381 | .incoming-call-label {
382 | display: inline;
383 | }
384 |
385 | .in-call-button:hover,
386 | .incoming-call-button:hover {
387 | cursor: pointer;
388 | }
389 |
390 | .in-call-button:first-child {
391 | border-radius: 0px;
392 | }
393 |
394 | .video-grid-row {
395 | display: flex;
396 | padding-bottom: 25px;
397 | flex-flow: row wrap;
398 | justify-content: center;
399 | }
400 |
401 | @media screen and (max-width: 1024px){
402 | .video-grid-row {
403 | display: block;
404 | }
405 | }
406 |
407 | .stream-container {
408 | order: 1;
409 | display: none;
410 | position: relative;
411 | flex: 0 20%;
412 | }
413 |
414 | .stream-container.pinned {
415 | order: 0;
416 | flex: 0 100% !important;
417 | text-align: -webkit-center;
418 | }
419 |
420 | .stream-container.pinning-is-active {
421 | flex: 0 25% !important;
422 | }
423 |
424 | .stream-container.stream-count-1,
425 | .stream-container.stream-count-2,
426 | .stream-container.stream-count-3,
427 | .stream-container.stream-count-4 {
428 | flex: 0 49%;
429 | }
430 |
431 | .stream-container.stream-count-5,
432 | .stream-container.stream-count-6,
433 | .stream-container.stream-count-7,
434 | .stream-container.stream-count-8,
435 | .stream-container.stream-count-9 {
436 | flex: 0 33%;
437 | }
438 |
439 | .video-title {
440 | position: absolute;
441 | bottom: 8%;
442 | left: 4%;
443 | margin-bottom: 0px;
444 | width: 50%;
445 | background-color: #0000006e;
446 | margin: 0.4em;
447 | margin-left: 0.5em;
448 | margin-right: 0.5em;
449 | padding: 1em;
450 | padding-top: 0.5em;
451 | padding-bottom: 0.5em;
452 | }
453 |
454 | .video-stats {
455 | position: absolute;
456 | top: 10%;
457 | left: 4%;
458 | margin-bottom: 0px;
459 | background-color: #0000006e;
460 | margin: 0.4em;
461 | margin-left: 0.5em;
462 | margin-right: 0.5em;
463 | padding: 1em;
464 | padding-top: 0.5em;
465 | padding-bottom: 0.5em;
466 | }
467 |
468 | .speaking-border-for-initials>.ms-Persona-coin>.ms-Persona-imageArea>.ms-Persona-initials {
469 | box-shadow: 0px 0px 0px 3px #75b6e7, 0px 0px 20px 5px #75b6e7;
470 | }
471 |
472 | .speaking-border-for-video {
473 | box-shadow: 0px 0px 0px 3px #75b6e7;
474 | }
475 |
476 | .remote-video-container {
477 | position: relative;
478 | }
479 |
480 | .remote-video-container.pinning-is-active {
481 | height: 234px !important;
482 | width: 416px !important;
483 | justify-self: center;
484 | }
485 |
486 | .remote-video-container.pinned {
487 | height: 720px !important;
488 | width: 1280px !important;
489 | justify-self: center;
490 | }
491 |
492 | .remote-video-container.pinned.portrait {
493 | height: 1280px !important;
494 | width: 720px !important;
495 | justify-self: center;
496 | }
497 |
498 | @media screen and (max-width: 1024px){
499 | .remote-video-container.pinned, .remote-video-container.pinned.portrait {
500 | height: 100% !important;
501 | width: 100% !important;
502 | justify-self: unset;
503 | }
504 | }
505 |
506 | .remote-video-container video {
507 | object-fit: contain !important;
508 | object-position: center center;
509 | }
510 |
511 | .remote-video-loading-spinner {
512 | border: 12px solid #f3f3f3;
513 | border-radius: 50%;
514 | border-top: 12px solid #75b6e7;
515 | width: 100px;
516 | height: 100px;
517 | -webkit-animation: spin 2s linear infinite;
518 | /* Safari */
519 | animation: spin 2s linear infinite;
520 | position: absolute;
521 | margin: auto;
522 | top: 0;
523 | bottom: 0;
524 | left: 0;
525 | right: 0;
526 | transform: translate(-50%, -50%);
527 | }
528 |
529 | .pptLive {
530 | display: block;
531 | padding: 2vh 2vw;
532 | width: 100%;
533 | height: 50vh;
534 | }
535 |
536 | .icon-text-large {
537 | vertical-align: middle;
538 | font-size: large;
539 | }
540 |
541 | .icon-text-xlarge {
542 | vertical-align: middle;
543 | font-size: x-large;
544 | }
545 |
546 | .popover {
547 | border: 1px solid #605e5c;
548 | border-radius: 0;
549 | }
550 |
551 | .popover-header {
552 | background-color: #292827;
553 | color: #edebe9;
554 | border-bottom: 1px solid #292827;
555 | border-radius: 0;
556 | }
557 |
558 | .popover-body {
559 | background-color: #292827;
560 | color: #edebe9;
561 | }
562 |
563 | /*********************************
564 | * Width and Height *
565 | *********************************/
566 | .w-25 {
567 | width: 25% !important;
568 | }
569 |
570 | .w-50 {
571 | width: 50% !important;
572 | }
573 |
574 | .w-75 {
575 | width: 75% !important;
576 | }
577 |
578 | .w-100 {
579 | width: 100% !important;
580 | }
581 |
582 | .h-25 {
583 | height: 25% !important;
584 | }
585 |
586 | .h-50 {
587 | height: 50% !important;
588 | }
589 |
590 | .h-75 {
591 | height: 75% !important;
592 | }
593 |
594 | .h-100 {
595 | height: 100% !important;
596 | }
597 |
598 | /*********************************
599 | * Font weights *
600 | *********************************/
601 | .fontweight-100 {
602 | font-weight: 100;
603 | }
604 |
605 | .fontweight-200 {
606 | font-weight: 200;
607 | }
608 |
609 | .fontweight-300 {
610 | font-weight: 300;
611 | }
612 |
613 | .fontweight-400 {
614 | font-weight: 400;
615 | }
616 |
617 | .fontweight-500 {
618 | font-weight: 500;
619 | }
620 |
621 | .fontweight-600 {
622 | font-weight: 600;
623 | }
624 |
625 | .fontweight-700 {
626 | font-weight: 700;
627 | }
628 |
629 | /*********************************
630 | * Alignment *
631 | **********************************/
632 | .align-items-center {
633 | align-items: center;
634 | }
635 |
636 | .justify-content-left {
637 | justify-content: left;
638 | }
639 |
640 | .justify-content-right {
641 | justify-content: right;
642 | }
643 |
644 | .justify-content-center {
645 | justify-content: center;
646 | }
647 |
648 | .text-truncate {
649 | overflow: hidden;
650 | text-overflow: ellipsis;
651 | white-space: nowrap;
652 | }
653 |
654 | .text-left {
655 | text-align: left;
656 | }
657 |
658 | .text-right {
659 | text-align: right;
660 | }
661 |
662 | .text-center {
663 | text-align: center;
664 | }
665 |
666 | .inline-block {
667 | display: inline-block;
668 | vertical-align: middle;
669 | }
670 |
671 | .inline-flex {
672 | display: inline-flex;
673 | }
674 |
675 | /*********************************
676 | * Margin and Padding *
677 | **********************************/
678 | .m-0 {
679 | margin: 0 !important;
680 | }
681 |
682 | .mt-0,
683 | .my-0 {
684 | margin-top: 0 !important;
685 | }
686 |
687 | .mr-0,
688 | .mx-0 {
689 | margin-right: 0 !important;
690 | }
691 |
692 | .mb-0,
693 | .my-0 {
694 | margin-bottom: 0 !important;
695 | }
696 |
697 | .ml-0,
698 | .mx-0 {
699 | margin-left: 0 !important;
700 | }
701 |
702 | .m-1 {
703 | margin: 0.25rem !important;
704 | }
705 |
706 | .mt-1,
707 | .my-1 {
708 | margin-top: 0.25rem !important;
709 | }
710 |
711 | .mr-1,
712 | .mx-1 {
713 | margin-right: 0.25rem !important;
714 | }
715 |
716 | .mb-1,
717 | .my-1 {
718 | margin-bottom: 0.25rem !important;
719 | }
720 |
721 | .ml-1,
722 | .mx-1 {
723 | margin-left: 0.25rem !important;
724 | }
725 |
726 | .m-2 {
727 | margin: 0.5rem !important;
728 | }
729 |
730 | .mt-2,
731 | .my-2 {
732 | margin-top: 0.5rem !important;
733 | }
734 |
735 | .mr-2,
736 | .mx-2 {
737 | margin-right: 0.5rem !important;
738 | }
739 |
740 | .mb-2,
741 | .my-2 {
742 | margin-bottom: 0.5rem !important;
743 | }
744 |
745 | .ml-2,
746 | .mx-2 {
747 | margin-left: 0.5rem !important;
748 | }
749 |
750 | .m-3 {
751 | margin: 1rem !important;
752 | }
753 |
754 | .mt-3,
755 | .my-3 {
756 | margin-top: 1rem !important;
757 | }
758 |
759 | .mr-3,
760 | .mx-3 {
761 | margin-right: 1rem !important;
762 | }
763 |
764 | .mb-3,
765 | .my-3 {
766 | margin-bottom: 1rem !important;
767 | }
768 |
769 | .ml-3,
770 | .mx-3 {
771 | margin-left: 1rem !important;
772 | }
773 |
774 | .m-4 {
775 | margin: 1.5rem !important;
776 | }
777 |
778 | .mt-4,
779 | .my-4 {
780 | margin-top: 1.5rem !important;
781 | }
782 |
783 | .mr-4,
784 | .mx-4 {
785 | margin-right: 1.5rem !important;
786 | }
787 |
788 | .mb-4,
789 | .my-4 {
790 | margin-bottom: 1.5rem !important;
791 | }
792 |
793 | .ml-4,
794 | .mx-4 {
795 | margin-left: 1.5rem !important;
796 | }
797 |
798 | .m-5 {
799 | margin: 3rem !important;
800 | }
801 |
802 | .mt-5,
803 | .my-5 {
804 | margin-top: 3rem !important;
805 | }
806 |
807 | .mr-5,
808 | .mx-5 {
809 | margin-right: 3rem !important;
810 | }
811 |
812 | .mb-5,
813 | .my-5 {
814 | margin-bottom: 3rem !important;
815 | }
816 |
817 | .ml-5,
818 | .mx-5 {
819 | margin-left: 3rem !important;
820 | }
821 |
822 | .p-0 {
823 | padding: 0 !important;
824 | }
825 |
826 | .pt-0,
827 | .py-0 {
828 | padding-top: 0 !important;
829 | }
830 |
831 | .pr-0,
832 | .px-0 {
833 | padding-right: 0 !important;
834 | }
835 |
836 | .pb-0,
837 | .py-0 {
838 | padding-bottom: 0 !important;
839 | }
840 |
841 | .pl-0,
842 | .px-0 {
843 | padding-left: 0 !important;
844 | }
845 |
846 | .p-1 {
847 | padding: 0.25rem !important;
848 | }
849 |
850 | .pt-1,
851 | .py-1 {
852 | padding-top: 0.25rem !important;
853 | }
854 |
855 | .pr-1,
856 | .px-1 {
857 | padding-right: 0.25rem !important;
858 | }
859 |
860 | .pb-1,
861 | .py-1 {
862 | padding-bottom: 0.25rem !important;
863 | }
864 |
865 | .pl-1,
866 | .px-1 {
867 | padding-left: 0.25rem !important;
868 | }
869 |
870 | .p-2 {
871 | padding: 0.5rem !important;
872 | }
873 |
874 | .pt-2,
875 | .py-2 {
876 | padding-top: 0.5rem !important;
877 | }
878 |
879 | .pr-2,
880 | .px-2 {
881 | padding-right: 0.5rem !important;
882 | }
883 |
884 | .pb-2,
885 | .py-2 {
886 | padding-bottom: 0.5rem !important;
887 | }
888 |
889 | .pl-2,
890 | .px-2 {
891 | padding-left: 0.5rem !important;
892 | }
893 |
894 | .p-3 {
895 | padding: 1rem !important;
896 | }
897 |
898 | .pt-3,
899 | .py-3 {
900 | padding-top: 1rem !important;
901 | }
902 |
903 | .pr-3,
904 | .px-3 {
905 | padding-right: 1rem !important;
906 | }
907 |
908 | .pb-3,
909 | .py-3 {
910 | padding-bottom: 1rem !important;
911 | }
912 |
913 | .pl-3,
914 | .px-3 {
915 | padding-left: 1rem !important;
916 | }
917 |
918 | .p-4 {
919 | padding: 1.5rem !important;
920 | }
921 |
922 | .pt-4,
923 | .py-4 {
924 | padding-top: 1.5rem !important;
925 | }
926 |
927 | .pr-4,
928 | .px-4 {
929 | padding-right: 1.5rem !important;
930 | }
931 |
932 | .pb-4,
933 | .py-4 {
934 | padding-bottom: 1.5rem !important;
935 | }
936 |
937 | .pl-4,
938 | .px-4 {
939 | padding-left: 1.5rem !important;
940 | }
941 |
942 | .p-5 {
943 | padding: 3rem !important;
944 | }
945 |
946 | .pt-5,
947 | .py-5 {
948 | padding-top: 3rem !important;
949 | }
950 |
951 | .pr-5,
952 | .px-5 {
953 | padding-right: 3rem !important;
954 | }
955 |
956 | .pb-5,
957 | .py-5 {
958 | padding-bottom: 3rem !important;
959 | }
960 |
961 | .pl-5,
962 | .px-5 {
963 | padding-left: 3rem !important;
964 | }
965 |
966 | .m-n1 {
967 | margin: -0.25rem !important;
968 | }
969 |
970 | .mt-n1,
971 | .my-n1 {
972 | margin-top: -0.25rem !important;
973 | }
974 |
975 | .mr-n1,
976 | .mx-n1 {
977 | margin-right: -0.25rem !important;
978 | }
979 |
980 | .mb-n1,
981 | .my-n1 {
982 | margin-bottom: -0.25rem !important;
983 | }
984 |
985 | .ml-n1,
986 | .mx-n1 {
987 | margin-left: -0.25rem !important;
988 | }
989 |
990 | .m-n2 {
991 | margin: -0.5rem !important;
992 | }
993 |
994 | .mt-n2,
995 | .my-n2 {
996 | margin-top: -0.5rem !important;
997 | }
998 |
999 | .mr-n2,
1000 | .mx-n2 {
1001 | margin-right: -0.5rem !important;
1002 | }
1003 |
1004 | .mb-n2,
1005 | .my-n2 {
1006 | margin-bottom: -0.5rem !important;
1007 | }
1008 |
1009 | .ml-n2,
1010 | .mx-n2 {
1011 | margin-left: -0.5rem !important;
1012 | }
1013 |
1014 | .m-n3 {
1015 | margin: -1rem !important;
1016 | }
1017 |
1018 | .mt-n3,
1019 | .my-n3 {
1020 | margin-top: -1rem !important;
1021 | }
1022 |
1023 | .mr-n3,
1024 | .mx-n3 {
1025 | margin-right: -1rem !important;
1026 | }
1027 |
1028 | .mb-n3,
1029 | .my-n3 {
1030 | margin-bottom: -1rem !important;
1031 | }
1032 |
1033 | .ml-n3,
1034 | .mx-n3 {
1035 | margin-left: -1rem !important;
1036 | }
1037 |
1038 | .m-n4 {
1039 | margin: -1.5rem !important;
1040 | }
1041 |
1042 | .mt-n4,
1043 | .my-n4 {
1044 | margin-top: -1.5rem !important;
1045 | }
1046 |
1047 | .mr-n4,
1048 | .mx-n4 {
1049 | margin-right: -1.5rem !important;
1050 | }
1051 |
1052 | .mb-n4,
1053 | .my-n4 {
1054 | margin-bottom: -1.5rem !important;
1055 | }
1056 |
1057 | .ml-n4,
1058 | .mx-n4 {
1059 | margin-left: -1.5rem !important;
1060 | }
1061 |
1062 | .m-n5 {
1063 | margin: -3rem !important;
1064 | }
1065 |
1066 | .mt-n5,
1067 | .my-n5 {
1068 | margin-top: -3rem !important;
1069 | }
1070 |
1071 | .mr-n5,
1072 | .mx-n5 {
1073 | margin-right: -3rem !important;
1074 | }
1075 |
1076 | .mb-n5,
1077 | .my-n5 {
1078 | margin-bottom: -3rem !important;
1079 | }
1080 |
1081 | .ml-n5,
1082 | .mx-n5 {
1083 | margin-left: -3rem !important;
1084 | }
1085 |
1086 | div.callDebugInfoJSONStringDiv {
1087 | background-color: #292827;
1088 | /* height: 500px;
1089 | width: 1350px; */
1090 | height: auto;
1091 | width: auto;
1092 | max-width: 1350px;
1093 | max-height: 500px;
1094 | overflow-y: auto;
1095 | overflow-x: auto;
1096 | }
1097 |
1098 | .video-effects-loading-spinner {
1099 | border: 8px solid #f3f3f3;
1100 | border-radius: 50%;
1101 | border-top: 8px solid #75b6e7;
1102 | width: 50px;
1103 | height: 50px;
1104 | -webkit-animation: spin 2s linear infinite;
1105 | /* Safari */
1106 | animation: spin 2s linear infinite;
1107 | position: absolute;
1108 | margin: auto;
1109 | top: 0;
1110 | bottom: 0;
1111 | left: 0;
1112 | right: 0;
1113 | transform: translate(-50%, -50%);
1114 | }
1115 |
1116 | .video-effects-image-picker.disabled {
1117 | cursor: not-allowed !important;
1118 | pointer-events: none !important;
1119 | }
1120 |
1121 | .video-effects-image-picker.disabled img {
1122 | filter: brightness(20%);
1123 | }
1124 |
1125 | .video-effects-image-picker img {
1126 | width: 100%;
1127 | height: 100%;
1128 | }
1129 |
1130 | .video-effects-image-picker .selected {
1131 | border: 1px solid #f3f3f3;
1132 | }
1133 |
1134 | .video-effects-image-picker i {
1135 | position: absolute;
1136 | bottom: 10px;
1137 | right: 10px;
1138 | }
1139 |
1140 | .video-effects-image-picker .image-overlay-icon.show {
1141 | display: block;
1142 | }
1143 |
1144 | .video-effects-image-picker .image-overlay-icon.hide {
1145 | display: none;
1146 | }
1147 |
1148 | div.volumeVisualizer {
1149 | --volume: 0%;
1150 | position: relative;
1151 | width: 200px;
1152 | height: 20px;
1153 | margin: 50px;
1154 | background-color: #DDD;
1155 | }
1156 |
1157 | div.volumeVisualizer::before {
1158 | content: '';
1159 | position: absolute;
1160 | top: 0;
1161 | bottom: 0;
1162 | left: 0;
1163 | width: var(--volume);
1164 | background-color: #75b6e7;
1165 | transition: width 100ms linear;
1166 | }
1167 |
1168 | .volume-indicatordiv {
1169 | width: 300px;
1170 | border: 1px solid #605e5c;
1171 | padding: 8px;
1172 | }
1173 |
1174 | .scrollable-captions-container {
1175 | overflow: auto;
1176 | max-height: 300px;
1177 | display: flex;
1178 | flex-direction: column-reverse;
1179 | }
1180 |
1181 | .scrollable-captions-container .captions-area .caption-item {
1182 | transform: translateZ(0);
1183 | }
1184 |
1185 | .scrollable-rtt-container {
1186 | overflow: auto;
1187 | max-height: 300px;
1188 | display: flex;
1189 | flex-direction: column-reverse;
1190 | }
1191 |
1192 | .custom-video-effects-buttons:not(.outgoing) {
1193 | display: flex;
1194 | position: absolute;
1195 | top: 4%;
1196 | left: 5%;
1197 | margin-bottom: -0.5em;
1198 | }
1199 |
1200 | .custom-video-effects-buttons:not(.outgoing) button {
1201 | z-index: 999;
1202 | background-color: #0000006e;
1203 | border: none;
1204 | }
1205 |
1206 | .video-feature-sample .primary-button {
1207 | background: transparent;
1208 | border: 1px solid #fff;
1209 | }
1210 |
1211 | .video-feature-sample .primary-button.is-disabled {
1212 | border: 1px solid grey;
1213 | color: grey;
1214 | }
1215 |
1216 | .spotlightEnabled {
1217 | border: 1px solid #75b6e7;
1218 | }
1219 |
1220 | .lobby-action{
1221 | margin-top: 2px;
1222 | }
1223 |
1224 | .red-link {
1225 | color: red;
1226 | }
1227 |
1228 | .green-link {
1229 | color: green;
1230 | }
1231 |
1232 | .participantMenu, .participantMenu:hover{
1233 | background-color: inherit;
1234 | width: fit-content;
1235 | color: white;
1236 | border: none;
1237 | margin-right: 5px;
1238 | }
1239 | .participantMenu .ms-Icon {
1240 | font-size: 30px;
1241 | border: none;
1242 | }
1243 |
1244 | .callFeatureEnabled, .callFeatureEnabled:hover {
1245 | color: #75b6e7;
1246 | }
1247 |
1248 | .dominant-speakers-list {
1249 | max-height: 12em;
1250 | overflow-y: auto;
1251 | }
--------------------------------------------------------------------------------
/Project/src/App.js:
--------------------------------------------------------------------------------
1 | import React, { useState } from 'react';
2 | import './App.css';
3 | import MakeCall from './MakeCall/MakeCall'
4 | import { initializeIcons } from '@fluentui/font-icons-mdl2';
5 | import { registerIcons } from '@fluentui/react/lib/Styling';
6 | import { VideoOff2Icon } from '@fluentui/react-icons-mdl2';
7 | import { ToastContainer } from 'react-toastify';
8 |
9 | initializeIcons();
10 | // VideoOff2 is not available in fluentui/react-icons-mdl2. So we need to use fluentui/font-icons-mdl2 to register the icon and use it.
11 | registerIcons({
12 | icons: {
13 | VideoOff2:
14 | }
15 | });
16 |
17 | function App() {
18 | let [users, setUsers] = useState([ ]);
19 |
20 | function VWebSdkVersion() {
21 | return require('../package.json').dependencies['@azure/communication-calling'];
22 | }
23 |
24 | return (
25 |
26 |
27 |
28 |
29 |
30 |
31 |
{setUsers([...users,
]) }} className="nav-bar-icon" src="./assets/images/acsIcon.png">
32 |
33 |
34 | Azure Communication Services - Calling SDK for Javascript - { VWebSdkVersion() }
35 |
36 |
37 |
45 |
46 |
47 |
48 | {users}
49 |
50 |
51 | );
52 | }
53 |
54 | export default App;
55 |
--------------------------------------------------------------------------------
/Project/src/Constants.js:
--------------------------------------------------------------------------------
1 | export const URL_PARAM = {
2 | DISPLAY_NAME: 'dn',
3 | MEETING_LINK: 'ml',
4 | VIDEO: 'video',
5 | MIC: 'mic',
6 | ON: 'on'
7 | }
--------------------------------------------------------------------------------
/Project/src/MakeCall/AddParticipantPopover.js:
--------------------------------------------------------------------------------
1 | import React, { useState } from "react";
2 | import { PrimaryButton } from '@fluentui/react/lib/Button';
3 | import { TextField } from '@fluentui/react/lib/TextField';
4 | import { CallKind } from "@azure/communication-calling";
5 | import { createIdentifierFromRawId } from '@azure/communication-common';
6 |
7 | export default function AddParticipantPopover({call}) {
8 | const [userId, setUserId] = useState('');
9 | const [threadId, setThreadId] = useState('');
10 | const [alternateCallerId, setAlternateCallerId] = useState('');
11 |
12 | function handleAddParticipant() {
13 | console.log('handleAddParticipant', userId);
14 | try {
15 | let participantId = createIdentifierFromRawId(userId);
16 | call._kind === CallKind.TeamsCall ?
17 | call.addParticipant(participantId, {threadId}) :
18 | call.addParticipant(participantId);
19 | } catch (e) {
20 | console.error(e);
21 | }
22 | }
23 |
24 | function handleAddPhoneNumber() {
25 | console.log('handleAddPhoneNumber', userId);
26 | try {
27 | call.addParticipant({ phoneNumber: userId }, { alternateCallerId: { phoneNumber: alternateCallerId }});
28 | } catch (e) {
29 | console.error(e);
30 | }
31 | }
32 |
33 | return (
34 |
35 |
36 |
37 |
38 |
Add a participant
39 |
40 |
setUserId(e.target.value)} />
41 | {
42 | call._kind === CallKind.TeamsCall &&
43 | setThreadId(e.target.value)} />
44 | }
45 | {
46 | call._kind === CallKind.Call &&
47 | setAlternateCallerId(e.target.value)} />
48 | }
49 | Add Participant
50 | {
51 | call._kind === CallKind.Call &&
52 | < PrimaryButton className="secondary-button mt-1" onClick={handleAddPhoneNumber}>Add Phone Number
53 | }
54 |
55 |
56 |
57 |
58 |
59 | );
60 | }
--------------------------------------------------------------------------------
/Project/src/MakeCall/AudioEffects/AudioEffectsContainer.js:
--------------------------------------------------------------------------------
1 | import React from 'react';
2 | import { Features, LocalAudioStream } from '@azure/communication-calling';
3 | import {
4 | EchoCancellationEffect,
5 | DeepNoiseSuppressionEffect
6 | } from '@azure/communication-calling-effects';
7 | import { Dropdown, PrimaryButton } from '@fluentui/react';
8 |
9 | export const LoadingSpinner = () => {
10 | return (
11 |
12 | );
13 | };
14 |
15 | export default class AudioEffectsContainer extends React.Component {
16 | constructor(props) {
17 | super(props);
18 | this.call = props.call;
19 | this.deviceManager = props.deviceManager;
20 | this.localAudioStreamFeatureApi = null;
21 | this.localAudioStream = null;
22 |
23 | this.state = {
24 | supportedAudioEffects: [],
25 | supportedAudioEffectsPopulated: false,
26 | autoGainControl: {
27 | startLoading: false,
28 | stopLoading: false,
29 | autoGainControlList: [],
30 | currentSelected: undefined
31 | },
32 | echoCancellation: {
33 | startLoading: false,
34 | stopLoading: false,
35 | echoCancellationList: [],
36 | currentSelected: undefined
37 | },
38 | noiseSuppression: {
39 | startLoading: false,
40 | stopLoading: false,
41 | noiseSuppressionList: [],
42 | currentSelected: undefined
43 | },
44 | activeEffects: {
45 | autoGainControl: [],
46 | echoCancellation: [],
47 | noiseSuppression: []
48 | }
49 | };
50 |
51 | this.initLocalAudioStreamFeatureApi();
52 | }
53 |
54 | componentDidCatch(e) {
55 | this.logError(JSON.stringify(e));
56 | }
57 |
58 | componentDidMount() {
59 | this.populateAudioEffects();
60 | }
61 |
62 | logError(error) {
63 | console.error(error);
64 | }
65 |
66 | logWarn(error) {
67 | console.warn(error);
68 | }
69 |
70 | initLocalAudioStreamFeatureApi() {
71 | const localAudioStream = this.call.localAudioStreams.find(a => {
72 | return a.mediaStreamType === 'Audio';
73 | });
74 |
75 | if (!localAudioStream) {
76 | this.logWarn('No local audio streams found, creating a new one..');
77 | const selectedMicrophone = this.deviceManager.selectedMicrophone;
78 | if (selectedMicrophone) {
79 | this.localAudioStream = new LocalAudioStream(selectedMicrophone);
80 | } else {
81 | this.logWarn('No selected microphone found, cannot create LocalAudioStream');
82 | return;
83 | }
84 | } else {
85 | this.localAudioStream = localAudioStream;
86 | }
87 |
88 | const lasFeatureApi = this.localAudioStream.feature && this.localAudioStream.feature(Features?.AudioEffects);
89 | if (!lasFeatureApi) {
90 | this.logError('Could not get local audio stream feature API.');
91 | return;
92 | }
93 | this.localAudioStreamFeatureApi = lasFeatureApi;
94 |
95 | this.localAudioStreamFeatureApi.on('effectsError', (error) => {
96 | this.logError(JSON.stringify(error));
97 | });
98 |
99 | this.localAudioStreamFeatureApi.on('effectsStarted', (effect) => {
100 | this.updateActiveEffects();
101 | console.log(`Audio effects started: ${JSON.stringify(effect?.name ?? effect)}`);
102 | });
103 |
104 | this.localAudioStreamFeatureApi.on('effectsStopped', (effect) => {
105 | this.updateActiveEffects();
106 | console.log(`Audio effects stopped: ${JSON.stringify(effect?.name ?? effect)}`);
107 | });
108 | }
109 |
110 | updateActiveEffects() {
111 | this.setState({
112 | activeEffects: {
113 | autoGainControl: this.localAudioStreamFeatureApi?.activeEffects?.autoGainControl,
114 | echoCancellation: this.localAudioStreamFeatureApi?.activeEffects?.echoCancellation,
115 | noiseSuppression: this.localAudioStreamFeatureApi?.activeEffects?.noiseSuppression
116 | }
117 | });
118 | }
119 |
120 | async populateAudioEffects() {
121 | const supported = [];
122 |
123 | const autoGainControlList = [];
124 | const echoCancellationList = [];
125 | const noiseSuppressionList = [];
126 |
127 | if (this.localAudioStreamFeatureApi) {
128 | if (await this.localAudioStreamFeatureApi.isSupported('BrowserAutoGainControl')) {
129 | supported.push('BrowserAutoGainControl');
130 | autoGainControlList.push({
131 | key: 'BrowserAutoGainControl',
132 | text: 'Browser Auto Gain Control'
133 | });
134 | }
135 |
136 | if (await this.localAudioStreamFeatureApi.isSupported('BrowserEchoCancellation')) {
137 | supported.push('BrowserEchoCancellation');
138 | echoCancellationList.push({
139 | key: 'BrowserEchoCancellation',
140 | text: 'Browser Echo Cancellation'
141 | });
142 | }
143 |
144 | if (await this.localAudioStreamFeatureApi.isSupported('BrowserNoiseSuppression')) {
145 | supported.push('BrowserNoiseSuppression');
146 | noiseSuppressionList.push({
147 | key: 'BrowserNoiseSuppression',
148 | text: 'Browser Noise Suppression'
149 | });
150 | }
151 |
152 | const echoCancellation = new EchoCancellationEffect();
153 | if (await this.localAudioStreamFeatureApi.isSupported(echoCancellation)) {
154 | supported.push(echoCancellation);
155 | echoCancellationList.push({
156 | key: echoCancellation.name,
157 | text: 'Echo Cancellation'
158 | });
159 | }
160 |
161 | const deepNoiseSuppression = new DeepNoiseSuppressionEffect();
162 | if (await this.localAudioStreamFeatureApi.isSupported(deepNoiseSuppression)) {
163 | supported.push(deepNoiseSuppression);
164 | noiseSuppressionList.push({
165 | key: deepNoiseSuppression.name,
166 | text: 'Deep Noise Suppression'
167 | });
168 | }
169 |
170 | this.setState({
171 | supportedAudioEffects: [ ...supported ],
172 | supportedAudioEffectsPopulated: true,
173 | autoGainControl: {
174 | ...this.state.autoGainControl,
175 | autoGainControlList
176 | },
177 | echoCancellation: {
178 | ...this.state.echoCancellation,
179 | echoCancellationList
180 | },
181 | noiseSuppression: {
182 | ...this.state.noiseSuppression,
183 | noiseSuppressionList
184 | },
185 | activeEffects: {
186 | autoGainControl: this.localAudioStreamFeatureApi?.activeEffects?.autoGainControl,
187 | echoCancellation: this.localAudioStreamFeatureApi?.activeEffects?.echoCancellation,
188 | noiseSuppression: this.localAudioStreamFeatureApi?.activeEffects?.noiseSuppression
189 | }
190 | });
191 | }
192 | }
193 |
194 | findEffectFromSupportedList(name) {
195 | const effect = this.state.supportedAudioEffects.find((supportedEffect) => {
196 | if (typeof supportedEffect === 'string' && supportedEffect === name) {
197 | return true;
198 | } else if (typeof supportedEffect === 'object' && supportedEffect.name && supportedEffect.name === name) {
199 | return true;
200 | }
201 | });
202 |
203 | return effect;
204 | }
205 |
206 | /* ------------ AGC control functions - start ---------------- */
207 | agcSelectionChanged(e, item) {
208 | const effect = this.findEffectFromSupportedList(item.key);
209 | if (effect) {
210 | this.setState({
211 | autoGainControl: {
212 | ...this.state.autoGainControl,
213 | currentSelected: effect
214 | }
215 | });
216 | }
217 | }
218 |
219 | async startAgc() {
220 | this.setState({
221 | autoGainControl: {
222 | ...this.state.autoGainControl,
223 | startLoading: true
224 | }
225 | });
226 |
227 | if (this.localAudioStreamFeatureApi) {
228 | await this.localAudioStreamFeatureApi.startEffects({
229 | autoGainControl: this.state.autoGainControl.currentSelected
230 | });
231 | }
232 |
233 | this.setState({
234 | autoGainControl: {
235 | ...this.state.autoGainControl,
236 | startLoading: false
237 | }
238 | });
239 | }
240 |
241 | async stopAgc() {
242 | this.setState({
243 | autoGainControl: {
244 | ...this.state.autoGainControl,
245 | stopLoading: true
246 | }
247 | });
248 |
249 | if (this.localAudioStreamFeatureApi) {
250 | await this.localAudioStreamFeatureApi.stopEffects({
251 | autoGainControl: true
252 | });
253 | }
254 |
255 | this.setState({
256 | autoGainControl: {
257 | ...this.state.autoGainControl,
258 | stopLoading: false
259 | }
260 | });
261 | }
262 | /* ------------ AGC control functions - end ---------------- */
263 |
264 | /* ------------ EC control functions - start ---------------- */
265 | ecSelectionChanged(e, item) {
266 | const effect = this.findEffectFromSupportedList(item.key);
267 | if (effect) {
268 | this.setState({
269 | echoCancellation: {
270 | ...this.state.echoCancellation,
271 | currentSelected: effect
272 | }
273 | });
274 | }
275 | }
276 |
277 | async startEc() {
278 | this.setState({
279 | echoCancellation: {
280 | ...this.state.echoCancellation,
281 | startLoading: true
282 | }
283 | });
284 |
285 | if (this.localAudioStreamFeatureApi) {
286 | await this.localAudioStreamFeatureApi.startEffects({
287 | echoCancellation: this.state.echoCancellation.currentSelected
288 | });
289 | }
290 |
291 | this.setState({
292 | echoCancellation: {
293 | ...this.state.echoCancellation,
294 | startLoading: false
295 | }
296 | });
297 | }
298 |
299 | async stopEc() {
300 | this.setState({
301 | echoCancellation: {
302 | ...this.state.echoCancellation,
303 | stopLoading: true
304 | }
305 | });
306 |
307 | if (this.localAudioStreamFeatureApi) {
308 | await this.localAudioStreamFeatureApi.stopEffects({
309 | echoCancellation: true
310 | });
311 | }
312 |
313 | this.setState({
314 | echoCancellation: {
315 | ...this.state.echoCancellation,
316 | stopLoading: false
317 | }
318 | });
319 | }
320 | /* ------------ EC control functions - end ---------------- */
321 |
322 | /* ------------ NS control functions - start ---------------- */
323 | nsSelectionChanged(e, item) {
324 | const effect = this.findEffectFromSupportedList(item.key);
325 | if (effect) {
326 | this.setState({
327 | noiseSuppression: {
328 | ...this.state.noiseSuppression,
329 | currentSelected: effect
330 | }
331 | });
332 | }
333 | }
334 |
335 | async startNs() {
336 | this.setState({
337 | noiseSuppression: {
338 | ...this.state.noiseSuppression,
339 | startLoading: true
340 | }
341 | });
342 |
343 | if (this.localAudioStreamFeatureApi) {
344 | await this.localAudioStreamFeatureApi.startEffects({
345 | noiseSuppression: this.state.noiseSuppression.currentSelected
346 | });
347 | }
348 |
349 | this.setState({
350 | noiseSuppression: {
351 | ...this.state.noiseSuppression,
352 | startLoading: false
353 | }
354 | });
355 | }
356 |
357 | async stopNs() {
358 | this.setState({
359 | noiseSuppression: {
360 | ...this.state.noiseSuppression,
361 | stopLoading: true
362 | }
363 | });
364 |
365 | if (this.localAudioStreamFeatureApi) {
366 | await this.localAudioStreamFeatureApi.stopEffects({
367 | noiseSuppression: true
368 | });
369 | }
370 |
371 | this.setState({
372 | noiseSuppression: {
373 | ...this.state.noiseSuppression,
374 | stopLoading: false
375 | }
376 | });
377 | }
378 | /* ------------ NS control functions - end ---------------- */
379 |
380 | render() {
381 | return (
382 | <>
383 | {this.state.supportedAudioEffects.length > 0 ?
384 |
385 |
386 |
387 |
Current active:
388 |
389 |
390 |
391 | {this.state.activeEffects.autoGainControl?.length > 0 &&
392 |
393 | {this.state.activeEffects.autoGainControl[0]}
394 |
395 | }
396 | {this.state.activeEffects.echoCancellation?.length > 0 &&
397 |
398 | {this.localAudioStreamFeatureApi.activeEffects.echoCancellation[0]}
399 |
400 | }
401 | {this.state.activeEffects.noiseSuppression?.length > 0 &&
402 |
403 | {this.state.activeEffects.noiseSuppression[0]}
404 |
405 | }
406 |
407 |
408 |
409 | this.agcSelectionChanged(e, item)}
412 | options={this.state.autoGainControl.autoGainControlList}
413 | placeholder={'Select an option'}
414 | styles={{ dropdown: { width: 300, color: 'black' }, label: { color: 'white' } }}
415 | />
416 |
417 |
418 |
this.startAgc()}
421 | >
422 | {this.state.autoGainControl.startLoading ? : 'Start AGC'}
423 |
424 |
425 |
this.stopAgc()}
428 | >
429 | {this.state.autoGainControl.stopLoading ? : 'Stop AGC'}
430 |
431 |
432 |
433 |
434 |
435 |
436 | this.ecSelectionChanged(e, item)}
439 | options={this.state.echoCancellation.echoCancellationList}
440 | placeholder={'Select an option'}
441 | styles={{ dropdown: { width: 300, color: 'black' }, label: { color: 'white' } }}
442 | />
443 |
444 |
445 |
this.startEc()}
448 | >
449 | {this.state.echoCancellation.startLoading ? : 'Start EC'}
450 |
451 |
452 |
this.stopEc()}
455 | >
456 | {this.state.echoCancellation.stopLoading ? : 'Stop EC'}
457 |
458 |
459 |
460 |
461 |
462 |
463 | this.nsSelectionChanged(e, item)}
466 | options={this.state.noiseSuppression.noiseSuppressionList}
467 | placeholder={'Select an option'}
468 | styles={{ dropdown: { width: 300, color: 'black' }, label: { color: 'white' } }}
469 | />
470 |
471 |
472 |
this.startNs()}
475 | >
476 | {this.state.noiseSuppression.startLoading ? : 'Start NS'}
477 |
478 |
479 |
this.stopNs()}
482 | >
483 | {this.state.noiseSuppression.stopLoading ? : 'Stop NS'}
484 |
485 |
486 |
487 |
488 | :
489 |
490 | Audio effects and enhancements are not supported in the current environment.
491 | They are currently only supported on Windows Chrome, Windows Edge, MacOS Chrome, MacOS Edge and MacOS Safari.
492 |
493 | }
494 | >
495 | )
496 | }
497 | }
498 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/CallCaption.js:
--------------------------------------------------------------------------------
1 | import React, { useEffect, useState } from "react";
2 | import { Features } from '@azure/communication-calling';
3 | import { Dropdown } from '@fluentui/react/lib/Dropdown';
4 |
5 | // CallCaption react function component
6 | const CallCaption = ({ call }) => {
7 | const [captionsFeature, setCaptionsFeature] = useState(call.feature(Features.Captions));
8 | const [capabilitiesFeature, setCapabilitiesFeature] = useState(call.feature(Features.Capabilities));
9 | const [captions, setCaptions] = useState(captionsFeature.captions);
10 | const [currentSpokenLanguage, setCurrentSpokenLanguage] = useState(captions.activeSpokenLanguage);
11 | const [currentCaptionLanguage, setCurrentCaptionLanguage] = useState(null);
12 | let captionLanguageCurrent = null;
13 |
14 |
15 | useEffect(() => {
16 | try {
17 | startCaptions(captions);
18 | }
19 | catch(e) {
20 | console.log("Captions not configured for this release version")
21 | }
22 |
23 | return () => {
24 | // cleanup
25 | stopCaptions(captions);
26 | captions.off('CaptionsActiveChanged', captionsActiveHandler);
27 | captions.off('CaptionsReceived', captionsReceivedHandler);
28 | captions.off('SpokenLanguageChanged', activeSpokenLanguageHandler);
29 | if (captions.kind === 'TeamsCaptions' && capabilitiesFeature.capabilities.setCaptionLanguage?.isPresent) {
30 | captions.off('CaptionLanguageChanged', activeCaptionLanguageHandler);
31 | }
32 | };
33 | }, []);
34 |
35 | const startCaptions = async () => {
36 | try {
37 | if (!captions.isCaptionsFeatureActive) {
38 | await captions.startCaptions({ spokenLanguage: 'en-us' });
39 | }
40 | captions.on('CaptionsActiveChanged', captionsActiveHandler);
41 | captions.on('CaptionsReceived', captionsReceivedHandler);
42 | captions.on('SpokenLanguageChanged', activeSpokenLanguageHandler);
43 | capabilitiesFeature.on('CapabilitiesChanged', (value) => {
44 | if (value.newValue.setCaptionLanguage) {
45 | setCapabilitiesFeature(call.feature(Features.Capabilities));
46 | }
47 | });
48 | if (captions.kind === 'TeamsCaptions' && capabilitiesFeature.capabilities.setCaptionLanguage?.isPresent) {
49 | captions.on('CaptionLanguageChanged', activeCaptionLanguageHandler);
50 | }
51 | } catch (e) {
52 | console.error('startCaptions failed', e);
53 | }
54 | };
55 |
56 | const stopCaptions = async () => {
57 | try {
58 | await captions.stopCaptions();
59 | } catch (e) {
60 | console.error('stopCaptions failed', e);
61 | }
62 | };
63 |
64 | const captionsActiveHandler = () => {
65 | console.log('CaptionsActiveChanged: ', captions.isCaptionsFeatureActive);
66 | setCurrentSpokenLanguage(captions.activeSpokenLanguage);
67 | setCurrentCaptionLanguage(captions.activeCaptionLanguage);
68 | }
69 | const activeSpokenLanguageHandler = () => {
70 | setCurrentSpokenLanguage(captions.activeSpokenLanguage);
71 | }
72 | const activeCaptionLanguageHandler = () => {
73 | setCurrentCaptionLanguage(captions.activeCaptionLanguage);
74 | captionLanguageCurrent = captions.activeCaptionLanguage;
75 | }
76 |
77 | const captionsReceivedHandler = (captionData) => {
78 | if (!captionLanguageCurrent || captionLanguageCurrent === captionData.captionLanguage) {
79 | let mri = '';
80 | switch (captionData.speaker.identifier.kind) {
81 | case 'communicationUser': { mri = captionData.speaker.identifier.communicationUserId; break; }
82 | case 'microsoftTeamsUser': { mri = captionData.speaker.identifier.microsoftTeamsUserId; break; }
83 | case 'phoneNumber': { mri = captionData.speaker.identifier.phoneNumber; break; }
84 | }
85 | let captionAreasContainer = document.getElementById('captionsArea');
86 | const newClassName = `prefix${mri.replace(/:/g, '').replace(/-/g, '').replace(/\+/g, '')}`;
87 | const captionText = `${captionData.timestamp.toUTCString()}
88 | ${captionData.speaker.displayName ?? mri}: ${captionData.captionText ?? captionData.spokenText}`;
89 |
90 | let foundCaptionContainer = captionAreasContainer.querySelector(`.${newClassName}[isNotFinal='true']`);
91 |
92 | if (!foundCaptionContainer) {
93 | let captionContainer = document.createElement('div');
94 | captionContainer.setAttribute('isNotFinal', 'true');
95 | captionContainer.style['borderBottom'] = '1px solid';
96 | captionContainer.style['whiteSpace'] = 'pre-line';
97 | captionContainer.textContent = captionText;
98 | captionContainer.classList.add(newClassName);
99 | captionContainer.classList.add('caption-item')
100 |
101 | captionAreasContainer.appendChild(captionContainer);
102 |
103 | } else {
104 | foundCaptionContainer.textContent = captionText;
105 |
106 | if (captionData.resultType === 'Final') {
107 | foundCaptionContainer.setAttribute('isNotFinal', 'false');
108 | }
109 | }
110 | }
111 | };
112 |
113 | const spokenLanguageSelectionChanged = async (event, item) => {
114 | const spokenLanguages = captions.supportedSpokenLanguages;
115 | const language = spokenLanguages.find(language => { return language === item.key });
116 | await captions.setSpokenLanguage(language);
117 | setCurrentSpokenLanguage(language);
118 | };
119 |
120 | const SpokenLanguageDropdown = () => {
121 | const keyedSupportedSpokenLanguages = captions.supportedSpokenLanguages.map(language => ({key: language, text: language}));
122 | return
129 | }
130 |
131 | const captionLanguageSelectionChanged = async (event, item) => {
132 | const captionLanguages = captions.supportedCaptionLanguages;
133 | const language = captionLanguages.find(language => { return language === item.key });
134 | await captions.setCaptionLanguage(language);
135 | setCurrentCaptionLanguage(language);
136 | };
137 |
138 | const CaptionLanguageDropdown = () => {
139 | const keyedSupportedCaptionLanguages = captions.supportedCaptionLanguages.map(language => ({key: language, text: language}));
140 | return
147 | }
148 |
149 | return (
150 | <>
151 | {captions && }
152 | {captions && captions.kind === 'TeamsCaptions' && capabilitiesFeature.capabilities.setCaptionLanguage?.isPresent && }
153 |
157 | >
158 | );
159 | };
160 |
161 | export default CallCaption;
162 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/CallSurvey.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import {
3 | PrimaryButton
4 | } from '@fluentui/react/lib/Button'
5 | import StarRating from '../MakeCall/StarRating';
6 | import { Features } from '@azure/communication-calling';
7 | import config from '../../clientConfig.json';
8 | import { ApplicationInsights } from '@microsoft/applicationinsights-web';
9 | import { TextField } from '@fluentui/react/lib/TextField';
10 |
11 | export default class CallSurvey extends React.Component {
12 | constructor(props) {
13 | super(props);
14 | this.call = props.call;
15 | this.state = {
16 | overallIssue: '',
17 | overallRating: 0,
18 | audioIssue: '',
19 | audioRating: 0,
20 | videoIssue: '',
21 | videoRating: 0,
22 | screenShareIssue: '',
23 | screenShareRating: 0,
24 | surveyError: '',
25 | improvementSuggestion: ''
26 | };
27 |
28 | if (config.appInsightsConnectionString) {
29 | // Typical application will have the app insights already initialized, so that can be used here.
30 | this.appInsights = new ApplicationInsights({
31 | config: {
32 | // Use atob function to decode only if the connection string is base64. To encode: btoa("connection string")
33 | connectionString: atob(config.appInsightsConnectionString)
34 | }
35 | });
36 | this.appInsights.loadAppInsights();
37 | }
38 |
39 | }
40 |
41 | componentWillUnmount() {
42 |
43 | }
44 |
45 | componentDidMount() {
46 |
47 | }
48 |
49 | captureRating(category, score) {
50 | if (category == 'overall') {
51 | this.setState({ overallRating: score });
52 | } else if (category == 'audio') {
53 | this.setState({ audioRating: score });
54 | } else if (category == 'video') {
55 | this.setState({ videoRating: score });
56 | } else if (category == 'screenShare') {
57 | this.setState({ screenShareRating: score });
58 | }
59 | }
60 |
61 | captureIssue(category, issue) {
62 | if (category == 'overall') {
63 | this.setState({ overallIssue: issue });
64 | } else if (category == 'audio') {
65 | this.setState({ audioIssue: issue });
66 | } else if (category == 'video') {
67 | this.setState({ videoIssue: issue });
68 | } else if (category == 'screenShare') {
69 | this.setState({ screenShareIssue: issue });
70 | }
71 |
72 | }
73 |
74 | submitRating() {
75 | const rating = {};
76 | rating.overallRating = { score: this.state.overallRating };
77 | if (this.state.overallIssue) rating.overallRating.issues = [this.state.overallIssue];
78 |
79 | if (this.state.audioRating !== 0) rating.audioRating = { score: this.state.audioRating };
80 | if (this.state.audioIssue) rating.audioRating.issues = [this.state.audioIssue];
81 |
82 | if (this.state.videoRating !== 0) rating.videoRating = { score: this.state.videoRating };
83 | if (this.state.videoIssue) rating.videoRating.issues = [this.state.videoIssue];
84 |
85 | if (this.state.screenShareRating !== 0) rating.screenshareRating = { score: this.state.screenShareRating };
86 | if (this.state.screenShareIssue) rating.screenshareRating.issues = [this.state.screenShareIssue];
87 |
88 | this.call.feature(Features.CallSurvey).submitSurvey(rating).then((res) => {
89 | if (this.appInsights && this.state.improvementSuggestion !== '') {
90 | this.appInsights.trackEvent({
91 | name: "CallSurvey", properties: {
92 | // Survey ID to correlate the survey
93 | id: res.id,
94 | // Other custom properties as key value pair
95 | improvementSuggestion: this.state.improvementSuggestion
96 | }
97 | });
98 | this.appInsights.flush();
99 | }
100 | this.props.onSubmitted();
101 | }).catch((e) => {
102 | console.error('Failed to submit survey', e);
103 | this.setState({ surveyError: 'Failed to submit survey' + e });
104 | });
105 | }
106 |
107 | render() {
108 | return (
109 |
110 |
111 |
112 |
Rate your recent call!
113 | {
114 | this.state.surveyError !== '' && {this.state.surveyError}
115 | }
116 |
117 |
118 |
this.submitRating()}>
123 |
124 |
125 |
126 |
127 |
128 |
Rate your overall call experience
129 |
130 | this.captureIssue(category, issue)}
134 | onRate={(category, score) => this.captureRating(category, score)}
135 | />
136 |
137 |
138 |
139 |
140 |
Rate your audio experience (optional)
141 |
142 | this.captureIssue(category, issue)}
146 | onRate={(category, score) => this.captureRating(category, score)}
147 | />
148 |
149 |
150 |
151 |
152 |
Rate your video experience (optional)
153 |
154 | this.captureIssue(category, issue)}
158 | onRate={(category, score) => this.captureRating(category, score)}
159 | />
160 |
161 |
162 |
163 |
164 |
165 |
Rate your screen share experience (optional)
166 |
167 | this.captureIssue(category, issue)}
171 | onRate={(category, score) => this.captureRating(category, score)}
172 | />
173 |
174 |
175 |
176 |
177 |
How can we improve? (optional)
178 |
179 | { this.state.improvementSuggestion = e.target.value }}/>
184 |
185 |
186 |
187 |
188 | );
189 | }
190 | }
191 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/CurrentCallInformation.js:
--------------------------------------------------------------------------------
1 | import React, { useState, useEffect } from "react";
2 | import { Features } from "@azure/communication-calling";
3 | import { AzureLogger } from '@azure/logger';
4 |
5 | const CurrentCallInformation = ({ sentResolution, call }) => {
6 | const [ovcFeature, setOvcFeature] = useState();
7 | const [optimalVideoCount, setOptimalVideoCount] = useState(1);
8 |
9 | useEffect(() => {
10 | try {
11 | setOvcFeature(call.feature(Features.OptimalVideoCount));
12 | } catch (error) {
13 | AzureLogger.log("Feature not implemented yet");
14 | }
15 |
16 | return () => {
17 | ovcFeature?.off('optimalVideoCountChanged', optimalVideoCountChanged);
18 | }
19 | }, []);
20 |
21 | useEffect(() => {
22 | ovcFeature?.on('optimalVideoCountChanged', optimalVideoCountChanged);
23 | }, [ovcFeature]);
24 |
25 | const optimalVideoCountChanged = () => {
26 | setOptimalVideoCount(ovcFeature.optimalVideoCount);
27 | };
28 |
29 | return (
30 |
31 |
Call Id: {call.id}
32 |
Local Participant Id: {call.info.participantId}
33 | {
34 | sentResolution &&
Sent Resolution: {sentResolution}
35 | }
36 | {
37 | ovcFeature &&
Optimal Video Count: {optimalVideoCount}
38 | }
39 |
40 | );
41 | }
42 |
43 | export default CurrentCallInformation;
44 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/DataChannelCard.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { ToastContainer, toast } from 'react-toastify';
3 | import { Features } from '@azure/communication-calling';
4 | import 'react-toastify/dist/ReactToastify.css';
5 | import { PrimaryButton } from '@fluentui/react/lib/Button';
6 | import { TextField } from '@fluentui/react/lib/TextField';
7 |
8 | import { utils } from '../Utils/Utils';
9 |
10 | const toastOptions = {
11 | position: "top-right",
12 | autoClose: 5000,
13 | hideProgressBar: true,
14 | closeOnClick: true,
15 | pauseOnHover: true,
16 | draggable: true,
17 | progress: undefined,
18 | theme: "colored",
19 | };
20 |
21 | export default class DataChannelCard extends React.Component {
22 | constructor(props) {
23 | super(props);
24 | this.state = {
25 | inputMessage: ''
26 | }
27 | const call = props.call;
28 | if (!Features.DataChannel) {
29 | return;
30 | }
31 | const dataChannel = call.feature(Features.DataChannel);
32 | const getDisplayName = (participantId) => {
33 | const remoteParticipant = props.remoteParticipants.find(rp => rp.identifier.communicationUserId === participantId);
34 | if (remoteParticipant && remoteParticipant.displayName) {
35 | return remoteParticipant.displayName;
36 | }
37 | return undefined;
38 | }
39 | const textDecoder = new TextDecoder();
40 | const messageHandler = (message, remoteParticipantId) => {
41 | const displayName = getDisplayName(remoteParticipantId);
42 | const from = displayName ? displayName : remoteParticipantId;
43 | const text = textDecoder.decode(message.data);
44 | toast.info(`${from}: ${text}`, {
45 | position: "top-left",
46 | autoClose: 5000,
47 | hideProgressBar: true,
48 | closeOnClick: true,
49 | pauseOnHover: true,
50 | draggable: true,
51 | progress: undefined,
52 | theme: "colored",
53 | });
54 | };
55 | dataChannel.on('dataChannelReceiverCreated', receiver => {
56 | const participantId = utils.getIdentifierText(receiver.senderParticipantIdentifier);
57 | const displayName = getDisplayName(participantId);
58 | const from = displayName ? `${participantId} (${displayName})` : participantId;
59 | toast.success(`data channel id = ${receiver.channelId} from ${from} is opened`, toastOptions);
60 |
61 | receiver.on('close', () => {
62 | toast.error(`data channel id = ${receiver.channelId} from ${from} is closed`, toastOptions);
63 | });
64 | if (receiver.channelId === 1000) {
65 | receiver.on('messageReady', () => {
66 | const message = receiver.readMessage();
67 | messageHandler(message, participantId);
68 | });
69 | }
70 | });
71 |
72 | try {
73 | this.messageSender = dataChannel.createDataChannelSender({
74 | channelId: 1000
75 | });
76 | } catch(e) {
77 | toast.error(`createDataChannelSender: ${e.message}`, toastOptions);
78 | }
79 | }
80 |
81 | setParticipants(participants) {
82 | try {
83 | this.messageSender.setParticipants(participants);
84 | } catch(e) {
85 | toast.error(`setParticipants: ${e.message}`, toastOptions);
86 | }
87 | }
88 |
89 | sendMessage() {
90 | if (this.state.inputMessage) {
91 | try {
92 | this.messageSender.sendMessage((new TextEncoder()).encode(this.state.inputMessage)).then(() => {
93 | this.setState({
94 | inputMessage: ''
95 | });
96 | }).catch(e => {
97 | toast.error(`sendMessage: ${e.message}`, toastOptions);
98 | });
99 | } catch(e) {
100 | toast.error(`sendMessage: ${e.message}`, toastOptions);
101 | }
102 | }
103 | }
104 |
105 | render() {
106 | return (
107 |
108 |
109 |
When no remote participant checkbox is selected, message will broadcast in the channel
110 |
111 |
{
114 | if (ev.key === 'Enter') {
115 | this.sendMessage();
116 | ev.preventDefault();
117 | }
118 | }}
119 | onChange={ev => {
120 | this.setState({
121 | inputMessage: ev.target.value
122 | });
123 | }}
124 | value={this.state.inputMessage}
125 | />
126 | this.sendMessage()}>
131 |
132 |
133 |
134 |
135 | );
136 | }
137 | }
138 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/IncomingCallCard.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { Icon } from '@fluentui/react/lib/Icon';
3 |
4 | export default class IncomingCallCard extends React.Component {
5 | constructor(props) {
6 | super(props);
7 | this.incomingCall = props.incomingCall;
8 | this.acceptCallMicrophoneUnmutedVideoOff = props.acceptCallMicrophoneUnmutedVideoOff;
9 | this.acceptCallMicrophoneUnmutedVideoOn = props.acceptCallMicrophoneUnmutedVideoOn;
10 | this.acceptCallMicrophoneMutedVideoOn = props.acceptCallMicrophoneMutedVideoOn;
11 | this.acceptCallMicrophoneMutedVideoOff = props.acceptCallMicrophoneMutedVideoOff;
12 | }
13 |
14 | render() {
15 | return (
16 |
17 |
18 |
19 |
Incoming Call...
20 |
21 |
22 | {
23 | this.call &&
24 |
Call Id: {this.state.callId}
25 | }
26 |
27 |
28 |
31 | {this.incomingCall?.customContext &&
32 | <>
33 |
34 |
35 |
Custom context:
36 |
37 |
38 | {this.incomingCall.customContext.userToUser &&
39 |
40 |
41 | UUI: {this.incomingCall.customContext.userToUser}
42 |
43 |
44 | }
45 | {this.incomingCall.customContext.xHeaders && this.incomingCall.customContext.xHeaders
46 | .map(header =>
47 |
48 |
49 | {header.key}: {header.value}
50 |
51 |
)
52 | }
53 | >
54 | }
55 |
56 | this.incomingCall.accept(await this.acceptCallMicrophoneUnmutedVideoOff())}>
59 |
60 |
61 |
62 | this.incomingCall.accept(await this.acceptCallMicrophoneUnmutedVideoOn())}>
65 |
66 |
67 |
68 | this.incomingCall.accept(await this.acceptCallMicrophoneMutedVideoOn())}>
71 |
72 |
73 |
74 | this.incomingCall.accept(await this.acceptCallMicrophoneMutedVideoOff())}>
77 |
78 |
79 |
80 | { this.incomingCall.reject(); this.props.onReject(); }}>
83 |
84 |
85 |
86 |
87 | );
88 | }
89 | }
--------------------------------------------------------------------------------
/Project/src/MakeCall/Lobby.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { PrimaryButton } from '@fluentui/react/lib/Button';
3 | import { Features } from '@azure/communication-calling';
4 |
5 | // Lobby react function component
6 | export default class Lobby extends React.Component {
7 | constructor(props) {
8 | super(props);
9 | this.call = props.call;
10 | this.lobby = this.call.lobby;
11 | this.capabilitiesFeature = props.capabilitiesFeature;
12 | this.capabilities = this.capabilitiesFeature.capabilities;
13 | this.state = {
14 | canManageLobby: this.capabilities.manageLobby?.isPresent || this.capabilities.manageLobby?.reason === 'FeatureNotSupported',
15 | lobbyParticipantsCount: props.lobbyParticipantsCount
16 | };
17 | }
18 |
19 | componentDidMount() {
20 | this.capabilitiesFeature.on('capabilitiesChanged', this.capabilitiesChangedHandler);
21 | }
22 |
23 | componentWillReceiveProps(nextProps) {
24 | if (nextProps.lobbyParticipantsCount !== this.state.lobbyParticipantsCount) {
25 | this.setState({ lobbyParticipantsCount: nextProps.lobbyParticipantsCount });
26 | }
27 | }
28 |
29 | capabilitiesChangedHandler = (capabilitiesChangeInfo) => {
30 | console.log('lobby:capabilitiesChanged');
31 | for (const [key, value] of Object.entries(capabilitiesChangeInfo.newValue)) {
32 | if(key === 'manageLobby' && value.reason != 'FeatureNotSupported') {
33 | (value.isPresent) ? this.setState({ canManageLobby: true }) : this.setState({ canManageLobby: false });
34 | const admitAllButton = document.getElementById('admitAllButton');
35 | if(this.state.canManageLobby === true){
36 | admitAllButton.style.display = '';
37 | } else {
38 | admitAllButton.style.display = 'none';
39 | }
40 | continue;
41 | }
42 | }
43 | };
44 |
45 | async admitAllParticipants() {
46 | console.log('admitAllParticipants');
47 | try {
48 | await this.lobby?.admitAll();
49 | } catch (e) {
50 | console.error(e);
51 | }
52 | }
53 |
54 | render() {
55 | return (
56 |
57 |
58 |
In-Lobby participants number: {this.state.lobbyParticipantsCount}
59 |
60 |
61 |
this.admitAllParticipants()}>
67 |
68 |
69 |
70 | );
71 | }
72 | }
73 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/LocalVideoPreviewCard.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { VideoStreamRenderer} from '@azure/communication-calling';
3 | import { utils } from '../Utils/Utils';
4 | import VideoSendStats from './VideoSendStats';
5 |
6 | export default class LocalVideoPreviewCard extends React.Component {
7 | constructor(props) {
8 | super(props);
9 | this.identifier = props.identifier;
10 | this.stream = props.stream;
11 | this.type = this.stream.mediaStreamType;
12 | this.view = undefined;
13 | this.componentId = `${utils.getIdentifierText(this.identifier)}-local${this.type}Renderer`;
14 | this.state = {
15 | videoStats: undefined
16 | };
17 | }
18 |
19 | async componentDidMount() {
20 | try {
21 | this.renderer = new VideoStreamRenderer(this.stream);
22 | this.view = await this.renderer.createView();
23 | const targetContainer = document.getElementById(this.componentId);
24 | if (this.type === 'ScreenSharing' || this.type === 'RawMedia') {
25 | this.view.target.querySelector('video').style.width = targetContainer.style.width;
26 | }
27 | targetContainer.appendChild(this.view.target);
28 | } catch (error) {
29 | console.error('Failed to render preview', error);
30 | }
31 | }
32 |
33 | async componentWillUnmount() {
34 | this.view.dispose();
35 | this.view = undefined;
36 | }
37 |
38 | render() {
39 | return (
40 |
41 | {
42 | this.state.videoStats &&
43 |
44 |
45 |
46 | }
47 |
48 | );
49 | }
50 |
51 | updateSendStats(videoStats) {
52 | if (this.state.videoStats !== videoStats) {
53 | this.setState({ videoStats });
54 | }
55 | }
56 | }
57 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/MediaConstraint.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { Dropdown } from '@fluentui/react/lib/Dropdown';
3 |
4 |
5 | export default class MediaConstraint extends React.Component {
6 | constructor(props) {
7 | super(props);
8 | this.videoSendHeightMax = [
9 | { key: 0, text: 'None' },
10 | { key: 180, text: '180' },
11 | { key: 240, text: '240' },
12 | { key: 360, text: '360' },
13 | { key: 540, text: '540' },
14 | { key: 720, text: '720' },
15 | { key: 1080, text: '1080' },
16 | ];
17 | this.videoSendBitRateConstraint = [
18 | { key: 0, text: 'None' },
19 | { key: 15000, text: '15000 (< 180p bitrate)' },
20 | { key: 175000, text: '175000 (~180 bitrate)' },
21 | { key: 400000, text: '400000 (~240p bitrate)' },
22 | { key: 450000, text: '800000 (<360p bitrate)' },
23 | { key: 575000, text: '575000 (~360p bitrate)' },
24 | { key: 1125000, text: '1125000 (~540p bitrate)' },
25 | { key: 2500000, text: '2500000 (~720p bitrate)' },
26 | { key: 100000000, text: '100000000 (max range for 1080p)' }
27 | ];
28 | this.videoSendFrameRateConstraint = [
29 | { key: 0, text: 'None' },
30 | { key: 5, text: '5' },
31 | { key: 10, text: '10' },
32 | { key: 15, text: '15' },
33 | { key: 20, text: '20' },
34 | { key: 25, text: '25' },
35 | { key: 30, text: '30' }
36 | ];
37 | this.state = {
38 | videoSendHeightMax: 0,
39 | videoSendBitRate: 0,
40 | videoSendFrameRate: 0
41 | }
42 | }
43 |
44 | handleChange = async(event, item) => {
45 | const videoConstraints = {
46 | video: {
47 | send: {
48 | frameHeight: {
49 | max: this.state.videoSendHeightMax
50 | },
51 | bitrate: {
52 | max: this.state.videoSendBitRate
53 | },
54 | frameRate: {
55 | max: this.state.videoSendFrameRate
56 | }
57 | }
58 | }
59 | };
60 |
61 | if(event.target.id === 'videoSendHeightMaxDropdown') {
62 | videoConstraints.video.send.frameHeight.max = item.key;
63 | this.setState({
64 | videoSendHeightMax: item.key
65 | });
66 | } else if(event.target.id === 'videoSendBitRateDropdown') {
67 | videoConstraints.video.send.bitrate.max = item.key;
68 | this.setState({
69 | videoSendBitRate: item.key
70 | });
71 | } else if(event.target.id === 'videoSendFrameRateDropdown') {
72 | videoConstraints.video.send.frameRate.max = item.key;
73 | this.setState({
74 | videoSendFrameRate: item.key
75 | });
76 | }
77 |
78 | if (this.props.onChange) {
79 | this.props.onChange(videoConstraints);
80 | }
81 | }
82 |
83 | render() {
84 | return (
85 |
86 |
95 |
104 |
113 |
114 | );
115 | }
116 | }
117 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/NetworkConfiguration/ProxyConfiguration.js:
--------------------------------------------------------------------------------
1 | import React, { useState } from 'react';
2 | import { Checkbox } from '@fluentui/react';
3 | import { PrimaryButton } from '@fluentui/react/lib/Button';
4 | import { TextField } from '@fluentui/react/lib/TextField';
5 |
6 | export const ProxyConfiguration = (props) => {
7 | const [proxyUrl, setProxyUrl] = useState('');
8 |
9 | return (
10 |
11 | Proxy configuration
12 |
19 |
{props.proxy.url}
20 |
{
24 | setProxyUrl(e.target.value);
25 | }}
26 | value={proxyUrl}
27 | >
28 |
29 |
30 |
31 |
props.handleAddProxyUrl(proxyUrl)}
35 | />
36 |
37 |
38 |
{
41 | setProxyUrl('');
42 | props.handleProxyUrlReset();
43 | }}
44 | />
45 |
46 |
47 |
48 | );
49 | };
50 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/NetworkConfiguration/TurnConfiguration.js:
--------------------------------------------------------------------------------
1 | import React, { useState } from 'react';
2 | import { Checkbox } from '@fluentui/react';
3 | import { PrimaryButton } from '@fluentui/react/lib/Button';
4 | import { TextField } from '@fluentui/react/lib/TextField';
5 |
6 | export const TurnConfiguration = (props) => {
7 | const [turnUrls, setTurnUrls] = useState('');
8 | const [turnUsername, setTurnUsername] = useState('');
9 | const [turnCredential, setTurnCredential] = useState('');
10 |
11 | const handleAddTurn = () => {
12 | if (turnUrls) {
13 | const iceServer = {
14 | urls: !!turnUrls ? turnUrls.split(';') : [],
15 | username: turnUsername,
16 | credential: turnCredential
17 | };
18 |
19 | props.handleAddTurnConfig(iceServer);
20 | }
21 | };
22 |
23 | return (
24 |
25 | Turn configuration
26 |
32 |
33 | {props.customTurn.turn &&
34 | props.customTurn.turn?.iceServers?.map((iceServer, key) => {
35 | if (iceServer.urls && iceServer.urls.length > 0) {
36 | return (
37 |
38 | {iceServer?.urls?.map((url, key) => {
39 | return (
40 |
41 | {url}
42 |
43 | )
44 | })}
45 |
46 | )
47 | }
48 |
49 | return (
50 |
51 | )
52 | })
53 | }
54 |
55 |
{
60 | setTurnUrls(e.target.value);
61 | }}
62 | >
63 |
64 |
{
69 | setTurnUsername(e.target.value);
70 | }}
71 | >
72 |
73 |
{
78 | setTurnCredential(e.target.value);
79 | }}
80 | >
81 |
82 |
98 |
99 | )
100 | };
101 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/ParticipantMenuOptions.js:
--------------------------------------------------------------------------------
1 | import * as React from 'react';
2 | import { DefaultButton } from '@fluentui/react/lib/Button';
3 | export const ParticipantMenuOptions = ({id, appendMenuitems, menuOptionsHandler, menuOptionsState}) => {
4 |
5 | const emojiIcon= { iconName: 'More' };
6 | const isSpotlighted = menuOptionsState.isSpotlighted;
7 |
8 | const buttonStyles = {
9 | root: {
10 | minWidth: 0,
11 | padding: '10px 4px',
12 | alignSelf: 'stretch',
13 | fontSize: '30px',
14 | }
15 | }
16 |
17 | let commonMenuItems = [
18 | ]
19 |
20 |
21 | const menuProps = {
22 | shouldFocusOnMount: true,
23 | items: appendMenuitems ? [...commonMenuItems, ...appendMenuitems]: commonMenuItems
24 | };
25 | return ;
32 | };
33 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/RawVideoAccess/CustomVideoEffects.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { PrimaryButton } from '@fluentui/react/lib/Button'
3 |
4 | export default class CustomVideoEffects extends React.Component {
5 |
6 | constructor(props) {
7 | super(props);
8 | this.call = props.call;
9 | this.stream = props.stream;
10 | this.isLocal = props.isLocal;
11 | this.bwStream = undefined;
12 | this.bwVideoelem = undefined;
13 | this.bwTimeout = undefined;
14 | this.bwCtx = undefined;
15 | this.dummyTimeout = undefined;
16 | this.remoteParticipantId = props.remoteParticipantId;
17 |
18 | this.state = {
19 | buttons: props.buttons ? props.buttons : undefined,
20 | videoContainerId: props.videoContainerId
21 | };
22 | }
23 |
24 | componentWillUnmount() {
25 | if (this.dummyTimeout) {
26 | clearTimeout(this.dummyTimeout);
27 | }
28 |
29 | if (this.bwVideoElem) {
30 | this.bwCtx.clearRect(0, 0, this.bwVideoElem.width, this.bwVideoElem.height);
31 | clearTimeout(this.bwTimeout);
32 | this.bwVideoElem.srcObject.getVideoTracks().forEach((track) => { track.stop(); });
33 | this.bwVideoElem.srcObject = null;
34 | }
35 | }
36 |
37 | setRemoteVideoElementSourceObject(mediaStream) {
38 | const target = document.getElementById(this.state.videoContainerId)
39 | const video = target.querySelector("video");
40 | if(video) {
41 | try {
42 | video.srcObject = mediaStream;
43 | video.load();
44 | } catch(err) {
45 | console.error('There was an issue setting the source', err);
46 | }
47 | }
48 | }
49 |
50 | async addEffect(e) {
51 | switch (e.currentTarget.children[0].textContent) {
52 | case this.isLocal && this.state.buttons?.add?.label: {
53 | //add filters to outgoing video
54 | const localVideoStreamRawStream = await this.stream.getMediaStream();
55 | const { bwStream, bwVideoElem } = this.bwVideoStream(localVideoStreamRawStream);
56 | this.bwStream = bwStream;
57 | this.bwVideoElem = bwVideoElem;
58 | if(bwStream) {
59 | this.stream.setMediaStream(bwStream);
60 | }
61 | break;
62 | }
63 | case this.isLocal && this.state.buttons?.sendDummy?.label: {
64 | // send a dummy video
65 | const dummyStream = this.dummyStream();
66 | if(dummyStream) {
67 | this.stream.setMediaStream(dummyStream);
68 | }
69 | break;
70 | }
71 | case !this.isLocal && this.state.buttons?.add?.label: {
72 | const remoteVideoStreamRawStream = await this.stream.getMediaStream();
73 | const { bwStream, bwVideoElem } = this.bwVideoStream(remoteVideoStreamRawStream);
74 | if (bwStream) {
75 | this.setRemoteVideoElementSourceObject(bwStream);
76 | }
77 | break;
78 | }
79 | case !this.isLocal && this.state.buttons?.remove?.label:{
80 | const originalRemoteVideoStream = await this.stream.getMediaStream();
81 | if(originalRemoteVideoStream) {
82 | //render original unfiltered video
83 | this.setRemoteVideoElementSourceObject(originalRemoteVideoStream);
84 | }
85 | break;
86 | }
87 | }
88 |
89 | if (this.isLocal) {
90 | this.setState({ buttons: {
91 | add: {
92 | label: "Set B/W effect",
93 | disabled: true
94 | },
95 | sendDummy: {
96 | label: "Set dummy effect",
97 | disabled: true
98 | }
99 | }});
100 | }
101 | }
102 |
103 | bwVideoStream(stream) {
104 | let width = 1280, height = 720;
105 | const bwVideoElem = document.createElement("video");
106 | bwVideoElem.srcObject = stream;
107 | bwVideoElem.height = height;
108 | bwVideoElem.width = width;
109 | bwVideoElem.play();
110 | const canvas = document.createElement('canvas');
111 | this.bwCtx = canvas.getContext('2d', { willReadFrequently: true });
112 | canvas.width = width;
113 | canvas.height = height;
114 |
115 | const FPS = 30;
116 | const processVideo = function () {
117 | try {
118 | let begin = Date.now();
119 | // start processing.
120 | this.bwCtx.filter = "grayscale(1)";
121 | this.bwCtx.drawImage(bwVideoElem, 0, 0, width, height);
122 | // schedule the next one.
123 | let delay = Math.abs(1000/FPS - (Date.now() - begin));
124 | this.bwTimeout = setTimeout(processVideo, delay);
125 | } catch (err) {
126 | console.error(err);
127 | }
128 | }.bind(this);
129 |
130 | // schedule the first one.
131 | this.bwTimeout = setTimeout(processVideo, 0);
132 | const bwStream = canvas.captureStream(FPS);
133 | return { bwStream, bwVideoElem };
134 | }
135 |
136 | dummyStream() {
137 | const canvas = document.createElement('canvas');
138 | const ctx = canvas.getContext('2d', {willReadFrequently: true});
139 | canvas.width = 1280;
140 | canvas.height = 720;
141 | ctx.fillStyle = 'blue';
142 | ctx.fillRect(0, 0, canvas.width, canvas.height);
143 |
144 | const colors = ['red', 'yellow', 'green'];
145 | const FPS = 30;
146 | function createShapes() {
147 | try {
148 | let begin = Date.now();
149 | // start processing.
150 | if (ctx) {
151 | ctx.fillStyle = colors[Math.floor(Math.random() * colors.length)];
152 | const x = Math.floor(Math.random() * canvas.width);
153 | const y = Math.floor(Math.random() * canvas.height);
154 | const size = 100;
155 | ctx.fillRect(x, y, size, size);
156 | }
157 | // schedule the next one.
158 | let delay = Math.abs(1000/FPS - (Date.now() - begin));
159 | this.dummyTimeout = setTimeout(createShapes, delay);
160 | } catch (err) {
161 | console.error(err);
162 | }
163 | };
164 |
165 | // schedule the first one.
166 | this.dummyTimeout = setTimeout(createShapes, 0);
167 | return canvas.captureStream(FPS);
168 | }
169 |
170 | render() {
171 | return(
172 |
173 | {
174 | this.state.buttons &&
175 | Object.keys(this.state.buttons).map((obj, idx) => {
176 | return
177 |
this.addEffect(e)}
181 | disabled={this.state.buttons[obj].disabled}>
182 | {this.state.buttons[obj].label}
183 |
184 |
185 | })
186 | }
187 |
188 | )
189 |
190 | }
191 |
192 | }
193 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/RealTimeTextCard.js:
--------------------------------------------------------------------------------
1 | import React, { useEffect, useState } from "react";
2 | import { Features } from '@azure/communication-calling';
3 | import { PrimaryButton } from '@fluentui/react/lib/Button';
4 |
5 | // RealTimeText react function component
6 | const RealTimeTextCard = ({ call, state }) => {
7 | const [realTimeTextFeature, setRealTimeTextFeature] = useState(call.feature(Features.RealTimeText));
8 | const [rttInputLiveHandler, setRttInputLiveHandler] = useState(false);
9 |
10 | useEffect(() => {
11 | try {
12 | subscribeToSendRealTimeTextLive();
13 | }
14 | catch(error) {
15 | console.log("RealTimeText not configured for this release version")
16 | }
17 |
18 | return () => {
19 | // cleanup
20 | let rttTextField = document.getElementById('rttTextField');
21 | rttTextField.removeEventListener('input', subscribeToSendRealTimeTextHelper);
22 | };
23 | }, []);
24 |
25 | const sendRTT = async () => {
26 | try {
27 | let rttTextField = document.getElementById('rttTextField');
28 | if (!state.firstRealTimeTextReceivedorSent) {
29 | state.setFirstRealTimeTextReceivedorSent(true);
30 | }
31 | realTimeTextFeature.sendRealTimeText(rttTextField.value, true);
32 | rttTextField.value = null;
33 | } catch (error) {
34 | console.log('ERROR Send RTT failed', error);
35 | }
36 | }
37 |
38 | const sendRealTimeTextLiveHandler = () => {
39 | if (!rttInputLiveHandler) {
40 | try {
41 | let rttTextField = document.getElementById('rttTextField');
42 | rttTextField.removeEventListener('input', subscribeToSendRealTimeTextHelper);
43 | rttTextField.addEventListener('input', (event) => {
44 | if (!state.firstRealTimeTextReceivedorSent) {
45 | state.setFirstRealTimeTextReceivedorSent(true);
46 | }
47 | realTimeTextFeature.sendRealTimeText(rttTextField.value);
48 | })
49 | setRttInputLiveHandler(true);
50 | } catch (error) {
51 | console.log('ERROR Send live rtt handler subscription failed', error);
52 | }
53 | }
54 | }
55 |
56 | const subscribeToSendRealTimeTextHelper = () => {
57 | let rttTextField = document.getElementById('rttTextField');
58 | if (rttTextField.value !== '') {
59 | sendRealTimeTextLiveHandler();
60 | }
61 | setRttInputLiveHandler(true);
62 | }
63 |
64 | const subscribeToSendRealTimeTextLive = () => {
65 | if (!rttInputLiveHandler) {
66 | try {
67 | let rttTextField = document.getElementById('rttTextField');
68 | rttTextField.removeEventListener('input', subscribeToSendRealTimeTextHelper);
69 | rttTextField.addEventListener('input', subscribeToSendRealTimeTextHelper);
70 | } catch (error) {
71 | console.log('ERROR setting live rtt handler', error);
72 | }
73 |
74 | }
75 | }
76 |
77 | return (
78 | <>
79 |
89 |
93 | >
94 | );
95 | };
96 |
97 | export default RealTimeTextCard;
98 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/RemoteParticipantCard.js:
--------------------------------------------------------------------------------
1 | import React, { useEffect, createRef } from "react";
2 | import { utils } from '../Utils/Utils';
3 | import { Persona, PersonaSize } from '@fluentui/react/lib/Persona';
4 | import { Icon } from '@fluentui/react/lib/Icon';
5 | import {
6 | isCommunicationUserIdentifier,
7 | isMicrosoftTeamsUserIdentifier,
8 | isUnknownIdentifier,
9 | isPhoneNumberIdentifier,
10 | } from '@azure/communication-common';
11 | import { Features } from '@azure/communication-calling';
12 | import { ParticipantMenuOptions } from './ParticipantMenuOptions';
13 |
14 | export default class RemoteParticipantCard extends React.Component {
15 | constructor(props) {
16 | super(props);
17 | this.call = props.call;
18 | this.remoteParticipant = props.remoteParticipant;
19 | this.identifier = this.remoteParticipant.identifier;
20 | this.id = utils.getIdentifierText(this.remoteParticipant.identifier);
21 | this.isCheckable = isCommunicationUserIdentifier(this.remoteParticipant.identifier) ||
22 | isMicrosoftTeamsUserIdentifier(this.remoteParticipant.identifier);
23 |
24 | this.spotlightFeature = this.call.feature(Features.Spotlight);
25 | this.raiseHandFeature = this.call.feature(Features.RaiseHand);
26 | this.capabilitiesFeature = props.capabilitiesFeature;
27 | this.capabilities = this.capabilitiesFeature.capabilities;
28 | this.menuOptionsHandler= props.menuOptionsHandler;
29 | this.state = {
30 | isSpeaking: this.remoteParticipant.isSpeaking,
31 | state: this.remoteParticipant.state,
32 | isMuted: this.remoteParticipant.isMuted,
33 | hasDisplayNameChanged: this.remoteParticipant.hasDisplayNameChanged,
34 | displayName: this.remoteParticipant.hasDisplayNameChanged ? (this.remoteParticipant.displayName ? `${this.remoteParticipant.displayName.trim()} (Edited)` : '') : this.remoteParticipant.displayName?.trim(),
35 | participantIds: this.remoteParticipant.endpointDetails.map((e) => { return e.participantId }),
36 | isHandRaised: utils.isParticipantHandRaised(this.remoteParticipant.identifier, this.raiseHandFeature.getRaisedHands()),
37 | isSpotlighted: utils.isParticipantHandRaised(this.remoteParticipant.identifier, this.spotlightFeature.getSpotlightedParticipants()),
38 | canManageLobby: this.capabilities.manageLobby?.isPresent || this.capabilities.manageLobby?.reason === 'FeatureNotSupported',
39 | canMuteOthers: this.capabilities.muteOthers?.isPresent || this.capabilities.muteOthers?.reason === 'FeatureNotSupported',
40 | forbidOthersAudio: this.capabilities.forbidOthersAudio?.isPresent || this.capabilities.forbidOthersAudio?.reason === 'FeatureNotSupported',
41 | forbidOthersVideo: this.capabilities.forbidOthersVideo?.isPresent || this.capabilities.forbidOthersVideo?.reason === 'FeatureNotSupported',
42 | };
43 | }
44 |
45 | componentWillUnmount() {
46 | this.remoteParticipant.off('isMutedChanged', () => {});
47 | this.remoteParticipant.off('stateChanged', () => {});
48 | this.remoteParticipant.off('isSpeakingChanged', () => {});
49 | this.remoteParticipant.off('displayNameChanged', () => {});
50 | this.spotlightFeature.off('spotlightChanged', ()=>{});
51 | this.raiseHandFeature.off("loweredHandEvent", ()=>{});
52 | this.raiseHandFeature.off("raisedHandEvent", ()=>{});
53 | if (this.props.onSelectionChanged) {
54 | this.props.onSelectionChanged(this.remoteParticipant.identifier, false);
55 | }
56 | }
57 |
58 | componentDidMount() {
59 | this.remoteParticipant.on('isMutedChanged', () => {
60 | this.setState({ isMuted: this.remoteParticipant.isMuted });
61 | if (this.remoteParticipant.isMuted) {
62 | this.setState({ isSpeaking: false });
63 | }
64 | });
65 |
66 | this.remoteParticipant.on('stateChanged', () => {
67 | this.setState({ state: this.remoteParticipant.state });
68 | });
69 |
70 | this.remoteParticipant.on('isSpeakingChanged', () => {
71 | this.setState({ isSpeaking: this.remoteParticipant.isSpeaking });
72 | })
73 |
74 | this.remoteParticipant.on('displayNameChanged', ({newValue, oldValue, reason }) => {
75 | const hasDisplayNameChanged = this.remoteParticipant.hasDisplayNameChanged;
76 | const displayName = this.remoteParticipant.hasDisplayNameChanged ? (this.remoteParticipant.displayName ? `${this.remoteParticipant.displayName.trim()} (Edited)` : '') : this.remoteParticipant.displayName?.trim();
77 | if (reason === 'editedDisplayName') {
78 | console.log('Edited display Name: ', newValue, oldValue, reason);
79 | this.setState({ hasDisplayNameChanged, displayName});
80 | this.menuOptionsHandler.handleDisplayNameChanged(newValue, oldValue, reason);
81 | } else {
82 | this.setState({displayName})
83 | }
84 | });
85 |
86 | this.spotlightFeature.on("spotlightChanged", () => {
87 | this.setState({isSpotlighted: utils.isParticipantSpotlighted(
88 | this.remoteParticipant.identifier,
89 | this.spotlightFeature.getSpotlightedParticipants())});
90 | });
91 |
92 | const isRaiseHandChangedHandler = (event) => {
93 | this.setState({isHandRaised: utils.isParticipantHandRaised(
94 | this.remoteParticipant.identifier,
95 | this.raiseHandFeature.getRaisedHands())})
96 | }
97 | this.raiseHandFeature.on("loweredHandEvent", isRaiseHandChangedHandler);
98 | this.raiseHandFeature.on("raisedHandEvent", isRaiseHandChangedHandler);
99 | this.capabilitiesFeature.on('capabilitiesChanged', (capabilitiesChangeInfo) => {
100 | for (const [key, value] of Object.entries(capabilitiesChangeInfo.newValue)) {
101 | if(key === 'forbidOthersAudio' && value.reason != 'FeatureNotSupported') {
102 | (value.isPresent) ? this.setState({ forbidOthersAudio: true }) : this.setState({ forbidOthersAudio: false });
103 | } else if(key === 'forbidOthersVideo' && value.reason != 'FeatureNotSupported') {
104 | (value.isPresent) ? this.setState({ forbidOthersVideo: true }) : this.setState({ forbidOthersVideo: false });
105 | } else if(key === 'manageLobby' && value.reason != 'FeatureNotSupported') {
106 | (value.isPresent) ? this.setState({ canManageLobby: true }) : this.setState({ canManageLobby: false });
107 | let lobbyActions = document.getElementById('lobbyAction');
108 | if (lobbyActions) {
109 | if(this.state.canManageLobby === false){
110 | lobbyActions.hidden = true;
111 | }
112 | else{
113 | lobbyActions.hidden = false;
114 | }
115 | }
116 | continue;
117 | }
118 | }
119 | });
120 | }
121 |
122 | handleRemoveParticipant(e, identifier) {
123 | e.preventDefault();
124 | this.call.removeParticipant(identifier).catch((e) => console.error(e))
125 | }
126 |
127 | handleMuteParticipant(e, remoteParticipant) {
128 | e.preventDefault();
129 | if (this.state.canMuteOthers) {
130 | remoteParticipant.mute().catch((e) => console.error('Failed to mute specific participant.', e.message, e));
131 | } else {
132 | console.error('Soft mute of remote participants is not a supported capability for this participant.');
133 | }
134 | }
135 |
136 | handleCheckboxChange(e) {
137 | this.props.onSelectionChanged(this.remoteParticipant.identifier, e.target.checked);
138 | }
139 |
140 | getMenuItems() {
141 | let menuItems = []
142 | menuItems.push({
143 | key: 'spotlight',
144 | iconProps: { iconName: 'Focus', className: this.state.isSpotlighted ? "callFeatureEnabled" : ``},
145 | text: this.state.isSpotlighted ? 'Stop Spotlight' : 'Start Spotlight',
146 | onClick: (e) => this.state.isSpotlighted ?
147 | this.menuOptionsHandler.stopSpotlight(this.identifier, e):
148 | this.menuOptionsHandler.startSpotlight(this.identifier, e)
149 | });
150 | if (this.props.mediaAccess && this.state.forbidOthersAudio && this.remoteParticipant.role === 'Attendee') {
151 | menuItems.push({
152 | key: 'forbidAudio',
153 | iconProps: { iconName: this.props.mediaAccess?.isAudioPermitted ? 'MicOff':'Microphone', className: this.props.mediaAccess?.isAudioPermitted ? `` : "callFeatureEnabled"},
154 | text: this.props.mediaAccess?.isAudioPermitted ? 'Disable mic' : 'Allow mic',
155 | onClick: (e) => this.props.mediaAccess?.isAudioPermitted ?
156 | this.menuOptionsHandler.forbidAudio(this.identifier, e):
157 | this.menuOptionsHandler.permitAudio(this.identifier, e)
158 | });
159 | }
160 | if (this.props.mediaAccess && this.state.forbidOthersVideo && this.remoteParticipant.role === 'Attendee') {
161 | menuItems.push({
162 | key: 'forbidVideo',
163 | iconProps: { iconName: this.props.mediaAccess?.isVideoPermitted ? 'VideoOff':'Video', className: this.props.mediaAccess?.isVideoPermitted ? `` : "callFeatureEnabled"},
164 | text: this.props.mediaAccess?.isVideoPermitted ? 'Disable camera' : 'Allow camera',
165 | onClick: (e) => this.props.mediaAccess?.isVideoPermitted ?
166 | this.menuOptionsHandler.forbidVideo(this.identifier, e):
167 | this.menuOptionsHandler.permitVideo(this.identifier, e)
168 | });
169 | }
170 | return menuItems.filter(item => item != 0)
171 | }
172 |
173 | getSecondaryText() {
174 | return `${this.state.state} | ${this.state.participantIds.toString()}`;
175 | }
176 |
177 | async handleRemoteRaiseHand() {
178 | try {
179 | if (this.state.isHandRaised) {
180 | await this.raiseHandFeature.lowerHands([this.remoteParticipant.identifier]);
181 | this.setState({isHandRaised: utils.isParticipantHandRaised(this.remoteParticipant.identifier, this.raiseHandFeature.getRaisedHands())})
182 | }
183 | } catch(error) {
184 | console.error(error)
185 | }
186 | }
187 |
188 | async admitParticipant() {
189 | console.log('admit');
190 | try {
191 | await this.call.lobby.admit(this.identifier);
192 | } catch (e) {
193 | console.error(e);
194 | }
195 | }
196 |
197 | async rejectParticipant() {
198 | console.log('reject');
199 | try {
200 | await this.call.lobby.reject(this.identifier);
201 | } catch (e) {
202 | console.error(e);
203 | }
204 | }
205 |
206 | render() {
207 | return (
208 |
209 |
210 | {
211 | this.isCheckable &&
212 |
213 | this.handleCheckboxChange(e)} />
214 |
215 | }
216 |
226 |
227 |
228 | {
229 | this.props.mediaAccess?.isVideoPermitted === false ?
232 |
233 |
: undefined
234 | }
235 | {
236 | this.props.mediaAccess?.isAudioPermitted === false ?
239 |
240 |
:
this.handleMuteParticipant(e, this.remoteParticipant)}>
243 |
244 |
245 | }
246 | {
247 | !(isPhoneNumberIdentifier(this.remoteParticipant.identifier) || isUnknownIdentifier(this.remoteParticipant.identifier)) &&
248 |
this.handleRemoteRaiseHand()}>
252 |
253 |
254 | }
255 |
262 |
263 | {
264 | this.state.state === "InLobby" ?
265 |
:
269 |
this.handleRemoveParticipant(e, this.remoteParticipant.identifier)}>
273 |
274 |
275 | }
276 |
277 |
278 |
279 | )
280 | }
281 | }
282 |
283 |
284 |
285 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/Section.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { Depths } from '@uifabric/fluent-theme/lib/fluent/FluentDepths';
3 | import { PrimaryButton } from '@fluentui/react/lib/Button'
4 |
5 | export default class Section extends React.Component {
6 | constructor(props) {
7 | super(props);
8 | this.title = props.title;
9 | this.subtitle = props.subtitle;
10 | this.code = props.code;
11 |
12 | this.state = {
13 | sectionBody: props.sectionBody,
14 | call: props.call,
15 | callClient: props.callClient,
16 | isIncoming: props.isIncoming,
17 | showCode: false
18 | }
19 | }
20 |
21 | toggleCode() {
22 | this.setState({showCode: !this.state.showCode});
23 | }
24 |
25 | render() {
26 | return (
27 |
28 |
{this.title}
29 |
30 |
{this.subtitle}
31 |
this.toggleCode()}/>
32 |
33 |
34 | {
35 | this.state.showCode &&
36 |
37 |
38 | {this.code}
39 |
40 |
41 | }
42 |
43 |
On this instantiation you are logged in as {this.userId}
44 |
Try it out. Open a
new window , and place a call to
{this.userId}
45 |
Come back to this page to accept the incoming call.
46 | {
47 | this.state.isIncoming && (
)
48 | }
49 | {
50 | this.state.incomingCallEndReason &&
51 | ( )
52 | }
53 |
54 |
55 | );
56 | }
57 | }
--------------------------------------------------------------------------------
/Project/src/MakeCall/StarRating.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 | import { Icon } from '@fluentui/react/lib/Icon';
3 | import { Dropdown } from '@fluentui/react/lib/Dropdown';
4 |
5 | export default class StarRating extends React.Component {
6 | constructor(props) {
7 | super(props);
8 | this.category = props.category;
9 | this.onRate = props.onRate;
10 | this.onIssueSelected = props.onIssueSelected;
11 | const issues = props.issues.map(issue => {
12 | const text = issue.replace(/([A-Z])/g, ' $1').trim();
13 | return { key: issue, text: text };
14 | });
15 | this.state = {
16 | currentScore: 0,
17 | selectedIssue: '',
18 | issues: issues
19 | };
20 | }
21 |
22 | componentWillUnmount() {
23 |
24 | }
25 |
26 | componentDidMount() {
27 |
28 | }
29 |
30 | rate(rating) {
31 | this.props.onRate(this.props.category, rating);
32 | this.setState({ currentScore: rating });
33 | }
34 |
35 | issueSelected = async (event, item) => {
36 | this.props.onIssueSelected(this.props.category, item.key);
37 | this.setState({ selectedIssue: item.key });
38 | };
39 |
40 | render() {
41 | return (
42 |
43 |
44 |
45 |
46 | {[1, 2, 3, 4, 5].map(v =>
47 | this.rate(v)}>
52 | = v ? 'FavoriteStarFill' : 'FavoriteStar'}`} />
53 |
54 | )}
55 |
56 |
57 |
58 |
59 |
67 |
68 |
69 |
70 |
71 | );
72 | }
73 | }
74 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/StreamRenderer.js:
--------------------------------------------------------------------------------
1 | import React, { useEffect, useState, useRef, useImperativeHandle, forwardRef } from "react";
2 | import { utils } from '../Utils/Utils';
3 | import { VideoStreamRenderer } from "@azure/communication-calling";
4 | import CustomVideoEffects from "./RawVideoAccess/CustomVideoEffects";
5 | import VideoReceiveStats from './VideoReceiveStats';
6 |
7 | export const StreamRenderer = forwardRef(({
8 | remoteParticipant,
9 | stream,
10 | isPinningActive,
11 | isPinned,
12 | dominantRemoteParticipant,
13 | dominantSpeakerMode,
14 | call,
15 | streamCount,
16 | showMediaStats
17 | }, ref) => {
18 | const componentId = `${utils.getIdentifierText(remoteParticipant.identifier)}-${stream.mediaStreamType}-${stream.id}`;
19 | const videoContainerId = componentId + '-videoContainer';
20 | const componentContainer = useRef(null);
21 | const videoContainer = useRef(null);
22 | const renderer = useRef(null);
23 | const view = useRef(null);
24 | const [isLoading, setIsLoading] = useState(false);
25 | const [isSpeaking, setIsSpeaking] = useState(!!remoteParticipant?.isSpeaking);
26 | const [isMuted, setIsMuted] = useState(!!remoteParticipant?.isMuted);
27 | const [displayName, setDisplayName] = useState(remoteParticipant?.displayName?.trim() ?? '');
28 | const [videoStats, setVideoStats] = useState();
29 | const [transportStats, setTransportStats] = useState();
30 | const [height, setHeight] = useState(stream.size.height);
31 | const [width, setWidth] = useState(stream.size.width);
32 |
33 | useEffect(() => {
34 | initializeComponent();
35 | return () => {
36 | stream.off('isReceivingChanged', isReceivingChanged);
37 | stream.off('sizeChanged', sizeChanged);
38 | remoteParticipant.off('isSpeakingChanged', isSpeakingChanged);
39 | remoteParticipant.off('isMutedChanged', isMutedChanged);
40 | remoteParticipant.off('displayNameChanged', isDisplayNameChanged);
41 | disposeRenderer();
42 | }
43 | }, []);
44 |
45 | const getRenderer = () => {
46 | return view.current;
47 | }
48 |
49 | const createRenderer = async () => {
50 | if (!renderer.current) {
51 | renderer.current = new VideoStreamRenderer(stream);
52 | view.current = await renderer.current.createView();
53 | return view.current;
54 | } else {
55 | throw new Error(`[App][StreamMedia][id=${stream.id}][createRenderer] stream already has a renderer`);
56 | }
57 | }
58 |
59 | const attachRenderer = (v) => {
60 | try {
61 | if (v) {
62 | view.current = v;
63 | }
64 |
65 | if (!view.current.target) {
66 | throw new Error(`[App][StreamMedia][id=${stream.id}][attachRenderer] target is undefined. Must create renderer first`);
67 | } else {
68 | componentContainer.current.style.display = 'block';
69 | videoContainer.current.appendChild(view.current?.target);
70 | }
71 | } catch (e) {
72 | console.error(e);
73 | }
74 | }
75 |
76 | const disposeRenderer = () => {
77 | if (videoContainer.current && componentContainer.current) {
78 | videoContainer.current.innerHTML = '';
79 | componentContainer.current.style.display = 'none';
80 | }
81 | if (renderer.current) {
82 | renderer.current.dispose();
83 | } else {
84 | console.warn(`[App][StreamMedia][id=${stream.id}][disposeRender] no renderer to dispose`);
85 | }
86 | }
87 | const isReceivingChanged = () => {
88 | try {
89 | if (stream?.isAvailable) {
90 | setIsLoading(!stream.isReceiving);
91 | } else {
92 | setIsLoading(false);
93 | }
94 |
95 | } catch (e) {
96 | console.error(e);
97 | }
98 | };
99 |
100 | const sizeChanged = () => {
101 | setHeight(stream.size.height);
102 | setWidth(stream.size.width);
103 | }
104 |
105 | const isMutedChanged = () => {
106 | setIsMuted(remoteParticipant && remoteParticipant?.isMuted);
107 | };
108 |
109 | const isSpeakingChanged = () => {
110 | setIsSpeaking(remoteParticipant && remoteParticipant.isSpeaking);
111 | }
112 |
113 | const isDisplayNameChanged = () => {
114 | setDisplayName(remoteParticipant.displayName.trim());
115 | }
116 | /**
117 | * Start stream after DOM has rendered
118 | */
119 | const initializeComponent = async () => {
120 | stream.on('isReceivingChanged', isReceivingChanged);
121 | stream.on('sizeChanged', sizeChanged);
122 | remoteParticipant.on('isMutedChanged', isMutedChanged);
123 | remoteParticipant.on('isSpeakingChanged', isSpeakingChanged);
124 | if (dominantSpeakerMode && dominantRemoteParticipant !== remoteParticipant) {
125 | return;
126 | }
127 |
128 | try {
129 | if (stream.isAvailable && !renderer.current) {
130 | await createRenderer();
131 | attachRenderer();
132 | }
133 | } catch (e) {
134 | console.error(e);
135 | }
136 | }
137 |
138 | useImperativeHandle(ref, () => ({
139 | updateReceiveStats(videoStatsReceived, transportStatsReceived) {
140 | if (videoStatsReceived) {
141 | if (videoStatsReceived !== videoStats && stream.isAvailable) {
142 | setVideoStats(videoStatsReceived);
143 | setTransportStats(transportStatsReceived);
144 | }
145 | }
146 | },
147 | getRenderer,
148 | createRenderer,
149 | attachRenderer,
150 | disposeRenderer
151 | }));
152 |
153 | if (stream.isAvailable) {
154 | return (
155 |
161 |
width ? 'portrait' : ''}`}>
166 |
167 | {displayName ? displayName : remoteParticipant.displayName ? remoteParticipant.displayName : utils.getIdentifierText(remoteParticipant.identifier)}
168 |
169 |
183 | {
184 | isLoading &&
185 | }
186 | {
187 | videoStats && showMediaStats &&
188 |
189 |
190 |
191 | }
192 |
193 |
194 | );
195 | }
196 | return <>>;
197 | });
198 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/VideoEffects/VideoEffectsContainer.js:
--------------------------------------------------------------------------------
1 | import React from 'react';
2 | import { PrimaryButton } from '@fluentui/react/lib/Button';
3 | import { Dropdown } from '@fluentui/react/lib/Dropdown';
4 | import { Features } from '@azure/communication-calling';
5 | import { BackgroundBlurEffect, BackgroundReplacementEffect } from '@azure/communication-calling-effects';
6 | import VideoEffectsImagePicker from './VideoEffectsImagePicker';
7 |
8 | export const LoadingSpinner = () => {
9 | return (
10 |
11 | )
12 | };
13 |
14 | export default class VideoEffectsContainer extends React.Component {
15 | constructor(props) {
16 | super(props);
17 | this.call = props.call;
18 | this.localVideoStreamFeatureApi = null;
19 |
20 | this.state = {
21 | selectedVideoEffect: null,
22 | supportedVideoEffects: [],
23 | supportedVideoEffectsPopulated: false,
24 | videoEffectsList: [],
25 | startEffectsLoading: false,
26 | stopEffectsLoading: false
27 | };
28 |
29 | this.initLocalVideoStreamFeatureApi();
30 | }
31 |
32 | componentDidCatch(e) {
33 | this.logError(JSON.stringify(e));
34 | }
35 |
36 | componentDidMount() {
37 | this.populateVideoEffects();
38 | }
39 |
40 | logError(error) {
41 | this.setState({
42 | ...this.state,
43 | });
44 |
45 | console.error(error);
46 | }
47 |
48 | initLocalVideoStreamFeatureApi() {
49 | const localVideoStream = this.call.localVideoStreams.find(v => { return v.mediaStreamType === 'Video'});
50 |
51 | if (!localVideoStream) {
52 | this.logError('No local video streams found.');
53 | return;
54 | }
55 | const lvsFeatureApi = localVideoStream.feature && localVideoStream.feature(Features?.VideoEffects);
56 | if (!lvsFeatureApi) {
57 | this.logError('Could not get local video stream feature API.');
58 | return;
59 | }
60 | this.localVideoStreamFeatureApi = lvsFeatureApi;
61 |
62 | this.localVideoStreamFeatureApi.on('effectsError', (error) => {
63 | this.logError(JSON.stringify(error));
64 | this.setState({
65 | ...this.state,
66 | startEffectsLoading: false,
67 | stopEffectsLoading: false
68 | });
69 | });
70 |
71 | this.localVideoStreamFeatureApi.on('effectsStarted', () => {
72 | this.setState({
73 | ...this.state,
74 | startEffectsLoading: false
75 | });
76 | });
77 |
78 | this.localVideoStreamFeatureApi.on('effectsStopped', () => {
79 | this.setState({
80 | ...this.state,
81 | stopEffectsLoading: false
82 | });
83 | });
84 | }
85 |
86 | async populateVideoEffects() {
87 | const supported = [];
88 | if (this.localVideoStreamFeatureApi) {
89 | const backgroundBlur = new BackgroundBlurEffect();
90 | const isBackgroundBlurSupported = await this.localVideoStreamFeatureApi.isSupported(backgroundBlur);
91 | if (isBackgroundBlurSupported) {
92 | supported.push(backgroundBlur);
93 | }
94 |
95 | const backgroundReplacement = new BackgroundReplacementEffect({
96 | backgroundImageUrl: '../assets/images/ACSBackdrop.png'
97 | });
98 | const isBackgroundReplacementSupported = await this.localVideoStreamFeatureApi.isSupported(backgroundReplacement);
99 | if (isBackgroundReplacementSupported) {
100 | supported.push(backgroundReplacement);
101 | }
102 |
103 | const videoEffectsList = supported.map(effect => {
104 | return {
105 | key: effect.name,
106 | text: effect.name.replace('Background', 'Background ')
107 | }
108 | });
109 |
110 | this.setState({
111 | ...this.state,
112 | supportedVideoEffects: [ ...supported ],
113 | supportedVideoEffectsPopulated: true,
114 | selectedVideoEffect: supported[0].name,
115 | videoEffectsList
116 | });
117 | }
118 | }
119 |
120 | effectSelectionChanged(event, item) {
121 | const newSelection = this.state.supportedVideoEffects.find(effect => effect.name === item.key);
122 | if (newSelection) {
123 | this.setState({
124 | ...this.state,
125 | selectedVideoEffect: newSelection
126 | });
127 | }
128 | }
129 |
130 | async startEffects() {
131 | if (!this.localVideoStreamFeatureApi) {
132 | this.logError('Feature api not found.');
133 | return;
134 | }
135 |
136 | this.setState({
137 | ...this.state,
138 | startEffectsLoading: true
139 | });
140 |
141 | await this.localVideoStreamFeatureApi.startEffects(this.state.selectedVideoEffect);
142 | }
143 |
144 | async stopEffects() {
145 | if (!this.localVideoStreamFeatureApi) {
146 | this.logError('Feature api not found.');
147 | return;
148 | }
149 |
150 | this.setState({
151 | ...this.state,
152 | stopEffectsLoading: true
153 | });
154 |
155 | await this.localVideoStreamFeatureApi.stopEffects();
156 | }
157 |
158 | async handleImageClick(imageLocation) {
159 | if (this.state.selectedVideoEffect.name !== 'BackgroundReplacement') {
160 | this.logError('Wrong effect selected.');
161 | return;
162 | }
163 |
164 | await this.state.selectedVideoEffect.configure({
165 | backgroundImageUrl: imageLocation
166 | });
167 | }
168 |
169 | render() {
170 | if (!this.localVideoStreamFeatureApi || this.state.videoEffectsList.length === 0) {
171 | return <>>
172 | }
173 | return (
174 |
175 |
Video effects
176 | {this.state.supportedVideoEffects.length > 0 ?
177 |
178 |
this.effectSelectionChanged(e, item)}
180 | options={this.state.videoEffectsList}
181 | placeholder={'Select an option'}
182 | styles={{ dropdown: { width: 300, color: 'black' } }}
183 | />
184 | this.startEffects()}
187 | >
188 | {this.state.startEffectsLoading ? : 'Start Effects'}
189 |
190 |
191 | this.stopEffects()}
194 | >
195 | {this.state.stopEffectsLoading ? : 'Stop Effects'}
196 |
197 | this.handleImageClick(imageLocation)}
200 | />
201 |
202 | :
203 |
Background Blur/Replacement are only supported on Windows Chrome, Windows Edge, MacOS Chrome, MacOS Edge, and MacOS Safari
204 | }
205 |
206 | );
207 | }
208 | }
209 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/VideoEffects/VideoEffectsImagePicker.js:
--------------------------------------------------------------------------------
1 | import React from 'react';
2 | import { Icon } from '@fluentui/react/lib/Icon';
3 |
4 | export default class VideoEffectsImagePicker extends React.Component {
5 | constructor(props) {
6 | super(props);
7 |
8 | this.state = {
9 | images: [
10 | {
11 | location: '../assets/images/ACSBackdrop.png',
12 | name: 'ACSBackdrop'
13 | },
14 | {
15 | location: '../assets/images/MicrosoftLearnBackdrop.png',
16 | name: 'MicrosoftLearnBackdrop'
17 | }
18 | ],
19 | selectedImage: {
20 | location: '',
21 | name: ''
22 | }
23 | }
24 | }
25 |
26 | handleImageClick(image) {
27 | const selectedImage = this.state.images.find(item => item.name === image.name );
28 |
29 | if (selectedImage) {
30 | this.setState({
31 | ...this.state,
32 | selectedImage
33 | });
34 |
35 | if (this.props.handleImageClick) {
36 | this.props.handleImageClick(selectedImage.location);
37 | }
38 | }
39 | }
40 |
41 | render() {
42 | return (
43 |
44 | {this.state.images.map((image, key) => (
45 |
46 |
this.handleImageClick(image)}
51 | />
52 |
53 |
54 | ))
55 | }
56 |
57 | );
58 | }
59 | }
60 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/VideoReceiveStats.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 |
3 | export default class VideoReceiveStats extends React.Component {
4 | constructor(props) {
5 | super(props);
6 | }
7 |
8 | render() {
9 | if (!this.props.videoStats) {
10 | return null;
11 | }
12 | return (
13 |
14 |
15 |
16 | codec:
17 | {this.props.videoStats.codecName}
18 |
19 |
20 | bitrate:
21 | {(this.props.videoStats.bitrate/1000).toFixed(1)} kbps
22 |
23 |
24 | jitter:
25 | {this.props.videoStats.jitterInMs} ms
26 |
27 |
28 | rtt:
29 | {this.props.transportStats?.rttInMs ?? ''} ms
30 |
31 |
32 | packetsPerSecond:
33 | {this.props.videoStats.packetsPerSecond}
34 |
35 |
36 | frameWidthReceived:
37 | {this.props.videoStats.frameWidthReceived} px
38 |
39 |
40 | frameHeightReceived:
41 | {this.props.videoStats.frameHeightReceived} px
42 |
43 |
44 | frameRateDecoded:
45 | {this.props.videoStats.frameRateDecoded} fps
46 |
47 |
48 | frameRateReceived:
49 | {this.props.videoStats.frameRateReceived} fps
50 |
51 |
52 | framesReceived:
53 | {this.props.videoStats.framesReceived}
54 |
55 |
56 | framesDropped:
57 | {this.props.videoStats.framesDropped}
58 |
59 |
60 | framesDecoded:
61 | {this.props.videoStats.framesDecoded}
62 |
63 |
64 |
65 | );
66 | }
67 | }
68 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/VideoSendStats.js:
--------------------------------------------------------------------------------
1 | import React from "react";
2 |
3 | export default class VideoSendStats extends React.Component {
4 | constructor(props) {
5 | super(props);
6 | }
7 |
8 | render() {
9 | return (
10 |
11 | {
12 | this.props.videoStats &&
13 |
14 |
15 | codec:
16 | {this.props.videoStats.codecName}
17 |
18 |
19 | bitrate:
20 | {(this.props.videoStats.bitrate/1000).toFixed(1)} kbps
21 |
22 |
23 | jitter:
24 | {this.props.videoStats.jitterInMs} ms
25 |
26 |
27 | rtt:
28 | {this.props.videoStats.rttInMs} ms
29 |
30 |
31 | packetsPerSecond:
32 | {this.props.videoStats.packetsPerSecond}
33 |
34 |
35 | frameWidthSent:
36 | {this.props.videoStats.frameWidthSent} px
37 |
38 |
39 | frameHeightSent:
40 | {this.props.videoStats.frameHeightSent} px
41 |
42 |
43 | frameRateInput:
44 | {this.props.videoStats.frameRateInput} fps
45 |
46 |
47 | frameRateEncoded:
48 | {this.props.videoStats.frameRateEncoded} fps
49 |
50 | {
51 | this.props.videoStats.altLayouts?.length > 0 &&
52 |
53 | simulcast:
54 | {this.props.videoStats.altLayouts[0].frameWidthSent}x{this.props.videoStats.altLayouts[0].frameHeightSent}
55 |
56 | }
57 |
58 | }
59 |
60 | );
61 | }
62 | }
63 |
--------------------------------------------------------------------------------
/Project/src/MakeCall/VolumeVisualizer.js:
--------------------------------------------------------------------------------
1 | import React, { useEffect, useState } from "react";
2 | import { LocalAudioStream } from '@azure/communication-calling';
3 |
4 | const VolumeVisualizer = ({ deviceManager, call }) => {
5 | const [localStream, setLocalStream] = useState(new LocalAudioStream(deviceManager.selectedMicrophone));
6 | const [remoteStream, setRemoteStream] = useState(call?.remoteAudioStreams[0]);
7 |
8 | useEffect(() => {
9 | deviceManager.on('selectedSpeakerChanged', () => {
10 | setRemoteStream(call?.remoteAudioStreams[0])
11 | });
12 |
13 | deviceManager.on('selectedMicrophoneChanged', () => {
14 | setLocalStream(new LocalAudioStream(deviceManager.selectedMicrophone));
15 | });
16 |
17 | return () => {
18 | deviceManager.off('selectedSpeakerChanged', () => {});
19 | deviceManager.off('selectedMicrophoneChanged', () => {});
20 | }
21 | }, []);
22 |
23 | return (
24 |
25 |
26 |
27 |
28 | );
29 | };
30 |
31 | const VolumeIndicator = ({ title, audioStream }) => {
32 | const [volumeLevel, setVolumeLevel] = useState(0);
33 | const [volume, setVolume] = useState();
34 |
35 | useEffect(() => {
36 | const setUpVolume = async() => {
37 | const volume = await audioStream?.getVolume();
38 | setVolume(volume);
39 | volume?.on('levelChanged', () => {
40 | setVolumeLevel(volume.level);
41 | });
42 | setVolumeLevel(volume.level);
43 | };
44 | setUpVolume();
45 | }, [audioStream]);
46 |
47 | useEffect(() => {
48 | return () => {
49 | volume?.off('levelChanged', () => {});
50 | };
51 | }, [volume])
52 |
53 | return (
54 |
58 | );
59 | };
60 |
61 | export default VolumeVisualizer;
62 |
--------------------------------------------------------------------------------
/Project/src/Utils/Utils.js:
--------------------------------------------------------------------------------
1 | import {
2 | isCommunicationUserIdentifier,
3 | isPhoneNumberIdentifier,
4 | isMicrosoftTeamsUserIdentifier,
5 | isUnknownIdentifier,
6 | createIdentifierFromRawId
7 | } from '@azure/communication-common';
8 | import { PublicClientApplication } from "@azure/msal-browser";
9 | import { authConfig, authScopes } from "../../oAuthConfig"
10 | import axios from 'axios';
11 |
12 | export const utils = {
13 | getAppServiceUrl: () => {
14 | return window.location.origin;
15 | },
16 | getCommunicationUserToken: async (communicationUserId, isJoinOnlyToken) => {
17 | let data = {};
18 | if (communicationUserId) {
19 | data.communicationUserId = communicationUserId;
20 | }
21 | if (isJoinOnlyToken) {
22 | data.isJoinOnlyToken = isJoinOnlyToken;
23 | }
24 | let response = await axios({
25 | url: 'getCommunicationUserToken',
26 | method: 'POST',
27 | headers: {
28 | 'Content-Type': 'application/json'
29 | },
30 | data: JSON.stringify(data)
31 | })
32 | if (response.status === 200) {
33 | return response.data;
34 | }
35 | throw new Error('Failed to get ACS User Access token');
36 | },
37 | getCommunicationUserTokenForOneSignalRegistrationToken: async (oneSignalRegistrationToken) => {
38 | let response = await axios({
39 | url: 'getCommunicationUserTokenForOneSignalRegistrationToken',
40 | method: 'POST',
41 | headers: {
42 | 'Content-Type': 'application/json'
43 | },
44 | data: JSON.stringify({oneSignalRegistrationToken})
45 | });
46 | if (response.status === 200) {
47 | return response.data;
48 | }
49 | throw new Error('Failed to get ACS User Acccess token for the given OneSignal Registration Token');
50 | },
51 | getOneSignalRegistrationTokenForCommunicationUserToken: async (token, communicationUserId) => {
52 | let response = await axios({
53 | url: 'getOneSignalRegistrationTokenForCommunicationUserToken',
54 | method: 'POST',
55 | headers: {
56 | 'Content-Type': 'application/json'
57 | },
58 | data: JSON.stringify({token, communicationUserId})
59 | });
60 | if (response.status === 200) {
61 | return response.data;
62 | }
63 | throw new Error('Failed to get ACS User Acccess token for the given OneSignal Registration Token');
64 | },
65 | teamsPopupLogin: async () => {
66 | const oAuthObj = new PublicClientApplication(authConfig);
67 | const popupLoginRespoonse = await oAuthObj.loginPopup({scopes: authScopes.popUpLogin});
68 | const response = await axios({
69 | url: 'teamsPopupLogin',
70 | method: 'POST',
71 | headers: {
72 | 'Accept': 'application/json, text/plain, */*',
73 | 'Content-type': 'application/json'
74 | },
75 | data: JSON.stringify({
76 | aadToken: popupLoginRespoonse.accessToken,
77 | userObjectId: popupLoginRespoonse.uniqueId
78 | })
79 | });
80 | if (response.status === 200) {
81 | return response.data;
82 | }
83 | throw new Error('Failed to get Teams User Acccess token');
84 | },
85 | teamsM365Login: async (email, password) => {
86 | const response = await axios({
87 | url: 'teamsM365Login',
88 | method: 'POST',
89 | headers: {
90 | 'Accept': 'application/json, text/plain, */*',
91 | 'Content-type': 'application/json'
92 | },
93 | data: JSON.stringify({email, password })
94 | })
95 | if (response.status === 200) {
96 | return response.data;
97 | }
98 | throw new Error('Failed to get Teams User Acccess token');
99 | },
100 | createRoom: async (pstnDialOutEnabled, presenterUserIds, collaboratorUserIds, attendeeUserIds, consumerUserIds) => {
101 | try {
102 | const data = {};
103 | data.pstnDialOutEnabled = pstnDialOutEnabled;
104 | if (presenterUserIds) {
105 | data.presenterUserIds = presenterUserIds.split(',').map(id => id.trim());
106 | }
107 | if (collaboratorUserIds) {
108 | data.collaboratorUserIds = collaboratorUserIds.split(',').map(id => id.trim());
109 | }
110 | if (attendeeUserIds) {
111 | data.attendeeUserIds = attendeeUserIds.split(',').map(id => id.trim());
112 | }
113 | if (consumerUserIds) {
114 | data.consumerUserIds = consumerUserIds.split(',').map(id => id.trim());
115 | }
116 |
117 | const response = await axios({
118 | url: 'createRoom',
119 | method: 'POST',
120 | headers: {
121 | 'Accept': 'application/json, text/plain, */*',
122 | 'Content-type': 'application/json'
123 | },
124 | data: JSON.stringify(data)
125 | });
126 | console.log('Room created successfully:', response.data);
127 | return response.data.roomId;
128 |
129 | } catch (error) {
130 | console.error('Error creating room:', error);
131 | throw error.response.data.message;
132 | }
133 | },
134 | updateParticipant: async (patchRoomId, patchParticipantId, patchParticipantRole) => {
135 | try {
136 | if (!patchRoomId.trim() || !patchParticipantId.trim() || !patchParticipantRole.trim()) {
137 | throw new Error('All parameters (patchRoomId, patchParticipantId, patchParticipantRole) must be non-empty strings without trailing whitespace.');
138 | }
139 |
140 | const response = await axios({
141 | url: 'updateParticipant',
142 | method: 'PATCH',
143 | headers: {
144 | 'Accept': 'application/json, text/plain, */*',
145 | 'Content-type': 'application/json'
146 | },
147 | data: JSON.stringify({
148 | patchRoomId: patchRoomId.trim(),
149 | patchParticipantId: patchParticipantId.trim(),
150 | patchParticipantRole: patchParticipantRole.trim()
151 | })
152 | });
153 | console.log('Participant updated successfully:', response.data);
154 | } catch (error) {
155 | console.error('Error updating participant:', error);
156 | throw error.response?.data?.message || error.message;
157 | }
158 | },
159 | getIdentifierText: (identifier) => {
160 | if (isCommunicationUserIdentifier(identifier)) {
161 | return identifier.communicationUserId;
162 | } else if (isPhoneNumberIdentifier(identifier)) {
163 | return identifier.phoneNumber;
164 | } else if (isMicrosoftTeamsUserIdentifier(identifier)) {
165 | return identifier.microsoftTeamsUserId;
166 | } else if (isUnknownIdentifier(identifier) && identifier.id === '8:echo123'){
167 | return 'Echo Bot';
168 | } else {
169 | return 'Unknown Identifier';
170 | }
171 | },
172 | getSizeInBytes(str) {
173 | return new Blob([str]).size;
174 | },
175 | getRemoteParticipantObjFromIdentifier(call, identifier) {
176 | switch(identifier.kind) {
177 | case 'communicationUser': {
178 | return call.remoteParticipants.find(rm => {
179 | return rm.identifier.communicationUserId === identifier.communicationUserId
180 | });
181 | }
182 | case 'microsoftTeamsUser': {
183 | return call.remoteParticipants.find(rm => {
184 | return rm.identifier.microsoftTeamsUserId === identifier.microsoftTeamsUserId
185 | });
186 | }
187 | case 'phoneNumber': {
188 | return call.remoteParticipants.find(rm => {
189 | return rm.identifier.phoneNumber === identifier.phoneNumber
190 | });
191 | }
192 | case 'unknown': {
193 | return call.remoteParticipants.find(rm => {
194 | return rm.identifier.id === identifier.id
195 | });
196 | }
197 | }
198 | },
199 | isParticipantSpotlighted(participantId, spotlightState) {
200 | if (!participantId || !spotlightState) { return false }
201 | let rtn = spotlightState.find(element => this.getIdentifierText(element.identifier) === this.getIdentifierText(participantId));
202 | return !!rtn
203 |
204 | },
205 | isParticipantHandRaised(participantId, raisedHandState) {
206 | if (!participantId || !raisedHandState) { return false }
207 | let rtn = raisedHandState.find(element => this.getIdentifierText(element.identifier) === this.getIdentifierText(participantId));
208 | return !!rtn
209 | },
210 | getParticipantPublishStates(participantId, publishedStates) {
211 | let states = {isSpotlighted: false, isHandRaised: false}
212 | states.isSpotlighted = this.isParticipantSpotlighted(participantId, publishedStates.spotlight)
213 | states.isHandRaised = this.isParticipantHandRaised(participantId, publishedStates.raiseHand)
214 | return states
215 | }
216 | }
217 |
--------------------------------------------------------------------------------
/Project/src/index.js:
--------------------------------------------------------------------------------
1 | import React from 'react';
2 | import ReactDOM from 'react-dom';
3 | import App from './App';
4 | import * as serviceWorker from './serviceWorker';
5 |
6 | ReactDOM.render( , document.getElementById('root'));
7 |
8 | // If you want your app to work offline and load faster, you can change
9 | // unregister() to register() below. Note this comes with some pitfalls.
10 | // Learn more about service workers: https://bit.ly/CRA-PWA
11 | // serviceWorker.unregister();
12 |
--------------------------------------------------------------------------------
/Project/src/serviceWorker.js:
--------------------------------------------------------------------------------
1 | // This optional code is used to register a service worker.
2 | // register() is not called by default.
3 |
4 | // This lets the app load faster on subsequent visits in production, and gives
5 | // it offline capabilities. However, it also means that developers (and users)
6 | // will only see deployed updates on subsequent visits to a page, after all the
7 | // existing tabs open on the page have been closed, since previously cached
8 | // resources are updated in the background.
9 |
10 | // To learn more about the benefits of this model and instructions on how to
11 | // opt-in, read https://bit.ly/CRA-PWA
12 |
13 | const isLocalhost = Boolean(
14 | window.location.hostname === 'localhost' ||
15 | // [::1] is the IPv6 localhost address.
16 | window.location.hostname === '[::1]' ||
17 | // 127.0.0.0/8 are considered localhost for IPv4.
18 | window.location.hostname.match(
19 | /^127(?:\.(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}$/
20 | )
21 | );
22 |
23 | export function register(config) {
24 | if (process.env.NODE_ENV === 'production' && 'serviceWorker' in navigator) {
25 | // The URL constructor is available in all browsers that support SW.
26 | const publicUrl = new URL(process.env.PUBLIC_URL, window.location.href);
27 | if (publicUrl.origin !== window.location.origin) {
28 | // Our service worker won't work if PUBLIC_URL is on a different origin
29 | // from what our page is served on. This might happen if a CDN is used to
30 | // serve assets; see https://github.com/facebook/create-react-app/issues/2374
31 | return;
32 | }
33 |
34 | window.addEventListener('load', () => {
35 | const swUrl = `${process.env.PUBLIC_URL}/service-worker.js`;
36 |
37 | if (isLocalhost) {
38 | // This is running on localhost. Let's check if a service worker still exists or not.
39 | checkValidServiceWorker(swUrl, config);
40 |
41 | // Add some additional logging to localhost, pointing developers to the
42 | // service worker/PWA documentation.
43 | navigator.serviceWorker.ready.then(() => {
44 | console.log(
45 | 'This web app is being served cache-first by a service ' +
46 | 'worker. To learn more, visit https://bit.ly/CRA-PWA'
47 | );
48 | });
49 | } else {
50 | // Is not localhost. Just register service worker
51 | registerValidSW(swUrl, config);
52 | }
53 | });
54 | }
55 | }
56 |
57 | function registerValidSW(swUrl, config) {
58 | navigator.serviceWorker
59 | .register(swUrl)
60 | .then(registration => {
61 | registration.onupdatefound = () => {
62 | const installingWorker = registration.installing;
63 | if (installingWorker == null) {
64 | return;
65 | }
66 | installingWorker.onstatechange = () => {
67 | if (installingWorker.state === 'installed') {
68 | if (navigator.serviceWorker.controller) {
69 | // At this point, the updated precached content has been fetched,
70 | // but the previous service worker will still serve the older
71 | // content until all client tabs are closed.
72 | console.log(
73 | 'New content is available and will be used when all ' +
74 | 'tabs for this page are closed. See https://bit.ly/CRA-PWA.'
75 | );
76 |
77 | // Execute callback
78 | if (config && config.onUpdate) {
79 | config.onUpdate(registration);
80 | }
81 | } else {
82 | // At this point, everything has been precached.
83 | // It's the perfect time to display a
84 | // "Content is cached for offline use." message.
85 | console.log('Content is cached for offline use.');
86 |
87 | // Execute callback
88 | if (config && config.onSuccess) {
89 | config.onSuccess(registration);
90 | }
91 | }
92 | }
93 | };
94 | };
95 | })
96 | .catch(error => {
97 | console.error('Error during service worker registration:', error);
98 | });
99 | }
100 |
101 | function checkValidServiceWorker(swUrl, config) {
102 | // Check if the service worker can be found. If it can't reload the page.
103 | fetch(swUrl, {
104 | headers: { 'Service-Worker': 'script' }
105 | })
106 | .then(response => {
107 | // Ensure service worker exists, and that we really are getting a JS file.
108 | const contentType = response.headers.get('content-type');
109 | if (
110 | response.status === 404 ||
111 | (contentType != null && contentType.indexOf('javascript') === -1)
112 | ) {
113 | // No service worker found. Probably a different app. Reload the page.
114 | navigator.serviceWorker.ready.then(registration => {
115 | registration.unregister().then(() => {
116 | window.location.reload();
117 | });
118 | });
119 | } else {
120 | // Service worker found. Proceed as normal.
121 | registerValidSW(swUrl, config);
122 | }
123 | })
124 | .catch(() => {
125 | console.log(
126 | 'No internet connection found. App is running in offline mode.'
127 | );
128 | });
129 | }
130 |
131 | export function unregister() {
132 | if ('serviceWorker' in navigator) {
133 | navigator.serviceWorker.ready.then(registration => {
134 | registration.unregister();
135 | });
136 | }
137 | }
138 |
--------------------------------------------------------------------------------
/Project/webpack.config.js:
--------------------------------------------------------------------------------
1 | const CommunicationIdentityClient = require("@azure/communication-identity").CommunicationIdentityClient;
2 | const { RoomsClient } = require('@azure/communication-rooms');
3 | const HtmlWebPackPlugin = require("html-webpack-plugin");
4 | const config = require("./serverConfig.json");
5 | const clientConfig = require("./clientConfig.json");
6 | const axios = require("axios");
7 | const bodyParser = require('body-parser');
8 | const msal = require('@azure/msal-node');
9 |
10 | const {authConfig, authScopes} = require('./oAuthConfig');
11 | const clientId = authConfig.auth.clientId;
12 |
13 |
14 | if(!config || !config.connectionString || config.connectionString.indexOf('endpoint=') === -1)
15 | {
16 | throw new Error("Update `serverConfig.json` with connection string");
17 | }
18 |
19 | const communicationIdentityClient = new CommunicationIdentityClient(config.connectionString);
20 |
21 | const PORT = process.env.port || 8080;
22 |
23 |
24 | const oneSignalRegistrationTokenToAcsUserAccesTokenMap = new Map();
25 | const registerCommunicationUserForOneSignal = async (communicationAccessToken, communicationUserIdentifier) => {
26 | const oneSignalRegistrationToken = generateGuid();
27 | await axios({
28 | url: config.functionAppOneSignalTokenRegistrationUrl,
29 | method: 'PUT',
30 | headers: {
31 | 'Content-Type': 'application/json'
32 | },
33 | data: JSON.stringify({
34 | communicationUserId: communicationUserIdentifier.communicationUserId,
35 | oneSignalRegistrationToken,
36 | oneSignalAppId: clientConfig.oneSignalAppId
37 | })
38 | }).then((response) => { return response.data });
39 | oneSignalRegistrationTokenToAcsUserAccesTokenMap.set(oneSignalRegistrationToken, { communicationAccessToken, communicationUserIdentifier });
40 | return oneSignalRegistrationToken;
41 | }
42 |
43 | const generateGuid = function () {
44 | function s4() {
45 | return Math.floor((Math.random() + 1) * 0x10000).toString(16).substring(1);
46 | }
47 | return `${s4()}${s4()}-${s4()}-${s4()}-${s4()}-${s4()}${s4()}${s4()}`;
48 | }
49 |
50 | function parseJWT (token) {
51 | return JSON.parse(Buffer.from(token.split('.')[1], 'base64').toString());
52 | }
53 |
54 | // Exchanging Azure AD access token of a Teams User for a Communication access token
55 | // https://learn.microsoft.com/en-us/azure/communication-services/quickstarts/manage-teams-identity?pivots=programming-language-javascript
56 | const getACSAccessTokenInfo = async (aadToken, userObjectId) => {
57 | let acsToken;
58 | try{
59 | acsToken = await communicationIdentityClient.getTokenForTeamsUser({
60 | teamsUserAadToken: aadToken,
61 | clientId,
62 | userObjectId: userObjectId
63 | });
64 | } catch(e) {
65 | console.log('ERROR', e);
66 | throw e
67 | }
68 |
69 | let parsedToken = parseJWT(acsToken.token);
70 | if (parsedToken == '') {
71 | throw (" Parsed Token is empty");
72 | }
73 | const mri = `8:${parsedToken.skypeid}`;
74 | const tokenResponse = {
75 | token: acsToken.token,
76 | userId: { communicationUserId: mri }
77 | };
78 | return tokenResponse;
79 | }
80 |
81 | module.exports = {
82 | devtool: 'inline-source-map',
83 | mode: 'development',
84 | entry: "./src/index.js",
85 | module: {
86 | rules: [
87 | {
88 | test: /\.(js|jsx)$/,
89 | exclude: /node_modules/,
90 | use: {
91 | loader: "babel-loader"
92 | }
93 | },
94 | {
95 | test: /\.(ts|tsx)?$/,
96 | use: 'ts-loader',
97 | exclude: /node_modules/,
98 | },
99 | {
100 | test: /\.html$/,
101 | use: [
102 | {
103 | loader: "html-loader"
104 | }
105 | ]
106 | },
107 | {
108 | test: /\.css$/,
109 | use: ["style-loader", "css-loader"]
110 | }
111 | ]
112 | },
113 | plugins: [
114 | new HtmlWebPackPlugin({
115 | template: "./public/index.html",
116 | filename: "./index.html"
117 | })
118 | ],
119 | devServer: {
120 | open: true,
121 | port: PORT,
122 | static:'./public',
123 | allowedHosts:[
124 | '.azurewebsites.net'
125 | ],
126 | webSocketServer: false,
127 | setupMiddlewares: (middlewares, devServer) => {
128 | if (!devServer) {
129 | throw new Error('webpack-dev-server is not defined');
130 | }
131 |
132 | devServer.app.use(bodyParser.json());
133 | devServer.app.post('/getCommunicationUserToken', async (req, res) => {
134 | try {
135 | const communicationUserId = req.body.communicationUserId;
136 | const isJoinOnlyToken = req.body.isJoinOnlyToken === true;
137 | let CommunicationUserIdentifier;
138 | if (!communicationUserId) {
139 | CommunicationUserIdentifier = await communicationIdentityClient.createUser();
140 | } else {
141 | CommunicationUserIdentifier = { communicationUserId: communicationUserId };
142 | }
143 | const communicationUserToken = await communicationIdentityClient.getToken(CommunicationUserIdentifier, [isJoinOnlyToken ? "voip.join" : "voip"]);
144 | let oneSignalRegistrationToken;
145 | if (config.functionAppOneSignalTokenRegistrationUrl) {
146 | oneSignalRegistrationToken = await registerCommunicationUserForOneSignal(communicationUserToken, CommunicationUserIdentifier);
147 | }
148 | res.setHeader('Content-Type', 'application/json');
149 | res.status(200).json({communicationUserToken, oneSignalRegistrationToken, userId: CommunicationUserIdentifier });
150 | } catch (e) {
151 | console.log('Error setting registration token', e);
152 | res.sendStatus(500);
153 | }
154 | });
155 | devServer.app.post('/getCommunicationUserTokenForOneSignalRegistrationToken', async (req, res) => {
156 | try {
157 | const oneSignalRegistrationToken = req.body.oneSignalRegistrationToken;
158 | const { communicationUserToken, communicationUserIdentifier } = oneSignalRegistrationTokenToAcsUserAccesTokenMap.get(oneSignalRegistrationToken);
159 | res.setHeader('Content-Type', 'application/json');
160 | res.status(200).json({ communicationUserToken, userId: communicationUserIdentifier, oneSignalRegistrationToken });
161 | } catch (e) {
162 | console.log('Error setting registration token', e);
163 | res.sendStatus(500);
164 | }
165 | });
166 | devServer.app.post('/getOneSignalRegistrationTokenForCommunicationUserToken', async (req, res) => {
167 | try {
168 | const communicationUserToken = {token: req.body.token };
169 | const communicationUserIdentifier = { communicationUserId: req.body.communicationUserId };
170 |
171 | if (!config.functionAppOneSignalTokenRegistrationUrl) {
172 | res.setHeader('Content-Type', 'application/json');
173 | res.status(200).json({
174 | communicationUserToken, userId: communicationUserIdentifier
175 | });
176 | return;
177 | }
178 |
179 | let pair = [...oneSignalRegistrationTokenToAcsUserAccesTokenMap.entries()].find((pair) => {
180 | return pair[1].token === communicationUserToken.token &&
181 | pair[1].communicationUserId === communicationUserIdentifier.communicationUserId;
182 | });
183 | let oneSignalRegistrationToken;
184 | if (pair) {
185 | oneSignalRegistrationToken = pair[0];
186 | } else {
187 | oneSignalRegistrationToken = await registerCommunicationUserForOneSignal(communicationUserToken, communicationUserIdentifier);
188 | }
189 | res.setHeader('Content-Type', 'application/json');
190 | res.status(200).json({
191 | communicationUserToken,
192 | userId: communicationUserIdentifier,
193 | oneSignalRegistrationToken
194 | });
195 | } catch (e) {
196 | console.log('Error setting registration token', e);
197 | res.sendStatus(500);
198 | }
199 | });
200 | devServer.app.post('/teamsPopupLogin', async (req, res) => {
201 | try {
202 | const aadToken = req.body.aadToken;
203 | const userObjectId = req.body.userObjectId;
204 | let acsTokenInfo = await getACSAccessTokenInfo(aadToken, userObjectId);
205 | res.setHeader('Content-Type', 'application/json');
206 | res.status(200).json({
207 | communicationUserToken: {token: acsTokenInfo.token},
208 | userId: acsTokenInfo.userId
209 | });
210 | } catch (e) {
211 | console.error(e);
212 | res.sendStatus(400);
213 | }
214 | });
215 | devServer.app.post('/teamsM365Login', async (req, res) => {
216 | try {
217 | const email = req.body.email;
218 | const password = req.body.password;
219 |
220 | const pca = new msal.PublicClientApplication(authConfig);
221 | let tokenRequest = {scopes: authScopes.m365Login}
222 |
223 | tokenRequest.username = email;
224 | tokenRequest.password = password;
225 | const response = await pca.acquireTokenByUsernamePassword(tokenRequest);
226 | let acsTokenInfo = await getACSAccessTokenInfo(response.accessToken, response.uniqueId);
227 |
228 | res.setHeader('Content-Type', 'application/json');
229 | res.status(200).json({
230 | communicationUserToken: {token: acsTokenInfo.token},
231 | userId: acsTokenInfo.userId
232 | });
233 | } catch (e) {
234 | console.error(e);
235 | res.sendStatus(400);
236 | }
237 | });
238 | devServer.app.post('/createRoom', async (req, res) => {
239 | try {
240 | let participants = [];
241 | console.log('req.body:', req.body);
242 | if (req.body.presenterUserIds && Array.isArray(req.body.presenterUserIds)) {
243 | req.body.presenterUserIds.forEach(presenterUserId => {
244 | participants.push({
245 | id: { communicationUserId: presenterUserId },
246 | role: "Presenter"
247 | });
248 | });
249 | }
250 | if (req.body.collaboratorUserIds && Array.isArray(req.body.collaboratorUserIds)) {
251 | req.body.collaboratorUserIds.forEach(collaboratorUserId => {
252 | participants.push({
253 | id: { communicationUserId: collaboratorUserId },
254 | role: "Collaborator"
255 | });
256 | });
257 | }
258 | if (req.body.attendeeUserIds && Array.isArray(req.body.attendeeUserIds)) {
259 | req.body.attendeeUserIds.forEach(attendeeUserId => {
260 | participants.push({
261 | id: { communicationUserId: attendeeUserId },
262 | role: "Attendee"
263 | });
264 | });
265 | }
266 | if (req.body.consumerUserIds && Array.isArray(req.body.consumerUserIds)) {
267 | req.body.consumerUserIds.forEach(consumerUserId => {
268 | participants.push({
269 | id: { communicationUserId: consumerUserId },
270 | role: "Consumer"
271 | });
272 | });
273 | }
274 |
275 | if (participants.length === 0) {
276 | res.status(400).json({
277 | message: "At least one participant must be provided to create a room."
278 | });
279 | return;
280 | }
281 |
282 | console.log('participants:', participants);
283 | const validFrom = new Date(Date.now());
284 | const validUntil = new Date(validFrom.getTime() + 60 * 60 * 1000);
285 | const pstnDialOutEnabled = req.body.pstnDialOutEnabled;
286 | const roomsClient = new RoomsClient(config.connectionString);
287 | const createRoom = await roomsClient.createRoom({
288 | validFrom,
289 | validUntil,
290 | pstnDialOutEnabled,
291 | participants
292 | });
293 | const roomId = createRoom.id;
294 | console.log('\nRoom successfully created');
295 | console.log('Room ID:', roomId);
296 | console.log('Participants:', participants);
297 |
298 | res.setHeader('Content-Type', 'application/json');
299 | res.status(200).json({
300 | roomId
301 | });
302 | } catch (e) {
303 | console.error(e);
304 | throw e;
305 | }
306 | });
307 | devServer.app.patch('/updateParticipant', async (req, res) => {
308 | try {
309 | const roomId = req.body.patchRoomId;
310 | const participantId = req.body.patchParticipantId;
311 | const participantRole = req.body.patchParticipantRole;
312 | const roomsClient = new RoomsClient(config.connectionString);
313 | const participant = [
314 | {
315 | id: { communicationUserId: participantId},
316 | role: participantRole,
317 | },
318 | ];
319 | await roomsClient.addOrUpdateParticipants(roomId, participant);
320 | res.setHeader('Content-Type', 'application/json');
321 | res.status(200).json({message: 'Participant updated successfully'});
322 | } catch (e) {
323 | console.error(e);
324 | throw e;
325 | }
326 | });
327 |
328 | return middlewares;
329 | }
330 | }
331 | };
332 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | ---
2 | page_type: sample
3 | languages:
4 | - javascript
5 | - nodejs
6 | products:
7 | - azure
8 | - azure-communication-services
9 | ---
10 |
11 | # ACS Calling Tutorial
12 |
13 | ## Prerequisites
14 | - An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
15 | - A deployed Communication Services resource. [Create a Communication Services resource](https://learn.microsoft.com/en-us/azure/communication-services/quickstarts/create-communication-resource?tabs=windows&pivots=platform-azp).
16 | - [NPM](https://www.npmjs.com/get-npm)
17 | - You need to have [Node.js 18](https://nodejs.org/dist/v18.18.0/). You can use the msi installer to install it.
18 |
19 | ## Code structure
20 | * ./Project/src: client side source code
21 | * ./Project/webpack.config.js: Project bundler. Has a simple local server for user token provisioning.
22 | * ./Project/serverConfig.json: configuration file for specifying the connection strings.
23 |
24 | ## Cloning the repo
25 | 1. Open a terminal or command prompt, and `cd` into a folder where you would like to clone this repo. The run:
26 | - `git clone https://github.com/Azure-Samples/communication-services-web-calling-tutorial`
27 | - `cd communication-services-web-calling-tutorial/Project`
28 | ## Running the app locally
29 | 1. Get your Azure Communication Services resource connection string from the Azure portal, and put it as the value for `connectionString` in serverConfig.json file.
30 | 2. From the terminal/command prompt, Run:
31 | - `npm install`
32 | - `npm run build-local`
33 | - `npm run start-local`
34 | 3. Open localhost:5000 in a browser. (Supported browsers are Chrome, Edge, and Safari)
35 |
36 | ## Deploying to Azure App Service
37 | - This app has been setup to be easily deployed to Azure App Service
38 | - webpack.config.js.
39 | - allowedHosts: Specifies that it allows this app to be hosted in \.azurewebsites.org which is how Azure App Service hosts web apps.
40 | - contentBase: The folder where public assets can be served from. For example, a request to your app like GET https://\.azurewebsites.org/file.txt, will serve the file.txt that resides in the contentBase folder. This app has this field set to the './public' folder.
41 | - package.json. Azure app service will run these scripts when deploying.
42 | - "build" script. Used by Azure App Service when deploying to build the application.
43 | - "start" script. Used by Azure App Service when deploying. This will start server in port 8080. Port 8080 is specified in webpack.config.js. Do not change this port when deploying to Azrue App Service becaue this is the port that Azure App Service uses.
44 | [Tutorial on how to deploy a NodeJs app to Azure App Service](https://learn.microsoft.com/en-us/azure/app-service/quickstart-nodejs?tabs=windows&pivots=development-environment-vscode)
45 | Note: If you want to deploy this application with a different deployment environment other than Azure App Service, you may need to change these configurations according to your deployment environment specifications.
46 |
47 | ## Troubleshooting
48 | - Make sure your ACS connecting string is specified in serverConfig.json or you wont be able to provision ACS User Access tokens for the app.
49 | - If any errors occur, check the browser console logs for errors. Also, check the webpack server side console logs for errors.
50 | - Web Push Notifications - In order to test web push notifications, we must run the app in HTTPS, hence you will need to deploy this app to a secured server that will serve the application with HTTPS. You will need to specify value in ./clientConfig.json for the key "oneSignalAppId". And you will need to specify value for "functionAppOneSignalTokenRegistrationUrl" in ./serverConfig.json. To learn how to set up a web push notification architecture for the ACS Web Calling SDK, please follow our [ACS Web Calling SDK - Web push notifications tutorial](https://github.com/Azure-Samples/communication-services-javascript-quickstarts/tree/main/calling-web-push-notifications):
51 |
52 | ## Resources
53 | 1. Documentation on how to use the ACS Calling SDK for Javascript can be found on https://docs.microsoft.com/en-gb/azure/communication-services/quickstarts/voice-video-calling/calling-client-samples?pivots=platform-web
54 | 2. ACS Calling SDK for Javascript API reference documentation can be found on https://docs.microsoft.com/en-us/javascript/api/azure-communication-services/@azure/communication-calling/?view=azure-communication-services-js
55 | 3. Documentation on Communications Calling SDK with Teams identity can be found on https://learn.microsoft.com/en-us/azure/communication-services/concepts/teams-interop
56 | 4. Documentation on how to setup and get access tokens for teams User can be found on https://learn.microsoft.com/en-us/azure/communication-services/quickstarts/manage-teams-identity?pivots=programming-language-javascript
--------------------------------------------------------------------------------
/SECURITY.md:
--------------------------------------------------------------------------------
1 | # Security Policy
2 |
3 | ## Supported Versions
4 |
5 | Use this section to tell people about which versions of your project are
6 | currently being supported with security updates.
7 |
8 | | Version | Supported |
9 | | ------- | ------------------ |
10 | | 5.1.x | :white_check_mark: |
11 | | 5.0.x | :x: |
12 | | 4.0.x | :white_check_mark: |
13 | | < 4.0 | :x: |
14 |
15 | ## Reporting a Vulnerability
16 |
17 | Use this section to tell people how to report a vulnerability.
18 |
19 | Tell them where to go, how often they can expect to get an update on a
20 | reported vulnerability, what to expect if the vulnerability is accepted or
21 | declined, etc.
22 |
--------------------------------------------------------------------------------
/package-lock.json:
--------------------------------------------------------------------------------
1 | {
2 | "lockfileVersion": 1
3 | }
4 |
--------------------------------------------------------------------------------