├── docs
├── img
│ ├── container.png
│ ├── architecture.png
│ ├── devops_project.png
│ ├── import-git-repo.png
│ ├── jmeter-plugin.jpg
│ ├── latency_example.jpg
│ ├── ui-run-pipeline.png
│ ├── dashboard_example.jpg
│ ├── jmeter-dashboard.png
│ ├── pipeline-artifacts.png
│ ├── azdo-test-results-fail.png
│ ├── create-variable-group.png
│ ├── azdo-test-results-success.png
│ └── create-service-connection.png
├── estimating-costs.md
├── integrating-application-insights.md
├── implementation-notes.md
├── jmeter-pipeline-settings.md
└── adding-jmeter-plugins.md
├── terraform
├── provider.tf
├── output.tf
├── variables.tf
├── variables_privateendpoint.temp
├── main_privateendpoint.temp
└── main.tf
├── other_tools
├── SearchFunctions
│ ├── Properties
│ │ ├── serviceDependencies.json
│ │ └── serviceDependencies.local.json
│ ├── host.json
│ ├── SearchFunctions.csproj
│ ├── .gitignore
│ └── SearchTestNode.cs
├── Data
│ └── SemanticScholar
│ │ ├── SemanticScholarDataUploader.csproj
│ │ ├── README.md
│ │ ├── Schema.cs
│ │ └── Program.cs
├── PerfTest
│ ├── Models.cs
│ ├── SearchPerfTest.csproj
│ ├── ManagementClient.cs
│ └── Program.cs
├── SearchFunctions.sln
├── README.md
└── .gitignore
├── .github
├── workflows
│ └── docker-ci.yml
├── CODE_OF_CONDUCT.md
├── ISSUE_TEMPLATE.md
└── PULL_REQUEST_TEMPLATE.md
├── .gitignore
├── CODE_OF_CONDUCT.md
├── pipelines
├── azure-pipelines.docker.yml
└── azure-pipelines.load-test.yml
├── docker
└── Dockerfile
├── LICENSE.md
├── SECURITY.md
├── jmeter
├── README.md
├── sample_steps.jmx
├── sample.jmx
└── terms_to_search.csv
├── CONTRIBUTING.md
├── scripts
└── jtl_junit_converter.py
└── README.md
/docs/img/container.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/container.png
--------------------------------------------------------------------------------
/docs/img/architecture.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/architecture.png
--------------------------------------------------------------------------------
/docs/img/devops_project.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/devops_project.png
--------------------------------------------------------------------------------
/docs/img/import-git-repo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/import-git-repo.png
--------------------------------------------------------------------------------
/docs/img/jmeter-plugin.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/jmeter-plugin.jpg
--------------------------------------------------------------------------------
/docs/img/latency_example.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/latency_example.jpg
--------------------------------------------------------------------------------
/docs/img/ui-run-pipeline.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/ui-run-pipeline.png
--------------------------------------------------------------------------------
/docs/img/dashboard_example.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/dashboard_example.jpg
--------------------------------------------------------------------------------
/docs/img/jmeter-dashboard.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/jmeter-dashboard.png
--------------------------------------------------------------------------------
/docs/img/pipeline-artifacts.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/pipeline-artifacts.png
--------------------------------------------------------------------------------
/docs/img/azdo-test-results-fail.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/azdo-test-results-fail.png
--------------------------------------------------------------------------------
/docs/img/create-variable-group.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/create-variable-group.png
--------------------------------------------------------------------------------
/docs/img/azdo-test-results-success.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/azdo-test-results-success.png
--------------------------------------------------------------------------------
/docs/img/create-service-connection.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Azure-Samples/azure-search-performance-testing/HEAD/docs/img/create-service-connection.png
--------------------------------------------------------------------------------
/terraform/provider.tf:
--------------------------------------------------------------------------------
1 | provider "azurerm" {
2 | version = "=2.26.0"
3 | features {}
4 | }
5 |
6 | provider "random" {
7 | version = "=2.2.1"
8 | }
9 |
--------------------------------------------------------------------------------
/other_tools/SearchFunctions/Properties/serviceDependencies.json:
--------------------------------------------------------------------------------
1 | {
2 | "dependencies": {
3 | "storage1": {
4 | "type": "storage",
5 | "connectionId": "AzureWebJobsStorage"
6 | }
7 | }
8 | }
--------------------------------------------------------------------------------
/other_tools/SearchFunctions/Properties/serviceDependencies.local.json:
--------------------------------------------------------------------------------
1 | {
2 | "dependencies": {
3 | "storage1": {
4 | "type": "storage.emulator",
5 | "connectionId": "AzureWebJobsStorage"
6 | }
7 | }
8 | }
--------------------------------------------------------------------------------
/other_tools/SearchFunctions/host.json:
--------------------------------------------------------------------------------
1 | {
2 | "version": "2.0",
3 | "logging": {
4 | "applicationInsights": {
5 | "samplingSettings": {
6 | "isEnabled": true,
7 | "excludedTypes": "Request"
8 | }
9 | }
10 | }
11 | }
12 |
--------------------------------------------------------------------------------
/other_tools/Data/SemanticScholar/SemanticScholarDataUploader.csproj:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | Exe
5 | netcoreapp3.1
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
--------------------------------------------------------------------------------
/.github/workflows/docker-ci.yml:
--------------------------------------------------------------------------------
1 | name: Docker-CI
2 |
3 | on:
4 | push:
5 | branches: [ main ]
6 | pull_request:
7 | branches: [ main ]
8 |
9 | jobs:
10 | build:
11 | runs-on: ubuntu-latest
12 |
13 | steps:
14 | - uses: actions/checkout@v2
15 |
16 | - name: Builds JMeter Docker image
17 | run: docker build ./docker -t jmeter-build
18 |
19 | - name: Run JMeter in Docker
20 | run: docker run jmeter-build --version
21 |
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | # Local .terraform directories
2 | **/.terraform/*
3 |
4 | # .tfstate files
5 | *.tfstate
6 | *.tfstate.*
7 |
8 | # log files
9 | *.log
10 |
11 | *.tfvars
12 |
13 | # Ignore override files as they are usually used to override resources locally and so
14 | # are not checked in
15 | override.tf
16 | override.tf.json
17 | *_override.tf
18 | *_override.tf.json
19 |
20 | # Include tfplan files to ignore the plan output of command: terraform plan -out=tfplan
21 | *tfplan*
22 | .idea/
--------------------------------------------------------------------------------
/.github/CODE_OF_CONDUCT.md:
--------------------------------------------------------------------------------
1 | # Microsoft Open Source Code of Conduct
2 |
3 | This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
4 |
5 | Resources:
6 |
7 | - [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/)
8 | - [Microsoft Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
9 | - Contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with questions or concerns
10 |
--------------------------------------------------------------------------------
/CODE_OF_CONDUCT.md:
--------------------------------------------------------------------------------
1 | # Microsoft Open Source Code of Conduct
2 |
3 | This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
4 |
5 | Resources:
6 |
7 | - [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/)
8 | - [Microsoft Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
9 | - Contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with questions or concerns
10 |
--------------------------------------------------------------------------------
/other_tools/Data/SemanticScholar/README.md:
--------------------------------------------------------------------------------
1 | # Semantic Scholar Data Upload for Azure Cognitive Search
2 |
3 | This application contains information on how to create an Azure Cognitive Search Index with data provided by [Semantic Scholar](http://s2-public-api-prod.us-west-2.elasticbeanstalk.com/corpus/download/).
4 |
5 | For uploading the data, we will use the Azure Cognitive Search SDK because it allows for retries and exponential backoff.
6 | The average document size is ~3 KB Each corpus file (e.g. s2-corpus-000.gz) has ~998,537 documents and is ~1,680MB in raw text (1.68KB / DOC)
7 |
--------------------------------------------------------------------------------
/docs/estimating-costs.md:
--------------------------------------------------------------------------------
1 | # Estimating Costs
2 |
3 | It's recommended the [Azure Pricing Calculator](https://azure.microsoft.com/en-us/pricing/calculator/) to estimate the monthly costs.
4 |
5 | > The costs may change depending on your contracts with Microsoft.
6 |
7 | ### Example
8 |
9 | * 1 Basic Container Registry
10 | * 1 Standard Storage Account (General Purpose)
11 | * `N` Container Instance groups running in `M` seconds with `X` vCPUs; where:
12 | * `N` is the estimated number of instances in the load test (1 controller + `N'` workers)
13 | * `M` is the test duration in seconds
14 | * `X` is the number of vCPUs for each instance group
15 |
--------------------------------------------------------------------------------
/other_tools/PerfTest/Models.cs:
--------------------------------------------------------------------------------
1 | // Copyright (c) Microsoft Corporation. All rights reserved.
2 | // Licensed under the MIT License.
3 |
4 | using System;
5 | using System.Collections.Generic;
6 | using System.Text;
7 |
8 | namespace SearchPerfTest
9 | {
10 | public class PerfStat
11 | {
12 | public int runStatusCode { get; set; }
13 | public int runMS { get; set; }
14 | public DateTime runTime { get; set; }
15 | public long? resultCount { get; set; }
16 | }
17 |
18 | public class Query
19 | {
20 | public int q { get; set; }
21 | }
22 |
23 | public class Result
24 | {
25 |
26 | }
27 | }
28 |
--------------------------------------------------------------------------------
/pipelines/azure-pipelines.docker.yml:
--------------------------------------------------------------------------------
1 | trigger:
2 | branches:
3 | include:
4 | - main
5 | paths:
6 | include:
7 | - docker/*
8 |
9 | pool:
10 | vmImage: 'ubuntu-latest'
11 |
12 | variables:
13 | - group: JMETER_TERRAFORM_SETTINGS
14 |
15 | steps:
16 |
17 | - task: AzureCLI@2
18 | displayName: 'Build and Push JMeter Docker image'
19 | inputs:
20 | azureSubscription: $(AZURE_SERVICE_CONNECTION_NAME)
21 | scriptType: bash
22 | scriptLocation: inlineScript
23 | inlineScript: |
24 | az acr build -t $(TF_VAR_JMETER_DOCKER_IMAGE) -r $(TF_VAR_JMETER_ACR_NAME) -f $(Build.SourcesDirectory)/docker/Dockerfile $(Build.SourcesDirectory)/docker
25 |
--------------------------------------------------------------------------------
/docker/Dockerfile:
--------------------------------------------------------------------------------
1 | FROM justb4/jmeter:5.3
2 |
3 | # We will need two useful plugins
4 | # https://jmeter-plugins.org/wiki/TestPlanCheckTool/
5 | # https://jmeter-plugins.org/wiki/ConcurrencyThreadGroup/
6 | ENV PLAN_CHECK_PLUGIN_VERSION=2.4
7 | ENV CUSTOM_THREAD_VERSION=2.8
8 | RUN wget https://jmeter-plugins.org/files/packages/jpgc-plancheck-${PLAN_CHECK_PLUGIN_VERSION}.zip
9 | RUN unzip -o jpgc-plancheck-${PLAN_CHECK_PLUGIN_VERSION}.zip -d ${JMETER_HOME}
10 | RUN wget https://jmeter-plugins.org/files/packages/jpgc-casutg-${CUSTOM_THREAD_VERSION}.zip
11 | RUN unzip -o jpgc-casutg-${CUSTOM_THREAD_VERSION}.zip -d ${JMETER_HOME}
12 |
13 | EXPOSE 1099
14 |
15 | ENTRYPOINT ["/entrypoint.sh"]
16 |
--------------------------------------------------------------------------------
/other_tools/PerfTest/SearchPerfTest.csproj:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | Exe
5 | netcoreapp3.1
6 |
7 |
8 |
9 | x64
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
--------------------------------------------------------------------------------
/.github/ISSUE_TEMPLATE.md:
--------------------------------------------------------------------------------
1 |
4 | > Please provide us with the following information:
5 | > ---------------------------------------------------------------
6 |
7 | ### This issue is for a: (mark with an `x`)
8 | ```
9 | - [ ] bug report -> please search issues before submitting
10 | - [ ] feature request
11 | - [ ] documentation issue or request
12 | - [ ] regression (a behavior that used to work and stopped in a new release)
13 | ```
14 |
15 | ### Minimal steps to reproduce
16 | >
17 |
18 | ### Any log messages given by the failure
19 | >
20 |
21 | ### Expected/desired behavior
22 | >
23 |
24 | ### OS and Version?
25 | > Windows 7, 8 or 10. Linux (which distribution). macOS (Yosemite? El Capitan? Sierra?)
26 |
27 | ### Versions
28 | >
29 |
30 | ### Mention any other details that might be useful
31 |
32 | > ---------------------------------------------------------------
33 | > Thanks! We'll be in touch soon.
34 |
--------------------------------------------------------------------------------
/docs/integrating-application-insights.md:
--------------------------------------------------------------------------------
1 | # Logging JMeter Test Results Into Azure App Insights
2 |
3 | 1. Make the plugin `Azure backend Listener` available by updating the [Dockerfile](../docker/Dockerfile) with code snippet below:
4 |
5 | ```docker
6 | # Azure backend Listener plugin
7 | ENV AZURE_BACKEND_LISTENER_PLUGIN_VERSION=0.2.2
8 | RUN wget https://jmeter-plugins.org/files/packages/jmeter.backendlistener.azure-${AZURE_BACKEND_LISTENER_PLUGIN_VERSION}.zip
9 | RUN unzip -o jmeter.backendlistener.azure-${AZURE_BACKEND_LISTENER_PLUGIN_VERSION}.zip -d ${JMETER_HOME}
10 | ```
11 |
12 | 2. Update [sample.jmx](../jmeter/sample.jmx) by adding the required Azure backend listener plugin and providing required application insights instrumentation key. The instructions on how to do that can be found [here](https://techcommunity.microsoft.com/t5/azure-global/send-your-jmeter-test-results-to-azure-application-insights/ba-p/1195320).
13 |
14 | > Turning on Live Metrics within this plugin has serious performance impact on controller instance if the solution is being used for larger infrastructures.
15 |
--------------------------------------------------------------------------------
/other_tools/SearchFunctions/SearchFunctions.csproj:
--------------------------------------------------------------------------------
1 |
2 |
3 | netcoreapp3.1
4 | v3
5 | f4626fbc-2a92-4496-a2b8-6f01084dc8a8
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 | PreserveNewest
15 |
16 |
17 | Always
18 | Never
19 |
20 |
21 | Always
22 |
23 |
24 |
25 |
--------------------------------------------------------------------------------
/.github/PULL_REQUEST_TEMPLATE.md:
--------------------------------------------------------------------------------
1 | ## Purpose
2 |
3 | * ...
4 |
5 | ## Does this introduce a breaking change?
6 |
7 | ```
8 | [ ] Yes
9 | [ ] No
10 | ```
11 |
12 | ## Pull Request Type
13 | What kind of change does this Pull Request introduce?
14 |
15 |
16 | ```
17 | [ ] Bugfix
18 | [ ] Feature
19 | [ ] Code style update (formatting, local variables)
20 | [ ] Refactoring (no functional changes, no api changes)
21 | [ ] Documentation content changes
22 | [ ] Other... Please describe:
23 | ```
24 |
25 | ## How to Test
26 | * Get the code
27 |
28 | ```
29 | git clone [repo-address]
30 | cd [repo-name]
31 | git checkout [branch-name]
32 | npm install
33 | ```
34 |
35 | * Test the code
36 |
37 | ```
38 | ```
39 |
40 | ## What to Check
41 | Verify that the following are valid
42 | * ...
43 |
44 | ## Other Information
45 |
--------------------------------------------------------------------------------
/LICENSE.md:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) Microsoft Corporation.
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE
--------------------------------------------------------------------------------
/terraform/output.tf:
--------------------------------------------------------------------------------
1 | output "resource_group_name" {
2 | value = "${var.RESOURCE_GROUP_NAME}"
3 | }
4 |
5 | output "storage_account_name" {
6 | value = "${azurerm_storage_account.jmeter_storage.name}"
7 | }
8 |
9 | output "storage_account_key" {
10 | value = "${azurerm_storage_account.jmeter_storage.primary_access_key}"
11 | sensitive = true
12 | }
13 |
14 | output "storage_file_share_name" {
15 | value = "${azurerm_storage_share.jmeter_share.name}"
16 | }
17 |
18 | output "storage_file_share_url" {
19 | value = "${azurerm_storage_share.jmeter_share.url}"
20 | }
21 |
22 | output "jmeter_controller_name" {
23 | value = "${azurerm_container_group.jmeter_controller.name}"
24 | }
25 |
26 | output "jmeter_controller_ip" {
27 | value = "${azurerm_container_group.jmeter_controller.ip_address}"
28 | }
29 |
30 | output "jmeter_workers_names" {
31 | value = "${join(",", "${azurerm_container_group.jmeter_workers.*.name}")}"
32 | }
33 |
34 | output "jmeter_workers_ip_list" {
35 | value = ["${azurerm_container_group.jmeter_workers.*.ip_address}"]
36 | }
37 |
38 | output "jmeter_workers_ips" {
39 | value = "${join(",", "${azurerm_container_group.jmeter_workers.*.ip_address}")}"
40 | }
41 |
42 | output "jmeter_results_file" {
43 | value = "${var.JMETER_RESULTS_FILE}"
44 | }
45 |
46 | output "jmeter_dashboard_folder" {
47 | value = "${var.JMETER_DASHBOARD_FOLDER}"
48 | }
49 |
--------------------------------------------------------------------------------
/docs/implementation-notes.md:
--------------------------------------------------------------------------------
1 | # Implementation Notes
2 |
3 | ## Repository structure
4 |
5 | | Folder | Description |
6 | |-----------|------------------------------------------------|
7 | | docker | JMeter custom image |
8 | | docs | Documentation and images |
9 | | jmeter | Contains JMX files used by JMeter agents |
10 | | pipelines | Docker and JMeter pipeline definitions |
11 | | scripts | Scripts that support pipeline execution |
12 | | terraform | Terraform template for infrastructure creation |
13 |
14 | ## Possible Modifications
15 |
16 | This sample only shows how to manually trigger a JMeter Pipeline. You can easily adapt its content and incorporate it on other pipelines, apply continuous integration or other improvements.
17 |
18 | This sample uses static JMX files on [jmeter](./jmeter/) directory. You can use many techniques to parameterize JMX files. Some of them are:
19 | * [CSV files](https://guide.blazemeter.com/hc/en-us/articles/206733689-Using-CSV-DATA-SET-CONFIG)
20 | * [Properties](http://jmeter.apache.org/usermanual/functions.html#__P)
21 | * [Environment Variables](https://jmeter-plugins.org/wiki/Functions/#envsupfont-color-gray-size-1-since-1-2-0-font-sup)
22 |
23 | Also, you can dynamically generate JMX files from Swagger/Open API using [swagger-codegen](https://github.com/swagger-api/swagger-codegen) or other similar projects.
24 |
25 | Current Terraform template creates a new VNET to host JMeter installation. Instead you can modify the template to deploy agents in an existing VNET or you can apply VNET peering to connect them into an existing infrastructure.
--------------------------------------------------------------------------------
/terraform/variables.tf:
--------------------------------------------------------------------------------
1 | variable "RESOURCE_GROUP_NAME" {
2 | type = string
3 | default = "myloadtest"
4 | }
5 |
6 | variable "LOCATION" {
7 | type = string
8 | default = "westeurope"
9 | }
10 |
11 | variable "PREFIX" {
12 | type = string
13 | default = "jmeter"
14 | }
15 |
16 | variable "VNET_ADDRESS_SPACE" {
17 | type = string
18 | default = "10.0.0.0/16"
19 | }
20 |
21 | variable "SUBNET_ADDRESS_PREFIX" {
22 | type = string
23 | default = "10.0.0.0/24"
24 | }
25 |
26 | variable "JMETER_WORKERS_COUNT" {
27 | type = number
28 | default = 1
29 | }
30 |
31 | variable "JMETER_WORKER_CPU" {
32 | type = string
33 | default = "2.0"
34 | }
35 |
36 | variable "JMETER_WORKER_MEMORY" {
37 | type = string
38 | default = "8.0"
39 | }
40 |
41 | variable "JMETER_CONTROLLER_CPU" {
42 | type = string
43 | default = "4.0"
44 | }
45 |
46 | variable "JMETER_CONTROLLER_MEMORY" {
47 | type = string
48 | default = "16.0"
49 | }
50 |
51 | variable "JMETER_DOCKER_IMAGE" {
52 | type = string
53 | default = "justb4/jmeter:5.3"
54 | }
55 |
56 | variable "JMETER_DOCKER_PORT" {
57 | type = number
58 | default = 1099
59 | }
60 |
61 | variable "JMETER_ACR_NAME" {
62 | type = string
63 | default = "loadtestmyacr"
64 | }
65 |
66 | variable "JMETER_ACR_RESOURCE_GROUP_NAME" {
67 | type = string
68 | default = "myloadtest"
69 | }
70 |
71 | variable "JMETER_STORAGE_QUOTA_GIGABYTES" {
72 | type = number
73 | default = 1
74 | }
75 |
76 | variable "JMETER_JMX_FILE" {
77 | type = string
78 | description = "JMX file"
79 | }
80 |
81 | variable "JMETER_RESULTS_FILE" {
82 | type = string
83 | default = "results.jtl"
84 | }
85 |
86 | variable "JMETER_DASHBOARD_FOLDER" {
87 | type = string
88 | default = "dashboard"
89 | }
90 |
--------------------------------------------------------------------------------
/terraform/variables_privateendpoint.temp:
--------------------------------------------------------------------------------
1 | variable "RESOURCE_GROUP_NAME" {
2 | type = string
3 | default = "myloadtest"
4 | }
5 |
6 | variable "LOCATION" {
7 | type = string
8 | default = "westeurope"
9 | }
10 |
11 | variable "PREFIX" {
12 | type = string
13 | default = "jmeter"
14 | }
15 |
16 | variable "VNET_NAME" {
17 | type = string
18 | default = "acsvnet"
19 | }
20 |
21 | variable "SUBNET_ADDRESS_PREFIX" {
22 | type = string
23 | default = "172.17.2.0/24"
24 | }
25 |
26 | variable "JMETER_WORKERS_COUNT" {
27 | type = number
28 | default = 1
29 | }
30 |
31 | variable "JMETER_WORKER_CPU" {
32 | type = string
33 | default = "2.0"
34 | }
35 |
36 | variable "JMETER_WORKER_MEMORY" {
37 | type = string
38 | default = "8.0"
39 | }
40 |
41 | variable "JMETER_CONTROLLER_CPU" {
42 | type = string
43 | default = "2.0"
44 | }
45 |
46 | variable "JMETER_CONTROLLER_MEMORY" {
47 | type = string
48 | default = "8.0"
49 | }
50 |
51 | variable "JMETER_DOCKER_IMAGE" {
52 | type = string
53 | default = "justb4/jmeter:5.1.1"
54 | }
55 |
56 | variable "JMETER_DOCKER_PORT" {
57 | type = number
58 | default = 1099
59 | }
60 |
61 | variable "JMETER_ACR_NAME" {
62 | type = string
63 | default = "loadtestmyacr"
64 | }
65 |
66 | variable "JMETER_ACR_RESOURCE_GROUP_NAME" {
67 | type = string
68 | default = "myloadtest"
69 | }
70 |
71 | variable "JMETER_STORAGE_QUOTA_GIGABYTES" {
72 | type = number
73 | default = 1
74 | }
75 |
76 | variable "JMETER_JMX_FILE" {
77 | type = string
78 | description = "JMX file"
79 | }
80 |
81 | variable "JMETER_RESULTS_FILE" {
82 | type = string
83 | default = "results.jtl"
84 | }
85 |
86 | variable "JMETER_DASHBOARD_FOLDER" {
87 | type = string
88 | default = "dashboard"
89 | }
90 |
--------------------------------------------------------------------------------
/other_tools/SearchFunctions.sln:
--------------------------------------------------------------------------------
1 |
2 | Microsoft Visual Studio Solution File, Format Version 12.00
3 | # Visual Studio Version 16
4 | VisualStudioVersion = 16.0.30517.126
5 | MinimumVisualStudioVersion = 10.0.40219.1
6 | Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SearchFunctions", "SearchFunctions\SearchFunctions.csproj", "{080FA8FD-B2DB-4FC0-8704-CFB43166E5FB}"
7 | EndProject
8 | Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SearchPerfTest", "PerfTest\SearchPerfTest.csproj", "{D5C78228-8970-4955-A9FC-F1ABC9121565}"
9 | EndProject
10 | Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "SemanticScholarDataUploader", "Data\SemanticScholar\SemanticScholarDataUploader.csproj", "{6DE15352-E823-4692-A93C-8F7C7B617354}"
11 | EndProject
12 | Global
13 | GlobalSection(SolutionConfigurationPlatforms) = preSolution
14 | Debug|Any CPU = Debug|Any CPU
15 | Release|Any CPU = Release|Any CPU
16 | EndGlobalSection
17 | GlobalSection(ProjectConfigurationPlatforms) = postSolution
18 | {080FA8FD-B2DB-4FC0-8704-CFB43166E5FB}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
19 | {080FA8FD-B2DB-4FC0-8704-CFB43166E5FB}.Debug|Any CPU.Build.0 = Debug|Any CPU
20 | {080FA8FD-B2DB-4FC0-8704-CFB43166E5FB}.Release|Any CPU.ActiveCfg = Release|Any CPU
21 | {080FA8FD-B2DB-4FC0-8704-CFB43166E5FB}.Release|Any CPU.Build.0 = Release|Any CPU
22 | {D5C78228-8970-4955-A9FC-F1ABC9121565}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
23 | {D5C78228-8970-4955-A9FC-F1ABC9121565}.Debug|Any CPU.Build.0 = Debug|Any CPU
24 | {D5C78228-8970-4955-A9FC-F1ABC9121565}.Release|Any CPU.ActiveCfg = Release|Any CPU
25 | {D5C78228-8970-4955-A9FC-F1ABC9121565}.Release|Any CPU.Build.0 = Release|Any CPU
26 | {6DE15352-E823-4692-A93C-8F7C7B617354}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
27 | {6DE15352-E823-4692-A93C-8F7C7B617354}.Debug|Any CPU.Build.0 = Debug|Any CPU
28 | {6DE15352-E823-4692-A93C-8F7C7B617354}.Release|Any CPU.ActiveCfg = Release|Any CPU
29 | {6DE15352-E823-4692-A93C-8F7C7B617354}.Release|Any CPU.Build.0 = Release|Any CPU
30 | EndGlobalSection
31 | GlobalSection(SolutionProperties) = preSolution
32 | HideSolutionNode = FALSE
33 | EndGlobalSection
34 | GlobalSection(ExtensibilityGlobals) = postSolution
35 | SolutionGuid = {AA50C5A8-E500-4985-8B07-80742D533220}
36 | EndGlobalSection
37 | EndGlobal
38 |
--------------------------------------------------------------------------------
/other_tools/PerfTest/ManagementClient.cs:
--------------------------------------------------------------------------------
1 | // Copyright (c) Microsoft Corporation. All rights reserved.
2 | // Licensed under the MIT License.
3 |
4 | using Microsoft.Azure.Management.ResourceManager.Fluent;
5 | using Microsoft.Azure.Management.Search;
6 | using Microsoft.Azure.Management.Search.Models;
7 | using Microsoft.Rest;
8 | using Microsoft.Rest.Azure.Authentication;
9 | using System;
10 | using System.Collections.Generic;
11 | using System.Text;
12 | using System.Threading.Tasks;
13 |
14 | namespace SearchPerfTest
15 | {
16 |
17 | class ManagementClient
18 | {
19 | static SearchManagementClient _managementClient;
20 | private string clientId = "";
21 | private string clientSecret = "";
22 | private string tenantId = "";
23 |
24 | private string resourceGroupName;
25 | private string searchServiceName;
26 |
27 | public ManagementClient(string resourceGroupName, string searchServiceName, string subscriptionId)
28 | {
29 | this.resourceGroupName = resourceGroupName;
30 | this.searchServiceName = searchServiceName;
31 |
32 | var credentials = SdkContext.AzureCredentialsFactory.FromServicePrincipal(clientId, clientSecret, tenantId, AzureEnvironment.AzureGlobalCloud);
33 |
34 | _managementClient = new SearchManagementClient(credentials);
35 | _managementClient.SubscriptionId = subscriptionId;
36 |
37 | }
38 |
39 | public AdminKeyResult GetKeys()
40 | {
41 | return _managementClient.AdminKeys.Get(resourceGroupName, searchServiceName);
42 | }
43 |
44 | public void ScaleService(int? replicas, int partitions)
45 | {
46 | var searchService = _managementClient.Services.Get(resourceGroupName, searchServiceName);
47 |
48 | searchService.ReplicaCount = replicas;
49 | searchService.PartitionCount = partitions;
50 | _managementClient.Services.CreateOrUpdateAsync(resourceGroupName, searchServiceName, searchService);
51 | }
52 |
53 | public string GetStatus()
54 | {
55 | var searchService = _managementClient.Services.Get(resourceGroupName, searchServiceName);
56 |
57 | return searchService.Status.ToString();
58 | }
59 |
60 | public int? GetReplicaCount()
61 | {
62 | var searchService = _managementClient.Services.Get(resourceGroupName, searchServiceName);
63 |
64 | return searchService.ReplicaCount;
65 | }
66 |
67 | }
68 | }
69 |
--------------------------------------------------------------------------------
/other_tools/Data/SemanticScholar/Schema.cs:
--------------------------------------------------------------------------------
1 | // Copyright (c) Microsoft Corporation. All rights reserved.
2 | // Licensed under the MIT License.
3 |
4 | using System;
5 | using System.Text.Json.Serialization;
6 | using System.Collections.Generic;
7 | using System.Text;
8 |
9 | namespace SemanticScholarDataUploader
10 | {
11 | public class SemanticScholar
12 | {
13 | [JsonPropertyName("id")]
14 | public string id { get; set; }
15 |
16 | [JsonPropertyName("magId")]
17 | public string magId { get; set; }
18 |
19 | [JsonPropertyName("entities")]
20 | public string[] entities { get; set; }
21 |
22 | [JsonPropertyName("fieldsOfStudy")]
23 | public string[] fieldsOfStudy { get; set; }
24 |
25 | [JsonPropertyName("journalVolume")]
26 | public string journalVolume { get; set; }
27 |
28 | [JsonPropertyName("journalPages")]
29 | public string journalPages { get; set; }
30 |
31 | [JsonPropertyName("pmid")]
32 | public string pmid { get; set; }
33 |
34 | [JsonPropertyName("year")]
35 | public int? year { get; set; }
36 |
37 | [JsonPropertyName("s2Url")]
38 | public string s2Url { get; set; }
39 |
40 | [JsonPropertyName("s2PdfUrl")]
41 | public string s2PdfUrl { get; set; }
42 |
43 | [JsonPropertyName("journs2PdfUrlalPages")]
44 | public string journs2PdfUrlalPages { get; set; }
45 |
46 | [JsonPropertyName("journalName")]
47 | public string journalName { get; set; }
48 |
49 | [JsonPropertyName("paperAbstract")]
50 | public string paperAbstract { get; set; }
51 |
52 | [JsonPropertyName("title")]
53 | public string title { get; set; }
54 |
55 | [JsonPropertyName("doi")]
56 | public string doi { get; set; }
57 |
58 | [JsonPropertyName("doiUrl")]
59 | public string doiUrl { get; set; }
60 |
61 | [JsonPropertyName("venue")]
62 | public string venue { get; set; }
63 |
64 | [JsonPropertyName("outCitations")]
65 | public string[] outCitations { get; set; }
66 |
67 | [JsonPropertyName("inCitations")]
68 | public string[] inCitations { get; set; }
69 |
70 | [JsonPropertyName("pdfUrls")]
71 | public string[] pdfUrls { get; set; }
72 |
73 | [JsonPropertyName("sources")]
74 | public string[] sources { get; set; }
75 |
76 | [JsonPropertyName("authors")]
77 | public Author[] authors { get; set; }
78 | }
79 |
80 | public class Author
81 | {
82 | [JsonPropertyName("name")]
83 | public string name { get; set; }
84 |
85 | [JsonPropertyName("ids")]
86 | public string[] ids { get; set; }
87 |
88 | }
89 | }
90 |
91 |
--------------------------------------------------------------------------------
/SECURITY.md:
--------------------------------------------------------------------------------
1 |
2 |
3 | ## Security
4 |
5 | Microsoft takes the security of our software products and services seriously, which includes all source code repositories managed through our GitHub organizations, which include [Microsoft](https://github.com/Microsoft), [Azure](https://github.com/Azure), [DotNet](https://github.com/dotnet), [AspNet](https://github.com/aspnet), [Xamarin](https://github.com/xamarin), and [our GitHub organizations](https://opensource.microsoft.com/).
6 |
7 | If you believe you have found a security vulnerability in any Microsoft-owned repository that meets Microsoft's [Microsoft's definition of a security vulnerability](https://docs.microsoft.com/en-us/previous-versions/tn-archive/cc751383(v=technet.10)) of a security vulnerability, please report it to us as described below.
8 |
9 | ## Reporting Security Issues
10 |
11 | **Please do not report security vulnerabilities through public GitHub issues.**
12 |
13 | Instead, please report them to the Microsoft Security Response Center (MSRC) at [https://msrc.microsoft.com/create-report](https://msrc.microsoft.com/create-report).
14 |
15 | If you prefer to submit without logging in, send email to [secure@microsoft.com](mailto:secure@microsoft.com). If possible, encrypt your message with our PGP key; please download it from the the [Microsoft Security Response Center PGP Key page](https://www.microsoft.com/en-us/msrc/pgp-key-msrc).
16 |
17 | You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Additional information can be found at [microsoft.com/msrc](https://www.microsoft.com/msrc).
18 |
19 | Please include the requested information listed below (as much as you can provide) to help us better understand the nature and scope of the possible issue:
20 |
21 | * Type of issue (e.g. buffer overflow, SQL injection, cross-site scripting, etc.)
22 | * Full paths of source file(s) related to the manifestation of the issue
23 | * The location of the affected source code (tag/branch/commit or direct URL)
24 | * Any special configuration required to reproduce the issue
25 | * Step-by-step instructions to reproduce the issue
26 | * Proof-of-concept or exploit code (if possible)
27 | * Impact of the issue, including how an attacker might exploit the issue
28 |
29 | This information will help us triage your report more quickly.
30 |
31 | If you are reporting for a bug bounty, more complete reports can contribute to a higher bounty award. Please visit our [Microsoft Bug Bounty Program](https://microsoft.com/msrc/bounty) page for more details about our active programs.
32 |
33 | ## Preferred Languages
34 |
35 | We prefer all communications to be in English.
36 |
37 | ## Policy
38 |
39 | Microsoft follows the principle of [Coordinated Vulnerability Disclosure](https://www.microsoft.com/en-us/msrc/cvd).
40 |
41 |
42 |
--------------------------------------------------------------------------------
/other_tools/README.md:
--------------------------------------------------------------------------------
1 | # Azure Cognitive Search benchmarking tool
2 |
3 | The following code was used to produce the benchmarks for Cognitive Search. The code is optimized to run tests, scale the service, and collect the results.
4 |
5 | For performance testing, we generally recommend using the [JMeter solution](https://github.com/Azure-Samples/azure-search-performance-testing) that's outlined in the main README rather than this tool.
6 |
7 | ## Prerequisites
8 |
9 | + An Azure Cognitive Search Service
10 | + An Azure Function resource (preferably in the same region as your search service). To scale up to higher QPS, additional Azure Functions will be needed.
11 |
12 | ## Run the solution
13 |
14 | 1. Open **SearchFunctions.sln** in Visual Studio
15 |
16 | 1. Update **SearchFunctions/local.settings.json**. It should look like this:
17 |
18 | ```json
19 | {
20 | "IsEncrypted": false,
21 | "Values": {
22 | "AzureWebJobsStorage": "UseDevelopmentStorage=true",
23 | "FUNCTIONS_WORKER_RUNTIME": "dotnet",
24 |
25 | "SearchSerivceName": "",
26 | "SearchAdminKey": "",
27 | "SearchIndexName": ""
28 | }
29 | }
30 | ```
31 |
32 | 1. Deploy the SearchFunctions to your Azure Functions resource.
33 | - In Visual Studio, right click the project name -> select publish
34 | - Under **Actions** select **Manage Azure App Service settings** and make sure `SearchServiceName`, `SearchAdminKey`, and `SearchIndexName` all have the correct values under remote. This ensures the Azure Function has the proper environment variables.
35 |
36 | 1. Select **SearchPerfTest** as the startup project
37 |
38 | 1. Update the settings for **SearchPerfTest**
39 |
40 | * First, update the function endpoint and code to connect to your Azure Function project:
41 |
42 | ```csharp
43 | static string functionEndpoint = "https://delegenz-perf.azurewebsites.net";
44 | static string code = "";
45 |
46 | static string searchServiceName = "";
47 | static string resourceGroupName = "";
48 | static string subscriptionId = "";
49 | ```
50 |
51 | * Next, update the values that control the QPS for the tests (or leave the defaults):
52 |
53 | ```csharp
54 | int startQPS = 10;
55 | int endQPS = 550;
56 | int increment = 10;
57 | int duration = 60;
58 | ```
59 |
60 | * To scale the search service, you'll also need a Service Principal with access to your search service. Once you [create the service principal](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal), update the following values in ManagementClient.cs:
61 |
62 | ```csharp
63 | private string clientId = "";
64 | private string clientSecret = "";
65 | private string tenantId = "";
66 | ```
67 |
68 | 1. At this point, you're ready to run the solution. Run the project and then monitor the results. Results will be added to the `logs` folder.
69 |
--------------------------------------------------------------------------------
/docs/jmeter-pipeline-settings.md:
--------------------------------------------------------------------------------
1 | # JMeter Pipeline Settings
2 |
3 | The pipeline uses Terraform 0.13.x to provision JMeter and its infrastructure on Azure.
4 |
5 | All environment variables that start with the prefix `TF_VAR` can be used by Terraform to fill the template. According to the [official docs](https://www.terraform.io/docs/commands/environment-variables.html#tf_var_name):
6 |
7 | > Environment variables can be used to set variables. The environment variables must be in the format TF_VAR_name and this will be checked last for a value.
8 |
9 | There are 2 variables that can be set directly on the JMeter pipeline. Those are:
10 |
11 | | Environment Variable | Terraform Variable | Default Value |
12 | |-----------------------------|----------------------|---------------|
13 | | TF_VAR_JMETER_JMX_FILE | JMETER_JMX_FILE | |
14 | | TF_VAR_JMETER_WORKERS_COUNT | JMETER_WORKERS_COUNT | 1 |
15 |
16 | All the other variables can be set on a library group called `JMETER_TERRAFORM_SETTINGS`. If a variable is not present on that library, a default value will be used, as the following table shows:
17 |
18 | | Environment Variable | Terraform Variable | Default Value |
19 | |---------------------------------------|--------------------------------|---------------------|
20 | | TF_VAR_RESOURCE_GROUP_NAME | RESOURCE_GROUP_NAME | jmeter |
21 | | TF_VAR_LOCATION | LOCATION | eastus |
22 | | TF_VAR_PREFIX | PREFIX | jmeter |
23 | | TF_VAR_VNET_ADDRESS_SPACE | VNET_ADDRESS_SPACE | 10.0.0.0/16 |
24 | | TF_VAR_SUBNET_ADDRESS_PREFIX | SUBNET_ADDRESS_PREFIX | 10.0.0.0/24 |
25 | | TF_VAR_JMETER_WORKER_CPU | JMETER_WORKER_CPU | 2.0 |
26 | | TF_VAR_JMETER_WORKER_MEMORY | JMETER_WORKER_MEMORY | 8.0 |
27 | | TF_VAR_JMETER_CONTROLLER_CPU | JMETER_CONTROLLER_CPU | 2.0 |
28 | | TF_VAR_JMETER_CONTROLLER_MEMORY | JMETER_CONTROLLER_MEMORY | 8.0 |
29 | | TF_VAR_JMETER_DOCKER_IMAGE | JMETER_DOCKER_IMAGE | justb4/jmeter:5.1.1 |
30 | | TF_VAR_JMETER_DOCKER_PORT | JMETER_DOCKER_PORT | 1099 |
31 | | TF_VAR_JMETER_ACR_NAME | JMETER_ACR_NAME | |
32 | | TF_VAR_JMETER_ACR_RESOURCE_GROUP_NAME | JMETER_ACR_RESOURCE_GROUP_NAME | |
33 | | TF_VAR_JMETER_STORAGE_QUOTA_GIGABYTES | JMETER_STORAGE_QUOTA_GIGABYTES | 1 |
34 | | TF_VAR_JMETER_RESULTS_FILE | JMETER_RESULTS_FILE | results.jtl |
35 | | TF_VAR_JMETER_DASHBOARD_FOLDER | JMETER_DASHBOARD_FOLDER | dashboard |
36 | | TF_VAR_JMETER_EXTRA_CLI_ARGUMENTS | JMETER_EXTRA_CLI_ARGUMENTS | |
37 |
--------------------------------------------------------------------------------
/docs/adding-jmeter-plugins.md:
--------------------------------------------------------------------------------
1 | # Adding plugins to JMeter Docker image
2 |
3 | JMeter allows the use of custom plugins to improve the load testing experience and execute different load testing scenarios. The [jmeter-plugins.org](https://jmeter-plugins.org/) contains a catalogue with available custom plugins created by the community that can be used with [Plugins Manager](https://jmeter-plugins.org/wiki/PluginsManager/).
4 |
5 | By default, this repository uses the [Test Plan Check Tool](https://jmeter-plugins.org/wiki/TestPlanCheckTool/) to automatically check the test plan consistency before running load tests. This validation is done in the load testing pipeline on Azure DevOps to avoid provisioning the infrastructure – VNet + ACI instances – if the JMX file is invalid (e.g. plugins that were not installed on JMeter root folder and invalid test parameters).
6 |
7 | Plugins are added on the JMeter Docker container image as you can verify on the `Dockerfile` located on the `docker` folder:
8 |
9 | ```docker
10 | FROM justb4/jmeter:5.1.1
11 |
12 | # https://jmeter-plugins.org/wiki/TestPlanCheckTool/
13 | ENV PLAN_CHECK_PLUGIN_VERSION=2.4
14 | RUN wget https://jmeter-plugins.org/files/packages/jpgc-plancheck-${PLAN_CHECK_PLUGIN_VERSION}.zip
15 | RUN unzip -o jpgc-plancheck-${PLAN_CHECK_PLUGIN_VERSION}.zip -d ${JMETER_HOME}
16 |
17 | EXPOSE 1099
18 |
19 | ENTRYPOINT ["/entrypoint.sh"]
20 | ```
21 |
22 | To add new plugins on JMeter, you have to:
23 |
24 | 1. Access the [jmeter-plugins.org](https://jmeter-plugins.org/) and find the plugin of your preference.
25 | 2. Choose the preferred plugin version and copy the download URL.
26 | 3. Create a step on `Dockerfile` to download the plugin through `wget`.
27 | 4. Create a step on `Dockerfile` to unzip the downloaded plugin on the JMeter root folder.
28 |
29 | > Note: Removing the [Test Plan Check Tool](https://jmeter-plugins.org/wiki/TestPlanCheckTool/) installation step on the `Dockerfile` will break the load testing pipeline as it has a task to validate the JMX file. If you want to remove this plugin, update the `azure-pipelines.load-test.yml` by removing the `SETUP: Validate JMX File` step.
30 |
31 | Using the [Ultimate Thread Group](https://jmeter-plugins.org/wiki/UltimateThreadGroup/) plugin as an example:
32 |
33 | 
34 |
35 | Then new steps are created to add the plugin:
36 |
37 | ```docker
38 | FROM justb4/jmeter:5.1.1
39 |
40 | # https://jmeter-plugins.org/wiki/TestPlanCheckTool/
41 | ENV PLAN_CHECK_PLUGIN_VERSION=2.4
42 | RUN wget https://jmeter-plugins.org/files/packages/jpgc-plancheck-${PLAN_CHECK_PLUGIN_VERSION}.zip
43 | RUN unzip -o jpgc-plancheck-${PLAN_CHECK_PLUGIN_VERSION}.zip -d ${JMETER_HOME}
44 |
45 | # https://jmeter-plugins.org/wiki/UltimateThreadGroup/
46 | ENV CUSTOM_PLUGIN_VERSION=2.9
47 | RUN wget https://jmeter-plugins.org/files/packages/jpgc-casutg-${CUSTOM_PLUGIN_VERSION}.zip
48 | RUN unzip -o jpgc-casutg-${CUSTOM_PLUGIN_VERSION}.zip -d ${JMETER_HOME}
49 |
50 | EXPOSE 1099
51 |
52 | ENTRYPOINT ["/entrypoint.sh"]
53 | ```
54 |
--------------------------------------------------------------------------------
/jmeter/README.md:
--------------------------------------------------------------------------------
1 | # General tips on .jmx config file
2 |
3 | ## ThreadGroup or any other plugin
4 |
5 | This section defines the test strategy, ie "ThreadGroup.on_sample_error" if the test should stop once it encounters an error. "TargetLevel" is the final number of concurrent calls that the service will receive after a period ("RampUp") stepped into a number of steps
6 |
7 | ```xml
8 | continue
9 | 100
10 | 1
11 | 5
12 |
13 | ```
14 |
15 | ## HTTPSamplerProxy
16 |
17 | This section includes the parameters and the body of your REST API call, must adhere to [the expected Azure Cognitive Search syntax](https://docs.microsoft.com/en-us/azure/search/query-lucene-syntax). You can set the search instance, the index name, the api-version and the "Argument.value" itself includes the search body (in this case a random term from a defined variable, that reads from a CSV list)
18 |
19 | ```xml
20 |
21 |
22 |
23 | false
24 | {
25 | "search": "${randomsearchterm}",
26 | "skip":0,
27 | "top": 5,
28 | "queryType": "full"
29 | }
30 | =
31 |
32 |
33 |
34 | your_instance.search.windows.net
35 | 443
36 | https
37 |
38 | /indexes/your_index_name/docs/search?api-version=2020-06-30
39 | ```
40 |
41 | The timeouts are optional, in this case set at 10 secs
42 |
43 | ```xml
44 | 10000
45 | 10000
46 | ```
47 |
48 | ## HeaderManager
49 |
50 | This section includes the header values needed for the REST API to go through the service. API_KEY will be substituted by a Devops step for the real key
51 |
52 | ```xml
53 | api-key
54 | API_KEY
55 |
56 |
57 | Content-Type
58 | application/json
59 | ```
60 |
61 | ## CSVDataSet
62 |
63 | The search engine has a cache, if you repeat the same query the latency seen in the results will not be realistic compared to scenario where your users query the system with diverse terms. This module defines the input list used to query starting from first line (if you need a random ordered term from the CSV use https://www.blazemeter.com/blog/introducing-the-random-csv-data-set-config-plugin-on-jmeter)
64 |
65 | ## Examples
66 |
67 | ### Example 1: Simple test scenario using "Thread Group"
68 |
69 | [`sample.jmx`](./jmeter/sample.jmx)
70 |
71 | ### Example 2: Step growth scenario using "Concurrency Thread Group"
72 |
73 | [`sample_steps.jmx`](./jmeter/sample_steps.jmx)
74 |
75 | More info on [Concurrency Thread Group Plugin](https://jmeter-plugins.org/wiki/ConcurrencyThreadGroup/)
76 |
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | # Contributing to [project-title]
2 |
3 | This project welcomes contributions and suggestions. Most contributions require you to agree to a
4 | Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us
5 | the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
6 |
7 | When you submit a pull request, a CLA bot will automatically determine whether you need to provide
8 | a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions
9 | provided by the bot. You will only need to do this once across all repos using our CLA.
10 |
11 | This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
12 | For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
13 | contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
14 |
15 | - [Code of Conduct](#coc)
16 | - [Issues and Bugs](#issue)
17 | - [Feature Requests](#feature)
18 | - [Submission Guidelines](#submit)
19 |
20 | ## Code of Conduct
21 | Help us keep this project open and inclusive. Please read and follow our [Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
22 |
23 | ## Found an Issue?
24 | If you find a bug in the source code or a mistake in the documentation, you can help us by
25 | [submitting an issue](#submit-issue) to the GitHub Repository. Even better, you can
26 | [submit a Pull Request](#submit-pr) with a fix.
27 |
28 | ## Want a Feature?
29 | You can *request* a new feature by [submitting an issue](#submit-issue) to the GitHub
30 | Repository. If you would like to *implement* a new feature, please submit an issue with
31 | a proposal for your work first, to be sure that we can use it.
32 |
33 | * **Small Features** can be crafted and directly [submitted as a Pull Request](#submit-pr).
34 |
35 | ## Submission Guidelines
36 |
37 | ### Submitting an Issue
38 | Before you submit an issue, search the archive, maybe your question was already answered.
39 |
40 | If your issue appears to be a bug, and hasn't been reported, open a new issue.
41 | Help us to maximize the effort we can spend fixing issues and adding new
42 | features, by not reporting duplicate issues. Providing the following information will increase the
43 | chances of your issue being dealt with quickly:
44 |
45 | * **Overview of the Issue** - if an error is being thrown a non-minified stack trace helps
46 | * **Version** - what version is affected (e.g. 0.1.2)
47 | * **Motivation for or Use Case** - explain what are you trying to do and why the current behavior is a bug for you
48 | * **Browsers and Operating System** - is this a problem with all browsers?
49 | * **Reproduce the Error** - provide a live example or a unambiguous set of steps
50 | * **Related Issues** - has a similar issue been reported before?
51 | * **Suggest a Fix** - if you can't fix the bug yourself, perhaps you can point to what might be
52 | causing the problem (line of code or commit)
53 |
54 | You can file new issues by providing the above information at the corresponding repository's issues link: https://github.com/[organization-name]/[repository-name]/issues/new].
55 |
56 | ### Submitting a Pull Request (PR)
57 | Before you submit your Pull Request (PR) consider the following guidelines:
58 |
59 | * Search the repository (https://github.com/[organization-name]/[repository-name]/pulls) for an open or closed PR
60 | that relates to your submission. You don't want to duplicate effort.
61 |
62 | * Make your changes in a new git fork:
63 |
64 | * Commit your changes using a descriptive commit message
65 | * Push your fork to GitHub:
66 | * In GitHub, create a pull request
67 | * If we suggest changes then:
68 | * Make the required updates.
69 | * Rebase your fork and force push to your GitHub repository (this will update your Pull Request):
70 |
71 | ```shell
72 | git rebase master -i
73 | git push -f
74 | ```
75 |
76 | That's it! Thank you for your contribution!
77 |
--------------------------------------------------------------------------------
/terraform/main_privateendpoint.temp:
--------------------------------------------------------------------------------
1 | data "azurerm_container_registry" "jmeter_acr" {
2 | name = var.JMETER_ACR_NAME
3 | resource_group_name = var.RESOURCE_GROUP_NAME
4 | }
5 |
6 | data "azurerm_virtual_network" "jmeter_vnet" {
7 | name = var.VNET_NAME
8 | resource_group_name = var.RESOURCE_GROUP_NAME
9 | }
10 |
11 | resource "random_id" "random" {
12 | byte_length = 4
13 | }
14 |
15 | resource "azurerm_subnet" "jmeter_subnet" {
16 | name = "${var.PREFIX}mysubnet"
17 | resource_group_name = var.RESOURCE_GROUP_NAME
18 | virtual_network_name = data.azurerm_virtual_network.jmeter_vnet.name
19 | address_prefix = var.SUBNET_ADDRESS_PREFIX
20 |
21 | delegation {
22 | name = "delegation"
23 |
24 | service_delegation {
25 | name = "Microsoft.ContainerInstance/containerGroups"
26 | actions = ["Microsoft.Network/virtualNetworks/subnets/action"]
27 | }
28 | }
29 |
30 | service_endpoints = ["Microsoft.Storage"]
31 | }
32 |
33 | resource "azurerm_network_profile" "jmeter_net_profile" {
34 | name = "${var.PREFIX}mynetprofile"
35 | location = var.LOCATION
36 | resource_group_name = var.RESOURCE_GROUP_NAME
37 |
38 | container_network_interface {
39 | name = "${var.PREFIX}cnic"
40 |
41 | ip_configuration {
42 | name = "${var.PREFIX}ipconfig"
43 | subnet_id = azurerm_subnet.jmeter_subnet.id
44 | }
45 | }
46 | }
47 |
48 | resource "azurerm_storage_account" "jmeter_storage" {
49 | name = "${var.PREFIX}storage${random_id.random.hex}"
50 | resource_group_name = var.RESOURCE_GROUP_NAME
51 | location = var.LOCATION
52 |
53 | account_tier = "Standard"
54 | account_replication_type = "LRS"
55 |
56 | network_rules {
57 | default_action = "Allow"
58 | virtual_network_subnet_ids = ["${azurerm_subnet.jmeter_subnet.id}"]
59 | }
60 | }
61 |
62 | resource "azurerm_storage_share" "jmeter_share" {
63 | name = "jmeter"
64 | storage_account_name = azurerm_storage_account.jmeter_storage.name
65 | quota = var.JMETER_STORAGE_QUOTA_GIGABYTES
66 | }
67 |
68 | resource "azurerm_container_group" "jmeter_workers" {
69 | count = var.JMETER_WORKERS_COUNT
70 | name = "${var.PREFIX}-worker${count.index}"
71 | location = var.LOCATION
72 | resource_group_name = var.RESOURCE_GROUP_NAME
73 |
74 | ip_address_type = "private"
75 | os_type = "Linux"
76 |
77 | network_profile_id = azurerm_network_profile.jmeter_net_profile.id
78 |
79 | image_registry_credential {
80 | server = data.azurerm_container_registry.jmeter_acr.login_server
81 | username = data.azurerm_container_registry.jmeter_acr.admin_username
82 | password = data.azurerm_container_registry.jmeter_acr.admin_password
83 | }
84 |
85 | container {
86 | name = "jmeter"
87 | image = var.JMETER_DOCKER_IMAGE
88 | cpu = var.JMETER_WORKER_CPU
89 | memory = var.JMETER_WORKER_MEMORY
90 |
91 | ports {
92 | port = var.JMETER_DOCKER_PORT
93 | protocol = "TCP"
94 | }
95 |
96 | volume {
97 | name = "jmeter"
98 | mount_path = "/jmeter"
99 | read_only = true
100 | storage_account_name = azurerm_storage_account.jmeter_storage.name
101 | storage_account_key = azurerm_storage_account.jmeter_storage.primary_access_key
102 | share_name = azurerm_storage_share.jmeter_share.name
103 | }
104 |
105 | commands = [
106 | "/bin/sh",
107 | "-c",
108 | "cp -r /jmeter/* .; /entrypoint.sh -s -J server.rmi.ssl.disable=true",
109 | ]
110 | }
111 | }
112 |
113 | resource "azurerm_container_group" "jmeter_controller" {
114 | name = "${var.PREFIX}-controller"
115 | location = var.LOCATION
116 | resource_group_name = var.RESOURCE_GROUP_NAME
117 |
118 | ip_address_type = "private"
119 | os_type = "Linux"
120 |
121 | network_profile_id = azurerm_network_profile.jmeter_net_profile.id
122 |
123 | restart_policy = "Never"
124 |
125 | image_registry_credential {
126 | server = data.azurerm_container_registry.jmeter_acr.login_server
127 | username = data.azurerm_container_registry.jmeter_acr.admin_username
128 | password = data.azurerm_container_registry.jmeter_acr.admin_password
129 | }
130 |
131 | container {
132 | name = "jmeter"
133 | image = var.JMETER_DOCKER_IMAGE
134 | cpu = var.JMETER_CONTROLLER_CPU
135 | memory = var.JMETER_CONTROLLER_MEMORY
136 |
137 | ports {
138 | port = var.JMETER_DOCKER_PORT
139 | protocol = "TCP"
140 | }
141 |
142 | volume {
143 | name = "jmeter"
144 | mount_path = "/jmeter"
145 | read_only = false
146 | storage_account_name = azurerm_storage_account.jmeter_storage.name
147 | storage_account_key = azurerm_storage_account.jmeter_storage.primary_access_key
148 | share_name = azurerm_storage_share.jmeter_share.name
149 | }
150 |
151 | commands = [
152 | "/bin/sh",
153 | "-c",
154 | "cd /jmeter; /entrypoint.sh -n -J server.rmi.ssl.disable=true -t ${var.JMETER_JMX_FILE} -l ${var.JMETER_RESULTS_FILE} -e -o ${var.JMETER_DASHBOARD_FOLDER} -R ${join(",", "${azurerm_container_group.jmeter_workers.*.ip_address}")}",
155 | ]
156 | }
157 | }
158 |
--------------------------------------------------------------------------------
/jmeter/sample_steps.jmx:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 | false
7 | true
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 | continue
17 | 100
18 | 1
19 | 5
20 |
21 |
22 |
23 | M
24 |
25 |
26 |
27 | true
28 |
29 |
30 |
31 | false
32 | {
33 | "search": "${randomsearchterm}",
34 | "skip":0,
35 | "top": 5,
36 | "queryType": "full"
37 | }
38 | =
39 |
40 |
41 |
42 | SEARCH_SERVICE_NAME.search.windows.net
43 | 443
44 | https
45 |
46 | /indexes/SEARCH_INDEX_NAME/docs/search?api-version=2020-06-30
47 | POST
48 | false
49 | false
50 | true
51 | false
52 |
53 | 10000
54 | 10000
55 |
56 |
57 |
58 |
59 |
60 | User-Agent
61 | Edg/87.0.664.66
62 |
63 |
64 | api-key
65 | API_KEY
66 |
67 |
68 | Content-Type
69 | application/json
70 |
71 |
72 |
73 |
74 |
75 | terms_to_search.csv
76 |
77 | randomsearchterm
78 | false
79 |
80 | false
81 | true
82 | false
83 | shareMode.all
84 |
85 |
86 |
87 |
88 |
89 |
--------------------------------------------------------------------------------
/jmeter/sample.jmx:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 | false
7 | true
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 | continue
16 |
17 | false
18 | 100
19 |
20 | 20
21 | 2
22 | true
23 | 200
24 | 1
25 | false
26 |
27 |
28 |
29 | true
30 |
31 |
32 |
33 | false
34 | {
35 | "search": "${randomsearchterm}",
36 | "skip":0,
37 | "top": 5,
38 | "queryType": "full"
39 | }
40 | =
41 |
42 |
43 |
44 | SEARCH_SERVICE_NAME.search.windows.net
45 | 443
46 | https
47 |
48 | /indexes/SEARCH_INDEX_NAME/docs/search?api-version=2020-06-30
49 | POST
50 | false
51 | false
52 | true
53 | false
54 |
55 |
56 |
57 |
58 |
59 |
60 |
61 |
62 | User-Agent
63 | Edg/87.0.664.66
64 |
65 |
66 | api-key
67 | API_KEY
68 |
69 |
70 | Content-Type
71 | application/json
72 |
73 |
74 |
75 |
76 |
77 | terms_to_search.csv
78 |
79 | randomsearchterm
80 | false
81 |
82 | false
83 | true
84 | false
85 | shareMode.all
86 |
87 |
88 |
89 |
90 |
91 |
--------------------------------------------------------------------------------
/terraform/main.tf:
--------------------------------------------------------------------------------
1 | data "azurerm_container_registry" "jmeter_acr" {
2 | name = var.JMETER_ACR_NAME
3 | resource_group_name = var.RESOURCE_GROUP_NAME
4 | }
5 |
6 | resource "random_id" "random" {
7 | byte_length = 4
8 | }
9 |
10 | resource "azurerm_virtual_network" "jmeter_vnet" {
11 | name = "${var.PREFIX}vnet"
12 | location = var.LOCATION
13 | resource_group_name = var.RESOURCE_GROUP_NAME
14 | address_space = ["${var.VNET_ADDRESS_SPACE}"]
15 | }
16 |
17 | resource "azurerm_subnet" "jmeter_subnet" {
18 | name = "${var.PREFIX}subnet"
19 | resource_group_name = var.RESOURCE_GROUP_NAME
20 | virtual_network_name = azurerm_virtual_network.jmeter_vnet.name
21 | address_prefix = var.SUBNET_ADDRESS_PREFIX
22 |
23 | delegation {
24 | name = "delegation"
25 |
26 | service_delegation {
27 | name = "Microsoft.ContainerInstance/containerGroups"
28 | actions = ["Microsoft.Network/virtualNetworks/subnets/action"]
29 | }
30 | }
31 |
32 | service_endpoints = ["Microsoft.Storage"]
33 | }
34 |
35 | resource "azurerm_network_profile" "jmeter_net_profile" {
36 | name = "${var.PREFIX}netprofile"
37 | location = var.LOCATION
38 | resource_group_name = var.RESOURCE_GROUP_NAME
39 |
40 | container_network_interface {
41 | name = "${var.PREFIX}cnic"
42 |
43 | ip_configuration {
44 | name = "${var.PREFIX}ipconfig"
45 | subnet_id = azurerm_subnet.jmeter_subnet.id
46 | }
47 | }
48 | }
49 |
50 | resource "azurerm_storage_account" "jmeter_storage" {
51 | name = "${var.PREFIX}storage${random_id.random.hex}"
52 | resource_group_name = var.RESOURCE_GROUP_NAME
53 | location = var.LOCATION
54 |
55 | account_tier = "Standard"
56 | account_replication_type = "LRS"
57 |
58 | network_rules {
59 | default_action = "Allow"
60 | virtual_network_subnet_ids = ["${azurerm_subnet.jmeter_subnet.id}"]
61 | }
62 | }
63 |
64 | resource "azurerm_storage_share" "jmeter_share" {
65 | name = "jmeter"
66 | storage_account_name = azurerm_storage_account.jmeter_storage.name
67 | quota = var.JMETER_STORAGE_QUOTA_GIGABYTES
68 | }
69 |
70 | resource "azurerm_container_group" "jmeter_workers" {
71 | count = var.JMETER_WORKERS_COUNT
72 | name = "${var.PREFIX}-worker${count.index}"
73 | location = var.LOCATION
74 | resource_group_name = var.RESOURCE_GROUP_NAME
75 |
76 | ip_address_type = "private"
77 | os_type = "Linux"
78 |
79 | network_profile_id = azurerm_network_profile.jmeter_net_profile.id
80 |
81 | image_registry_credential {
82 | server = data.azurerm_container_registry.jmeter_acr.login_server
83 | username = data.azurerm_container_registry.jmeter_acr.admin_username
84 | password = data.azurerm_container_registry.jmeter_acr.admin_password
85 | }
86 |
87 | container {
88 | name = "jmeter"
89 | image = var.JMETER_DOCKER_IMAGE
90 | cpu = var.JMETER_WORKER_CPU
91 | memory = var.JMETER_WORKER_MEMORY
92 |
93 | ports {
94 | port = var.JMETER_DOCKER_PORT
95 | protocol = "TCP"
96 | }
97 |
98 | volume {
99 | name = "jmeter"
100 | mount_path = "/jmeter"
101 | read_only = true
102 | storage_account_name = azurerm_storage_account.jmeter_storage.name
103 | storage_account_key = azurerm_storage_account.jmeter_storage.primary_access_key
104 | share_name = azurerm_storage_share.jmeter_share.name
105 | }
106 |
107 | commands = [
108 | "/bin/sh",
109 | "-c",
110 | "cp -r /jmeter/* .; /entrypoint.sh -s -J server.rmi.ssl.disable=true",
111 | ]
112 | }
113 | }
114 |
115 | resource "azurerm_container_group" "jmeter_controller" {
116 | name = "${var.PREFIX}-controller"
117 | location = var.LOCATION
118 | resource_group_name = var.RESOURCE_GROUP_NAME
119 |
120 | ip_address_type = "private"
121 | os_type = "Linux"
122 |
123 | network_profile_id = azurerm_network_profile.jmeter_net_profile.id
124 |
125 | restart_policy = "Never"
126 |
127 | image_registry_credential {
128 | server = data.azurerm_container_registry.jmeter_acr.login_server
129 | username = data.azurerm_container_registry.jmeter_acr.admin_username
130 | password = data.azurerm_container_registry.jmeter_acr.admin_password
131 | }
132 |
133 | container {
134 | name = "jmeter"
135 | image = var.JMETER_DOCKER_IMAGE
136 | cpu = var.JMETER_CONTROLLER_CPU
137 | memory = var.JMETER_CONTROLLER_MEMORY
138 |
139 | ports {
140 | port = var.JMETER_DOCKER_PORT
141 | protocol = "TCP"
142 | }
143 |
144 | volume {
145 | name = "jmeter"
146 | mount_path = "/jmeter"
147 | read_only = false
148 | storage_account_name = azurerm_storage_account.jmeter_storage.name
149 | storage_account_key = azurerm_storage_account.jmeter_storage.primary_access_key
150 | share_name = azurerm_storage_share.jmeter_share.name
151 | }
152 |
153 | commands = [
154 | "/bin/sh",
155 | "-c",
156 | "cd /jmeter; /entrypoint.sh -n -J server.rmi.ssl.disable=true -t ${var.JMETER_JMX_FILE} -l ${var.JMETER_RESULTS_FILE} -e -o ${var.JMETER_DASHBOARD_FOLDER} -R ${join(",", "${azurerm_container_group.jmeter_workers.*.ip_address}")}",
157 | ]
158 | }
159 | }
160 |
--------------------------------------------------------------------------------
/other_tools/SearchFunctions/.gitignore:
--------------------------------------------------------------------------------
1 | ## Ignore Visual Studio temporary files, build results, and
2 | ## files generated by popular Visual Studio add-ons.
3 |
4 | # Azure Functions localsettings file
5 | local.settings.json
6 |
7 | # User-specific files
8 | *.suo
9 | *.user
10 | *.userosscache
11 | *.sln.docstates
12 |
13 | # User-specific files (MonoDevelop/Xamarin Studio)
14 | *.userprefs
15 |
16 | # Build results
17 | [Dd]ebug/
18 | [Dd]ebugPublic/
19 | [Rr]elease/
20 | [Rr]eleases/
21 | x64/
22 | x86/
23 | bld/
24 | [Bb]in/
25 | [Oo]bj/
26 | [Ll]og/
27 |
28 | # Visual Studio 2015 cache/options directory
29 | .vs/
30 | # Uncomment if you have tasks that create the project's static files in wwwroot
31 | #wwwroot/
32 |
33 | # MSTest test Results
34 | [Tt]est[Rr]esult*/
35 | [Bb]uild[Ll]og.*
36 |
37 | # NUNIT
38 | *.VisualState.xml
39 | TestResult.xml
40 |
41 | # Build Results of an ATL Project
42 | [Dd]ebugPS/
43 | [Rr]eleasePS/
44 | dlldata.c
45 |
46 | # DNX
47 | project.lock.json
48 | project.fragment.lock.json
49 | artifacts/
50 |
51 | *_i.c
52 | *_p.c
53 | *_i.h
54 | *.ilk
55 | *.meta
56 | *.obj
57 | *.pch
58 | *.pdb
59 | *.pgc
60 | *.pgd
61 | *.rsp
62 | *.sbr
63 | *.tlb
64 | *.tli
65 | *.tlh
66 | *.tmp
67 | *.tmp_proj
68 | *.log
69 | *.vspscc
70 | *.vssscc
71 | .builds
72 | *.pidb
73 | *.svclog
74 | *.scc
75 |
76 | # Chutzpah Test files
77 | _Chutzpah*
78 |
79 | # Visual C++ cache files
80 | ipch/
81 | *.aps
82 | *.ncb
83 | *.opendb
84 | *.opensdf
85 | *.sdf
86 | *.cachefile
87 | *.VC.db
88 | *.VC.VC.opendb
89 |
90 | # Visual Studio profiler
91 | *.psess
92 | *.vsp
93 | *.vspx
94 | *.sap
95 |
96 | # TFS 2012 Local Workspace
97 | $tf/
98 |
99 | # Guidance Automation Toolkit
100 | *.gpState
101 |
102 | # ReSharper is a .NET coding add-in
103 | _ReSharper*/
104 | *.[Rr]e[Ss]harper
105 | *.DotSettings.user
106 |
107 | # JustCode is a .NET coding add-in
108 | .JustCode
109 |
110 | # TeamCity is a build add-in
111 | _TeamCity*
112 |
113 | # DotCover is a Code Coverage Tool
114 | *.dotCover
115 |
116 | # NCrunch
117 | _NCrunch_*
118 | .*crunch*.local.xml
119 | nCrunchTemp_*
120 |
121 | # MightyMoose
122 | *.mm.*
123 | AutoTest.Net/
124 |
125 | # Web workbench (sass)
126 | .sass-cache/
127 |
128 | # Installshield output folder
129 | [Ee]xpress/
130 |
131 | # DocProject is a documentation generator add-in
132 | DocProject/buildhelp/
133 | DocProject/Help/*.HxT
134 | DocProject/Help/*.HxC
135 | DocProject/Help/*.hhc
136 | DocProject/Help/*.hhk
137 | DocProject/Help/*.hhp
138 | DocProject/Help/Html2
139 | DocProject/Help/html
140 |
141 | # Click-Once directory
142 | publish/
143 |
144 | # Publish Web Output
145 | *.[Pp]ublish.xml
146 | *.azurePubxml
147 | # TODO: Comment the next line if you want to checkin your web deploy settings
148 | # but database connection strings (with potential passwords) will be unencrypted
149 | #*.pubxml
150 | *.publishproj
151 |
152 | # Microsoft Azure Web App publish settings. Comment the next line if you want to
153 | # checkin your Azure Web App publish settings, but sensitive information contained
154 | # in these scripts will be unencrypted
155 | PublishScripts/
156 |
157 | # NuGet Packages
158 | *.nupkg
159 | # The packages folder can be ignored because of Package Restore
160 | **/packages/*
161 | # except build/, which is used as an MSBuild target.
162 | !**/packages/build/
163 | # Uncomment if necessary however generally it will be regenerated when needed
164 | #!**/packages/repositories.config
165 | # NuGet v3's project.json files produces more ignoreable files
166 | *.nuget.props
167 | *.nuget.targets
168 |
169 | # Microsoft Azure Build Output
170 | csx/
171 | *.build.csdef
172 |
173 | # Microsoft Azure Emulator
174 | ecf/
175 | rcf/
176 |
177 | # Windows Store app package directories and files
178 | AppPackages/
179 | BundleArtifacts/
180 | Package.StoreAssociation.xml
181 | _pkginfo.txt
182 |
183 | # Visual Studio cache files
184 | # files ending in .cache can be ignored
185 | *.[Cc]ache
186 | # but keep track of directories ending in .cache
187 | !*.[Cc]ache/
188 |
189 | # Others
190 | ClientBin/
191 | ~$*
192 | *~
193 | *.dbmdl
194 | *.dbproj.schemaview
195 | *.jfm
196 | *.pfx
197 | *.publishsettings
198 | node_modules/
199 | orleans.codegen.cs
200 |
201 | # Since there are multiple workflows, uncomment next line to ignore bower_components
202 | # (https://github.com/github/gitignore/pull/1529#issuecomment-104372622)
203 | #bower_components/
204 |
205 | # RIA/Silverlight projects
206 | Generated_Code/
207 |
208 | # Backup & report files from converting an old project file
209 | # to a newer Visual Studio version. Backup files are not needed,
210 | # because we have git ;-)
211 | _UpgradeReport_Files/
212 | Backup*/
213 | UpgradeLog*.XML
214 | UpgradeLog*.htm
215 |
216 | # SQL Server files
217 | *.mdf
218 | *.ldf
219 |
220 | # Business Intelligence projects
221 | *.rdl.data
222 | *.bim.layout
223 | *.bim_*.settings
224 |
225 | # Microsoft Fakes
226 | FakesAssemblies/
227 |
228 | # GhostDoc plugin setting file
229 | *.GhostDoc.xml
230 |
231 | # Node.js Tools for Visual Studio
232 | .ntvs_analysis.dat
233 |
234 | # Visual Studio 6 build log
235 | *.plg
236 |
237 | # Visual Studio 6 workspace options file
238 | *.opt
239 |
240 | # Visual Studio LightSwitch build output
241 | **/*.HTMLClient/GeneratedArtifacts
242 | **/*.DesktopClient/GeneratedArtifacts
243 | **/*.DesktopClient/ModelManifest.xml
244 | **/*.Server/GeneratedArtifacts
245 | **/*.Server/ModelManifest.xml
246 | _Pvt_Extensions
247 |
248 | # Paket dependency manager
249 | .paket/paket.exe
250 | paket-files/
251 |
252 | # FAKE - F# Make
253 | .fake/
254 |
255 | # JetBrains Rider
256 | .idea/
257 | *.sln.iml
258 |
259 | # CodeRush
260 | .cr/
261 |
262 | # Python Tools for Visual Studio (PTVS)
263 | __pycache__/
264 | *.pyc
--------------------------------------------------------------------------------
/jmeter/terms_to_search.csv:
--------------------------------------------------------------------------------
1 | Orthopedic Population
2 | Individuals
3 | Morquio Syndrome
4 | Maroteaux Lamy Syndrome
5 | Greenville
6 | South Carolina
7 | Greenwood Genetic Center
8 | United States
9 | Morquio syndrome type
10 | patients
11 | MPS IVA
12 | study
13 | purpose
14 | Maroteaux Lamy syndrome
15 | MPS
16 | milder course
17 | negative urine screening
18 | clinical features
19 | certain hip
20 | joint problems
21 | participants
22 | chart review process
23 | genetic conditions
24 | Shriners
25 | Greenville
26 | Children
27 | Diagnostic testing
28 | Hospital
29 | participant
30 | conditions
31 | Results
32 | legal guardians
33 | appropriate follow
34 | participants
35 | abnormal results
36 | Extension Study of Intrathecal Enzyme Replacement
37 | Cognitive Decline
38 | MPS
39 | Harbor UCLA Medical Center
40 | Torrance
41 | Los Angeles Biomedical Research Institute
42 | California
43 | United States
44 | pilot study
45 | five year extension study
46 | Intrathecal Enzyme Replacement
47 | Cognitive Decline
48 | MPS
49 | Participants
50 | pilot study
51 | study
52 | Efficacy
53 | Double Blind Study
54 | Safety of BMN
55 | Mucopolysaccharidosis IVA
56 | Morquio
57 | Patients
58 | Syndrome
59 | Oakland
60 | California
61 | United States
62 | Wilmington
63 | Delaware
64 | United States
65 | District of Columbia
66 | United States
67 | Washington
68 | Chicago
69 | Illinois
70 | United States
71 | New York
72 | United States
73 | Seattle
74 | Washington
75 | United States
76 | Cordoba
77 | Argentina
78 | Campina Grande
79 | Brazil
80 | Porto Alegre
81 | Brazil
82 | Montreal
83 | Canada
84 | Sherbrooke
85 | Canada
86 | Toronto
87 | Canada
88 | Bogota
89 | Colombia
90 | Copenhagen
91 | Denmark
92 | Lyon
93 | France
94 | Paris
95 | France
96 | Mainz
97 | Germany
98 | Monza
99 | Italy
100 | Tokyo
101 | Japan
102 | Seoul
103 | Korea
104 | Republic
105 | Amsterdam
106 | Netherlands
107 | Coimbra
108 | Portugal
109 | Doha
110 | Qatar
111 | Riyadh
112 | Saudi Arabia
113 | Taipei
114 | Taiwan
115 | Birmingham
116 | United Kingdom
117 | London
118 | United Kingdom
119 | Manchester
120 | United Kingdom
121 | study
122 | efficacy
123 | Phase
124 | safety
125 | week BMN
126 | patients
127 | mucopolysaccharidosis IVA
128 | week BMN
129 | Morquio
130 | Syndrome
131 | MPS IVA
132 | standard accepted treatment
133 | supportive care
134 | potential new treatment option
135 | ERT
136 | MPS IVA
137 | Enzyme replacement therapy
138 | patients
139 | MPS IVA patients
140 | infusion
141 | BMN
142 | phosphate receptor
143 | mannose
144 | transportation
145 | lysosomes
146 | uptake
147 | lysosomes
148 | enzyme uptake
149 | increased catabolism
150 | tissue macrophages
151 | hyaline cartilage
152 | KS
153 | connective tissues
154 | keratan sulfate
155 | progressive accumulation of KS
156 | heart valve
157 | clinical manifestations
158 | disorders Orthopedic Population
159 | Individuals
160 | Morquio Syndrome
161 | Maroteaux Lamy Syndrome
162 | Greenville
163 | South Carolina
164 | Greenwood Genetic Center
165 | United States
166 | Morquio syndrome type
167 | patients
168 | MPS IVA
169 | study
170 | purpose
171 | Maroteaux Lamy syndrome
172 | MPS
173 | milder course
174 | negative urine screening
175 | clinical features
176 | certain hip
177 | joint problems
178 | participants
179 | chart review process
180 | genetic conditions
181 | Shriners
182 | Greenville
183 | Children
184 | Diagnostic testing
185 | Hospital
186 | participant
187 | conditions
188 | Results
189 | legal guardians
190 | appropriate follow
191 | participants
192 | abnormal results
193 | Extension Study of Intrathecal Enzyme Replacement
194 | Cognitive Decline
195 | MPS
196 | Harbor UCLA Medical Center
197 | Torrance
198 | Los Angeles Biomedical Research Institute
199 | California
200 | United States
201 | pilot study
202 | five year extension study
203 | Intrathecal Enzyme Replacement
204 | Cognitive Decline
205 | MPS
206 | Participants
207 | pilot study
208 | study
209 | Efficacy
210 | Double Blind Study
211 | Safety of BMN
212 | Mucopolysaccharidosis IVA
213 | Morquio
214 | Patients
215 | Syndrome
216 | Oakland
217 | California
218 | United States
219 | Wilmington
220 | Delaware
221 | United States
222 | District of Columbia
223 | United States
224 | Washington
225 | Chicago
226 | Illinois
227 | United States
228 | New York
229 | United States
230 | Seattle
231 | Washington
232 | United States
233 | Cordoba
234 | Argentina
235 | Campina Grande
236 | Brazil
237 | Porto Alegre
238 | Brazil
239 | Montreal
240 | Canada
241 | Sherbrooke
242 | Canada
243 | Toronto
244 | Canada
245 | Bogota
246 | Colombia
247 | Copenhagen
248 | Denmark
249 | Lyon
250 | France
251 | Paris
252 | France
253 | Mainz
254 | Germany
255 | Monza
256 | Italy
257 | Tokyo
258 | Japan
259 | Seoul
260 | Korea
261 | Republic
262 | Amsterdam
263 | Netherlands
264 | Coimbra
265 | Portugal
266 | Doha
267 | Qatar
268 | Riyadh
269 | Saudi Arabia
270 | Taipei
271 | Taiwan
272 | Birmingham
273 | United Kingdom
274 | London
275 | United Kingdom
276 | Manchester
277 | United Kingdom
278 | study
279 | efficacy
280 | Phase
281 | safety
282 | week BMN
283 | patients
284 | mucopolysaccharidosis IVA
285 | week BMN
286 | Morquio
287 | Syndrome
288 | MPS IVA
289 | standard accepted treatment
290 | supportive care
291 | potential new treatment option
292 | ERT
293 | MPS IVA
294 | Enzyme replacement therapy
295 | patients
296 | MPS IVA patients
297 | infusion
298 | BMN
299 | phosphate receptor
300 | mannose
301 | transportation
302 | lysosomes
303 | uptake
304 | lysosomes
305 | enzyme uptake
306 | increased catabolism
307 | tissue macrophages
308 | hyaline cartilage
309 | KS
310 | connective tissues
311 | keratan sulfate
312 | progressive accumulation of KS
313 | heart valve
314 | clinical manifestations
315 | disorders
316 |
--------------------------------------------------------------------------------
/pipelines/azure-pipelines.load-test.yml:
--------------------------------------------------------------------------------
1 | trigger: none
2 |
3 | pool:
4 | vmImage: 'ubuntu-18.04'
5 |
6 | variables:
7 | - group: JMETER_TERRAFORM_SETTINGS
8 | - name: JMETER_DIRECTORY_INPUT
9 | value: $(System.DefaultWorkingDirectory)/jmeter
10 | - name: JMETER_DIRECTORY_OUTPUT
11 | value: $(System.DefaultWorkingDirectory)/results
12 | - name: TERRAFORM_VERSION
13 | value: 0.13.2
14 |
15 | steps:
16 |
17 | - task: AzureCLI@2
18 | displayName: 'SETUP: Validate JMeter Docker Image'
19 | inputs:
20 | azureSubscription: $(AZURE_SERVICE_CONNECTION_NAME)
21 | scriptType: bash
22 | scriptLocation: inlineScript
23 | inlineScript: |
24 | az acr login -n $(TF_VAR_JMETER_ACR_NAME)
25 | docker pull $(TF_VAR_JMETER_DOCKER_IMAGE)
26 |
27 | - script: |
28 | docker run --name=jmx-validator -v $(JMETER_DIRECTORY_INPUT):/jmeter -w /jmeter \
29 | --entrypoint "TestPlanCheck.sh" $(TF_VAR_JMETER_DOCKER_IMAGE) \
30 | --stats --tree-dump --jmx $(TF_VAR_JMETER_JMX_FILE)
31 | displayName: 'SETUP: Validate JMX File'
32 |
33 | - script: |
34 | sudo sed 's/API_KEY/'$(API_KEY)'/g; s/SEARCH_SERVICE_NAME/'$(SEARCH_SERVICE_NAME)'/g; s/SEARCH_INDEX_NAME/'$(SEARCH_INDEX_NAME)'/g' $(JMETER_DIRECTORY_INPUT)/$(TF_VAR_JMETER_JMX_FILE) > $(JMETER_DIRECTORY_INPUT)/tmp.jmx
35 | mv $(JMETER_DIRECTORY_INPUT)/tmp.jmx $(JMETER_DIRECTORY_INPUT)/$(TF_VAR_JMETER_JMX_FILE)
36 | displayName: 'SETUP: Populate API key, service name, and index name for search in jmx config jmeter file'
37 |
38 | - task: AzureCLI@2
39 | displayName: 'SETUP: Prepare Terraform Credentials'
40 | inputs:
41 | azureSubscription: $(AZURE_SERVICE_CONNECTION_NAME)
42 | scriptType: bash
43 | scriptLocation: inlineScript
44 | addSpnToEnvironment: true
45 | inlineScript: |
46 | echo "##vso[task.setvariable variable=ARM_CLIENT_ID]$servicePrincipalId"
47 | echo "##vso[task.setvariable variable=ARM_CLIENT_SECRET]$servicePrincipalKey"
48 | echo "##vso[task.setvariable variable=ARM_TENANT_ID]$tenantId"
49 | echo "##vso[task.setvariable variable=ARM_SUBSCRIPTION_ID]$AZURE_SUBSCRIPTION_ID"
50 |
51 | - script: |
52 | wget https://releases.hashicorp.com/terraform/$(TERRAFORM_VERSION)/terraform_$(TERRAFORM_VERSION)_linux_amd64.zip
53 | unzip terraform_$(TERRAFORM_VERSION)_linux_amd64.zip
54 | sudo mv ./terraform /usr/local/bin
55 | workingDirectory: $(Agent.TempDirectory)
56 | displayName: 'SETUP: Install Terraform'
57 |
58 | - script: terraform init
59 | workingDirectory: ./terraform
60 | displayName: 'SETUP: Run Terraform Init'
61 |
62 | - script: terraform apply -target azurerm_storage_share.jmeter_share -auto-approve
63 | workingDirectory: ./terraform
64 | displayName: 'SETUP: Run Terraform Apply (target=file share)'
65 |
66 | - task: AzureCLI@2
67 | displayName: 'SETUP: Transfer JMeter Files to Storage Account'
68 | inputs:
69 | azureSubscription: $(AZURE_SERVICE_CONNECTION_NAME)
70 | scriptType: 'bash'
71 | workingDirectory: ./terraform
72 | scriptLocation: 'inlineScript'
73 | inlineScript: |
74 | SHARENAME=$(terraform output storage_file_share_name)
75 | ACCOUNTNAME=$(terraform output storage_account_name)
76 | STGKEY=$(terraform output storage_account_key)
77 | az storage file upload-batch --account-key $STGKEY --account-name $ACCOUNTNAME --destination $SHARENAME --source $(JMETER_DIRECTORY_INPUT)
78 |
79 | - script: terraform apply -auto-approve
80 | workingDirectory: ./terraform
81 | displayName: 'SETUP: Run Terraform Apply (target=all)'
82 |
83 | - task: AzureCLI@2
84 | inputs:
85 | azureSubscription: $(AZURE_SERVICE_CONNECTION_NAME)
86 | workingDirectory: ./terraform
87 | scriptType: bash
88 | scriptLocation: inlineScript
89 | inlineScript: |
90 | RG=$(terraform output resource_group_name)
91 | NAME=$(terraform output jmeter_controller_name)
92 | echo "`date`: Started!"
93 | sleep 10
94 | while [ $(az container show -g $RG -n $NAME --query "containers[0].instanceView.currentState.state" -o tsv) == "Running" ]; do
95 | echo "`date`: Still Running..."
96 | sleep 20
97 | done
98 | echo "`date`: Finished!"
99 | displayName: 'TEST: Wait Test Execution'
100 |
101 | - task: AzureCLI@2
102 | inputs:
103 | azureSubscription: $(AZURE_SERVICE_CONNECTION_NAME)
104 | workingDirectory: ./terraform
105 | scriptType: bash
106 | scriptLocation: inlineScript
107 | inlineScript: |
108 | az container logs -g $(terraform output resource_group_name) -n $(terraform output jmeter_controller_name)
109 | RESOURCE_GROUP=$(terraform output resource_group_name)
110 | echo -n $(terraform output jmeter_workers_names) | xargs -t -d "," -I '{}' -n1 az container logs -g $RESOURCE_GROUP -n {}
111 | displayName: 'RESULTS: Collect JMeter Controller and Worker Logs'
112 |
113 | - task: AzureCLI@2
114 | displayName: 'RESULTS: Download Jmeter results'
115 | inputs:
116 | azureSubscription: $(AZURE_SERVICE_CONNECTION_NAME)
117 | scriptType: 'bash'
118 | workingDirectory: ./terraform
119 | scriptLocation: 'inlineScript'
120 | inlineScript: |
121 | RG=$(terraform output resource_group_name)
122 | SHARENAME=$(terraform output storage_file_share_name)
123 | STGKEY=$(terraform output storage_account_key)
124 | ACCOUNTNAME=$(terraform output storage_account_name)
125 | mkdir $(System.DefaultWorkingDirectory)/results
126 | az storage file download-batch --account-key $STGKEY --account-name $ACCOUNTNAME --destination $(JMETER_DIRECTORY_OUTPUT) --no-progress --source $SHARENAME
127 |
128 | - script: |
129 | JMETER_RESULTS=$(JMETER_DIRECTORY_OUTPUT)/$(terraform output jmeter_results_file)
130 | JUNIT_RESULTS=$(JMETER_DIRECTORY_OUTPUT)/output.xml
131 | python3 ../scripts/jtl_junit_converter.py $JMETER_RESULTS $JUNIT_RESULTS
132 | workingDirectory: ./terraform
133 | displayName: 'RESULTS: Convert JMeter Results to JUnit Format'
134 |
135 | - task: PublishTestResults@2
136 | inputs:
137 | testResultsFormat: 'JUnit'
138 | testResultsFiles: '$(JMETER_DIRECTORY_OUTPUT)/output.xml'
139 | failTaskOnFailedTests: false
140 | displayName: 'RESULTS: Publish Load Testing Results'
141 |
142 | - publish: $(JMETER_DIRECTORY_OUTPUT)
143 | artifact: JMeterResults
144 | condition: succeededOrFailed()
145 | displayName: 'RESULTS: Publish Load Test Artifacts'
146 |
147 | - script: terraform destroy -auto-approve
148 | workingDirectory: ./terraform
149 | displayName: 'TEARDOWN: Run Terraform Destroy'
--------------------------------------------------------------------------------
/scripts/jtl_junit_converter.py:
--------------------------------------------------------------------------------
1 | import sys, csv
2 | from datetime import datetime
3 | from xml.etree import ElementTree
4 | from xml.dom import minidom
5 | from xml.etree.ElementTree import Element, SubElement, Comment
6 |
7 | # jtl table columns
8 | TIMESTAMP = 0
9 | ELAPSED = 1
10 | LABEL = 2
11 | RESPONSE_MESSAGE = 4
12 | SUCCESS = 7
13 |
14 | def prettify(elem):
15 | """Returns a pretty-printed XML string for the Element.
16 | """
17 | rough_string = ElementTree.tostring(elem, 'utf-8')
18 | reparsed = minidom.parseString(rough_string)
19 | return reparsed.toprettyxml(indent=" ")
20 |
21 | def retrieve_jmeter_results(jmeter_file):
22 | """Returns a list of JMeter JTL rows without header.
23 | The JMeter JTL file must be in CSV format.
24 | """
25 | csv_reader = csv.reader(jmeter_file)
26 | next(csv_reader)
27 | return list(csv_reader)
28 |
29 | def create_request_attrib(jmeter_result):
30 | """Returns a JSON with attributes for a JUnit testsuite: https://llg.cubic.org/docs/junit/
31 | The JMeter JTL file must be in CSV format.
32 | """
33 | request_time = float(jmeter_result[ELAPSED])/1000.0
34 |
35 | return {
36 | 'name': jmeter_result[LABEL],
37 | 'time': str(request_time),
38 | 'error_message': jmeter_result[RESPONSE_MESSAGE]
39 | }
40 |
41 | def create_test_case_attrib(request_attrib):
42 | """Returns a JSON with attributes for a JUnit testcase: https://llg.cubic.org/docs/junit/
43 | The JMeter JTL file must be in CSV format.
44 | """
45 | return {
46 | 'classname': 'httpSample',
47 | 'name': request_attrib['name'],
48 | 'time': request_attrib['time']
49 | }
50 |
51 | def create_test_suite_attrib(junit_results):
52 | """Returns a JSON with attributes for JUnit testsuite: https://llg.cubic.org/docs/junit/
53 | The JMeter JTL file must be in CSV format.
54 | """
55 | return {
56 | 'id': '1',
57 | 'name': 'load test',
58 | 'package': 'load test',
59 | 'hostname': 'Azure DevOps',
60 | 'time': str(junit_results['time']),
61 | 'tests': str(junit_results['tests']),
62 | 'failures': str(len(junit_results['requests']['failures'])),
63 | 'errors': '0'
64 | }
65 |
66 | def create_error_test_case_attrib(error_message):
67 | """Returns a JSON with attributes for JUnit testcase for failed requests: https://llg.cubic.org/docs/junit/
68 | The JMeter JTL file must be in CSV format.
69 | """
70 | return {
71 | 'message': error_message,
72 | 'type': 'exception'
73 | }
74 |
75 | def requests(jmeter_results):
76 | """Returns a JSON with successful and failed HTTP requests.
77 | The JMeter JTL file must be in CSV format.
78 | """
79 | failed_requests = []
80 | successful_requests = []
81 |
82 | for result in jmeter_results:
83 | request_attrib = create_request_attrib(result)
84 |
85 | if result[SUCCESS] == 'true':
86 | successful_requests.append(request_attrib)
87 | else:
88 | failed_requests.append(request_attrib)
89 |
90 | return {
91 | 'success': successful_requests,
92 | 'failures': failed_requests
93 | }
94 |
95 | def total_time_seconds(jmeter_results):
96 | """Returns the total test duration in seconds.
97 | The JMeter JTL file must be in CSV format.
98 | """
99 | max_timestamp = max(jmeter_results, key=lambda result: int(result[TIMESTAMP]))
100 | min_timestamp = min(jmeter_results, key=lambda result: int(result[TIMESTAMP]))
101 | total_timestamp = int(max_timestamp[TIMESTAMP]) - int(min_timestamp[TIMESTAMP])
102 |
103 | return float(total_timestamp)/1000.0
104 |
105 | def create_junit_results(jtl_results_filename):
106 | with open(jtl_results_filename) as jmeter_file:
107 | jmeter_results = retrieve_jmeter_results(jmeter_file)
108 | time = total_time_seconds(jmeter_results)
109 |
110 | return {
111 | 'tests': len(jmeter_results),
112 | 'time': time,
113 | 'requests': requests(jmeter_results),
114 | }
115 |
116 | def create_properties(test_suite):
117 | """Creates a JUnit properties element for testsuite: https://llg.cubic.org/docs/junit/
118 | The JMeter JTL file must be in CSV format.
119 | """
120 | return SubElement(test_suite, 'properties')
121 |
122 | def create_test_suite(test_suites, junit_results):
123 | """Creates a JUnit testsuite: https://llg.cubic.org/docs/junit/
124 | The JMeter JTL file must be in CSV format.
125 | """
126 | test_suite_attrib = create_test_suite_attrib(junit_results)
127 | test_suite = SubElement(test_suites, 'testsuite', test_suite_attrib)
128 |
129 | create_properties(test_suite)
130 |
131 | failed_requests = len(junit_results['requests']['failures'])
132 | successful_requests = len(junit_results['requests']['success'])
133 |
134 | for success_index in range(successful_requests):
135 | successful_request = junit_results['requests']['success'][success_index]
136 | create_successful_test_case(test_suite, successful_request)
137 |
138 | for error_index in range(failed_requests):
139 | failed_request = junit_results['requests']['failures'][error_index]
140 | create_failed_test_case(test_suite, failed_request)
141 |
142 | def create_failed_test_case(test_suite, failed_request):
143 | """Creates a JUnit test case for failed HTTP requests: https://llg.cubic.org/docs/junit/
144 | The JMeter JTL file must be in CSV format.
145 | """
146 | test_case_attrib = create_test_case_attrib(failed_request)
147 | error_test_case_attrib = create_error_test_case_attrib(failed_request['error_message'])
148 |
149 | test_case = SubElement(test_suite, 'testcase', test_case_attrib)
150 | test_case_error = SubElement(test_case, 'error', error_test_case_attrib)
151 |
152 | def create_successful_test_case(test_suite, successful_request):
153 | """Creates a JUnit test case for successful HTTP requests: https://llg.cubic.org/docs/junit/
154 | The JMeter JTL file must be in CSV format.
155 | """
156 | test_case_attrib = create_test_case_attrib(successful_request)
157 | test_case = SubElement(test_suite, 'testcase', test_case_attrib)
158 |
159 | def create_test_suites(jtl_results_filename):
160 | """Creates a JUnit testsuites element: https://llg.cubic.org/docs/junit/
161 | The JMeter JTL file must be in CSV format.
162 | """
163 | test_suites = Element('testsuites')
164 | junit_results = create_junit_results(jtl_results_filename)
165 | create_test_suite(test_suites, junit_results)
166 |
167 | return prettify(test_suites)
168 |
169 | def main():
170 | print('Converting...')
171 |
172 | with open(sys.argv[2], "w") as output_file:
173 | test_suites = create_test_suites(sys.argv[1])
174 | output_file.write(test_suites)
175 |
176 | print('Done!')
177 |
178 | if __name__ == '__main__':
179 | main()
--------------------------------------------------------------------------------
/other_tools/SearchFunctions/SearchTestNode.cs:
--------------------------------------------------------------------------------
1 | // Copyright (c) Microsoft Corporation. All rights reserved.
2 | // Licensed under the MIT License.
3 |
4 | using System;
5 | using System.IO;
6 | using System.Threading.Tasks;
7 | using Microsoft.AspNetCore.Mvc;
8 | using Microsoft.Azure.WebJobs;
9 | using Microsoft.Azure.WebJobs.Extensions.Http;
10 | using Microsoft.AspNetCore.Http;
11 | using Microsoft.Extensions.Logging;
12 | using Newtonsoft.Json;
13 | using Azure;
14 | using Azure.Search.Documents;
15 | using Azure.Search.Documents.Models;
16 | using System.Collections.Concurrent;
17 | using System.Collections.Generic;
18 | using System.Threading;
19 | using System.Linq;
20 | using System.Net.Http;
21 | using System.Net.Http.Headers;
22 | using System.Web.Http;
23 | using System.Net;
24 |
25 | namespace SearchFunctions
26 | {
27 |
28 |
29 | public static class SearchTestNode
30 | {
31 | // Getting environment variables
32 | static string searchServiceName = Environment.GetEnvironmentVariable("SearchSerivceName", EnvironmentVariableTarget.Process);
33 | static string searchAdminKey = Environment.GetEnvironmentVariable("SearchAdminKey", EnvironmentVariableTarget.Process);
34 | static string searchIndexName = Environment.GetEnvironmentVariable("SearchIndexName", EnvironmentVariableTarget.Process);
35 | static string apiVersion = "2020-06-30";
36 | static List queryList = new List();
37 |
38 | private static readonly Random _random = new Random();
39 |
40 | [FunctionName("SearchTestNode")]
41 | public static async Task Run(
42 | [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
43 | ILogger log, Microsoft.Azure.WebJobs.ExecutionContext context)
44 | {
45 | try
46 | {
47 |
48 | log.LogInformation("C# HTTP trigger function processed a request.");
49 |
50 |
51 | log.LogInformation(searchServiceName);
52 |
53 | var path = System.IO.Path.Combine(context.FunctionDirectory, "..\\semantic-scholar-queries.txt");
54 | var reader = new StreamReader(File.OpenRead(path));
55 | while (!reader.EndOfStream)
56 | {
57 | var line = reader.ReadLine();
58 | queryList.Add(line.Replace('"', ' ').Trim());
59 | }
60 |
61 | // Creating a HttpClient to send queries
62 | HttpClient client = new HttpClient();
63 | client.DefaultRequestHeaders.Add("api-key", searchAdminKey);
64 | client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
65 |
66 | log.LogInformation("Search client created");
67 |
68 | // Reading in parameters
69 | int qps = int.Parse(req.Query["qps"]);
70 | int duration = int.Parse(req.Query["duration"]);
71 | int totalQueryCount = qps * duration;
72 |
73 | // Creating a variable to hold the results
74 | ConcurrentBag processedWork = new ConcurrentBag();
75 | var queries = new List();
76 |
77 | var taskList = new List();
78 | var testStartTime = DateTime.Now;
79 | for (int i = 0; i < totalQueryCount; i++)
80 | {
81 | taskList.Add(Task.Factory.StartNew(() => SendQuery(client, processedWork, log)));
82 |
83 | double testRunTime = DateTime.Now.Subtract(testStartTime).TotalSeconds;
84 | double currentQPS = i / testRunTime;
85 | while (currentQPS > qps)
86 | {
87 | Thread.Sleep(100);
88 | testRunTime = DateTime.Now.Subtract(testStartTime).TotalSeconds;
89 | currentQPS = i / testRunTime;
90 | }
91 | }
92 |
93 | Console.WriteLine("Waiting for all tasks to complete...");
94 | Task.WaitAll(taskList.ToArray());
95 |
96 | var successfulQueries = processedWork.Where(s => s.runStatusCode == 200).Count();
97 | var failedQueries = processedWork.Where(s => s.runStatusCode != 200).Count();
98 | var percentSuccess = 100 * successfulQueries / (qps * duration);
99 | var averageLatency = processedWork.Select(x => x.runMS).Average();
100 |
101 | string responseMessage = $"Successful Queries: {successfulQueries} \n";
102 | responseMessage += $"Failed Queries: {failedQueries} \n";
103 | responseMessage += $"Percent Succesful: {percentSuccess} \n";
104 | responseMessage += $"Average Latency: {averageLatency} \n";
105 |
106 | log.LogInformation(responseMessage);
107 |
108 | return new OkObjectResult(processedWork);
109 | }
110 | catch (Exception e)
111 | {
112 | log.LogInformation(e.ToString());
113 |
114 | return new ExceptionResult(e, true);
115 | }
116 | }
117 |
118 | static async Task SendQuery(HttpClient client, ConcurrentBag processedWork, ILogger log)
119 | {
120 | try
121 | {
122 | var startTime = DateTime.UtcNow;
123 | string query = GetSearchTerm();
124 |
125 |
126 | string url = $"https://{searchServiceName}.search.windows.net/indexes/{searchIndexName}/docs?api-version=2020-06-30&search={query}&queryType=full&$count=true&highlight=paperAbstract&facets=entities,fieldsOfStudy,year,journalName";
127 | HttpResponseMessage response = await client.GetAsync(url);
128 |
129 | if (!response.IsSuccessStatusCode)
130 | {
131 | int x = 5;
132 | }
133 |
134 | processedWork.Add(new PerfStat
135 | {
136 | runStatusCode = (int)response.StatusCode,
137 | runMS = Convert.ToInt32(response.Headers.GetValues("elapsed-time").FirstOrDefault().ToString()),
138 | runTime = startTime
139 | });
140 |
141 | }
142 | catch (Exception e)
143 | {
144 | log.LogInformation(e.ToString());
145 | }
146 |
147 | }
148 |
149 | static string GetSearchTerm()
150 | {
151 | int randomNumber = _random.Next(0, queryList.Count - 1);
152 |
153 | return queryList[randomNumber];
154 | }
155 | }
156 | }
157 |
--------------------------------------------------------------------------------
/other_tools/.gitignore:
--------------------------------------------------------------------------------
1 | ## Ignore Visual Studio temporary files, build results, and
2 | ## files generated by popular Visual Studio add-ons.
3 | ##
4 | ## Get latest from https://github.com/github/gitignore/blob/master/VisualStudio.gitignore
5 |
6 | # User-specific files
7 | *.rsuser
8 | *.suo
9 | *.user
10 | *.userosscache
11 | *.sln.docstates
12 |
13 | # User-specific files (MonoDevelop/Xamarin Studio)
14 | *.userprefs
15 |
16 | # Mono auto generated files
17 | mono_crash.*
18 |
19 | # Build results
20 | [Dd]ebug/
21 | [Dd]ebugPublic/
22 | [Rr]elease/
23 | [Rr]eleases/
24 | x64/
25 | x86/
26 | [Ww][Ii][Nn]32/
27 | [Aa][Rr][Mm]/
28 | [Aa][Rr][Mm]64/
29 | bld/
30 | [Bb]in/
31 | [Oo]bj/
32 | [Ll]og/
33 | [Ll]ogs/
34 |
35 | # Visual Studio 2015/2017 cache/options directory
36 | .vs/
37 | # Uncomment if you have tasks that create the project's static files in wwwroot
38 | #wwwroot/
39 |
40 | # Visual Studio 2017 auto generated files
41 | Generated\ Files/
42 |
43 | # MSTest test Results
44 | [Tt]est[Rr]esult*/
45 | [Bb]uild[Ll]og.*
46 |
47 | # NUnit
48 | *.VisualState.xml
49 | TestResult.xml
50 | nunit-*.xml
51 |
52 | # Build Results of an ATL Project
53 | [Dd]ebugPS/
54 | [Rr]eleasePS/
55 | dlldata.c
56 |
57 | # Benchmark Results
58 | BenchmarkDotNet.Artifacts/
59 |
60 | # .NET Core
61 | project.lock.json
62 | project.fragment.lock.json
63 | artifacts/
64 |
65 | # ASP.NET Scaffolding
66 | ScaffoldingReadMe.txt
67 |
68 | # StyleCop
69 | StyleCopReport.xml
70 |
71 | # Files built by Visual Studio
72 | *_i.c
73 | *_p.c
74 | *_h.h
75 | *.ilk
76 | *.meta
77 | *.obj
78 | *.iobj
79 | *.pch
80 | *.pdb
81 | *.ipdb
82 | *.pgc
83 | *.pgd
84 | *.rsp
85 | *.sbr
86 | *.tlb
87 | *.tli
88 | *.tlh
89 | *.tmp
90 | *.tmp_proj
91 | *_wpftmp.csproj
92 | *.log
93 | *.vspscc
94 | *.vssscc
95 | .builds
96 | *.pidb
97 | *.svclog
98 | *.scc
99 |
100 | # Chutzpah Test files
101 | _Chutzpah*
102 |
103 | # Visual C++ cache files
104 | ipch/
105 | *.aps
106 | *.ncb
107 | *.opendb
108 | *.opensdf
109 | *.sdf
110 | *.cachefile
111 | *.VC.db
112 | *.VC.VC.opendb
113 |
114 | # Visual Studio profiler
115 | *.psess
116 | *.vsp
117 | *.vspx
118 | *.sap
119 |
120 | # Visual Studio Trace Files
121 | *.e2e
122 |
123 | # TFS 2012 Local Workspace
124 | $tf/
125 |
126 | # Guidance Automation Toolkit
127 | *.gpState
128 |
129 | # ReSharper is a .NET coding add-in
130 | _ReSharper*/
131 | *.[Rr]e[Ss]harper
132 | *.DotSettings.user
133 |
134 | # TeamCity is a build add-in
135 | _TeamCity*
136 |
137 | # DotCover is a Code Coverage Tool
138 | *.dotCover
139 |
140 | # AxoCover is a Code Coverage Tool
141 | .axoCover/*
142 | !.axoCover/settings.json
143 |
144 | # Coverlet is a free, cross platform Code Coverage Tool
145 | coverage*.json
146 | coverage*.xml
147 | coverage*.info
148 |
149 | # Visual Studio code coverage results
150 | *.coverage
151 | *.coveragexml
152 |
153 | # NCrunch
154 | _NCrunch_*
155 | .*crunch*.local.xml
156 | nCrunchTemp_*
157 |
158 | # MightyMoose
159 | *.mm.*
160 | AutoTest.Net/
161 |
162 | # Web workbench (sass)
163 | .sass-cache/
164 |
165 | # Installshield output folder
166 | [Ee]xpress/
167 |
168 | # DocProject is a documentation generator add-in
169 | DocProject/buildhelp/
170 | DocProject/Help/*.HxT
171 | DocProject/Help/*.HxC
172 | DocProject/Help/*.hhc
173 | DocProject/Help/*.hhk
174 | DocProject/Help/*.hhp
175 | DocProject/Help/Html2
176 | DocProject/Help/html
177 |
178 | # Click-Once directory
179 | publish/
180 |
181 | # Publish Web Output
182 | *.[Pp]ublish.xml
183 | *.azurePubxml
184 | # Note: Comment the next line if you want to checkin your web deploy settings,
185 | # but database connection strings (with potential passwords) will be unencrypted
186 | *.pubxml
187 | *.publishproj
188 |
189 | # Microsoft Azure Web App publish settings. Comment the next line if you want to
190 | # checkin your Azure Web App publish settings, but sensitive information contained
191 | # in these scripts will be unencrypted
192 | PublishScripts/
193 |
194 | # NuGet Packages
195 | *.nupkg
196 | # NuGet Symbol Packages
197 | *.snupkg
198 | # The packages folder can be ignored because of Package Restore
199 | **/[Pp]ackages/*
200 | # except build/, which is used as an MSBuild target.
201 | !**/[Pp]ackages/build/
202 | # Uncomment if necessary however generally it will be regenerated when needed
203 | #!**/[Pp]ackages/repositories.config
204 | # NuGet v3's project.json files produces more ignorable files
205 | *.nuget.props
206 | *.nuget.targets
207 |
208 | # Microsoft Azure Build Output
209 | csx/
210 | *.build.csdef
211 |
212 | # Microsoft Azure Emulator
213 | ecf/
214 | rcf/
215 |
216 | # Windows Store app package directories and files
217 | AppPackages/
218 | BundleArtifacts/
219 | Package.StoreAssociation.xml
220 | _pkginfo.txt
221 | *.appx
222 | *.appxbundle
223 | *.appxupload
224 |
225 | # Visual Studio cache files
226 | # files ending in .cache can be ignored
227 | *.[Cc]ache
228 | # but keep track of directories ending in .cache
229 | !?*.[Cc]ache/
230 |
231 | # Others
232 | ClientBin/
233 | ~$*
234 | *~
235 | *.dbmdl
236 | *.dbproj.schemaview
237 | *.jfm
238 | *.pfx
239 | *.publishsettings
240 | orleans.codegen.cs
241 |
242 | # Including strong name files can present a security risk
243 | # (https://github.com/github/gitignore/pull/2483#issue-259490424)
244 | #*.snk
245 |
246 | # Since there are multiple workflows, uncomment next line to ignore bower_components
247 | # (https://github.com/github/gitignore/pull/1529#issuecomment-104372622)
248 | #bower_components/
249 |
250 | # RIA/Silverlight projects
251 | Generated_Code/
252 |
253 | # Backup & report files from converting an old project file
254 | # to a newer Visual Studio version. Backup files are not needed,
255 | # because we have git ;-)
256 | _UpgradeReport_Files/
257 | Backup*/
258 | UpgradeLog*.XML
259 | UpgradeLog*.htm
260 | ServiceFabricBackup/
261 | *.rptproj.bak
262 |
263 | # SQL Server files
264 | *.mdf
265 | *.ldf
266 | *.ndf
267 |
268 | # Business Intelligence projects
269 | *.rdl.data
270 | *.bim.layout
271 | *.bim_*.settings
272 | *.rptproj.rsuser
273 | *- [Bb]ackup.rdl
274 | *- [Bb]ackup ([0-9]).rdl
275 | *- [Bb]ackup ([0-9][0-9]).rdl
276 |
277 | # Microsoft Fakes
278 | FakesAssemblies/
279 |
280 | # GhostDoc plugin setting file
281 | *.GhostDoc.xml
282 |
283 | # Node.js Tools for Visual Studio
284 | .ntvs_analysis.dat
285 | node_modules/
286 |
287 | # Visual Studio 6 build log
288 | *.plg
289 |
290 | # Visual Studio 6 workspace options file
291 | *.opt
292 |
293 | # Visual Studio 6 auto-generated workspace file (contains which files were open etc.)
294 | *.vbw
295 |
296 | # Visual Studio LightSwitch build output
297 | **/*.HTMLClient/GeneratedArtifacts
298 | **/*.DesktopClient/GeneratedArtifacts
299 | **/*.DesktopClient/ModelManifest.xml
300 | **/*.Server/GeneratedArtifacts
301 | **/*.Server/ModelManifest.xml
302 | _Pvt_Extensions
303 |
304 | # Paket dependency manager
305 | .paket/paket.exe
306 | paket-files/
307 |
308 | # FAKE - F# Make
309 | .fake/
310 |
311 | # CodeRush personal settings
312 | .cr/personal
313 |
314 | # Python Tools for Visual Studio (PTVS)
315 | __pycache__/
316 | *.pyc
317 |
318 | # Cake - Uncomment if you are using it
319 | # tools/**
320 | # !tools/packages.config
321 |
322 | # Tabs Studio
323 | *.tss
324 |
325 | # Telerik's JustMock configuration file
326 | *.jmconfig
327 |
328 | # BizTalk build output
329 | *.btp.cs
330 | *.btm.cs
331 | *.odx.cs
332 | *.xsd.cs
333 |
334 | # OpenCover UI analysis results
335 | OpenCover/
336 |
337 | # Azure Stream Analytics local run output
338 | ASALocalRun/
339 |
340 | # MSBuild Binary and Structured Log
341 | *.binlog
342 |
343 | # NVidia Nsight GPU debugger configuration file
344 | *.nvuser
345 |
346 | # MFractors (Xamarin productivity tool) working folder
347 | .mfractor/
348 |
349 | # Local History for Visual Studio
350 | .localhistory/
351 |
352 | # BeatPulse healthcheck temp database
353 | healthchecksdb
354 |
355 | # Backup folder for Package Reference Convert tool in Visual Studio 2017
356 | MigrationBackup/
357 |
358 | # Ionide (cross platform F# VS Code tools) working folder
359 | .ionide/
360 |
361 | # Fody - auto-generated XML schema
362 | FodyWeavers.xsd
--------------------------------------------------------------------------------
/other_tools/PerfTest/Program.cs:
--------------------------------------------------------------------------------
1 | // Copyright (c) Microsoft Corporation. All rights reserved.
2 | // Licensed under the MIT License.
3 |
4 | using Newtonsoft.Json;
5 | using SearchPerfTest;
6 | using System;
7 | using System.Collections.Concurrent;
8 | using System.Collections.Generic;
9 | using System.IO;
10 | using System.Linq;
11 | using System.Net;
12 | using System.Net.Http;
13 | using System.Text.Json.Serialization;
14 | using System.Threading;
15 | using System.Threading.Tasks;
16 |
17 | namespace PerfTest
18 | {
19 | class Program
20 | {
21 | static HttpClient client = new HttpClient();
22 |
23 | static ConcurrentBag processedWork = new ConcurrentBag();
24 |
25 | static string functionEndpoint = "https://{function-name}.azurewebsites.net";
26 | static string code = "";
27 |
28 | static string searchServiceName = "";
29 | static string resourceGroupName = "";
30 | static string subscriptionId = "";
31 |
32 | static int? currentReplicas = 1;
33 | static int maxReplicas = 5;
34 | static int partitions = 1;
35 |
36 | static int startQPS = 10;
37 | static int endQPS = 300;
38 | static int increment = 10;
39 | static int duration = 60;
40 |
41 | static void Main(string[] args)
42 | {
43 | var sm = new ManagementClient(resourceGroupName, searchServiceName, subscriptionId);
44 |
45 | Uri uri = new Uri(functionEndpoint);
46 | ServicePoint sp = ServicePointManager.FindServicePoint(uri);
47 | sp.ConnectionLimit = 25;
48 |
49 | var logFile = Path.Combine("../../../logs", $"{searchServiceName}-{currentReplicas}r{partitions}p-{DateTime.Now.ToString("yyyyMMddHHmm")}.csv");
50 |
51 | // Re-Create file with header
52 | if (File.Exists(logFile))
53 | File.Delete(logFile);
54 | File.WriteAllText(logFile, "Service Name, Replicas, Partitions, Time (UTC), Target QPS, Target Duration, Actual Duration, Target Queries, Actual Queries, Successful Queries, Failed Queries, Avg Latency, Latency25, Latency75, Latency90, Latency95, Latency99 \r\n");
55 |
56 |
57 | Console.WriteLine("Starting the Azure Cognitive Search Performance Test!\n");
58 |
59 | string status = "";
60 | while (currentReplicas <= maxReplicas)
61 | {
62 | var replicaCheck = sm.GetReplicaCount();
63 | if (currentReplicas != replicaCheck)
64 | {
65 | status = sm.GetStatus();
66 | while (status != "Running")
67 | {
68 | Console.WriteLine($"Waiting for service to be ready...\n");
69 | Thread.Sleep(15000);
70 | status = sm.GetStatus();
71 | }
72 |
73 | Console.WriteLine($"Resizing service to {currentReplicas} replicas...");
74 | sm.ScaleService(currentReplicas, partitions);
75 | Thread.Sleep(2000);
76 | }
77 |
78 | status = sm.GetStatus();
79 | while (status != "Running")
80 | {
81 | Console.WriteLine($"Waiting for service to be ready...\n");
82 | Thread.Sleep(15000);
83 | status = sm.GetStatus();
84 | }
85 |
86 | Console.WriteLine($"Running test for {currentReplicas} replicas!!!\n");
87 | int currentQPS = startQPS;
88 | int queriesPerThread = 10;
89 |
90 | while ( currentQPS <= endQPS)
91 | {
92 | var startTime = DateTime.UtcNow;
93 | Console.WriteLine($"Starting test at {currentQPS} QPS at {startTime.TimeOfDay}");
94 | var taskList = new List>>();
95 | var testStartTime = DateTime.Now;
96 |
97 | for (int remainingQPS = currentQPS; remainingQPS > 0; remainingQPS = remainingQPS - queriesPerThread)
98 | {
99 | int thisQPS = Math.Min(remainingQPS, queriesPerThread);
100 | taskList.Add(KickoffJobAsync(thisQPS, duration));
101 | Thread.Sleep(50);
102 | }
103 |
104 | Console.WriteLine("Waiting for all tasks to complete...");
105 | Task.WaitAll(taskList.ToArray());
106 | var endTime = DateTime.UtcNow;
107 |
108 | var results = new List();
109 | foreach (Task> t in taskList)
110 | {
111 | results.AddRange(t.Result);
112 | }
113 |
114 | // Calculate Statistics
115 | var successfulQueries = results.Where(s => s.runStatusCode == 200).Count();
116 | var failedQueries = results.Where(s => s.runStatusCode != 200).Count();
117 | var percentSuccess = 100 * successfulQueries / (failedQueries + successfulQueries);
118 | var averageLatency = results.Select(x => x.runMS).Average();
119 | var averageLatency25 = Percentile(results.Select(x => Convert.ToDouble(x.runMS)).ToArray(), 0.25);
120 | var averageLatency75 = Percentile(results.Select(x => Convert.ToDouble(x.runMS)).ToArray(), 0.75);
121 | var averageLatency90 = Percentile(results.Select(x => Convert.ToDouble(x.runMS)).ToArray(), 0.90);
122 | var averageLatency95 = Percentile(results.Select(x => Convert.ToDouble(x.runMS)).ToArray(), 0.95);
123 | var averageLatency99 = Percentile(results.Select(x => Convert.ToDouble(x.runMS)).ToArray(), 0.99);
124 |
125 | string responseMessage = $"\nTarget QPS: {currentQPS} \n";
126 | responseMessage += $"Target Duration: {duration} \n";
127 | responseMessage += $"Actual Duration: {endTime - startTime} \n";
128 | responseMessage += $"Successful Queries: {successfulQueries} \n";
129 | responseMessage += $"Failed Queries: {failedQueries} \n";
130 | responseMessage += $"Percent Succesful: {percentSuccess} \n";
131 | responseMessage += $"Average Latency: {averageLatency} \n";
132 | responseMessage += $"Latency 25th Percentile: {averageLatency25} \n";
133 | responseMessage += $"Latency 75th Percentile: {averageLatency75} \n";
134 | responseMessage += $"Latency 90th Percentile: {averageLatency90} \n";
135 | responseMessage += $"Latency 95th Percentile: {averageLatency95} \n";
136 | responseMessage += $"Latency 99th Percentile: {averageLatency99} \n";
137 |
138 | File.AppendAllText(logFile, $"{searchServiceName}, {currentReplicas}, {partitions}, {startTime.ToString()}, {currentQPS}, {duration}, {endTime - startTime}, {currentQPS * duration}, {failedQueries + successfulQueries}, {successfulQueries}, {failedQueries}, {averageLatency}, {averageLatency25}, {averageLatency75}, {averageLatency90}, {averageLatency95}, {averageLatency99} \r\n");
139 | Console.WriteLine(responseMessage);
140 |
141 | if (percentSuccess < 97 || averageLatency > 1000)
142 | {
143 | break;
144 | }
145 |
146 | currentQPS += increment;
147 | }
148 |
149 | currentReplicas++;
150 | }
151 |
152 | Console.WriteLine("Test finished. Press any key to exit.");
153 | Console.ReadKey();
154 | }
155 |
156 | static async Task> KickoffJobAsync(int qps, int duration)
157 | {
158 | Console.WriteLine($"Kicking off an Azure Function at {qps} QPS for {duration} seconds");
159 |
160 | string url = $"{functionEndpoint}/api/searchtestnode?code={code}&qps={qps}&duration={duration}";
161 |
162 | // To scale out to multiple azure function instances, follow this pattern
163 | //string url;
164 | //if (counter % 3 == 0)
165 | //{
166 | // url = $"{functionEndpoint1}/api/searchtestnode?code={code1}&qps={qps}&duration={duration}";
167 | //}
168 | //else if (counter % 3 == 1)
169 | //{
170 | // url = $"{functionEndpoint2}/api/searchtestnode?code={code2}&qps={qps}&duration={duration}";
171 | //}
172 | //else
173 | //{
174 | // url = $"{functionEndpoint3}/api/searchtestnode?code={code3}&qps={qps}&duration={duration}";
175 | //}
176 |
177 | HttpResponseMessage response = await client.GetAsync(url);
178 | response.EnsureSuccessStatusCode();
179 |
180 | string responseJson = await response.Content.ReadAsStringAsync();
181 | List performanceStats = JsonConvert.DeserializeObject>(responseJson);
182 |
183 |
184 | return performanceStats;
185 |
186 | }
187 |
188 | // code from https://stackoverflow.com/questions/8137391/percentile-calculation
189 | static double Percentile(double[] sequence, double excelPercentile)
190 | {
191 | Array.Sort(sequence);
192 | int N = sequence.Length;
193 | double n = (N - 1) * excelPercentile + 1;
194 | // Another method: double n = (N + 1) * excelPercentile;
195 | if (n == 1d) return sequence[0];
196 | else if (n == N) return sequence[N - 1];
197 | else
198 | {
199 | int k = (int)n;
200 | double d = n - k;
201 | return sequence[k - 1] + d * (sequence[k] - sequence[k - 1]);
202 | }
203 | }
204 | }
205 | }
206 |
--------------------------------------------------------------------------------
/other_tools/Data/SemanticScholar/Program.cs:
--------------------------------------------------------------------------------
1 | // Copyright (c) Microsoft Corporation. All rights reserved.
2 | // Licensed under the MIT License.
3 |
4 | // This application contains information on how to create an Azure Cognitive Search Index with data provided
5 | // by Semantic Scholar http://s2-public-api-prod.us-west-2.elasticbeanstalk.com/corpus/download/
6 | // For uploading the data, we will use the Azure Cognitive Search SDK because it allows for retries and
7 | // exponential backoff. The average document size is ~3 KB Each corpus file (e.g. s2-corpus-000.gz) has
8 | // ~998,537 documents and is ~1,680MB in raw text (1.68KB / DOC)
9 |
10 | using Azure;
11 | using Azure.Core.Serialization;
12 | using Azure.Search.Documents;
13 | using Azure.Search.Documents.Indexes;
14 | using Azure.Search.Documents.Indexes.Models;
15 | using Azure.Search.Documents.Models;
16 | using Newtonsoft.Json;
17 | using Newtonsoft.Json.Linq;
18 | using System;
19 | using System.Collections;
20 | using System.Collections.Generic;
21 | using System.IO;
22 | using System.IO.Compression;
23 | using System.Net;
24 | using System.Runtime.InteropServices;
25 | using System.Text;
26 | using System.Threading;
27 | using System.Threading.Tasks;
28 |
29 | namespace SemanticScholarDataUploader
30 | {
31 | class Program
32 | {
33 | //Set this to the number of documents you need in the search index
34 | //NOTE: One doc == ~3KB
35 | private static decimal DocumentsToUpload = 1000000;
36 |
37 | private static string ServiceName = "[Search Service Name]";
38 | private static string ApiKey = "[Search Admin API KEY]";
39 | private static string IndexName = "semanticscholar";
40 |
41 | private static string DownloadDir = @"c:\temp\delete\data";
42 | private static string SemanticScholarDownloadUrl = "https://s3-us-west-2.amazonaws.com/ai2-s2-research-public/open-corpus/2020-05-27/";
43 | private static decimal ApproxDocumentsPerFile = 1000000;
44 | private static int MaxParallelUploads = 8;
45 | private static int BatchUploadCounter = 0;
46 | private static int MaxBatchSize = 500;
47 |
48 |
49 | static void Main(string[] args)
50 | {
51 | // Create a SearchIndexClient to send create/delete index commands
52 | Uri serviceEndpoint = new Uri($"https://{ServiceName}.search.windows.net/");
53 | AzureKeyCredential credential = new AzureKeyCredential(ApiKey);
54 | SearchIndexClient idxclient = new SearchIndexClient(serviceEndpoint, credential);
55 |
56 | // Create a SearchClient to load and query documents
57 | SearchClient srchclient = new SearchClient(serviceEndpoint, IndexName, credential);
58 |
59 | // Delete index if it exists
60 | DeleteIndexIfExists(IndexName, idxclient);
61 |
62 | // Delete index if it exists
63 | CreateIndex(IndexName, idxclient);
64 |
65 | // Deleete and Re-create the data download dir
66 | ResetDownloadDir();
67 |
68 | var filesToDownload = Convert.ToInt32(Math.Ceiling(DocumentsToUpload / ApproxDocumentsPerFile));
69 | DownloadFiles(filesToDownload);
70 |
71 | UploadDocuments(srchclient, filesToDownload);
72 | }
73 |
74 | // Delete the index to reuse its name
75 | private static void DeleteIndexIfExists(string indexName, SearchIndexClient idxclient)
76 | {
77 | Console.WriteLine("{0}", "Deleting index...\n");
78 | idxclient.GetIndexNames();
79 | {
80 | idxclient.DeleteIndex(indexName);
81 | }
82 | }
83 |
84 | // Create the index
85 | private static void CreateIndex(string indexName, SearchIndexClient idxclient)
86 | {
87 | // Define an index schema and create the index
88 | Console.WriteLine("{0}", "Creating index...\n");
89 | var index = new SearchIndex(indexName)
90 | {
91 | Fields =
92 | {
93 | new SimpleField("id", SearchFieldDataType.String) {IsKey=true, IsFilterable=true},
94 | new SimpleField("magId", SearchFieldDataType.String) {IsFilterable=true},
95 | new SearchableField("entities", true) {IsFilterable=true, IsFacetable=true, AnalyzerName="en.microsoft"},
96 | new SearchableField("fieldsOfStudy", true) {IsFilterable=true, IsFacetable=true },
97 | new SimpleField("journalVolume", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
98 | new SimpleField("journalPages", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
99 | new SimpleField("pmid", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
100 | new SimpleField("year", SearchFieldDataType.Int32) { IsFilterable = true, IsSortable = true, IsFacetable = true },
101 | new SimpleField("s2Url", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
102 | new SimpleField("s2PdfUrl", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
103 | new SimpleField("journs2PdfUrlalPages", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
104 | new SearchableField("journalName") {IsFilterable=true, IsFacetable=true, AnalyzerName="en.microsoft"},
105 | new SearchableField("paperAbstract") {IsFilterable=false, IsFacetable=false, AnalyzerName="en.microsoft"},
106 | new SearchableField("title") {IsFilterable=false, IsFacetable=false, AnalyzerName="en.microsoft"},
107 | new SimpleField("doi", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
108 | new SimpleField("doiUrl", SearchFieldDataType.String) {IsFilterable=false, IsFacetable=false},
109 | new SearchableField("venue") {IsFilterable=true, IsFacetable=true, AnalyzerName="en.microsoft"},
110 | new SearchableField("outCitations", true) {IsFilterable=true, IsFacetable=true },
111 | new SearchableField("inCitations", true) {IsFilterable=false, IsFacetable=false },
112 | new SearchableField("pdfUrls", true) {IsFilterable=false, IsFacetable=false },
113 | new SearchableField("sources", true) {IsFilterable=true, IsFacetable=true },
114 | new ComplexField("authors", collection: true)
115 | {
116 | Fields =
117 | {
118 | new SearchableField("name") {IsFilterable=true, IsFacetable=true, AnalyzerName="en.microsoft"},
119 | new SearchableField("ids", true) {IsFilterable=true, IsFacetable=true }
120 | }
121 | }
122 | }
123 | };
124 |
125 | idxclient.CreateIndex(index);
126 | }
127 |
128 | static void UploadDocuments(SearchClient srchclient, int FileCount)
129 | {
130 | var docCounter = 0;
131 | var batchJobs = new List>();
132 | var batch = new IndexDocumentsBatch();
133 |
134 | Console.WriteLine("Creating batches for upload...");
135 | for (var fileNum = 0; fileNum < FileCount; fileNum++)
136 | {
137 | var paddedFileNum = fileNum.ToString().PadLeft(3, '0');
138 | var baseFileName = "s2-corpus-" + paddedFileNum + ".gz";
139 | var fileToProcess = Path.Combine(DownloadDir, baseFileName).Replace(".gz", "");
140 | const Int32 BufferSize = 128;
141 | using (var fileStream = File.OpenRead(fileToProcess))
142 | using (var streamReader = new StreamReader(fileStream, Encoding.UTF8, true, BufferSize))
143 | {
144 | String line;
145 |
146 | while ((line = streamReader.ReadLine()) != null)
147 | {
148 | docCounter += 1;
149 | if (docCounter == DocumentsToUpload)
150 | break;
151 | var ssDoc = JsonConvert.DeserializeObject(line);
152 | batch.Actions.Add(IndexDocumentsAction.Upload(ssDoc));
153 | if (docCounter % MaxBatchSize == 0)
154 | {
155 | batchJobs.Add(batch);
156 | batch = new IndexDocumentsBatch();
157 | if (batchJobs.Count % 100 == 0)
158 | Console.WriteLine("Created {0} batches...", batchJobs.Count);
159 |
160 | }
161 |
162 | }
163 | }
164 |
165 | ParallelBatchApplication(batchJobs, srchclient);
166 |
167 | batchJobs.Clear();
168 | batch = new IndexDocumentsBatch();
169 |
170 | if (docCounter == DocumentsToUpload)
171 | break;
172 |
173 | }
174 |
175 | if (batch.Actions.Count > 0)
176 | batchJobs.Add(batch);
177 |
178 | ParallelBatchApplication(batchJobs, srchclient);
179 |
180 | }
181 |
182 | static void ParallelBatchApplication(List> batchJobs, SearchClient srchclient)
183 | {
184 | // Apply a set of batches in parallel
185 | var idxoptions = new IndexDocumentsOptions { ThrowOnAnyError = true };
186 | Parallel.ForEach(batchJobs,
187 | new ParallelOptions { MaxDegreeOfParallelism = MaxParallelUploads },
188 | (b) =>
189 | {
190 | Interlocked.Increment(ref BatchUploadCounter);
191 | Console.WriteLine("Uploading Batch {0} with doc count {1}", BatchUploadCounter.ToString(), (BatchUploadCounter * MaxBatchSize).ToString());
192 | srchclient.IndexDocuments(b, idxoptions);
193 |
194 | });
195 | }
196 |
197 | static void ResetDownloadDir()
198 | {
199 | if (Directory.Exists(DownloadDir))
200 | Directory.Delete(DownloadDir, true);
201 | Directory.CreateDirectory(DownloadDir);
202 | }
203 |
204 |
205 | static void DownloadFiles(int FileCount)
206 | {
207 | // Download the semantic scholar files
208 | for (var fileNum = 0; fileNum < FileCount; fileNum++)
209 | {
210 | var client = new WebClient();
211 | var paddedFileNum = fileNum.ToString().PadLeft(3, '0');
212 | var baseFileName = "s2-corpus-" + paddedFileNum + ".gz";
213 | var downloadFileUrl = SemanticScholarDownloadUrl + baseFileName;
214 | Console.WriteLine("Downloading File: {0}", baseFileName);
215 |
216 | client.DownloadFile(downloadFileUrl, Path.Combine(DownloadDir, baseFileName));
217 |
218 | // Decompress the file
219 | Console.WriteLine("Decompressing File: {0}", baseFileName);
220 | using (FileStream fInStream = new FileStream(Path.Combine(DownloadDir, baseFileName), FileMode.Open, FileAccess.Read))
221 | {
222 | using (GZipStream zipStream = new GZipStream(fInStream, CompressionMode.Decompress))
223 | {
224 | using (FileStream fOutStream = new FileStream(Path.Combine(DownloadDir, baseFileName).Replace(".gz", ""),
225 | FileMode.Create, FileAccess.Write))
226 | {
227 | byte[] tempBytes = new byte[4096];
228 | int i;
229 | while ((i = zipStream.Read(tempBytes, 0, tempBytes.Length)) != 0)
230 | {
231 | fOutStream.Write(tempBytes, 0, i);
232 | }
233 | }
234 | }
235 | }
236 |
237 | //Remove local gz file
238 | Console.WriteLine("Removing File: {0}", baseFileName);
239 | File.Delete(Path.Combine(DownloadDir, baseFileName));
240 | }
241 | }
242 |
243 | }
244 | }
245 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | ---
2 | page_type: sample
3 | languages:
4 | - yaml
5 | - python
6 | products:
7 | - azure
8 | - azure-cognitive-search
9 | name: Azure Search Performance Tests
10 | description: "Performance testing tool for Azure Cognitive Search"
11 | urlFragment: "azure-search-perf"
12 | ---
13 |
14 | # Load Testing Pipeline with JMeter, ACI and Terraform
15 |
16 | This pipeline helps to load test Azure Cognitive Search, it leverages [Apache JMeter](https://jmeter.apache.org/) as an open source load and performance testing tool and [Terraform](https://www.terraform.io/) to dynamically provision and destroy the required infrastructure on Azure. The JMeter workers and controller are hosted in Azure Container Instances (ACI) to allow VNET injection and Private Endpoint scenarios too.
17 |
18 | Note: This is a fork from [this original repo](https://github.com/Azure-Samples/jmeter-aci-terraform) customized for Azure Cognitive Search (ACS) REST API and syntax.
19 |
20 | ## Key concepts
21 |
22 | 
23 |
24 | The flow is triggered and controlled by an [Azure Pipeline](https://azure.microsoft.com/en-us/services/devops/pipelines/) on [Azure DevOps](https://azure.microsoft.com/en-in/services/devops/). The pipeline contains a set of tasks that are organized logically in `SETUP`, `TEST`, `RESULTS` and `TEARDOWN` groups.
25 |
26 | | Task group | Tasks |
27 | |-------------------------|--------|
28 | | SETUP | Check if the JMeter Docker image existsValidate the JMX file that contains the JMeter test definitionUpload JMeter JMX file to Azure Storage Account File ShareProvision the infrastructure with Terraform |
29 | | TEST | Run JMeter test execution and wait for completion |
30 | | RESULTS | Show JMeter logsGet JMeter artifacts (e.g. logs, dashboard)Convert JMeter tests result (JTL format) to JUnit formatPublish JUnit test results to Azure PipelinesPublish JMeter artifacts to Azure Pipelines |
31 | | TEARDOWN | Destroy all ephemeral infrastructure with Terraform |
32 |
33 | On the `SETUP` phase, JMeter agents are provisioned as [Azure Container Instance (ACI)](https://azure.microsoft.com/en-us/services/container-instances/) using a [custom Docker image](./docker/Dockerfile) on Terraform. Through a [Remote Testing](https://jmeter.apache.org/usermanual/remote-test.html) approach, JMeter controller is responsible to configure all workers, consolidating all results and generating the resulting artifacts (dashboard, logs, etc).
34 |
35 | The infrastructure provisioned by Terraform includes:
36 |
37 | * Resource Group
38 | * Virtual Network (VNet)
39 | * Storage Account File Share
40 | * 1 JMeter controller on ACI
41 | * N JMeter workers on ACI
42 |
43 | On the `RESULTS` phase, a [JMeter Report Dashboard](https://jmeter.apache.org/usermanual/generating-dashboard.html) and [Tests Results](https://docs.microsoft.com/en-us/azure/devops/pipelines/test/review-continuous-test-results-after-build?view=azure-devops) are published in the end of each load testing execution.
44 |
45 | ## Prerequisites
46 |
47 | You should have the following Azure resources:
48 |
49 | * [Azure DevOps Organization](https://docs.microsoft.com/azure/devops/organizations/accounts/create-organization?view=azure-devops)
50 | * [Azure Container Registry (ACR)](https://azure.microsoft.com/services/container-registry/) with admin user enabled 
51 | * An Azure Cognitive Search Service
52 |
53 | ## Getting Started using the UI
54 |
55 | ### 1. Create an Azure DevOps project and import this repo
56 |
57 | Go to [Azure DevOps](https://dev.azure.com/), create a new project, and import this repo.
58 |
59 | 
60 |
61 | Click into the **Repos** tab. You will get a warning saying that the repo is empty. Click on **Import a repository**, then for the Clone URL copy and paste this url: `https://github.com/Azure-Samples/azure-search-performance-testing`
62 |
63 | 
64 |
65 |
66 | ### 2. Create a service connection in Azure DevOps
67 |
68 | Next, create a service connection in Azure Devops as described in the [DevOps documentation](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#create-a-service-connection). This service connection will create a service principal allowing the Azure Pipelines to access the resource group.
69 |
70 | > Note: Make sure to add the service connection at the subscription level (don't specify a resource group) so the service connection has access to install resource providers.
71 |
72 | 
73 |
74 | Make a note of the Service Connection name as it will be used in next step.
75 |
76 | ### 3. Create the variable group
77 |
78 | Create a variable group named `JMETER_TERRAFORM_SETTINGS` as described [in the DevOps documentation](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops&tabs=classic).
79 |
80 |
81 | Add the following variables to the variable group:
82 |
83 | * TF_VAR_JMETER_ACR_NAME =
84 | * TF_VAR_RESOURCE_GROUP_NAME =
85 | * TF_VAR_JMETER_DOCKER_IMAGE = .azurecr.io/jmeter
86 | * TF_VAR_LOCATION = (i.e. eastus, westeurope, etc.)
87 | * AZURE_SERVICE_CONNECTION_NAME =
88 | * AZURE_SUBSCRIPTION_ID =
89 |
90 | When you're finished, the variable group should look similar to the image below:
91 |
92 | 
93 |
94 | ### 4. Create and run the Docker pipeline
95 |
96 | Create a pipeline with **New Pipeline** (blue button, right side), chose **Azure Repos Git (YAML)**, click on your existing repo (cloned in step #1), configure the pipeline with **Existing Azure Pipelines YAML file**, the path of the existing file is **/pipelines/azure-pipelines.docker.yml**.
97 |
98 | Rename the new pipeline to `jmeter-docker-build` so you won't confuse it with the pipeline created in the next step (in the Pipelines tab, find the three dots inside your pipeline row and there you can rename it).
99 |
100 | Run this pipeline before proceeding to the next step.
101 |
102 | ### 5. Create the JMeter pipeline
103 |
104 | Replicate the steps as in step #4 but with yaml file **pipelines/azure-pipelines.load-test.yml** and rename it to **jmeter-load-test**.
105 |
106 | For this pipeline you will need some additional variables:
107 |
108 | * API_KEY = (and keep it secret in devops)
109 | * SEARCH_SERVICE_NAME =
110 | * SEARCH_INDEX_NAME =
111 | * TF_VAR_JMETER_JMX_FILE = sample.jmx
112 | * TF_VAR_JMETER_WORKERS_COUNT = 1 (or as many as you want for scalability of the JMeter workers)
113 |
114 | Save the pipeline but don't run it yet. You may want to update [`sample.jmx`](./jmeter/sample.jmx) as described in the next step.
115 |
116 | ### 6. Define the test definition inside your JMX file
117 |
118 | By default the test uses [`sample.jmx`](./jmeter/sample.jmx). This JMX file contains a test definition for performing HTTP requests on `.search.windows.net` endpoint through the `443` port.
119 |
120 | You can update the JMX file with the test definition of your preference. You can find more details on updating the test definition in [jmeter/README.md](./jmeter/README.md).
121 |
122 | ### 7. Run the JMeter Pipeline
123 |
124 | Now you're ready to run the pipeline:
125 |
126 | 
127 |
128 | > Note: The variables you created in step #5 might not show up on this view, but rest assured they haven't been deleted.
129 |
130 | If you're using the default test configuration this should take about 10-15 minutes to complete.
131 |
132 | ## Viewing Test Results
133 |
134 | JMeter test results are created in a [JTL](https://cwiki.apache.org/confluence/display/JMETER/JtlFiles) file (`results.jtl`) with CSV formatting. A [Python script](https://github.com/Azure-Samples/jmeter-aci-terraform/blob/main/scripts/jtl_junit_converter.py) was created to convert JTL to [JUnit format](https://llg.cubic.org/docs/junit/) and used during the pipeline to have full integration with Azure DevOps test visualization.
135 |
136 | 
137 |
138 | Error messages generated by JMeter for failed HTTP requests can also be seen on Azure DevOps. In this case, `Sevice Unavailable` usually means the request was throttled by the search service.
139 |
140 | 
141 |
142 | ## Viewing Artifacts
143 |
144 | Some artifacts are published after the test ends. Some of them are a static JMeter Dashboard, logs and others.
145 |
146 | 
147 |
148 | > You can also download these build artifacts using [`az pipelines runs artifact download`](https://docs.microsoft.com/en-us/cli/azure/ext/azure-devops/pipelines/runs/artifact?view=azure-cli-latest#ext-azure-devops-az-pipelines-runs-artifact-download).
149 |
150 | After downloading the dashboard and unzipping it, open `dashboard/index.html` on your browser.
151 |
152 | Some screenshots here:
153 | 
154 |
155 | 
156 |
157 | ## JMeter Test Configuration
158 |
159 | The [sample.jmx](./jmeter/sample.jmx) includes some modules to configure the HTTP request, headers and body that Azure Cognitive Search is expecting. It also includes subsections to configure the query distribution (ie 10 concurrent users per second during 1 minute), a section to define which search terms will be sent (to avoid distortion in latencies thanks to cache) that read an input CSV. For more details and examples: [JMeter official doc](https://jmeter.apache.org/usermanual/component_reference.html).
160 |
161 | If you struggle adding new modules to the .jmx (the syntax can be quite tricky) I would suggest to use JMeter's UI and save the config to a temporary jmx file, analyze the new module and embed it in your jmx config file.
162 |
163 | ## Pipeline Configuration
164 |
165 | All Terraform parameters can be configured using the Variable Group `JMETER_TERRAFORM_SETTINGS`. Please read [JMeter Pipeline Settings](./docs/jmeter-pipeline-settings.md) to know more details about it.
166 |
167 | ## Limitations
168 |
169 | * **Load Test duration**
170 | Please note that for [Microsoft hosted agents](https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/hosted?view=azure-devops#capabilities-and-limitations), you can have pipelines that runs up to 1 hour (private project) or 6 hours (public project). You can have your own agents to bypass this limitation.
171 |
172 | * **ACI on VNET regions**
173 | Please note that [not all regions](https://docs.microsoft.com/en-us/azure/container-instances/container-instances-virtual-network-concepts#where-to-deploy) currently support ACI and VNET integration. If you need private JMeter agents, you can deploy it in a different region and use VNET peering between them. Also note that vCPUs and memory limits change based on regions.
174 |
175 | ## Additional Documentation
176 |
177 | * [Implementation Notes](./docs/implementation-notes.md)
178 | * [Adding plugins to JMeter Docker image](./docs/adding-jmeter-plugins.md)
179 | * [JMeter pipeline settings](./docs/jmeter-pipeline-settings.md)
180 | * [Estimating costs](./docs/estimating-costs.md)
181 | * [Integrating with Application Insights](./docs/integrating-application-insights.md)
182 |
183 | ## External References
184 |
185 | * [User Manual: Remote Testing](https://jmeter.apache.org/usermanual/remote-test.html)
186 | * [User Manual: Apache JMeter Distributed Testing Step-by-step](https://jmeter.apache.org/usermanual/jmeter_distributed_testing_step_by_step.html)
187 | * [Azure DevOps CLI reference](https://docs.microsoft.com/en-us/cli/azure/ext/azure-devops/?view=azure-cli-latest)
188 | * [Create your Azure Cognitive Search instance and populate an index with clinical trials docs](https://github.com/cynotebo/KM-Ready-Lab/blob/master/KM-Ready-Lab/workshops/Module%201.md)
189 |
190 | ## Contributing
191 |
192 | This project welcomes contributions and suggestions. Most contributions require you to agree to a
193 | Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us
194 | the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
195 |
196 | When you submit a pull request, a CLA bot will automatically determine whether you need to provide
197 | a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions
198 | provided by the bot. You will only need to do this once across all repos using our CLA.
199 |
200 | This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
201 | For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
202 | contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
203 |
--------------------------------------------------------------------------------