├── README.md ├── TA-user-agents ├── LICENSE ├── README.md ├── README │ └── apl_logging.conf.spec ├── UPDATING.md ├── app.manifest ├── appserver │ └── static │ │ ├── README.md │ │ ├── appIcon.png │ │ ├── appIconAlt.png │ │ ├── appIconAlt_2x.png │ │ ├── appIcon_2x.png │ │ ├── css │ │ └── dashboard.css │ │ ├── dashboard.css │ │ ├── details.md │ │ ├── documentation │ │ ├── README.md │ │ └── index.html │ │ ├── installation.md │ │ ├── third_party.md │ │ └── troubleshooting.md ├── bin │ ├── Diag.py │ ├── Utilities.py │ ├── _yaml │ │ └── __init__.py │ ├── app_properties.py │ ├── fetch_latest.sample │ ├── requirements.txt │ ├── ua_parser │ │ ├── __init__.py │ │ ├── _regexes.py │ │ ├── user_agent_parser.py │ │ └── user_agent_parser_test.py │ ├── uap-core │ │ ├── CONTRIBUTING.md │ │ ├── LICENSE │ │ ├── docs │ │ │ └── specification.md │ │ ├── package.json │ │ ├── regexes.yaml │ │ ├── test_resources │ │ │ ├── additional_os_tests.yaml │ │ │ ├── firefox_user_agent_strings.txt │ │ │ ├── firefox_user_agent_strings.yaml │ │ │ ├── opera_mini_user_agent_strings.yaml │ │ │ ├── pgts_browser_list-orig.yaml │ │ │ ├── pgts_browser_list.txt │ │ │ ├── pgts_browser_list.yaml │ │ │ ├── podcasting_user_agent_strings.yaml │ │ │ └── transform-pgts_browser_list.pl │ │ └── tests │ │ │ ├── regexes.js │ │ │ ├── sample.js │ │ │ ├── test.js │ │ │ ├── test_device.yaml │ │ │ ├── test_os.yaml │ │ │ └── test_ua.yaml │ ├── user_agents.py │ ├── version.py │ └── yaml │ │ ├── __init__.py │ │ ├── composer.py │ │ ├── constructor.py │ │ ├── cyaml.py │ │ ├── dumper.py │ │ ├── emitter.py │ │ ├── error.py │ │ ├── events.py │ │ ├── loader.py │ │ ├── nodes.py │ │ ├── parser.py │ │ ├── reader.py │ │ ├── representer.py │ │ ├── resolver.py │ │ ├── scanner.py │ │ ├── serializer.py │ │ └── tokens.py ├── default │ ├── apl_logging.conf │ ├── app.conf │ ├── checklist.conf │ ├── data │ │ └── ui │ │ │ └── nav │ │ │ └── default.xml │ ├── log.cfg │ ├── props.conf │ ├── server.conf │ └── transforms.conf ├── file.manifest ├── metadata │ └── default.meta └── static │ ├── appIcon.png │ ├── appIconAlt.png │ ├── appIconAlt_2x.png │ └── appIcon_2x.png └── manual_check_definitions.json /README.md: -------------------------------------------------------------------------------- 1 | # PAVO TA User Agents Documentation 2 | 3 | Provides an external Python lookup that parses User Agents strings. 4 | 5 | ## About PAVO TA User Agents 6 | 7 | | | | 8 | |----------------------------|---------------------------------| 9 | | Author | Aplura, LLC | 10 | | App Version | 1.7.7 | 11 | | App Build | 21 | 12 | | Creates an index | False | 13 | | Implements summarization | No | 14 | | Summary Indexing | False | 15 | | Data Model Acceleration | If Enabled | 16 | | Report Acceleration | False | 17 | | Splunk Enterprise versions | | 18 | | Platforms | Splunk Enterprise, Splunk Cloud | 19 | 20 | ## Scripts and binaries 21 | 22 | This App provides the following scripts: 23 | 24 | | | | 25 | |---------------------|---------------------------------------------------------------------------| 26 | | Diag.py | For use with the diag command. | 27 | | fetch_latest.sample | For grabbing the most recent versions of the libraries. | 28 | | user_agents.py | This is the lookup command python to parse the user agent. | 29 | | Utilities.py | This is a supporting python script for use with logging, and other needs. | 30 | | version.py | This contains the version of the package. | 31 | | app_properties.py | This contains app properties. | 32 | 33 |
34 | 35 | `fetch_latest.sample` is a bash script that would need to be renamed and have +x added to it in order to be a valid script. This script updates the libraries for on-prem installations. 36 | 37 |
38 | 39 | ## Lookups 40 | 41 | PAVO TA User Agents contains the following lookup files. 42 | 43 | - None 44 | 45 | ## Event Generator 46 | 47 | PAVO TA User Agents does not include an event generator. 48 | 49 | ## Acceleration 50 | 51 | - Summary Indexing: No 52 | 53 | - Data Model Acceleration: No 54 | 55 | - Report Acceleration: No 56 | 57 | # Prerequisites, Installation, and Configuration 58 | 59 | Because this App runs on Splunk Enterprise, all the [Splunk Enterprise system requirements](https://docs.splunk.com/Documentation/Splunk/latest/Installation/Systemrequirements) apply. 60 | 61 | ## Installation and Configuration 62 | 63 | ### Download 64 | 65 | ## Installation Process Overview 66 | 67 | - Install the extension. 68 | 69 | ### Deploy to single server instance 70 | 71 | Follow these steps to install the app in a single server instance of Splunk Enterprise: 72 | 73 | - Deploy as you would any App, and restart Splunk. 74 | 75 | - Configure. 76 | 77 | ### Deploy to Splunk Cloud 78 | 79 | - Have your Splunk Cloud Support handle this installation. 80 | 81 | ### Deploy to a Distributed Environment 82 | 83 | - For each Search Head in the environment, deploy a copy of the App. 84 | 85 | # Support and resources 86 | 87 | ## Questions and answers 88 | 89 | Access questions and answers specific to PAVO TA User Agents at . Be sure to tag your question with the App. 90 | 91 | ## Support 92 | 93 | - Support Email: 94 | 95 | - Support Offered: Splunk Answers, Email 96 | 97 | ### Logging 98 | 99 | Copy the \`\`log.cfg\`\` file from \`\`default\`\` to \`\`local\`\` and change the settings as needed. 100 | 101 | ### Diagnostics Generation 102 | 103 | If a support representative asks for it, a support diagnostic file can be generated. Use the following command to generate the file. Send the resulting file to support. 104 | 105 | \`\`\$SPLUNK_HOME/bin/splunk diag --collect=app:TA-user-agents\`\` 106 | 107 | ## Known Issues 108 | 109 | Version 1.7.7 of PAVO TA User Agents has the following known issues: 110 | 111 | - None 112 | 113 | ## Release notes 114 | 115 | ### Version 1.7.7 116 | 117 | - Improvement 118 | - Removed Python that was flagged by Upgrade Readiness App. 119 | 120 | 121 | ### Version 1.7.5 122 | 123 | - Improvement 124 | 125 | - Modified Script for Splunk Cloud compatability. 126 | 127 | ### Version 1.7.4 128 | 129 | - Improvement 130 | 131 | - Updated for Python 3 and Splunk 8 compatability 132 | 133 | # Third Party Notices 134 | 135 | Version 1.7.7 of PAVO TA User Agents incorporates the following Third-party software or third-party services. 136 | 137 | - ua_parser 138 | 139 | - pyyaml 140 | -------------------------------------------------------------------------------- /TA-user-agents/LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Aplura, LLC 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /TA-user-agents/README.md: -------------------------------------------------------------------------------- 1 | # PAVO TA User Agents Documentation 2 | 3 | Provides an external Python lookup that parses User Agents strings. 4 | 5 | ## About PAVO TA User Agents 6 | 7 | | | | 8 | |----------------------------|---------------------------------| 9 | | Author | Aplura, LLC | 10 | | App Version | 1.7.7 | 11 | | App Build | 21 | 12 | | Creates an index | False | 13 | | Implements summarization | No | 14 | | Summary Indexing | False | 15 | | Data Model Acceleration | If Enabled | 16 | | Report Acceleration | False | 17 | | Splunk Enterprise versions | | 18 | | Platforms | Splunk Enterprise, Splunk Cloud | 19 | 20 | ## Scripts and binaries 21 | 22 | This App provides the following scripts: 23 | 24 | | | | 25 | |---------------------|---------------------------------------------------------------------------| 26 | | Diag.py | For use with the diag command. | 27 | | fetch_latest.sample | For grabbing the most recent versions of the libraries. | 28 | | user_agents.py | This is the lookup command python to parse the user agent. | 29 | | Utilities.py | This is a supporting python script for use with logging, and other needs. | 30 | | version.py | This contains the version of the package. | 31 | | app_properties.py | This contains app properties. | 32 | 33 |
34 | 35 | `fetch_latest.sample` is a bash script that would need to be renamed and have +x added to it in order to be a valid script. This script updates the libraries for on-prem installations. 36 | 37 |
38 | 39 | ## Lookups 40 | 41 | PAVO TA User Agents contains the following lookup files. 42 | 43 | - None 44 | 45 | ## Event Generator 46 | 47 | PAVO TA User Agents does not include an event generator. 48 | 49 | ## Acceleration 50 | 51 | - Summary Indexing: No 52 | 53 | - Data Model Acceleration: No 54 | 55 | - Report Acceleration: No 56 | 57 | # Prerequisites, Installation, and Configuration 58 | 59 | Because this App runs on Splunk Enterprise, all the [Splunk Enterprise system requirements](https://docs.splunk.com/Documentation/Splunk/latest/Installation/Systemrequirements) apply. 60 | 61 | ## Installation and Configuration 62 | 63 | ### Download 64 | 65 | ## Installation Process Overview 66 | 67 | - Install the extension. 68 | 69 | ### Deploy to single server instance 70 | 71 | Follow these steps to install the app in a single server instance of Splunk Enterprise: 72 | 73 | - Deploy as you would any App, and restart Splunk. 74 | 75 | - Configure. 76 | 77 | ### Deploy to Splunk Cloud 78 | 79 | - Have your Splunk Cloud Support handle this installation. 80 | 81 | ### Deploy to a Distributed Environment 82 | 83 | - For each Search Head in the environment, deploy a copy of the App. 84 | 85 | # Support and resources 86 | 87 | ## Questions and answers 88 | 89 | Access questions and answers specific to PAVO TA User Agents at . Be sure to tag your question with the App. 90 | 91 | ## Support 92 | 93 | - Support Email: 94 | 95 | - Support Offered: Splunk Answers, Email 96 | 97 | ### Logging 98 | 99 | Copy the \`\`log.cfg\`\` file from \`\`default\`\` to \`\`local\`\` and change the settings as needed. 100 | 101 | ### Diagnostics Generation 102 | 103 | If a support representative asks for it, a support diagnostic file can be generated. Use the following command to generate the file. Send the resulting file to support. 104 | 105 | \`\`\$SPLUNK_HOME/bin/splunk diag --collect=app:TA-user-agents\`\` 106 | 107 | ## Known Issues 108 | 109 | Version 1.7.7 of PAVO TA User Agents has the following known issues: 110 | 111 | - None 112 | 113 | ## Release notes 114 | 115 | ### Version 1.7.7 116 | 117 | - Improvement 118 | 119 | - Removed Python that was flagged by Upgrade Readiness App. 120 | 121 | ### Version 1.7.5 122 | 123 | - Improvement 124 | 125 | - Modified Script for Splunk Cloud compatability. 126 | 127 | ### Version 1.7.4 128 | 129 | - Improvement 130 | 131 | - Updated for Python 3 and Splunk 8 compatability 132 | 133 | # Third Party Notices 134 | 135 | Version 1.7.7 of PAVO TA User Agents incorporates the following Third-party software or third-party services. 136 | 137 | - ua_parser 138 | 139 | - pyyaml 140 | -------------------------------------------------------------------------------- /TA-user-agents/README/apl_logging.conf.spec: -------------------------------------------------------------------------------- 1 | [TA-user_agents] 2 | modularinput = 3 | restclient = 4 | utilities = 5 | kenny_loggins = -------------------------------------------------------------------------------- /TA-user-agents/UPDATING.md: -------------------------------------------------------------------------------- 1 | # Updating the Python modules and parsers 2 | In 1.7.2, @lowell80 was nice enough to contribute a script for updating the 3 | Python modules this TA uses. 4 | Unfortunately, there were commands in this script which caused the app to fail 5 | Splunk Cloud vetting, so that it cannot be installed by Cloud customers. 6 | 7 | To solve this, we have repackaged the app without the script, and resubmitted it 8 | for vetting. 9 | 10 | As such, the `fetch_latest` script is no longer included in the download from 11 | Splunkbase. However, the script is still available for download from the 12 | Github repo: 13 | 14 | https://github.com/automine/TA-user-agents/blob/master/bin/fetch_latest.sh 15 | 16 | To fetch the latest user agent matching rules run the `bin/refresh_latest.sh` script. This will update not only the browser strings information (`regexes.yaml`), but the python module as well. If you'd like to run this on a regular basis, consider setting this up as a cron job, or as a scripted input at an appropriate interval. 17 | 18 | For example, to refresh every Monday at 4AM, add the following `inputs.conf` entry: 19 | 20 | [script://./bin/fetch_latest.sh local] 21 | interval = 0 4 * * 1 22 | index = _internal 23 | 24 | Note: This script on works on Unix, requires `git`, and will clobber any local customizations to `regexes.yaml`. 25 | -------------------------------------------------------------------------------- /TA-user-agents/app.manifest: -------------------------------------------------------------------------------- 1 | {"schemaVersion":"2.0.0","info":{"title":"PAVO TA User Agents","id":{"group":null,"name":"TA-user-agents","version":"1.7.7"},"author":[{"name":"Aplura, LLC","email":null,"company":null}],"releaseDate":null,"description":"Provides an external Python lookup that parses User Agents strings.","classification":{"intendedAudience":null,"categories":[],"developmentStatus":"Production/Stable"},"commonInformationModels":null,"license":{"name":null,"text":null,"uri":null},"privacyPolicy":{"name":null,"text":null,"uri":null},"releaseNotes":{"name":"README","text":"./README.md","uri":null}},"dependencies":null,"tasks":null,"inputGroups":{},"incompatibleApps":null,"platformRequirements":null,"supportedDeployments":["*"],"targetWorkloads":null} -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/README.md: -------------------------------------------------------------------------------- 1 | # PAVO TA User Agents Documentation 2 | 3 | Provides an external Python lookup that parses User Agents strings. 4 | 5 | ## About PAVO TA User Agents 6 | 7 | | | | 8 | |----------------------------|---------------------------------| 9 | | Author | Aplura, LLC | 10 | | App Version | 1.7.7 | 11 | | App Build | 21 | 12 | | Creates an index | False | 13 | | Implements summarization | No | 14 | | Summary Indexing | False | 15 | | Data Model Acceleration | If Enabled | 16 | | Report Acceleration | False | 17 | | Splunk Enterprise versions | | 18 | | Platforms | Splunk Enterprise, Splunk Cloud | 19 | 20 | ## Scripts and binaries 21 | 22 | This App provides the following scripts: 23 | 24 | | | | 25 | |---------------------|---------------------------------------------------------------------------| 26 | | Diag.py | For use with the diag command. | 27 | | fetch_latest.sample | For grabbing the most recent versions of the libraries. | 28 | | user_agents.py | This is the lookup command python to parse the user agent. | 29 | | Utilities.py | This is a supporting python script for use with logging, and other needs. | 30 | | version.py | This contains the version of the package. | 31 | | app_properties.py | This contains app properties. | 32 | 33 |
34 | 35 | `fetch_latest.sample` is a bash script that would need to be renamed and have +x added to it in order to be a valid script. This script updates the libraries for on-prem installations. 36 | 37 |
38 | 39 | ## Lookups 40 | 41 | PAVO TA User Agents contains the following lookup files. 42 | 43 | - None 44 | 45 | ## Event Generator 46 | 47 | PAVO TA User Agents does not include an event generator. 48 | 49 | ## Acceleration 50 | 51 | - Summary Indexing: No 52 | 53 | - Data Model Acceleration: No 54 | 55 | - Report Acceleration: No 56 | 57 | # Prerequisites, Installation, and Configuration 58 | 59 | Because this App runs on Splunk Enterprise, all the [Splunk Enterprise system requirements](https://docs.splunk.com/Documentation/Splunk/latest/Installation/Systemrequirements) apply. 60 | 61 | ## Installation and Configuration 62 | 63 | ### Download 64 | 65 | ## Installation Process Overview 66 | 67 | - Install the extension. 68 | 69 | ### Deploy to single server instance 70 | 71 | Follow these steps to install the app in a single server instance of Splunk Enterprise: 72 | 73 | - Deploy as you would any App, and restart Splunk. 74 | 75 | - Configure. 76 | 77 | ### Deploy to Splunk Cloud 78 | 79 | - Have your Splunk Cloud Support handle this installation. 80 | 81 | ### Deploy to a Distributed Environment 82 | 83 | - For each Search Head in the environment, deploy a copy of the App. 84 | 85 | # Support and resources 86 | 87 | ## Questions and answers 88 | 89 | Access questions and answers specific to PAVO TA User Agents at . Be sure to tag your question with the App. 90 | 91 | ## Support 92 | 93 | - Support Email: 94 | 95 | - Support Offered: Splunk Answers, Email 96 | 97 | ### Logging 98 | 99 | Copy the \`\`log.cfg\`\` file from \`\`default\`\` to \`\`local\`\` and change the settings as needed. 100 | 101 | ### Diagnostics Generation 102 | 103 | If a support representative asks for it, a support diagnostic file can be generated. Use the following command to generate the file. Send the resulting file to support. 104 | 105 | \`\`\$SPLUNK_HOME/bin/splunk diag --collect=app:TA-user-agents\`\` 106 | 107 | ## Known Issues 108 | 109 | Version 1.7.7 of PAVO TA User Agents has the following known issues: 110 | 111 | - None 112 | 113 | ## Release notes 114 | 115 | ### Version 1.7.7 116 | 117 | - Improvement 118 | 119 | - Removed Python that was flagged by Upgrade Readiness App. 120 | 121 | ### Version 1.7.5 122 | 123 | - Improvement 124 | 125 | - Modified Script for Splunk Cloud compatability. 126 | 127 | ### Version 1.7.4 128 | 129 | - Improvement 130 | 131 | - Updated for Python 3 and Splunk 8 compatability 132 | 133 | # Third Party Notices 134 | 135 | Version 1.7.7 of PAVO TA User Agents incorporates the following Third-party software or third-party services. 136 | 137 | - ua_parser 138 | 139 | - pyyaml 140 | -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/appIcon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/appserver/static/appIcon.png -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/appIconAlt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/appserver/static/appIconAlt.png -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/appIconAlt_2x.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/appserver/static/appIconAlt_2x.png -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/appIcon_2x.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/appserver/static/appIcon_2x.png -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/details.md: -------------------------------------------------------------------------------- 1 | # PAVO TA User Agents Documentation 2 | 3 | Provides an external Python lookup that parses User Agents strings. 4 | 5 | ## About PAVO TA User Agents 6 | 7 | | | | 8 | |----------------------------|---------------------------------| 9 | | Author | Aplura, LLC | 10 | | App Version | 1.7.7 | 11 | | App Build | 21 | 12 | | Creates an index | False | 13 | | Implements summarization | No | 14 | | Summary Indexing | False | 15 | | Data Model Acceleration | If Enabled | 16 | | Report Acceleration | False | 17 | | Splunk Enterprise versions | | 18 | | Platforms | Splunk Enterprise, Splunk Cloud | 19 | 20 | ## Scripts and binaries 21 | 22 | This App provides the following scripts: 23 | 24 | | | | 25 | |---------------------|---------------------------------------------------------------------------| 26 | | Diag.py | For use with the diag command. | 27 | | fetch_latest.sample | For grabbing the most recent versions of the libraries. | 28 | | user_agents.py | This is the lookup command python to parse the user agent. | 29 | | Utilities.py | This is a supporting python script for use with logging, and other needs. | 30 | | version.py | This contains the version of the package. | 31 | | app_properties.py | This contains app properties. | 32 | 33 |
34 | 35 | `fetch_latest.sample` is a bash script that would need to be renamed and have +x added to it in order to be a valid script. This script updates the libraries for on-prem installations. 36 | 37 |
38 | 39 | ## Lookups 40 | 41 | PAVO TA User Agents contains the following lookup files. 42 | 43 | - None 44 | 45 | ## Event Generator 46 | 47 | PAVO TA User Agents does not include an event generator. 48 | 49 | ## Acceleration 50 | 51 | - Summary Indexing: No 52 | 53 | - Data Model Acceleration: No 54 | 55 | - Report Acceleration: No 56 | -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/documentation/README.md: -------------------------------------------------------------------------------- 1 | # PAVO TA User Agents Documentation 2 | 3 | Provides an external Python lookup that parses User Agents strings. 4 | 5 | ## About PAVO TA User Agents 6 | 7 | | | | 8 | |----------------------------|---------------------------------| 9 | | Author | Aplura, LLC | 10 | | App Version | 1.7.7 | 11 | | App Build | 21 | 12 | | Creates an index | False | 13 | | Implements summarization | No | 14 | | Summary Indexing | False | 15 | | Data Model Acceleration | If Enabled | 16 | | Report Acceleration | False | 17 | | Splunk Enterprise versions | | 18 | | Platforms | Splunk Enterprise, Splunk Cloud | 19 | 20 | ## Scripts and binaries 21 | 22 | This App provides the following scripts: 23 | 24 | | | | 25 | |---------------------|---------------------------------------------------------------------------| 26 | | Diag.py | For use with the diag command. | 27 | | fetch_latest.sample | For grabbing the most recent versions of the libraries. | 28 | | user_agents.py | This is the lookup command python to parse the user agent. | 29 | | Utilities.py | This is a supporting python script for use with logging, and other needs. | 30 | | version.py | This contains the version of the package. | 31 | | app_properties.py | This contains app properties. | 32 | 33 |
34 | 35 | `fetch_latest.sample` is a bash script that would need to be renamed and have +x added to it in order to be a valid script. This script updates the libraries for on-prem installations. 36 | 37 |
38 | 39 | ## Lookups 40 | 41 | PAVO TA User Agents contains the following lookup files. 42 | 43 | - None 44 | 45 | ## Event Generator 46 | 47 | PAVO TA User Agents does not include an event generator. 48 | 49 | ## Acceleration 50 | 51 | - Summary Indexing: No 52 | 53 | - Data Model Acceleration: No 54 | 55 | - Report Acceleration: No 56 | 57 | # Prerequisites, Installation, and Configuration 58 | 59 | Because this App runs on Splunk Enterprise, all the [Splunk Enterprise system requirements](https://docs.splunk.com/Documentation/Splunk/latest/Installation/Systemrequirements) apply. 60 | 61 | ## Installation and Configuration 62 | 63 | ### Download 64 | 65 | ## Installation Process Overview 66 | 67 | - Install the extension. 68 | 69 | ### Deploy to single server instance 70 | 71 | Follow these steps to install the app in a single server instance of Splunk Enterprise: 72 | 73 | - Deploy as you would any App, and restart Splunk. 74 | 75 | - Configure. 76 | 77 | ### Deploy to Splunk Cloud 78 | 79 | - Have your Splunk Cloud Support handle this installation. 80 | 81 | ### Deploy to a Distributed Environment 82 | 83 | - For each Search Head in the environment, deploy a copy of the App. 84 | 85 | # Support and resources 86 | 87 | ## Questions and answers 88 | 89 | Access questions and answers specific to PAVO TA User Agents at . Be sure to tag your question with the App. 90 | 91 | ## Support 92 | 93 | - Support Email: 94 | 95 | - Support Offered: Splunk Answers, Email 96 | 97 | ### Logging 98 | 99 | Copy the \`\`log.cfg\`\` file from \`\`default\`\` to \`\`local\`\` and change the settings as needed. 100 | 101 | ### Diagnostics Generation 102 | 103 | If a support representative asks for it, a support diagnostic file can be generated. Use the following command to generate the file. Send the resulting file to support. 104 | 105 | \`\`\$SPLUNK_HOME/bin/splunk diag --collect=app:TA-user-agents\`\` 106 | 107 | ## Known Issues 108 | 109 | Version 1.7.7 of PAVO TA User Agents has the following known issues: 110 | 111 | - None 112 | 113 | ## Release notes 114 | 115 | ### Version 1.7.7 116 | 117 | - Improvement 118 | 119 | - Removed Python that was flagged by Upgrade Readiness App. 120 | 121 | ### Version 1.7.5 122 | 123 | - Improvement 124 | 125 | - Modified Script for Splunk Cloud compatability. 126 | 127 | ### Version 1.7.4 128 | 129 | - Improvement 130 | 131 | - Updated for Python 3 and Splunk 8 compatability 132 | 133 | # Third Party Notices 134 | 135 | Version 1.7.7 of PAVO TA User Agents incorporates the following Third-party software or third-party services. 136 | 137 | - ua_parser 138 | 139 | - pyyaml 140 | -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/documentation/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | README Documentation 9 | 10 | 11 | 31 | 248 | 249 | 250 |
251 |
252 |

PAVO TA User Agents Documentation

253 |

Provides an external Python lookup that parses User Agents strings.

254 |

About PAVO TA User Agents

255 | 256 | 257 | 258 | 259 | 260 | 261 | 262 | 263 | 264 | 265 | 266 | 267 | 268 | 269 | 270 | 271 | 272 | 273 | 274 | 275 | 276 | 277 | 278 | 279 | 280 | 281 | 282 | 283 | 284 | 285 | 286 | 287 | 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | 296 | 297 | 298 | 299 | 300 | 301 | 302 |

Author

Aplura, LLC

App Version

1.7.7

App Build

21

Creates an index

False

Implements summarization

No

Summary Indexing

False

Data Model Acceleration

If Enabled

Report Acceleration

False

Splunk Enterprise versions

Platforms

Splunk Enterprise, Splunk Cloud

303 |

Scripts and binaries

304 |

This App provides the following scripts:

305 | 306 | 307 | 308 | 309 | 310 | 311 | 312 | 313 | 314 | 315 | 316 | 317 | 318 | 319 | 320 | 321 | 322 | 323 | 324 | 325 | 326 | 327 | 328 | 329 | 330 | 331 | 332 | 333 | 334 | 335 | 336 |

Diag.py

For use with the diag command.

fetch_latest.sample

For grabbing the most recent versions of the libraries.

user_agents.py

This is the lookup command python to parse the user agent.

Utilities.py

This is a supporting python script for use with logging, and other needs.

version.py

This contains the version of the package.

app_properties.py

This contains app properties.

337 |
338 |

fetch_latest.sample is a bash script that would need to be renamed and have +x added to it in order to be a valid script. This script updates the libraries for on-prem installations.

339 |
340 |

Lookups

341 |

PAVO TA User Agents contains the following lookup files.

342 |
    343 |
  • None

  • 344 |
345 |

Event Generator

346 |

PAVO TA User Agents does not include an event generator.

347 |

Acceleration

348 |
    349 |
  • Summary Indexing: No

  • 350 |
  • Data Model Acceleration: No

  • 351 |
  • Report Acceleration: No

  • 352 |
353 |

Prerequisites, Installation, and Configuration

354 |

Because this App runs on Splunk Enterprise, all the Splunk Enterprise system requirements apply.

355 |

Installation and Configuration

356 |

Download

357 |

Installation Process Overview

358 |
    359 |
  • Install the extension.

  • 360 |
361 |

Deploy to single server instance

362 |

Follow these steps to install the app in a single server instance of Splunk Enterprise:

363 |
    364 |
  • Deploy as you would any App, and restart Splunk.

  • 365 |
  • Configure.

  • 366 |
367 |

Deploy to Splunk Cloud

368 |
    369 |
  • Have your Splunk Cloud Support handle this installation.

  • 370 |
371 |

Deploy to a Distributed Environment

372 |
    373 |
  • For each Search Head in the environment, deploy a copy of the App.

  • 374 |
375 |

Support and resources

376 |

Questions and answers

377 |

Access questions and answers specific to PAVO TA User Agents at https://community.splunk.com. Be sure to tag your question with the App.

378 |

Support

379 | 383 |

Logging

384 |

Copy the ``log.cfg`` file from ``default`` to ``local`` and change the settings as needed.

385 |

Diagnostics Generation

386 |

If a support representative asks for it, a support diagnostic file can be generated. Use the following command to generate the file. Send the resulting file to support.

387 |

``$SPLUNK_HOME/bin/splunk diag --collect=app:TA-user-agents``

388 |

Known Issues

389 |

Version 1.7.7 of PAVO TA User Agents has the following known issues:

390 |
    391 |
  • None

  • 392 |
393 |

Release notes

394 |

Version 1.7.7

395 |
    396 |
  • Improvement

    397 |
      398 |
    • Removed Python that was flagged by Upgrade Readiness App.

    • 399 |
  • 400 |
401 |

Version 1.7.5

402 |
    403 |
  • Improvement

    404 |
      405 |
    • Modified Script for Splunk Cloud compatability.

    • 406 |
  • 407 |
408 |

Version 1.7.4

409 |
    410 |
  • Improvement

    411 |
      412 |
    • Updated for Python 3 and Splunk 8 compatability

    • 413 |
  • 414 |
415 |

Third Party Notices

416 |

Version 1.7.7 of PAVO TA User Agents incorporates the following Third-party software or third-party services.

417 |
    418 |
  • ua_parser

  • 419 |
  • pyyaml

  • 420 |
421 |
422 |
423 |
424 |

Generated on 2023-06-13 / Copyright 2023 Aplura, LLC.

425 | 426 | 427 | -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/installation.md: -------------------------------------------------------------------------------- 1 | # Prerequisites, Installation, and Configuration 2 | 3 | Because this App runs on Splunk Enterprise, all the [Splunk Enterprise system requirements](https://docs.splunk.com/Documentation/Splunk/latest/Installation/Systemrequirements) apply. 4 | 5 | ## Installation and Configuration 6 | 7 | ### Download 8 | 9 | ## Installation Process Overview 10 | 11 | - Install the extension. 12 | 13 | ### Deploy to single server instance 14 | 15 | Follow these steps to install the app in a single server instance of Splunk Enterprise: 16 | 17 | - Deploy as you would any App, and restart Splunk. 18 | 19 | - Configure. 20 | 21 | ### Deploy to Splunk Cloud 22 | 23 | - Have your Splunk Cloud Support handle this installation. 24 | 25 | ### Deploy to a Distributed Environment 26 | 27 | - For each Search Head in the environment, deploy a copy of the App. 28 | -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/third_party.md: -------------------------------------------------------------------------------- 1 | # Third Party Notices 2 | 3 | Version 1.7.7 of PAVO TA User Agents incorporates the following Third-party software or third-party services. 4 | 5 | - ua_parser 6 | 7 | - pyyaml 8 | -------------------------------------------------------------------------------- /TA-user-agents/appserver/static/troubleshooting.md: -------------------------------------------------------------------------------- 1 | # Support and resources 2 | 3 | ## Questions and answers 4 | 5 | Access questions and answers specific to PAVO TA User Agents at . Be sure to tag your question with the App. 6 | 7 | ## Support 8 | 9 | - Support Email: 10 | 11 | - Support Offered: Splunk Answers, Email 12 | 13 | ### Logging 14 | 15 | Copy the \`\`log.cfg\`\` file from \`\`default\`\` to \`\`local\`\` and change the settings as needed. 16 | 17 | ### Diagnostics Generation 18 | 19 | If a support representative asks for it, a support diagnostic file can be generated. Use the following command to generate the file. Send the resulting file to support. 20 | 21 | \`\`\$SPLUNK_HOME/bin/splunk diag --collect=app:TA-user-agents\`\` 22 | 23 | ## Known Issues 24 | 25 | Version 1.7.7 of PAVO TA User Agents has the following known issues: 26 | 27 | - None 28 | 29 | ## Release notes 30 | 31 | ### Version 1.7.7 32 | 33 | - Improvement 34 | 35 | - Removed Python that was flagged by Upgrade Readiness App. 36 | 37 | ### Version 1.7.5 38 | 39 | - Improvement 40 | 41 | - Modified Script for Splunk Cloud compatability. 42 | 43 | ### Version 1.7.4 44 | 45 | - Improvement 46 | 47 | - Updated for Python 3 and Splunk 8 compatability 48 | -------------------------------------------------------------------------------- /TA-user-agents/bin/Diag.py: -------------------------------------------------------------------------------- 1 | """ 2 | Written by Kyle Smith for Aplura, LLC 3 | Copyright (C) 2016-2022 Aplura, ,LLC 4 | 5 | This program is free software; you can redistribute it and/or 6 | modify it under the terms of the GNU General Public License 7 | as published by the Free Software Foundation; either version 2 8 | of the License, or (at your option) any later version. 9 | 10 | This program is distributed in the hope that it will be useful, 11 | but WITHOUT ANY WARRANTY; without even the implied warranty of 12 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 13 | GNU General Public License for more details. 14 | 15 | You should have received a copy of the GNU General Public License 16 | along with this program; if not, write to the Free Software 17 | Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA. 18 | """ 19 | from __future__ import absolute_import 20 | import logging 21 | import os 22 | from splunk.appserver.mrsparkle.lib.util import make_splunkhome_path 23 | 24 | 25 | # Use the **args pattern to ignore options we don't care about. 26 | def setup(parser=None, callback=None, **kwargs): 27 | logging.debug("setup() was called!") 28 | 29 | # Declare that we're going to use REST later 30 | callback.will_need_rest() 31 | 32 | 33 | # The options are out of order, as is possible for keyword invocation 34 | def collect_diag_info(diag, options=None, global_options=None, app_dir=None, **kwargs): 35 | app = app_dir.split(os.path.sep)[-1] 36 | logging.info("collect_diag_info() was called for app {}".format(app)) 37 | 38 | # Collect a directory from the app 39 | a_dir = os.path.join(app_dir, 'bin') 40 | logging.info("collecting bin: {}".format(a_dir)) 41 | diag.add_dir(a_dir, 'bin') 42 | 43 | # Collect a directory from the app 44 | a_dir = os.path.join(app_dir, 'appserver') 45 | logging.info("collecting appserver: {}".format(a_dir)) 46 | diag.add_dir(a_dir, 'appserver') 47 | 48 | a_dir = os.path.join(app_dir, 'default') 49 | logging.info("collecting default: {}".format(a_dir)) 50 | diag.add_dir(a_dir, "default") 51 | 52 | # Collect a directory from the app 53 | local_dir = os.path.join(app_dir, 'local') 54 | if not os.path.exists(local_dir): 55 | logging.info(f"action=collect_local_logs path={local_dir} status=404") 56 | else: 57 | logging.info("collecting local: {}".format(a_dir)) 58 | diag.add_dir(local_dir, "local") 59 | 60 | app_logs = make_splunkhome_path(["var", "log", "splunk", app]) 61 | if not os.path.exists(app_logs): 62 | logging.info(f"action=collect_app_logs path={app_logs} status=failed") 63 | app_logs = os.path.join(os.environ["SPLUNK_HOME"], "var", "log", "splunk", app) 64 | if not os.path.exists(app_logs): 65 | logging.error(f"action=collect_app_logs path={app_logs} status=not_found") 66 | else: 67 | logging.info(f"action=collect_app_logs path={app_logs} status=found") 68 | diag.add_dir(app_logs, "01_application_logs") 69 | 70 | modinputs_dir = make_splunkhome_path(["var", "lib", "splunk", "modinputs"]) 71 | if not os.path.exists(modinputs_dir): 72 | logging.info(f"action=collect_modinputs path={modinputs_dir} status=failed") 73 | modinputs_dir = os.path.join(os.environ["SPLUNK_DB"], "modinputs") 74 | if not os.path.exists(modinputs_dir): 75 | logging.error(f"action=collect_modinputs_checkpoints path={modinputs_dir} status=not_found") 76 | else: 77 | logging.info(f"action=collect_modinputs_checkpoints path={modinputs_dir} status=found") 78 | diag.add_dir(modinputs_dir, "02_modinput_checkpoints") 79 | 80 | # Collect some REST endpoint data 81 | diag.add_rest_endpoint("/services/server/info", "03_server_info.xml") 82 | -------------------------------------------------------------------------------- /TA-user-agents/bin/_yaml/__init__.py: -------------------------------------------------------------------------------- 1 | # This is a stub package designed to roughly emulate the _yaml 2 | # extension module, which previously existed as a standalone module 3 | # and has been moved into the `yaml` package namespace. 4 | # It does not perfectly mimic its old counterpart, but should get 5 | # close enough for anyone who's relying on it even when they shouldn't. 6 | import yaml 7 | 8 | # in some circumstances, the yaml module we imoprted may be from a different version, so we need 9 | # to tread carefully when poking at it here (it may not have the attributes we expect) 10 | if not getattr(yaml, '__with_libyaml__', False): 11 | from sys import version_info 12 | 13 | exc = ModuleNotFoundError if version_info >= (3, 6) else ImportError 14 | raise exc("No module named '_yaml'") 15 | else: 16 | from yaml._yaml import * 17 | import warnings 18 | warnings.warn( 19 | 'The _yaml extension module is now located at yaml._yaml' 20 | ' and its location is subject to change. To use the' 21 | ' LibYAML-based parser and emitter, import from `yaml`:' 22 | ' `from yaml import CLoader as Loader, CDumper as Dumper`.', 23 | DeprecationWarning 24 | ) 25 | del warnings 26 | # Don't `del yaml` here because yaml is actually an existing 27 | # namespace member of _yaml. 28 | 29 | __name__ = '_yaml' 30 | # If the module is top-level (i.e. not a part of any specific package) 31 | # then the attribute should be set to ''. 32 | # https://docs.python.org/3.8/library/types.html 33 | __package__ = '' 34 | -------------------------------------------------------------------------------- /TA-user-agents/bin/app_properties.py: -------------------------------------------------------------------------------- 1 | __app_name__ = "TA-user-agents" 2 | __version__ = "1.7.7" 3 | __build__ = "1" 4 | -------------------------------------------------------------------------------- /TA-user-agents/bin/fetch_latest.sample: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | # This script will pull down the latest python modules from the upstream 3 | # maintainers using git. This updates not only the regexes.yaml file, but all 4 | # the python code. 5 | # 6 | # This script is indented to be useful to both end-users simply wanting the 7 | # latest User Agent strings as well as the TA maintainer (or anyone else 8 | # storing their Splunk apps in git) periodically refreshing this content. 9 | # 10 | # Even though uap-core is a submodule of uap-python, to always get the latest 11 | # UA parsing configuration, this script pulls both repos independently. Since 12 | # the submodule isn't initialize, this doesn't result in duplicate work. 13 | # 14 | # WARNING: Any local customizations to regexes.yaml will be overwritten. 15 | # 16 | # Author: Lowell Alleman (lowell@kintyre.co) 17 | 18 | cd "$(dirname "${BASH_SOURCE[0]}")" || exit 1 19 | MYNAME=$(basename "${BASH_SOURCE[0]}") 20 | BIN_DIR=$(pwd) 21 | REPOS="$BIN_DIR/repos" 22 | 23 | target="$1" 24 | if [[ $target != "local" ]] && [[ $target != "git" ]] 25 | then 26 | echo "Usage: $MYNAME (local|git) " 1>&2 27 | echo 1>&2 28 | echo " Unless you are the TA maintainer, pick 'local'" 1>&2 29 | exit 1 30 | fi 31 | 32 | function fetch_repo() { 33 | local repo="$1" 34 | local dir="$2" 35 | if [[ -d $dir ]] 36 | then 37 | echo "Pulling down upstream changes for $repo" 38 | git -C "$dir" pull --ff-only || exit 2 39 | else 40 | echo "Cloning upstream repo $repo" 41 | git clone "$repo" "$dir" || exit 2 42 | fi 43 | 44 | } 45 | 46 | function git_repo_info() { 47 | git -C "$1" show -s --abbrev=20 --format="%h %cd %cn" --date=format:"%Y-%m-%d" 48 | } 49 | 50 | [[ -x $(command -v git) ]] || { echo "$MYNAME requires 'git'." 1>&2; exit 2; } 51 | 52 | [[ -d $REPOS ]] || mkdir -v "$REPOS" 53 | cd "$REPOS" || exit 1 54 | 55 | fetch_repo https://github.com/ua-parser/uap-python.git uap-python 56 | fetch_repo https://github.com/ua-parser/uap-core.git uap-core 57 | 58 | # Confirm that the checkout still contains all the expected folders; If not 59 | # this script will need to be updated to reflect whatever upstream changes. 60 | [[ -d "uap-python/ua_parser" ]] || { echo "Upstream git repo missing 'ua_parser'"; exit 3; } 61 | [[ -f "uap-core/regexes.yaml" ]] || { echo "Upstream git repo missing regexes.yaml'"; exit 3; } 62 | 63 | echo "Copying updated python modules into TA-user-agents" 64 | #delete command here incase need to revert, will fail appinspect though -rf "$BIN_DIR/ua_parser" "$BIN_DIR/uap-core" 65 | mkdir "$BIN_DIR/ua_parser" "$BIN_DIR/uap-core" 66 | # Skip all hidden files 67 | #cp -a "$REPOS"/uap-python/ua_parser/* "$BIN_DIR/ua_parser" 68 | #cp -a "$REPOS"/uap-core/* "$BIN_DIR/uap-core" 69 | mov "$REPOS"/uap-python/ua_parser/* "$BIN_DIR/ua_parser" 70 | mov "$REPOS"/uap-core/* "$BIN_DIR/uap-core" 71 | 72 | # If you're not the git TA maintainer, then just can exit here... 73 | [[ $target == "git" ]] || exit 74 | 75 | cd "$BIN_DIR" || exit 1 76 | 77 | # Stage updates to known files (adds and deletes) 78 | git add -u ua_parser uap-core 79 | 80 | # Add any newly created files (not previously part of the upstream repo) 81 | git add ua_parser uap-core 82 | 83 | UAP_PY_VER=$(git -C "$REPOS/uap-python" describe) 84 | UAP_PY_INFO=$(git_repo_info "$REPOS/uap-python") 85 | UAP_CORE_INFO=$(git_repo_info "$REPOS/uap-core") 86 | 87 | [[ $(git status --porcelain | grep -cv '[?][?]') -eq 0 ]] && { 88 | echo "No changes to commit."; exit 0; } 89 | 90 | git commit -m "Update ua_parser $UAP_PY_VER 91 | 92 | uap-python $UAP_PY_INFO 93 | uap-core $UAP_CORE_INFO 94 | 95 | Updated-By-Script: $MYNAME" --edit 96 | -------------------------------------------------------------------------------- /TA-user-agents/bin/requirements.txt: -------------------------------------------------------------------------------- 1 | ua_parser==0.16.1 2 | pyyaml==6.0 -------------------------------------------------------------------------------- /TA-user-agents/bin/ua_parser/__init__.py: -------------------------------------------------------------------------------- 1 | VERSION = (0, 16, 1) 2 | -------------------------------------------------------------------------------- /TA-user-agents/bin/ua_parser/user_agent_parser.py: -------------------------------------------------------------------------------- 1 | # Copyright 2009 Google Inc. 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the 'License') 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # http://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an 'AS IS' BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | """Python implementation of the UA parser.""" 16 | 17 | from __future__ import absolute_import 18 | 19 | import os 20 | import re 21 | import sys 22 | import warnings 23 | 24 | __author__ = "Lindsey Simon " 25 | 26 | 27 | class UserAgentParser(object): 28 | def __init__( 29 | self, pattern, family_replacement=None, v1_replacement=None, v2_replacement=None 30 | ): 31 | """Initialize UserAgentParser. 32 | 33 | Args: 34 | pattern: a regular expression string 35 | family_replacement: a string to override the matched family (optional) 36 | v1_replacement: a string to override the matched v1 (optional) 37 | v2_replacement: a string to override the matched v2 (optional) 38 | """ 39 | self.pattern = pattern 40 | self.user_agent_re = re.compile(self.pattern) 41 | self.family_replacement = family_replacement 42 | self.v1_replacement = v1_replacement 43 | self.v2_replacement = v2_replacement 44 | 45 | def MatchSpans(self, user_agent_string): 46 | match_spans = [] 47 | match = self.user_agent_re.search(user_agent_string) 48 | if match: 49 | match_spans = [ 50 | match.span(group_index) for group_index in range(1, match.lastindex + 1) 51 | ] 52 | return match_spans 53 | 54 | def Parse(self, user_agent_string): 55 | family, v1, v2, v3 = None, None, None, None 56 | match = self.user_agent_re.search(user_agent_string) 57 | if match: 58 | if self.family_replacement: 59 | if re.search(r"\$1", self.family_replacement): 60 | family = re.sub(r"\$1", match.group(1), self.family_replacement) 61 | else: 62 | family = self.family_replacement 63 | else: 64 | family = match.group(1) 65 | 66 | if self.v1_replacement: 67 | v1 = self.v1_replacement 68 | elif match.lastindex and match.lastindex >= 2: 69 | v1 = match.group(2) or None 70 | 71 | if self.v2_replacement: 72 | v2 = self.v2_replacement 73 | elif match.lastindex and match.lastindex >= 3: 74 | v2 = match.group(3) or None 75 | 76 | if match.lastindex and match.lastindex >= 4: 77 | v3 = match.group(4) or None 78 | 79 | return family, v1, v2, v3 80 | 81 | 82 | class OSParser(object): 83 | def __init__( 84 | self, 85 | pattern, 86 | os_replacement=None, 87 | os_v1_replacement=None, 88 | os_v2_replacement=None, 89 | os_v3_replacement=None, 90 | os_v4_replacement=None, 91 | ): 92 | """Initialize UserAgentParser. 93 | 94 | Args: 95 | pattern: a regular expression string 96 | os_replacement: a string to override the matched os (optional) 97 | os_v1_replacement: a string to override the matched v1 (optional) 98 | os_v2_replacement: a string to override the matched v2 (optional) 99 | os_v3_replacement: a string to override the matched v3 (optional) 100 | os_v4_replacement: a string to override the matched v4 (optional) 101 | """ 102 | self.pattern = pattern 103 | self.user_agent_re = re.compile(self.pattern) 104 | self.os_replacement = os_replacement 105 | self.os_v1_replacement = os_v1_replacement 106 | self.os_v2_replacement = os_v2_replacement 107 | self.os_v3_replacement = os_v3_replacement 108 | self.os_v4_replacement = os_v4_replacement 109 | 110 | def MatchSpans(self, user_agent_string): 111 | match_spans = [] 112 | match = self.user_agent_re.search(user_agent_string) 113 | if match: 114 | match_spans = [ 115 | match.span(group_index) for group_index in range(1, match.lastindex + 1) 116 | ] 117 | return match_spans 118 | 119 | def Parse(self, user_agent_string): 120 | os, os_v1, os_v2, os_v3, os_v4 = None, None, None, None, None 121 | match = self.user_agent_re.search(user_agent_string) 122 | if match: 123 | if self.os_replacement: 124 | os = MultiReplace(self.os_replacement, match) 125 | elif match.lastindex: 126 | os = match.group(1) 127 | 128 | if self.os_v1_replacement: 129 | os_v1 = MultiReplace(self.os_v1_replacement, match) 130 | elif match.lastindex and match.lastindex >= 2: 131 | os_v1 = match.group(2) 132 | 133 | if self.os_v2_replacement: 134 | os_v2 = MultiReplace(self.os_v2_replacement, match) 135 | elif match.lastindex and match.lastindex >= 3: 136 | os_v2 = match.group(3) 137 | 138 | if self.os_v3_replacement: 139 | os_v3 = MultiReplace(self.os_v3_replacement, match) 140 | elif match.lastindex and match.lastindex >= 4: 141 | os_v3 = match.group(4) 142 | 143 | if self.os_v4_replacement: 144 | os_v4 = MultiReplace(self.os_v4_replacement, match) 145 | elif match.lastindex and match.lastindex >= 5: 146 | os_v4 = match.group(5) 147 | 148 | return os, os_v1, os_v2, os_v3, os_v4 149 | 150 | 151 | def MultiReplace(string, match): 152 | def _repl(m): 153 | index = int(m.group(1)) - 1 154 | group = match.groups() 155 | if index < len(group): 156 | return group[index] 157 | return "" 158 | 159 | _string = re.sub(r"\$(\d)", _repl, string) 160 | _string = re.sub(r"^\s+|\s+$", "", _string) 161 | if _string == "": 162 | return None 163 | return _string 164 | 165 | 166 | class DeviceParser(object): 167 | def __init__( 168 | self, 169 | pattern, 170 | regex_flag=None, 171 | device_replacement=None, 172 | brand_replacement=None, 173 | model_replacement=None, 174 | ): 175 | """Initialize UserAgentParser. 176 | 177 | Args: 178 | pattern: a regular expression string 179 | device_replacement: a string to override the matched device (optional) 180 | """ 181 | self.pattern = pattern 182 | if regex_flag == "i": 183 | self.user_agent_re = re.compile(self.pattern, re.IGNORECASE) 184 | else: 185 | self.user_agent_re = re.compile(self.pattern) 186 | self.device_replacement = device_replacement 187 | self.brand_replacement = brand_replacement 188 | self.model_replacement = model_replacement 189 | 190 | def MatchSpans(self, user_agent_string): 191 | match_spans = [] 192 | match = self.user_agent_re.search(user_agent_string) 193 | if match: 194 | match_spans = [ 195 | match.span(group_index) for group_index in range(1, match.lastindex + 1) 196 | ] 197 | return match_spans 198 | 199 | def Parse(self, user_agent_string): 200 | device, brand, model = None, None, None 201 | match = self.user_agent_re.search(user_agent_string) 202 | if match: 203 | if self.device_replacement: 204 | device = MultiReplace(self.device_replacement, match) 205 | else: 206 | device = match.group(1) 207 | 208 | if self.brand_replacement: 209 | brand = MultiReplace(self.brand_replacement, match) 210 | 211 | if self.model_replacement: 212 | model = MultiReplace(self.model_replacement, match) 213 | elif len(match.groups()) > 0: 214 | model = match.group(1) 215 | 216 | return device, brand, model 217 | 218 | 219 | MAX_CACHE_SIZE = 200 220 | _PARSE_CACHE = {} 221 | 222 | _UA_TYPES = str 223 | if sys.version_info < (3,): 224 | _UA_TYPES = (str, unicode) 225 | 226 | 227 | def _lookup(ua, args): 228 | if not isinstance(ua, _UA_TYPES): 229 | raise TypeError("Expected user agent to be a string, got %r" % ua) 230 | 231 | key = (ua, tuple(sorted(args.items()))) 232 | entry = _PARSE_CACHE.get(key) 233 | if entry is not None: 234 | return entry 235 | 236 | if len(_PARSE_CACHE) >= MAX_CACHE_SIZE: 237 | _PARSE_CACHE.clear() 238 | 239 | v = _PARSE_CACHE[key] = {"string": ua} 240 | return v 241 | 242 | 243 | def _cached(ua, args, key, fn): 244 | entry = _lookup(ua, args) 245 | r = entry.get(key) 246 | if not r: 247 | r = entry[key] = fn(ua, args) 248 | return r 249 | 250 | 251 | def Parse(user_agent_string, **jsParseBits): 252 | """Parse all the things 253 | Args: 254 | user_agent_string: the full user agent string 255 | Returns: 256 | A dictionary containing all parsed bits 257 | """ 258 | entry = _lookup(user_agent_string, jsParseBits) 259 | # entry is complete, return directly 260 | if len(entry) == 4: 261 | return entry 262 | 263 | # entry is partially or entirely empty 264 | if "user_agent" not in entry: 265 | entry["user_agent"] = _ParseUserAgent(user_agent_string, jsParseBits) 266 | if "os" not in entry: 267 | entry["os"] = _ParseOS(user_agent_string, jsParseBits) 268 | if "device" not in entry: 269 | entry["device"] = _ParseDevice(user_agent_string, jsParseBits) 270 | 271 | return entry 272 | 273 | 274 | def ParseUserAgent(user_agent_string, **jsParseBits): 275 | """Parses the user-agent string for user agent (browser) info. 276 | Args: 277 | user_agent_string: The full user-agent string. 278 | Returns: 279 | A dictionary containing parsed bits. 280 | """ 281 | return _cached(user_agent_string, jsParseBits, "user_agent", _ParseUserAgent) 282 | 283 | 284 | def _ParseUserAgent(user_agent_string, jsParseBits): 285 | if jsParseBits: 286 | warnings.warn( 287 | "javascript overrides are deprecated and will be removed next release", 288 | category=DeprecationWarning, 289 | stacklevel=2, 290 | ) 291 | if ( 292 | "js_user_agent_family" in jsParseBits 293 | and jsParseBits["js_user_agent_family"] != "" 294 | ): 295 | family = jsParseBits["js_user_agent_family"] 296 | v1 = jsParseBits.get("js_user_agent_v1") or None 297 | v2 = jsParseBits.get("js_user_agent_v2") or None 298 | v3 = jsParseBits.get("js_user_agent_v3") or None 299 | else: 300 | for uaParser in USER_AGENT_PARSERS: 301 | family, v1, v2, v3 = uaParser.Parse(user_agent_string) 302 | if family: 303 | break 304 | 305 | # Override for Chrome Frame IFF Chrome is enabled. 306 | if "js_user_agent_string" in jsParseBits: 307 | js_user_agent_string = jsParseBits["js_user_agent_string"] 308 | if ( 309 | js_user_agent_string 310 | and js_user_agent_string.find("Chrome/") > -1 311 | and user_agent_string.find("chromeframe") > -1 312 | ): 313 | jsOverride = {} 314 | jsOverride = ParseUserAgent(js_user_agent_string) 315 | family = "Chrome Frame (%s %s)" % (family, v1) 316 | v1 = jsOverride["major"] 317 | v2 = jsOverride["minor"] 318 | v3 = jsOverride["patch"] 319 | 320 | family = family or "Other" 321 | return { 322 | "family": family, 323 | "major": v1 or None, 324 | "minor": v2 or None, 325 | "patch": v3 or None, 326 | } 327 | 328 | 329 | def ParseOS(user_agent_string, **jsParseBits): 330 | """Parses the user-agent string for operating system info 331 | Args: 332 | user_agent_string: The full user-agent string. 333 | Returns: 334 | A dictionary containing parsed bits. 335 | """ 336 | return _cached(user_agent_string, jsParseBits, "os", _ParseOS) 337 | 338 | 339 | def _ParseOS(user_agent_string, jsParseBits): 340 | if jsParseBits: 341 | warnings.warn( 342 | "javascript overrides are deprecated and will be removed next release", 343 | category=DeprecationWarning, 344 | stacklevel=2, 345 | ) 346 | for osParser in OS_PARSERS: 347 | os, os_v1, os_v2, os_v3, os_v4 = osParser.Parse(user_agent_string) 348 | if os: 349 | break 350 | os = os or "Other" 351 | return { 352 | "family": os, 353 | "major": os_v1, 354 | "minor": os_v2, 355 | "patch": os_v3, 356 | "patch_minor": os_v4, 357 | } 358 | 359 | 360 | def ParseDevice(user_agent_string, **jsParseBits): 361 | """Parses the user-agent string for device info. 362 | Args: 363 | user_agent_string: The full user-agent string. 364 | Returns: 365 | A dictionary containing parsed bits. 366 | """ 367 | return _cached(user_agent_string, jsParseBits, "device", _ParseDevice) 368 | 369 | 370 | def _ParseDevice(user_agent_string, jsParseBits): 371 | if jsParseBits: 372 | warnings.warn( 373 | "javascript overrides are deprecated and will be removed next release", 374 | category=DeprecationWarning, 375 | stacklevel=2, 376 | ) 377 | for deviceParser in DEVICE_PARSERS: 378 | device, brand, model = deviceParser.Parse(user_agent_string) 379 | if device: 380 | break 381 | 382 | if device is None: 383 | device = "Other" 384 | 385 | return {"family": device, "brand": brand, "model": model} 386 | 387 | 388 | def PrettyUserAgent(family, v1=None, v2=None, v3=None): 389 | """Pretty user agent string.""" 390 | if v3: 391 | if v3[0].isdigit(): 392 | return "%s %s.%s.%s" % (family, v1, v2, v3) 393 | else: 394 | return "%s %s.%s%s" % (family, v1, v2, v3) 395 | elif v2: 396 | return "%s %s.%s" % (family, v1, v2) 397 | elif v1: 398 | return "%s %s" % (family, v1) 399 | return family 400 | 401 | 402 | def PrettyOS(os, os_v1=None, os_v2=None, os_v3=None, os_v4=None): 403 | """Pretty os string.""" 404 | if os_v4: 405 | return "%s %s.%s.%s.%s" % (os, os_v1, os_v2, os_v3, os_v4) 406 | if os_v3: 407 | if os_v3[0].isdigit(): 408 | return "%s %s.%s.%s" % (os, os_v1, os_v2, os_v3) 409 | else: 410 | return "%s %s.%s%s" % (os, os_v1, os_v2, os_v3) 411 | elif os_v2: 412 | return "%s %s.%s" % (os, os_v1, os_v2) 413 | elif os_v1: 414 | return "%s %s" % (os, os_v1) 415 | return os 416 | 417 | 418 | def ParseWithJSOverrides( 419 | user_agent_string, 420 | js_user_agent_string=None, 421 | js_user_agent_family=None, 422 | js_user_agent_v1=None, 423 | js_user_agent_v2=None, 424 | js_user_agent_v3=None, 425 | ): 426 | """backwards compatible. use one of the other Parse methods instead!""" 427 | warnings.warn( 428 | "Use Parse (or a specialised parser)", DeprecationWarning, stacklevel=2 429 | ) 430 | 431 | # Override via JS properties. 432 | if js_user_agent_family is not None and js_user_agent_family != "": 433 | family = js_user_agent_family 434 | v1 = None 435 | v2 = None 436 | v3 = None 437 | if js_user_agent_v1 is not None: 438 | v1 = js_user_agent_v1 439 | if js_user_agent_v2 is not None: 440 | v2 = js_user_agent_v2 441 | if js_user_agent_v3 is not None: 442 | v3 = js_user_agent_v3 443 | else: 444 | for parser in USER_AGENT_PARSERS: 445 | family, v1, v2, v3 = parser.Parse(user_agent_string) 446 | if family: 447 | break 448 | 449 | # Override for Chrome Frame IFF Chrome is enabled. 450 | if ( 451 | js_user_agent_string 452 | and js_user_agent_string.find("Chrome/") > -1 453 | and user_agent_string.find("chromeframe") > -1 454 | ): 455 | family = "Chrome Frame (%s %s)" % (family, v1) 456 | ua_dict = ParseUserAgent(js_user_agent_string) 457 | v1 = ua_dict["major"] 458 | v2 = ua_dict["minor"] 459 | v3 = ua_dict["patch"] 460 | 461 | return family or "Other", v1, v2, v3 462 | 463 | 464 | def Pretty(family, v1=None, v2=None, v3=None): 465 | """backwards compatible. use PrettyUserAgent instead!""" 466 | warnings.warn("Use PrettyUserAgent", DeprecationWarning, stacklevel=2) 467 | if v3: 468 | if v3[0].isdigit(): 469 | return "%s %s.%s.%s" % (family, v1, v2, v3) 470 | else: 471 | return "%s %s.%s%s" % (family, v1, v2, v3) 472 | elif v2: 473 | return "%s %s.%s" % (family, v1, v2) 474 | elif v1: 475 | return "%s %s" % (family, v1) 476 | return family 477 | 478 | 479 | def GetFilters( 480 | user_agent_string, 481 | js_user_agent_string=None, 482 | js_user_agent_family=None, 483 | js_user_agent_v1=None, 484 | js_user_agent_v2=None, 485 | js_user_agent_v3=None, 486 | ): 487 | """Return the optional arguments that should be saved and used to query. 488 | 489 | js_user_agent_string is always returned if it is present. We really only need 490 | it for Chrome Frame. However, I added it in the generally case to find other 491 | cases when it is different. When the recording of js_user_agent_string was 492 | added, we created new records for all new user agents. 493 | 494 | Since we only added js_document_mode for the IE 9 preview case, it did not 495 | cause new user agent records the way js_user_agent_string did. 496 | 497 | js_document_mode has since been removed in favor of individual property 498 | overrides. 499 | 500 | Args: 501 | user_agent_string: The full user-agent string. 502 | js_user_agent_string: JavaScript ua string from client-side 503 | js_user_agent_family: This is an override for the family name to deal 504 | with the fact that IE platform preview (for instance) cannot be 505 | distinguished by user_agent_string, but only in javascript. 506 | js_user_agent_v1: v1 override - see above. 507 | js_user_agent_v2: v1 override - see above. 508 | js_user_agent_v3: v1 override - see above. 509 | Returns: 510 | {js_user_agent_string: '[...]', js_family_name: '[...]', etc...} 511 | """ 512 | filters = {} 513 | filterdict = { 514 | "js_user_agent_string": js_user_agent_string, 515 | "js_user_agent_family": js_user_agent_family, 516 | "js_user_agent_v1": js_user_agent_v1, 517 | "js_user_agent_v2": js_user_agent_v2, 518 | "js_user_agent_v3": js_user_agent_v3, 519 | } 520 | for key, value in filterdict.items(): 521 | if value is not None and value != "": 522 | filters[key] = value 523 | return filters 524 | 525 | 526 | # Build the list of user agent parsers from YAML 527 | UA_PARSER_YAML = os.environ.get("UA_PARSER_YAML") 528 | if UA_PARSER_YAML: 529 | # This will raise an ImportError if missing, obviously since it's no 530 | # longer a requirement 531 | import yaml 532 | 533 | try: 534 | # Try and use libyaml bindings if available since faster, 535 | # pyyaml doesn't do it by default (yaml/pyyaml#436) 536 | from yaml import CSafeLoader as SafeLoader 537 | except ImportError: 538 | from yaml import SafeLoader 539 | 540 | with open(UA_PARSER_YAML, "rb") as fp: 541 | regexes = yaml.load(fp, Loader=SafeLoader) 542 | 543 | USER_AGENT_PARSERS = [] 544 | for _ua_parser in regexes["user_agent_parsers"]: 545 | _regex = _ua_parser["regex"] 546 | 547 | _family_replacement = _ua_parser.get("family_replacement") 548 | _v1_replacement = _ua_parser.get("v1_replacement") 549 | _v2_replacement = _ua_parser.get("v2_replacement") 550 | 551 | USER_AGENT_PARSERS.append( 552 | UserAgentParser( 553 | _regex, _family_replacement, _v1_replacement, _v2_replacement 554 | ) 555 | ) 556 | 557 | OS_PARSERS = [] 558 | for _os_parser in regexes["os_parsers"]: 559 | _regex = _os_parser["regex"] 560 | 561 | _os_replacement = _os_parser.get("os_replacement") 562 | _os_v1_replacement = _os_parser.get("os_v1_replacement") 563 | _os_v2_replacement = _os_parser.get("os_v2_replacement") 564 | _os_v3_replacement = _os_parser.get("os_v3_replacement") 565 | _os_v4_replacement = _os_parser.get("os_v4_replacement") 566 | 567 | OS_PARSERS.append( 568 | OSParser( 569 | _regex, 570 | _os_replacement, 571 | _os_v1_replacement, 572 | _os_v2_replacement, 573 | _os_v3_replacement, 574 | _os_v4_replacement, 575 | ) 576 | ) 577 | 578 | DEVICE_PARSERS = [] 579 | for _device_parser in regexes["device_parsers"]: 580 | _regex = _device_parser["regex"] 581 | 582 | _regex_flag = _device_parser.get("regex_flag") 583 | _device_replacement = _device_parser.get("device_replacement") 584 | _brand_replacement = _device_parser.get("brand_replacement") 585 | _model_replacement = _device_parser.get("model_replacement") 586 | 587 | DEVICE_PARSERS.append( 588 | DeviceParser( 589 | _regex, 590 | _regex_flag, 591 | _device_replacement, 592 | _brand_replacement, 593 | _model_replacement, 594 | ) 595 | ) 596 | 597 | # Clean our our temporary vars explicitly 598 | # so they can't be reused or imported 599 | del regexes 600 | del yaml 601 | del SafeLoader 602 | else: 603 | # Just load our pre-compiled versions 604 | from ._regexes import USER_AGENT_PARSERS, DEVICE_PARSERS, OS_PARSERS 605 | -------------------------------------------------------------------------------- /TA-user-agents/bin/ua_parser/user_agent_parser_test.py: -------------------------------------------------------------------------------- 1 | # Copyright 2008 Google Inc. 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the 'License') 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # http://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an 'AS IS' BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | 16 | """User Agent Parser Unit Tests. 17 | Run: 18 | # python -m user_agent_parser_test (runs all the tests, takes awhile) 19 | or like: 20 | # python -m user_agent_parser_test ParseTest.testBrowserscopeStrings 21 | """ 22 | 23 | 24 | from __future__ import unicode_literals, absolute_import 25 | 26 | __author__ = "slamm@google.com (Stephen Lamm)" 27 | 28 | import logging 29 | import os 30 | import platform 31 | import re 32 | import sys 33 | import unittest 34 | import warnings 35 | import yaml 36 | 37 | if platform.python_implementation() == "PyPy": 38 | from yaml import SafeLoader 39 | else: 40 | try: 41 | from yaml import CSafeLoader as SafeLoader 42 | except ImportError: 43 | logging.getLogger(__name__).warning( 44 | "PyYaml C extension not available to run tests, this will result " 45 | "in dramatic tests slowdown." 46 | ) 47 | from yaml import SafeLoader 48 | 49 | 50 | from ua_parser import user_agent_parser 51 | 52 | TEST_RESOURCES_DIR = os.path.join( 53 | os.path.abspath(os.path.dirname(__file__)), "../uap-core" 54 | ) 55 | 56 | 57 | class ParseTest(unittest.TestCase): 58 | def testBrowserscopeStrings(self): 59 | self.runUserAgentTestsFromYAML( 60 | os.path.join(TEST_RESOURCES_DIR, "tests/test_ua.yaml") 61 | ) 62 | 63 | def testBrowserscopeStringsOS(self): 64 | self.runOSTestsFromYAML(os.path.join(TEST_RESOURCES_DIR, "tests/test_os.yaml")) 65 | 66 | def testStringsOS(self): 67 | self.runOSTestsFromYAML( 68 | os.path.join(TEST_RESOURCES_DIR, "test_resources/additional_os_tests.yaml") 69 | ) 70 | 71 | def testStringsDevice(self): 72 | self.runDeviceTestsFromYAML( 73 | os.path.join(TEST_RESOURCES_DIR, "tests/test_device.yaml") 74 | ) 75 | 76 | def testMozillaStrings(self): 77 | self.runUserAgentTestsFromYAML( 78 | os.path.join( 79 | TEST_RESOURCES_DIR, "test_resources/firefox_user_agent_strings.yaml" 80 | ) 81 | ) 82 | 83 | # NOTE: The YAML file used here is one output by makePGTSComparisonYAML() 84 | # below, as opposed to the pgts_browser_list-orig.yaml file. The -orig 85 | # file is by no means perfect, but identifies many browsers that we 86 | # classify as "Other". This test itself is mostly useful to know when 87 | # somthing in UA parsing changes. An effort should be made to try and 88 | # reconcile the differences between the two YAML files. 89 | def testPGTSStrings(self): 90 | self.runUserAgentTestsFromYAML( 91 | os.path.join(TEST_RESOURCES_DIR, "test_resources/pgts_browser_list.yaml") 92 | ) 93 | 94 | def testParseAll(self): 95 | user_agent_string = "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.4; fr; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5,gzip(gfe),gzip(gfe)" 96 | expected = { 97 | "device": {"family": "Mac", "brand": "Apple", "model": "Mac"}, 98 | "os": { 99 | "family": "Mac OS X", 100 | "major": "10", 101 | "minor": "4", 102 | "patch": None, 103 | "patch_minor": None, 104 | }, 105 | "user_agent": { 106 | "family": "Firefox", 107 | "major": "3", 108 | "minor": "5", 109 | "patch": "5", 110 | }, 111 | "string": user_agent_string, 112 | } 113 | 114 | result = user_agent_parser.Parse(user_agent_string) 115 | self.assertEqual( 116 | result, 117 | expected, 118 | "UA: {0}\n expected<{1}> != actual<{2}>".format( 119 | user_agent_string, expected, result 120 | ), 121 | ) 122 | 123 | # Run a set of test cases from a YAML file 124 | def runUserAgentTestsFromYAML(self, file_name): 125 | yamlFile = open(os.path.join(TEST_RESOURCES_DIR, file_name)) 126 | yamlContents = yaml.load(yamlFile, Loader=SafeLoader) 127 | yamlFile.close() 128 | 129 | for test_case in yamlContents["test_cases"]: 130 | # Inputs to Parse() 131 | user_agent_string = test_case["user_agent_string"] 132 | 133 | # The expected results 134 | expected = { 135 | "family": test_case["family"], 136 | "major": test_case["major"], 137 | "minor": test_case["minor"], 138 | "patch": test_case["patch"], 139 | } 140 | 141 | result = {} 142 | result = user_agent_parser.ParseUserAgent(user_agent_string) 143 | self.assertEqual( 144 | result, 145 | expected, 146 | "UA: {0}\n expected<{1}, {2}, {3}, {4}> != actual<{5}, {6}, {7}, {8}>".format( 147 | user_agent_string, 148 | expected["family"], 149 | expected["major"], 150 | expected["minor"], 151 | expected["patch"], 152 | result["family"], 153 | result["major"], 154 | result["minor"], 155 | result["patch"], 156 | ), 157 | ) 158 | self.assertLessEqual( 159 | len(user_agent_parser._PARSE_CACHE), 160 | user_agent_parser.MAX_CACHE_SIZE, 161 | "verify that the cache size never exceeds the configured setting", 162 | ) 163 | 164 | def runOSTestsFromYAML(self, file_name): 165 | yamlFile = open(os.path.join(TEST_RESOURCES_DIR, file_name)) 166 | yamlContents = yaml.load(yamlFile, Loader=SafeLoader) 167 | yamlFile.close() 168 | 169 | for test_case in yamlContents["test_cases"]: 170 | # Inputs to Parse() 171 | user_agent_string = test_case["user_agent_string"] 172 | 173 | # The expected results 174 | expected = { 175 | "family": test_case["family"], 176 | "major": test_case["major"], 177 | "minor": test_case["minor"], 178 | "patch": test_case["patch"], 179 | "patch_minor": test_case["patch_minor"], 180 | } 181 | 182 | result = user_agent_parser.ParseOS(user_agent_string) 183 | self.assertEqual( 184 | result, 185 | expected, 186 | "UA: {0}\n expected<{1} {2} {3} {4} {5}> != actual<{6} {7} {8} {9} {10}>".format( 187 | user_agent_string, 188 | expected["family"], 189 | expected["major"], 190 | expected["minor"], 191 | expected["patch"], 192 | expected["patch_minor"], 193 | result["family"], 194 | result["major"], 195 | result["minor"], 196 | result["patch"], 197 | result["patch_minor"], 198 | ), 199 | ) 200 | 201 | def runDeviceTestsFromYAML(self, file_name): 202 | yamlFile = open(os.path.join(TEST_RESOURCES_DIR, file_name)) 203 | yamlContents = yaml.load(yamlFile, Loader=SafeLoader) 204 | yamlFile.close() 205 | 206 | for test_case in yamlContents["test_cases"]: 207 | # Inputs to Parse() 208 | user_agent_string = test_case["user_agent_string"] 209 | 210 | # The expected results 211 | expected = { 212 | "family": test_case["family"], 213 | "brand": test_case["brand"], 214 | "model": test_case["model"], 215 | } 216 | 217 | result = user_agent_parser.ParseDevice(user_agent_string) 218 | self.assertEqual( 219 | result, 220 | expected, 221 | "UA: {0}\n expected<{1} {2} {3}> != actual<{4} {5} {6}>".format( 222 | user_agent_string, 223 | expected["family"], 224 | expected["brand"], 225 | expected["model"], 226 | result["family"], 227 | result["brand"], 228 | result["model"], 229 | ), 230 | ) 231 | 232 | 233 | class GetFiltersTest(unittest.TestCase): 234 | def testGetFiltersNoMatchesGiveEmptyDict(self): 235 | user_agent_string = "foo" 236 | filters = user_agent_parser.GetFilters( 237 | user_agent_string, js_user_agent_string=None 238 | ) 239 | self.assertEqual({}, filters) 240 | 241 | def testGetFiltersJsUaPassedThrough(self): 242 | user_agent_string = "foo" 243 | filters = user_agent_parser.GetFilters( 244 | user_agent_string, js_user_agent_string="bar" 245 | ) 246 | self.assertEqual({"js_user_agent_string": "bar"}, filters) 247 | 248 | def testGetFiltersJsUserAgentFamilyAndVersions(self): 249 | user_agent_string = ( 250 | "Mozilla/4.0 (compatible; MSIE 8.0; " 251 | "Windows NT 5.1; Trident/4.0; GTB6; .NET CLR 2.0.50727; " 252 | ".NET CLR 3.0.4506.2152; .NET CLR 3.5.30729)" 253 | ) 254 | filters = user_agent_parser.GetFilters( 255 | user_agent_string, js_user_agent_string="bar", js_user_agent_family="foo" 256 | ) 257 | self.assertEqual( 258 | {"js_user_agent_string": "bar", "js_user_agent_family": "foo"}, filters 259 | ) 260 | 261 | 262 | class TestDeprecationWarnings(unittest.TestCase): 263 | def setUp(self): 264 | """In Python 2.7, catch_warnings apparently does not do anything if 265 | the warning category is not active, whereas in 3(.6 and up) it 266 | seems to work out of the box. 267 | """ 268 | super(TestDeprecationWarnings, self).setUp() 269 | warnings.simplefilter("always", DeprecationWarning) 270 | 271 | def tearDown(self): 272 | # not ideal as it discards all other warnings updates from the 273 | # process, should really copy the contents of 274 | # `warnings.filters`, then reset-it. 275 | warnings.resetwarnings() 276 | super(TestDeprecationWarnings, self).tearDown() 277 | 278 | def test_parser_deprecation(self): 279 | with warnings.catch_warnings(record=True) as ws: 280 | user_agent_parser.ParseWithJSOverrides("") 281 | self.assertEqual(len(ws), 1) 282 | self.assertEqual(ws[0].category, DeprecationWarning) 283 | 284 | def test_printer_deprecation(self): 285 | with warnings.catch_warnings(record=True) as ws: 286 | user_agent_parser.Pretty("") 287 | self.assertEqual(len(ws), 1) 288 | self.assertEqual(ws[0].category, DeprecationWarning) 289 | 290 | def test_js_bits_deprecation(self): 291 | for parser, count in [ 292 | (user_agent_parser.Parse, 3), 293 | (user_agent_parser.ParseUserAgent, 1), 294 | (user_agent_parser.ParseOS, 1), 295 | (user_agent_parser.ParseDevice, 1), 296 | ]: 297 | user_agent_parser._PARSE_CACHE.clear() 298 | with warnings.catch_warnings(record=True) as ws: 299 | parser("some random thing", js_attribute=True) 300 | self.assertEqual(len(ws), count) 301 | for w in ws: 302 | self.assertEqual(w.category, DeprecationWarning) 303 | 304 | 305 | class ErrTest(unittest.TestCase): 306 | @unittest.skipIf( 307 | sys.version_info < (3,), "bytes and str are not differentiated in P2" 308 | ) 309 | def test_bytes(self): 310 | with self.assertRaises(TypeError): 311 | user_agent_parser.Parse(b"") 312 | 313 | def test_int(self): 314 | with self.assertRaises(TypeError): 315 | user_agent_parser.Parse(0) 316 | 317 | def test_list(self): 318 | with self.assertRaises(TypeError): 319 | user_agent_parser.Parse([]) 320 | 321 | def test_tuple(self): 322 | with self.assertRaises(TypeError): 323 | user_agent_parser.Parse(()) 324 | 325 | 326 | if __name__ == "__main__": 327 | unittest.main() 328 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | Contributing Changes to regexes.yaml 2 | ------------------------------------ 3 | 4 | Contributing to the project, especially `regexes.yaml`, is both welcomed and encouraged. To do so just do the following: 5 | 6 | 1. Fork the project 7 | 2. Create a branch for your changes 8 | 3. Modify `regexes.yaml` as appropriate 9 | 4. Add relevant tests to the following files and follow their format: 10 | * `tests/test_device.yaml` 11 | * `tests/test_os.yaml` 12 | * `tests/test_ua.yaml` 13 | 5. Check that your regex is not vulnerable to [ReDoS](https://www.owasp.org/index.php/Regular_expression_Denial_of_Service_-_ReDoS) using the test in `tests/regexes.js` 14 | 6. Push your branch to GitHub and submit a pull request 15 | 7. Monitor the pull request to make sure the Travis build succeeds. If it fails simply make the necessary changes to your branch and push it. Travis will re-test the changes. 16 | 17 | That's it. If you don't feel comfortable forking the project or modifying the YAML you can also [submit an issue](https://github.com/ua-parser/uap-core/issues) that includes the appropriate user agent string and the expected results of parsing. 18 | 19 | Thanks! 20 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/LICENSE: -------------------------------------------------------------------------------- 1 | Apache License, Version 2.0 2 | =========================== 3 | 4 | Copyright 2009 Google Inc. 5 | 6 | Licensed under the Apache License, Version 2.0 (the "License"); 7 | you may not use this file except in compliance with the License. 8 | You may obtain a copy of the License at 9 | 10 | http://www.apache.org/licenses/LICENSE-2.0 11 | 12 | Unless required by applicable law or agreed to in writing, software 13 | distributed under the License is distributed on an "AS IS" BASIS, 14 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | See the License for the specific language governing permissions and 16 | limitations under the License. -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/docs/specification.md: -------------------------------------------------------------------------------- 1 | # ua-parser Specification 2 | 3 | Version 0.2 Draft 4 | 5 | This document describes the specification on how a parser must implement the `regexes.yaml` file for correctly parsing user-agent strings on basis of that file. 6 | 7 | This specification intends to help maintainers and contributors to correctly use the provided information within the `regexes.yaml` file for obtaining information from the different user-agent strings. Furthermore this specification tries to be the basis for discussions on evolving the projects and the needed parsing algorithms. 8 | 9 | This document will not provide any information on how to implement the ua-parser project on your server and how to retrieve the user-agent string for further processing. 10 | 11 | # `regexes.yaml` 12 | 13 | Any information which can be obtained from a user-agent string may contain information on: 14 | 15 | * User-Agent aka “the browser” 16 | * OS (Operating System) the User-Agent currently uses (or runs on) 17 | * Device information by means of the physical device the User-Agent is using 18 | 19 | This information is provided within the `regexes.yaml` file. Each kind of information requires a different parser which extracts the related type. These are: 20 | 21 | * `user_agent_parser` 22 | * `os_parsers` 23 | * `device_parsers` 24 | 25 | Each parser contains a list of regular-expressions which are named `regex`. For each `regex` replacements specific to the parser can be named to attribute or change information. A replacement may require a match from the regular-expression which is extracted by an expression enclosed in parenthesis `"()"`. Each match can be addressed with `$1` to `$9` and used in a parser specific replacement. 26 | 27 | **TODO**: Provide some insights into the used chars. E.g. escape `"."` as `"\."` and `"("` as `"\("`. `"/"` does not need to be escaped. 28 | 29 | ## `user_agent_parsers` 30 | 31 | The `user_agent_parsers` returns information of the `family` type of the User-Agent. 32 | If available the version information specifying the `family` may be extracted as well if available. 33 | Here major, minor and patch version information can be addressed or overwritten. 34 | 35 | | match in regex | default replacement | placeholder in replacement | note | 36 | | ---- | ------------------- | ---- | --------------------------------------- | 37 | | 1 | family_replacement | $1 | specifies the User-Agents family | 38 | | 2 | v1_replacement | $2 | major version number/info of the family | 39 | | 3 | v2_replacement | $3 | minor version number/info of the family | 40 | | 4 | v3_replacement | $4 | patch version number/info of the family | 41 | 42 | In case that no replacement is specified, the association is given by order of the match. If in the `regex` no first match (within parenthesis) is given, the `family_replacement` shall be returned. 43 | To overwrite the respective value the replacement value needs to be named for a `regex`-item. 44 | 45 | **Parser Implementation:** 46 | 47 | The list of regular-expressions `regex` shall be evaluated for a given user-agent string beginning with the first `regex`-item in the list to the last item. The first matching `regex` stops processing the list. Regex-matching shall be case sensitive but not anchored. 48 | 49 | In case that no replacement for a match is specified for a `regex`-item, the first match defines the `family`, the second `major`, the third `minor`and the fourth `patch` information. 50 | If a `*_replacement` string is specified it shall overwrite or replace the match. 51 | 52 | As placeholder for inserting matched characters use within 53 | * `family_replacement`: `$1` 54 | * `v1_replacement`: `$2` 55 | * `v2_replacement`: `$3` 56 | * `v3_replacement`: `$4` 57 | 58 | If no matching `regex` is found the value for `family` shall be “Other”. The version information `major`, `minor` and `patch` shall not be defined. 59 | 60 | **Example:** 61 | 62 | For the User-Agent: `Mozilla/5.0 (Windows; Windows NT 5.1; rv:2.0b3pre) Gecko/20100727 Minefield/4.0.1pre` 63 | the matching `regex`: 64 | 65 | ``` 66 | - regex: '(Namoroka|Shiretoko|Minefield)/(\d+)\.(\d+)\.(\d+(?:pre)?)' 67 | family_replacement: 'Firefox ($1)' 68 | ``` 69 | 70 | resolves to: 71 | 72 | ``` 73 | family: Firefox (Minefield) 74 | major : 4 75 | minor : 0 76 | patch : 1pre 77 | ``` 78 | 79 | ## `os_parsers` 80 | 81 | The `os_parsers` return information of the `os` type of the Operating System (OS) the User-Agent runs. 82 | If available the version information specifying the `os` may be extracted as well if available. 83 | Here major, minor and patch version information can be addressed or overwritten. 84 | 85 | | match in regex | default replacement | placeholder in replacement | note | 86 | | ---- | ----------------- | ---- | ---------------------------------------- | 87 | | 1 | os_replacement | $1 | specifies the OS | 88 | | 2 | os_v1_replacement | $2 | major version number/info of OS | 89 | | 3 | os_v2_replacement | $3 | minor version number/info of the OS | 90 | | 4 | os_v3_replacement | $4 | patch version number/info of the OS | 91 | | 5 | os_v4_replacement | $5 | patchMinor version number/info of the OS | 92 | 93 | In case that no replacement is specified, the association is given by order of the match. If in the `regex` no first match (within normal brackets) is given, the `os_replacement` shall be specified! 94 | To overwrite the respective value the replacement value needs to be named for a `regex`-item. 95 | 96 | **Parser Implementation:** 97 | 98 | The list of regular-expressions `regex` shall be evaluated for a given user-agent string beginning with the first `regex`-item in the list to the last item. The first matching `regex` stops processing the list. Regex-matching shall be case sensitive. 99 | 100 | In case that no replacement for a match is specified for a `regex`-item, the first match defines the `os` family, the second `major`, the third `minor`, the forth `patch` and the fifth `patchMinor` version information. 101 | If a `*_replacement` string is specified it shall overwrite or replace the match. 102 | 103 | As placeholder for inserting matched characters use within 104 | * `os_replacement`: `$1` 105 | * `os_v1_replacement`: `$2` 106 | * `os_v2_replacement`: `$3` 107 | * `os_v3_replacement`: `$4` 108 | * `os_v4_replacement`: `$5` 109 | 110 | In case that no matching `regex` is found the value for `os` shall be “Other”. The version information `major`, `minor`, `patch` and `patchMinor` shall not be defined. 111 | 112 | **Example:** 113 | 114 | For the User-Agent: `Mozilla/5.0 (Windows; U; Win95; en-US; rv:1.1) Gecko/20020826` 115 | the matching `regex`: 116 | 117 | ``` 118 | - regex: 'Win(95|98|3.1|NT|ME|2000)' 119 | os_replacement: 'Windows $1' 120 | ``` 121 | 122 | resolves to: 123 | 124 | ``` 125 | os: Windows 95 126 | ``` 127 | 128 | ## `device_parsers` 129 | 130 | The `device_parsers` return information of the device `family` the User-Agent runs on. 131 | Furthermore `brand` and `model` of the device can be specified. 132 | `brand` names the manufacturer of the device, where model specifies the model of the device. 133 | 134 | | match in regex | default replacement | placeholder in replacement | note | 135 | | ---- | ------------------ | ------- | ---------------------------------------- | 136 | | 1 | device_replacement | $1...$9 | specifies the device family | 137 | | any | brand_replacement | $1...$9 | major version number/info of OS | 138 | | 1 | model_replacement | $1...$9 | minor version number/info of the OS | 139 | 140 | In case that no replacement is specified the association is given by order of the match. 141 | If in the `regex` no first match (within normal brackets) is given the `device_replacement` together with the `model_replacement` shall be specified! 142 | To overwrite the respective value the replacement value needs to be named for a given `regex`. 143 | 144 | For the `device_parsers` some `regex` require case insensitive parsing for proper matching. (E.g. Generic Feature Phones). To distinguish this from the case sensitive default case, the value `regex_flag: 'i'` is used to indicate that the regular-expression matching shall be case-insensitive for this regular expression. 145 | 146 | **Parser Implementation:** 147 | 148 | The list of regular-expressions `regex` shall be evaluated for a given user-agent string beginning with the first `regex`-item in the list to the last item. The first matching `regex` stops processing the list. Regex-matching shall be case sensitive. 149 | 150 | In case that no replacement for a match is given, the first match defines the `family` and the `model`. 151 | If a `*_replacement` string is specified it shall overwrite or replace the match. 152 | 153 | As placeholder for inserting matched characters `$1` to `$9` can be used to insert the matched characters from the regex into the replacement string. 154 | 155 | In case that no matching `regex` is found the value for `family` shall be “Other”. `brand` and `model` shall not be defined. 156 | Leading and tailing whitespaces shall be trimmed from the result. 157 | 158 | **Example:** 159 | 160 | For the User-Agent: `Mozilla/5.0 (Linux; U; Android 4.2.2; de-de; PEDI_PLUS_W Build/JDQ39) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 Safari/534.30` 161 | the matching `regex`: 162 | 163 | ```yaml 164 | - regex: '; *(PEDI)_(PLUS)_(W) Build' 165 | device_replacement: 'Odys $1 $2 $3' 166 | brand_replacement: 'Odys' 167 | model_replacement: '$1 $2 $3' 168 | ``` 169 | 170 | resolves to: 171 | 172 | ``` 173 | family: 'Odys PEDI PLUS W' 174 | brand: 'Odys' 175 | model: 'PEDI PLUS W' 176 | ``` 177 | 178 | # Parser Output 179 | 180 | To allow interoperability with code that builds upon ua-parser, it is recommended to provide the parser output in a standardized way. The structure defined in [WebIDL](http://www.w3.org/TR/WebIDL/) may follow: 181 | 182 | ``` 183 | interface ua-parser-output { 184 | attribute string string; // The "user-agent" string 185 | object ua: { // The "user_agent_parsers" result 186 | attribute string family; 187 | attribute string major; 188 | attribute string minor; 189 | attribute string patch; 190 | }; 191 | object os: { // The "os_parsers" result 192 | attribute string family; 193 | attribute string major; 194 | attribute string minor; 195 | attribute string patch; 196 | attribute string patchMinor; 197 | }; 198 | object device: { // The "device_parsers" result 199 | attribute string family; 200 | attribute string brand; 201 | attribute string model; 202 | }; 203 | }; 204 | ``` 205 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "uap-core", 3 | "description": "The regex file necessary to build language ports of Browserscope's user agent parser.", 4 | "version": "0.16.0", 5 | "maintainers": [ 6 | { 7 | "name": "Tobie Langel", 8 | "email": "tobie.langel@gmail.com", 9 | "web": "http://tobielangel.com" 10 | }, 11 | { 12 | "name": "Lindsey Simon", 13 | "email": "lsimon@commoner.com", 14 | "web": "http://www.idreamofuni.com" 15 | } 16 | ], 17 | "repository": { 18 | "type": "git", 19 | "url": "http://github.com/ua-parser/uap-core.git" 20 | }, 21 | "licenses": [ 22 | { 23 | "type": "Apache-2.0", 24 | "url": "https://raw.github.com/ua-parser/uap-core/master/LICENSE" 25 | } 26 | ], 27 | "devDependencies": { 28 | "mocha": "^9.2.2", 29 | "safe-regex": "^2.1.1", 30 | "uap-ref-impl": "git+https://github.com/ua-parser/uap-ref-impl#master", 31 | "yamlparser": "^0.0.2" 32 | }, 33 | "scripts": { 34 | "test": "mocha ./tests" 35 | }, 36 | "mocha": { 37 | "ui": "tdd", 38 | "reporter": "min", 39 | "check-leaks": true 40 | } 41 | } 42 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/test_resources/additional_os_tests.yaml: -------------------------------------------------------------------------------- 1 | test_cases: 2 | 3 | - user_agent_string: 'Mozilla/5.0 (Linux; U; Android Donut; de-de; HTC Tattoo 1.52.161.1 Build/Donut) AppleWebKit/528.5+ (KHTML, like Gecko) Version/3.1.2 Mobile Safari/525.20.1' 4 | family: 'Android' 5 | major: '1' 6 | minor: '2' 7 | patch: 8 | patch_minor: 9 | 10 | - user_agent_string: 'Mozilla/5.0 (Linux; U; Android 2.1-update1; en-us; Nexus One Build/ERE27) AppleWebKit/530.17 (KHTML, like Gecko) Version/4.0 Mobile Safari/530.17,gzip(gfe),gzip(gfe)' 11 | family: 'Android' 12 | major: '2' 13 | minor: '1' 14 | patch: 'update1' 15 | patch_minor: 16 | 17 | - user_agent_string: 'BlackBerry9000/4.6.0.167 Profile/MIDP-2.0 Configuration/CLDC-1.1 VendorID/102' 18 | family: 'BlackBerry OS' 19 | major: '4' 20 | minor: '6' 21 | patch: '0' 22 | patch_minor: '167' 23 | 24 | - user_agent_string: 'Mozilla/5.0 (BlackBerry; U; BlackBerry 9780; en) AppleWebKit/534.8+ (KHTML, like Gecko) Version/6.0.0.526 Mobile Safari/534.8+,gzip(gfe),gzip(gfe),gzip(gfe)' 25 | family: 'BlackBerry OS' 26 | major: '6' 27 | minor: '0' 28 | patch: '0' 29 | patch_minor: '526' 30 | 31 | - user_agent_string: 'Mozilla/5.0 (X11; 78; CentOS; US-en) AppleWebKit/527+ (KHTML, like Gecko) Bolt/0.862 Version/3.0 Safari/523.15' 32 | family: 'CentOS' 33 | major: 34 | minor: 35 | patch: 36 | patch_minor: 37 | 38 | - user_agent_string: 'Mozilla/5.0 (X11; CrOS i686 13.587.80) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.99 Safari/535.1' 39 | family: 'Chrome OS' 40 | major: '13' 41 | minor: '587' 42 | patch: '80' 43 | patch_minor: 44 | 45 | - user_agent_string: 'Mozilla/5.0 (X11; U; Linux x86_64; cs-CZ; rv:1.9.0.4) Gecko/2008111217 Fedora/3.0.4-1.fc9 Firefox/3.0.4' 46 | family: 'Fedora' 47 | major: '3' 48 | minor: '0' 49 | patch: '4' 50 | patch_minor: 51 | 52 | - user_agent_string: 'Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.77 Large Screen Safari/534.24 GoogleTV/000000' 53 | family: 'GoogleTV' 54 | major: 55 | minor: 56 | patch: 57 | patch_minor: 58 | 59 | - user_agent_string: 'Mozilla/5.0 (X11; U: Linux i686; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.127 Large Screen Safari/533.4 GoogleTV/b39389' 60 | family: 'GoogleTV' 61 | major: 62 | minor: 63 | patch: 64 | patch_minor: 65 | 66 | - user_agent_string: 'Mozilla/5.0 (Linux; GoogleTV 4.0.4; LG Google TV Build/000000) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.77 Safari/534.24' 67 | family: 'GoogleTV' 68 | major: '4' 69 | minor: '0' 70 | patch: '4' 71 | patch_minor: 72 | 73 | - user_agent_string: 'Opera/9.80 (iPhone; Opera Mini/5.0.019802/21.572; U; en) Presto/2.5.25 Version/10.54' 74 | family: 'iOS' 75 | major: 76 | minor: 77 | patch: 78 | patch_minor: 79 | 80 | - user_agent_string: 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.8) Gecko/20100723 Linux Mint/9 (Isadora) Firefox/3.6.8' 81 | family: 'Linux Mint' 82 | major: '9' 83 | minor: 84 | patch: 85 | patch_minor: 86 | 87 | - user_agent_string: 'Mozilla/5.0 (Macintosh; U; Intel Mac OS X Mach-O; en-en; rv:1.9.0.12) Gecko/2009070609 Firefox/3.0.12,gzip(gfe),gzip(gfe)' 88 | family: 'Mac OS X' 89 | major: 90 | minor: 91 | patch: 92 | patch_minor: 93 | 94 | - user_agent_string: 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/20090807 Mandriva Linux/1.9.1.2-1.1mud2009.1 (2009.1) Firefox/3.5.2 FirePHP/0.3,gzip(gfe),gzip(gfe)' 95 | family: 'Mandriva' 96 | major: '2009' 97 | minor: '1' 98 | patch: 99 | patch_minor: 100 | 101 | - user_agent_string: '' 102 | family: 'Other' 103 | major: 104 | minor: 105 | patch: 106 | patch_minor: 107 | 108 | - user_agent_string: 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.14) Gecko/20110301 PCLinuxOS/1.9.2.14-1pclos2011 (2011) Firefox/3.6.14,gzip(gfe),gzip(gfe),gzip(gfe)' 109 | family: 'PCLinuxOS' 110 | major: '1' 111 | minor: '9' 112 | patch: '2' 113 | patch_minor: '14' 114 | 115 | - user_agent_string: 'Mozilla/5.0 ( U; Linux x86_32; en-US; rv:1.0) Gecko/20090723 Puppy/3.6.8-0.1.1 Firefox/3.6.7,gzip(gfe),gzip(gfe)' 116 | family: 'Puppy' 117 | major: '3' 118 | minor: '6' 119 | patch: '8' 120 | patch_minor: 121 | 122 | - user_agent_string: 'Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.1) Gecko/2008072310 Red Hat/3.0.1-3.el4 Firefox/3.0.1,gzip(gfe),gzip(gfe)' 123 | family: 'Red Hat' 124 | major: '3' 125 | minor: '0' 126 | patch: '1' 127 | patch_minor: 128 | 129 | - user_agent_string: 'Opera/9.80 (X11; Linux x86_64; U; Slackware; lt) Presto/2.8.131 Version/11.11' 130 | family: 'Slackware' 131 | major: 132 | minor: 133 | patch: 134 | patch_minor: 135 | 136 | - user_agent_string: 'Opera/9.80 (S60; SymbOS; Opera Mobi/499; U; de) Presto/2.4.18 Version/10.00,gzip(gfe),gzip(gfe)' 137 | family: 'Symbian OS' 138 | major: 139 | minor: 140 | patch: 141 | patch_minor: 142 | 143 | - user_agent_string: 'Mozilla/5.0 (SymbianOS/9.2; U; Series60/3.1 NokiaN95/12.0.013; Profile/MIDP-2.0 Configuration/CLDC-1.1 ) AppleWebKit/413 (KHTML, like Gecko) Safari/413' 144 | family: 'Symbian OS' 145 | major: '9' 146 | minor: '2' 147 | patch: 148 | patch_minor: 149 | 150 | - user_agent_string: 'Mozilla/5.0 (Symbian/3; Series60/5.2 NokiaN8-00/010.022; Profile/MIDP-2.1 Configuration/CLDC-1.1 ) AppleWebKit/525 (KHTML, like Gecko) Version/3.0 BrowserNG/7.2.6.3 3gpp-gba,gzip(gfe),gzip(gfe)' 151 | family: 'Symbian^3' 152 | major: 153 | minor: 154 | patch: 155 | patch_minor: 156 | 157 | - user_agent_string: 'Mozilla/5.0 (compatible; MSIE 9.0; Windows Phone OS 7.5; Trident/5.0; IEMobile/9.0; SAMSUNG; SGH-i917)' 158 | family: 'Windows Phone' 159 | major: '7' 160 | minor: '5' 161 | patch: 162 | patch_minor: 163 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/test_resources/transform-pgts_browser_list.pl: -------------------------------------------------------------------------------- 1 | # Transform http://www.pgts.com.au/download/data/browser_list.txt 2 | # to a YAML format suitable for our needs 3 | use strict; 4 | use utf8; 5 | use open qw(:std :utf8); 6 | binmode(STDOUT, ":utf8"); 7 | 8 | open(my $fh, '<', 'pgts_browser_list.txt'); 9 | 10 | print "# From http://www.pgts.com.au/download/data/browser_list.txt 11 | # via http://www.texsoft.it/index.php?m=sw.php.useragent 12 | test_cases:\n"; 13 | foreach my $line (<$fh>) { 14 | $line =~ /(.*)\t(.*)\t(.*)/; 15 | my $family = '"' . $1 . '"'; 16 | my ($v1, $v2, $v3) = split /\./, $2; 17 | 18 | # Some UAs have double quotes in them :-/ 19 | my $ua = $3; 20 | $ua =~ s/"/\\"/g; 21 | $ua = '"' . $ua . '"'; 22 | 23 | # Where version field is something like "Camino 0.8" 24 | my @special = qw(AOL Camino Chimera Epiphany Firebird K-Meleon MultiZilla Phoenix); 25 | foreach (@special) { 26 | $family = "'$_'", $v1 = $1 if ($v1 =~ /$_ (\d+)/); 27 | } 28 | 29 | # Unversioned Firefox 30 | $family = '"Firefox"', $v1 = '' if ($v1 =~ /Firefox ?/); 31 | # Mismarked Firefox version 32 | $v1 = '1' if ($family =~ /Firefox/ && $line =~ /Firefox\/1\.0.*$/); 33 | 34 | $family = '"MultiZilla"', $v1 = '' if ($v1 =~ /MultiZilla/); 35 | $family = '"IE"' if ($family eq '"MSIE"'); 36 | 37 | print " - user_agent_string: $ua 38 | family: $family 39 | major: " . ($v1 eq '' ? "'0'" : "'$v1'") . " 40 | minor: " . ($v2 eq '' ? '' : "'$v2'") . " 41 | patch: " . ($v3 eq '' ? '' : "'$v3'") . "\n"; 42 | } 43 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/tests/regexes.js: -------------------------------------------------------------------------------- 1 | 'use strict' 2 | 3 | var assert = require('assert') 4 | var path = require('path') 5 | var fs = require('fs') 6 | var yaml = require('yamlparser') 7 | var regexes = readYAML('../regexes.yaml') 8 | var safe = require('safe-regex') 9 | var refImpl = require('uap-ref-impl') 10 | 11 | function readYAML (fileName) { 12 | var file = path.join(__dirname, fileName) 13 | var data = fs.readFileSync(file, 'utf8') 14 | return yaml.eval(data) 15 | } 16 | 17 | suite('regexes', function () { 18 | Object.keys(regexes).forEach(function (parser) { 19 | suite(parser, function () { 20 | regexes[parser].forEach(function(item) { 21 | test(item.regex, function () { 22 | assert.ok(safe(item.regex)) 23 | }) 24 | }) 25 | }) 26 | }) 27 | 28 | Object.keys(regexes).forEach(function (parser) { 29 | suite(`no reverse lookup in ${parser}`, function () { 30 | regexes[parser].forEach(function(item) { 31 | test(item.regex, function () { 32 | if (/\(\?<[!=]/.test(item.regex)) { 33 | assert.ok(false, 'go parser does not support regex lookbehind. See https://github.com/google/re2/wiki/Syntax') 34 | } 35 | if (/\(\?[!=]/.test(item.regex)) { 36 | assert.ok(false, 'go parser does not support regex lookahead. See https://github.com/google/re2/wiki/Syntax') 37 | } 38 | }) 39 | }) 40 | }) 41 | }) 42 | 43 | }) 44 | 45 | suite('redos', function(){ 46 | const parse = refImpl(regexes).parse 47 | 48 | const tests = [ 49 | '"a"+"a"*3500+"a"', 50 | '"SmartWatch"+" "*3500+"z"', 51 | '";A Build HuaweiA"+"4"*3500+"z"', 52 | '"HbbTV/0.0.0 (;LGE;"+" "*3500+"z"', 53 | '"HbbTV/0.0.0 (;CUS:;"+" "*3500+"z"', 54 | '"HbbTV/0.0.0 (;"+" "*3500+"z"', 55 | '"HbbTV/0.0.0 (;z;"+" "*3500+"z"', 56 | '"AppleWebKit/1."+"0"*10000+"!"', 57 | '"AppleWebKit/1.1"+" Safari"*10000+"!"', 58 | '"/0"+"0"*10000+"◎"', 59 | '"Mozilla"+"Mobile"*10000+"!"', 60 | '"Mozilla"+"Mobile"*10000+"!"', 61 | '"Mozilla"+"Mobile"*10000+"!"', 62 | '"Mozilla"+"Mobile"*10000+"!"', 63 | '"Mozilla"+"Mobile"*10000+"!"', 64 | '"Opera/"+"Opera Mobi"*10000+"!"', 65 | '"Opera/1."+"0"*10000+"!"', 66 | '"Mozilla"+"Android"*10000+"!"', 67 | '"Chrome/1.1."+"0"*10000+"!"', 68 | '"Chrome/1.1."+"0"*10000+"!"', 69 | '"MSIE 1."+"0"*10000+"!"', 70 | '"iPod"+"Version/1.1"*10000+"!"', 71 | '"iPod;ACPUAOS 0_0"+"0"*10000+"◎"', 72 | '"iPod;"+"CPU"*10000+"!"', 73 | '"iPod;"+"CPUOS 1_1"*10000+"!"', 74 | '"iPod;CPU"+"OS 1_1"*10000+"!"', 75 | '"Version/0.0"+"0"*10000+"◎\n"', 76 | '"HbbTV/0.0.0"+"0"*10000+"◎"', 77 | '"HbbTV/1.1.1 ("+";a;"*10000+"!"', 78 | '"HbbTV/1.1.1 ("+";a;2011"*10000+"!"', 79 | '"HbbTV/1.1.1 (;a;"+"2011"*10000+"!"', 80 | '"CPU OS 0.0"+"0"*10000+"◎"', 81 | '"ArcGIS.iOS-0.0"+"0"*10000+"◎"', 82 | '"x86_64 1.1."+"0"*10000+"!"', 83 | '"x86_64 1.1.1"+"Chrome"*10000+"!"', 84 | '" Darwin/91"+"0"*10000+"!"', 85 | '" Darwin/101"+"0"*10000+"!"', 86 | '" Darwin/111"+"0"*10000+"!"', 87 | '" Darwin/121"+"0"*10000+"!"', 88 | '" Darwin/131"+"0"*10000+"!"', 89 | '"iPhone"+"Mac OS X"*10000+"!"', 90 | '"CFNetwork/C Darwin/17.0"+"0"*10000+"◎"', 91 | '"CFNetwork/"+" Darwin/17.1"*10000+"!"', 92 | '"CFNetwork/"+" Darwin/16.1"*10000+"!"', 93 | '"CFNetwork/ Darwin/16."+"0"*10000+"!"', 94 | '"CFNetwork/8"+" Darwin/15.1"*10000+"!"', 95 | '"CFNetwork/8 Darwin/15."+"0"*10000+"!"', 96 | '"Linux 0.0"+"0"*10000+"◎"', 97 | '" PTST/0"+"0"*10000+"◎\n"', 98 | '"Mozilla"+"Mobile"*10000+"!"', 99 | '"; AIRIS "+" "*10000+"◎"', 100 | '";ASUS"+"_"*10000+"!"', 101 | '";Excite "+"0"*10000+"!"', 102 | '"; "+" "*10000+"◎"', 103 | '";"+"Coolpad_"*10000+"!"', 104 | '";TAC-"+"0"*10000+"!"', 105 | '"; "+" "*10000+"◎"', 106 | '"; Fly F30"+"0"*10000+"◎"', 107 | '"; FONE 0"+"0"*10000+"◎"', 108 | '"; "+" "*10000+"◎"', 109 | '"; "+" "*10000+"◎"', 110 | '"; "+" "*10000+"◎"', 111 | '"; Build HuaweiA0"+"0"*10000+"◎"', 112 | '"; "+" "*10000+"◎\n"', 113 | '"; Ideos "+" "*10000+"◎"', 114 | '"; NT-0"+"0"*10000+"◎"', 115 | '";ImPAD"+"0"*10000+"!"', 116 | '"; Intex AQUA "+" "*10000+"◎"', 117 | '"; IBuddy Connect "+" "*10000+"◎"', 118 | '"; I-Buddy "+" "*10000+"◎"', 119 | '"; Karbonn "+" "*10000+"◎"', 120 | '"; "+" "*10000+"◎\n"', 121 | '"; LAVA IRIS "+" "*10000+"◎"', 122 | '"; VS60"+"0"*10000+"◎"', 123 | '"; VS60 "+" "*10000+"◎"', 124 | '"; PhonePad 000"+"0"*10000+"◎"', 125 | '"; SmartPad 000"+"0"*10000+"◎"', 126 | '"; meizu_ "+" "*10000+"◎"', 127 | '"; Cynus F5 "+" "*10000+"◎"', 128 | '"; NXM0"+"0"*10000+"◎"', 129 | '";Nokia"+"_"*10000+"!"', 130 | '";IM-A111"+"0"*10000+"!"', 131 | '";Pantech"+"0"*10000+"!"', 132 | '"; Polaroid MIDC0000"+"0"*10000+"◎"', 133 | '"; POMP "+" "*10000+"◎"', 134 | '"; SAMSUNG Galaxy Note II "+" "*10000+"◎"', 135 | '"; SAMSUNG-"+"0"*10000+"!"', 136 | '"; SCH-0"+"0"*10000+"◎\n"', 137 | '"; SC-0"+"0"*10000+"◎"', 138 | '" SCH-0"+"0"*10000+"◎"', 139 | '"; Xperia X0"+"0"*10000+"◎"', 140 | '"; Sprint "+" "*10000+"◎"', 141 | '"; TM-MID0"+"0"*10000+"◎"', 142 | '";TM-SM"+"0"*10000+"!"', 143 | '"; Videocon "+" "*10000+"◎"', 144 | '";XOLO "+"tab"*10000+"!"', 145 | '"; PAD 70"+"0"*10000+"◎"', 146 | '";SmartTab"+"0"*10000+"!"', 147 | '"; v89_"+"0"*10000+"◎"', 148 | '"; e0000a_v0"+"0"*10000+"◎"', 149 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 150 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 151 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 152 | '"Windows Phone ; IEMobile/ ; ARM; Touch; HTC_blocked "+" "*10000+"◎"', 153 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 154 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 155 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 156 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 157 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 158 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 159 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 160 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 161 | '"Windows Phone ; "+"IEMobile/"*10000+"!"', 162 | '"SAMSUNG-"+"0"*10000+"!"', 163 | '"(Mobile; LYF/0/ ;"+";"*10000+"◎"', 164 | '"(Mobile; LYF/A/"+";0rv:"*10000+"!"', 165 | '"(Mobile; LYF/A/1;"+"rv:"*10000+"!"', 166 | '"(Mobile; Nokia_A_"+"; rv:"*10000+"!"', 167 | '"CFNetwork/"+" Darwin/1"*10000+"!"', 168 | '"CFNetwork/ Darwin/1"+"(Mac"*10000+"!"', 169 | '"Android 0.0-update1; AA-A ; "+" "*10000+"◎\n"', 170 | '"Android 0.0.0; AA-A- ; "+" "*10000+"◎"', 171 | '"Android 0.0.0; A- ; "+" "*10000+"◎"', 172 | '"Android 0.0.0; -AA; "+" "*10000+"◎\n"', 173 | '"Android 1; "+" Build"*10000+"!"', 174 | '"Android 1; "+" Build"*10000+"!"', 175 | '"foobar!/!./"+"/"*40000+"◎"', 176 | '" #- regex: \'(iPad;(Version/0.0.08"+"8"*40000+"◎"', 177 | '" #- regex: \'(iPad;"*8000', 178 | '" #- regex: \'HbbTV/0.0.0 (; Sony;H;"+";"*40000+"◎1\n_\n"', 179 | '" #- regex: \'HbbTV/1.1.1 (; Sony;"*600', 180 | '"Windows Phone ; IEMobile/ ; ARM; Touch; HTC_blockedd"+"d"*40000+"◎1\n_\n)"', 181 | '"Windows Phone "*8000', 182 | '"HTC/HTC "+" "*20000+"\n"', 183 | '"HTC/HTC "+" "*10000+"\n"', 184 | '""+"Google"*5000+"! _1SLQ_2"', 185 | '""+"MSIE 1.1;"*10000+"! _1SLQ_2"', 186 | '""+"[FB"*10000+"! _1SLQ_2"', 187 | '""+"[FB"*10000+"! _1SLQ_2"', 188 | '""+"[Pinterest/"*5000+"@1 _SLQ_2"', 189 | '""+"Mobile;"*5000+"! _1SLQ_2"', 190 | '""+"FirefoxTablet browser 1."*3000+"! _1SLQ_2"', 191 | '""+"Opera TabletVersion/1."*3000+"! _1SLQ_2"', 192 | '""+"Opera/9180Version/"*5000+"! _1SLQ_2"', 193 | '""+"Chrome/1 MMS/11"*5000+"! _1SLQ_2"', 194 | '""+"PLAYSTATION 3"*5000+"! _1SLQ_2"', 195 | '""+"AppleWebKit1 NX/1."*5000+"! _1SLQ_2"', 196 | '""+"Windows Phone Edge/1."*3000+"! _1SLQ_2"', 197 | '""+"amarok/"*10000+"@1 _SLQ_2"', 198 | '""+"iPod1GSA/1."*5000+"! _1SLQ_2"', 199 | '""+"iPod"*10000+"! _1SLQ_2"', 200 | '""+"PlayBook"*10000+"! _1SLQ_2"', 201 | '""+"Blackberry1Version/"*5000+"! _1SLQ_2"', 202 | '""+"AppleWebKit/1+ "*5000+"! _1SLQ_2"', 203 | '""+"HbbTV/1.1.1 (;Samsung;SmartTV0000;"*2000+"! _1SLQ_2"', 204 | '""+"HbbTV/1.1.1 (;Samsung;SmartTV0000;"*2000+"! _1SLQ_2"', 205 | '""+"HbbTV/1.1.1 (; Philips;"*3000+"! _1SLQ_2"', 206 | '""+"HbbTV/1.1.1 (; Philips;"*3000+"! _1SLQ_2"', 207 | '""+"HbbTV/1.1.1 (; Philips;"*3000+"! _1SLQ_2"', 208 | '""+"Symbian/3"*10000+"! _1SLQ_2"', 209 | '""+"Symbian/3"*10000+"! _1SLQ_2"', 210 | '""+"BB10;1Version/"*5000+"! _1SLQ_2"', 211 | '""+"BlackBerry"*5000+"! _1SLQ_2"', 212 | '""+"(Mobile;1Gecko/1810 Firefox/"*3000+"! _1SLQ_2"', 213 | '""+"(Mobile;1Gecko/1811 Firefox/"*3000+"! _1SLQ_2"', 214 | '""+"(Mobile;1Gecko/2610 Firefox/"*3000+"! _1SLQ_2"', 215 | '""+"(Mobile;1Gecko/2810 Firefox/"*3000+"! _1SLQ_2"', 216 | '""+"(Mobile;1Gecko/3010 Firefox/"*3000+"! _1SLQ_2"', 217 | '""+"(Mobile;1Gecko/3210 Firefox/"*3000+"! _1SLQ_2"', 218 | '""+"(Mobile;1Gecko/3410 Firefox/"*3000+"! _1SLQ_2"', 219 | '""+"(Mobile;1Firefox/1."*5000+"! _1SLQ_2"', 220 | '""+"DoCoMo"*10000+"! _1SLQ_2"', 221 | '""+"Mozilla"*10000+"! _1SLQ_2"', 222 | '""+"SmartWatch("*10000+"@1 _SLQ_2"', 223 | '""+"Android Application - Sony 1 "*3000+"! _1SLQ_2"', 224 | '""+"Android Application - HTC HTC 1 "*3000+"! _1SLQ_2"', 225 | '""+"Android Application - - 1 "*3000+"! _1SLQ_2"', 226 | '""+"Android 3"*10000+"! _1SLQ_2"', 227 | '""+";Vega"*10000+"! _1SLQ_2"', 228 | '""+";ALLVIEWSpeed"*5000+"! _1SLQ_2"', 229 | '""+";ARCHOSGAMEPAD"*5000+"! _1SLQ_2"', 230 | '""+";BlackBird I8"*5000+"! _1SLQ_2"', 231 | '""+";BlackBird "*5000+"! _1SLQ_2"', 232 | '""+";CatNova"*10000+"! _1SLQ_2"', 233 | '""+";P-1"*1000+"@1 _SLQ_2"', 234 | '""+";Explay_"*10000+"! _1SLQ_2"', 235 | '""+";IQ"*10000+"! _1SLQ_2"', 236 | '""+";Pixel"*10000+"! _1SLQ_2"', 237 | '""+";GSmart "*10000+"! _1SLQ_2"', 238 | '""+";imx51_"*10000+"! _1SLQ_2"', 239 | '""+";Haier "*10000+"! _1SLQ_2"', 240 | '""+"Build/HCL ME Tablet "*3000+"@1 _SLQ_2"', 241 | '""+";HP "*10000+"! _1SLQ_2"', 242 | '""+"HTC Streaming Player "*3000+"@1 _SLQ_2"', 243 | '""+";HYUNDAI T1"*5000+"! _1SLQ_2"', 244 | '""+";imobile "*10000+"! _1SLQ_2"', 245 | '""+";i-style"*10000+"! _1SLQ_2"', 246 | '""+";iOCEAN "*10000+"! _1SLQ_2"', 247 | '""+";NEC-"*10000+"! _1SLQ_2"', 248 | '""+";Pantech "*10000+"! _1SLQ_2"', 249 | '""+"Android 4."*5000+"! _1SLQ_2"', 250 | '""+";PLT0000"*10000+"! _1SLQ_2"', 251 | '""+";SAMSUNG "*10000+"@1 _SLQ_2"', 252 | '""+";GT-B1111"*10000+"@1 _SLQ_2"', 253 | '""+"; SAMSUNG-"*5000+"! _1SLQ_2"', 254 | '""+";SK-"*10000+"! _1SLQ_2"', 255 | '""+";ST1111"*10000+"! _1SLQ_2"', 256 | '""+";ST1111"*10000+"! _1SLQ_2"', 257 | '""+";Build/"*10000+"! _1SLQ_2"', 258 | '""+"TOOKY "*10000+"! _1SLQ_2"', 259 | '""+"TOUCHTAB"*10000+"! _1SLQ_2"', 260 | '""+"VERTU "*10000+"! _1SLQ_2"', 261 | '""+";GTablet"*10000+"! _1SLQ_2"', 262 | '""+"Vodafone "*10000+"! _1SLQ_2"', 263 | '""+"sprd-"*10000+"@1 _SLQ_2"', 264 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 265 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 266 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 267 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 268 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 269 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 270 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 271 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 272 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 273 | '""+"Windows Phone "*5000+"@1 _SLQ_2)"', 274 | '""+"(Mobile; ALCATELOneTouch ; rv:"*3000+"@1 _SLQ_2"', 275 | '""+"(Mobile; ZTE ; rv:"*5000+"@1 _SLQ_2"', 276 | '""+"(Mobile; ALCATELA; rv:"*3000+"@1 _SLQ_2"', 277 | '""+"PlayBook"*10000+"! _1SLQ_2"', 278 | '""+"webOS"*10000+"! _1SLQ_2"', 279 | '""+"HPiPAQ"*10000+"!1 _SLQ_2"', 280 | '""+"HbbTV;CE-HTML;"*5000+"@1 _SLQ_2"', 281 | '""+"InettvBrowser/0.0 (;Sony"*3000+"@1 _SLQ_2"', 282 | '""+"InettvBrowser/0.0 ("*5000+"@1 _SLQ_2"', 283 | '""+"MSIE"*10000+"! _1SLQ_2"', 284 | '""+"SMART-TV; "*10000+"! _1SLQ_2"', 285 | '""+"SymbianOS/9.1"*5000+"! _1SLQ_2"', 286 | '""+"Android "*10000+"@1 _SLQ_2"', 287 | '""+"Android-1.1; AA-; WOWMobile "*3000+"! _1SLQ_2"', 288 | '""+"Android-1.1-update1; AA-;"*3000+"! _1SLQ_2"', 289 | '""+"Android-1.1;AA_;"*5000+"! _1SLQ_2"', 290 | '""+"Android-1.1;-;"*5000+"! _1SLQ_2"', 291 | '""+"Android 1; Build"*3000+"! _1SLQ_2"', 292 | '""+"Android 1; Build"*3000+"! _1SLQ_2"' 293 | ] 294 | 295 | function buildAttackString (attackString) { 296 | const uas = attackString 297 | .replace(/^"|"$/g, '') 298 | .replace(/"?\+"([^"]+)"\*(\d+)/, (m, str, i) => { 299 | // console.error({m, str, i}) 300 | const size = parseInt(i, 10) 301 | if (!isNaN(size)) { 302 | return (new Array(size).fill(str).join('')) 303 | } else { 304 | return m 305 | } 306 | }) 307 | .replace(/\+"/, '') 308 | return uas 309 | } 310 | 311 | tests.forEach(attackString => { 312 | test(attackString, function() { 313 | const ua = buildAttackString(attackString) 314 | const start = Date.now() 315 | const parsed = parse(ua) 316 | // console.log(parsed) 317 | const diff = Date.now() - start 318 | assert.ok(diff < 500) 319 | }) 320 | }) 321 | }) 322 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/tests/sample.js: -------------------------------------------------------------------------------- 1 | 'use strict' 2 | 3 | var path = require('path') 4 | var fs = require('fs') 5 | var yaml = require('yamlparser') 6 | var refImpl = require('uap-ref-impl')(readYAML('../regexes.yaml')) 7 | 8 | function readYAML (fileName) { 9 | var file = path.join(__dirname, fileName) 10 | var data = fs.readFileSync(file, 'utf8') 11 | return yaml.eval(data) 12 | } 13 | 14 | if (require.main === module) { 15 | var user_agent_string = process.argv[2] 16 | console.log(refImpl.parse(user_agent_string)) 17 | } 18 | -------------------------------------------------------------------------------- /TA-user-agents/bin/uap-core/tests/test.js: -------------------------------------------------------------------------------- 1 | var assert = require('assert'), 2 | path = require('path'), 3 | fs = require('fs'), 4 | yaml = require('yamlparser'), 5 | refImpl = require('uap-ref-impl')(readYAML('../regexes.yaml')); 6 | 7 | function readYAML(fileName) { 8 | var file = path.join(__dirname, fileName); 9 | var data = fs.readFileSync(file, 'utf8'); 10 | return yaml.eval(data); 11 | } 12 | 13 | function msg(name, actual, expected) { 14 | return "Expected " + name + " to be " + JSON.stringify(expected) + " got " + JSON.stringify(actual) + " instead."; 15 | } 16 | 17 | ['../test_resources/firefox_user_agent_strings.yaml', '../tests/test_ua.yaml', '../test_resources/pgts_browser_list.yaml', 18 | '../test_resources/opera_mini_user_agent_strings.yaml','../test_resources/podcasting_user_agent_strings.yaml'].forEach(function(fileName) { 19 | var fixtures = readYAML(fileName).test_cases; 20 | suite(fileName, function() { 21 | fixtures.forEach(function(f) { 22 | test(f.user_agent_string, function() { 23 | var ua = refImpl.parse(f.user_agent_string).ua; 24 | fixFixture(f, ['major', 'minor', 'patch']); 25 | assert.strictEqual(ua.family, f.family, msg('ua.family', ua.family, f.family)); 26 | assert.strictEqual(ua.major, f.major, msg('ua.major', ua.major, f.major)); 27 | assert.strictEqual(ua.minor, f.minor, msg('ua.minor', ua.minor, f.minor)); 28 | assert.strictEqual(ua.patch, f.patch, msg('ua.patch', ua.patch, f.patch)); 29 | }); 30 | }); 31 | }); 32 | }); 33 | 34 | ['../tests/test_os.yaml', '../test_resources/additional_os_tests.yaml'].forEach(function(fileName) { 35 | var fixtures = readYAML(fileName).test_cases; 36 | suite(fileName, function() { 37 | fixtures.forEach(function(f) { 38 | test(f.user_agent_string, function() { 39 | var os = refImpl.parse(f.user_agent_string).os; 40 | fixFixture(f, ['major', 'minor', 'patch', 'patch_minor']); 41 | assert.strictEqual(os.family, f.family, msg('os.family', os.family, f.family)); 42 | assert.strictEqual(os.major, f.major, msg('os.major', os.major, f.major)); 43 | assert.strictEqual(os.minor, f.minor, msg('os.minor', os.minor, f.minor)); 44 | assert.strictEqual(os.patch, f.patch, msg('os.patch', os.patch, f.patch)); 45 | assert.strictEqual(os.patchMinor, f.patch_minor, msg('os.patchMinor', os.patchMinor, f.patch_minor)); 46 | }); 47 | }); 48 | }); 49 | }); 50 | 51 | ['../tests/test_device.yaml'].forEach(function(fileName) { 52 | var fixtures = readYAML(fileName).test_cases; 53 | suite(fileName, function() { 54 | fixtures.forEach(function(f) { 55 | test(f.user_agent_string, function() { 56 | var device = refImpl.parse(f.user_agent_string).device; 57 | fixFixture(f, ['family', 'brand', 'model']); 58 | assert.strictEqual(device.family, f.family, msg('device.family', device.family, f.family)); 59 | assert.strictEqual(device.brand, f.brand, msg('device.brand', device.brand, f.brand)); 60 | assert.strictEqual(device.model, f.model, msg('device.model', device.model, f.model)); 61 | }); 62 | }); 63 | }); 64 | }); 65 | 66 | function fixFixture(f, props) { 67 | // A bug in the YAML parser makes empty fixture props 68 | // return a vanila object. 69 | props.forEach(function(p) { 70 | if (typeof f[p] === 'object') { 71 | f[p] = null; 72 | } 73 | }) 74 | return f; 75 | } 76 | -------------------------------------------------------------------------------- /TA-user-agents/bin/user_agents.py: -------------------------------------------------------------------------------- 1 | import csv 2 | import logging 3 | import os 4 | import sys 5 | from urllib.parse import unquote_plus 6 | from Utilities import KennyLoggins 7 | 8 | ROOT_DIR = os.path.abspath(os.path.dirname(__file__)) 9 | DATA_FILE = os.path.abspath(os.path.join(ROOT_DIR, 'uap-core', 'regexes.yaml')) 10 | os.environ['UA_PARSER_YAML'] = DATA_FILE 11 | 12 | from ua_parser import user_agent_parser 13 | 14 | # The following log levels are available: 15 | # Debug (NOISY!!!): 16 | # LOG_LEVEL = logging.DEBUG 17 | # Info: 18 | # LOG_LEVEL = logging.INFO 19 | # Error: 20 | LOG_LEVEL = logging.ERROR 21 | 22 | LOG_FILENAME = 'TA-user-agents' 23 | kl = KennyLoggins() 24 | logger = kl.get_logger(app_name="TA-user-agents", file_name=LOG_FILENAME, log_level=logging.INFO) 25 | 26 | # Main routine - basically it's the standard python recipe for handling 27 | # Splunk lookups 28 | # 29 | if __name__ == '__main__': 30 | r = csv.reader(sys.stdin) 31 | w = csv.writer(sys.stdout) 32 | have_header: bool = False 33 | 34 | header = [] 35 | idx = -1 36 | for row in r: 37 | if not have_header: 38 | header = row 39 | logger.debug('fields found: %s' % header) 40 | have_header = True 41 | z = 0 42 | for h in row: 43 | if h == "http_user_agent": 44 | idx = z 45 | z = z + 1 46 | w.writerow(row) 47 | continue 48 | 49 | # We only care about the cs_user_agent field - everything else is filled in 50 | http_user_agent = row[idx] 51 | useragent = unquote_plus(http_user_agent) 52 | logger.debug('found useragent %s' % http_user_agent) 53 | 54 | logger.debug('sending to ua-parser') 55 | results = [] 56 | try: 57 | results = user_agent_parser.Parse(http_user_agent) 58 | except Exception as err: 59 | logger.error(err) 60 | continue 61 | logger.debug('back from ua-parser') 62 | 63 | # create our results for Splunk 64 | # using the full results 65 | forSplunk = { 66 | 'ua_os_family': 'unknown', 67 | 'ua_os_major': 'unknown', 68 | 'ua_os_minor': 'unknown', 69 | 'ua_os_patch': 'unknown', 70 | 'ua_os_patch_minor': 'unknown', 71 | 'ua_family': 'unknown', 72 | 'ua_major': 'unknown', 73 | 'ua_minor': 'unknown', 74 | 'ua_patch': 'unknown', 75 | 'ua_device': 'unknown' 76 | } 77 | 78 | # UA 79 | if results['user_agent']['family'] is not None: 80 | if results['user_agent']['family'] != 'Other': 81 | forSplunk['ua_family'] = results['user_agent']['family'] 82 | if results['user_agent']['major'] is not None: 83 | forSplunk['ua_major'] = results['user_agent']['major'] 84 | if results['user_agent']['minor'] is not None: 85 | forSplunk['ua_minor'] = results['user_agent']['minor'] 86 | if results['user_agent']['patch'] is not None: 87 | forSplunk['ua_patch'] = results['user_agent']['patch'] 88 | # Device 89 | if results['device']['family'] is not None: 90 | if results['device']['family'] != 'Other': 91 | forSplunk['ua_device'] = results['device']['family'] 92 | # OS 93 | if results['os']['family'] is not None: 94 | if results['os']['family'] != 'Other': 95 | forSplunk['ua_os_family'] = results['os']['family'] 96 | if results['os']['major'] is not None: 97 | forSplunk['ua_os_major'] = results['os']['major'] 98 | if results['os']['minor'] is not None: 99 | forSplunk['ua_os_minor'] = results['os']['minor'] 100 | if results['os']['patch'] is not None: 101 | forSplunk['ua_os_patch'] = results['os']['patch'] 102 | if results['os']['patch_minor'] is not None: 103 | forSplunk['ua_os_patch_minor'] = results['os']['patch_minor'] 104 | 105 | logger.debug('for return: %s' % forSplunk) 106 | # Now write it out 107 | logger.debug('outputting') 108 | orow = [] 109 | for header_name in header: 110 | if header_name == "http_user_agent": 111 | orow.append(http_user_agent) 112 | else: 113 | orow.append(forSplunk[header_name]) 114 | w.writerow(orow) 115 | logger.debug('done output') 116 | -------------------------------------------------------------------------------- /TA-user-agents/bin/version.py: -------------------------------------------------------------------------------- 1 | __version__='TA-user-agents.1.7.7b21' 2 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/__init__.py: -------------------------------------------------------------------------------- 1 | 2 | from .error import * 3 | 4 | from .tokens import * 5 | from .events import * 6 | from .nodes import * 7 | 8 | from .loader import * 9 | from .dumper import * 10 | 11 | __version__ = '6.0' 12 | try: 13 | from .cyaml import * 14 | __with_libyaml__ = True 15 | except ImportError: 16 | __with_libyaml__ = False 17 | 18 | import io 19 | 20 | #------------------------------------------------------------------------------ 21 | # XXX "Warnings control" is now deprecated. Leaving in the API function to not 22 | # break code that uses it. 23 | #------------------------------------------------------------------------------ 24 | def warnings(settings=None): 25 | if settings is None: 26 | return {} 27 | 28 | #------------------------------------------------------------------------------ 29 | def scan(stream, Loader=Loader): 30 | """ 31 | Scan a YAML stream and produce scanning tokens. 32 | """ 33 | loader = Loader(stream) 34 | try: 35 | while loader.check_token(): 36 | yield loader.get_token() 37 | finally: 38 | loader.dispose() 39 | 40 | def parse(stream, Loader=Loader): 41 | """ 42 | Parse a YAML stream and produce parsing events. 43 | """ 44 | loader = Loader(stream) 45 | try: 46 | while loader.check_event(): 47 | yield loader.get_event() 48 | finally: 49 | loader.dispose() 50 | 51 | def compose(stream, Loader=Loader): 52 | """ 53 | Parse the first YAML document in a stream 54 | and produce the corresponding representation tree. 55 | """ 56 | loader = Loader(stream) 57 | try: 58 | return loader.get_single_node() 59 | finally: 60 | loader.dispose() 61 | 62 | def compose_all(stream, Loader=Loader): 63 | """ 64 | Parse all YAML documents in a stream 65 | and produce corresponding representation trees. 66 | """ 67 | loader = Loader(stream) 68 | try: 69 | while loader.check_node(): 70 | yield loader.get_node() 71 | finally: 72 | loader.dispose() 73 | 74 | def load(stream, Loader): 75 | """ 76 | Parse the first YAML document in a stream 77 | and produce the corresponding Python object. 78 | """ 79 | loader = Loader(stream) 80 | try: 81 | return loader.get_single_data() 82 | finally: 83 | loader.dispose() 84 | 85 | def load_all(stream, Loader): 86 | """ 87 | Parse all YAML documents in a stream 88 | and produce corresponding Python objects. 89 | """ 90 | loader = Loader(stream) 91 | try: 92 | while loader.check_data(): 93 | yield loader.get_data() 94 | finally: 95 | loader.dispose() 96 | 97 | def full_load(stream): 98 | """ 99 | Parse the first YAML document in a stream 100 | and produce the corresponding Python object. 101 | 102 | Resolve all tags except those known to be 103 | unsafe on untrusted input. 104 | """ 105 | return load(stream, FullLoader) 106 | 107 | def full_load_all(stream): 108 | """ 109 | Parse all YAML documents in a stream 110 | and produce corresponding Python objects. 111 | 112 | Resolve all tags except those known to be 113 | unsafe on untrusted input. 114 | """ 115 | return load_all(stream, FullLoader) 116 | 117 | def safe_load(stream): 118 | """ 119 | Parse the first YAML document in a stream 120 | and produce the corresponding Python object. 121 | 122 | Resolve only basic YAML tags. This is known 123 | to be safe for untrusted input. 124 | """ 125 | return load(stream, SafeLoader) 126 | 127 | def safe_load_all(stream): 128 | """ 129 | Parse all YAML documents in a stream 130 | and produce corresponding Python objects. 131 | 132 | Resolve only basic YAML tags. This is known 133 | to be safe for untrusted input. 134 | """ 135 | return load_all(stream, SafeLoader) 136 | 137 | def unsafe_load(stream): 138 | """ 139 | Parse the first YAML document in a stream 140 | and produce the corresponding Python object. 141 | 142 | Resolve all tags, even those known to be 143 | unsafe on untrusted input. 144 | """ 145 | return load(stream, UnsafeLoader) 146 | 147 | def unsafe_load_all(stream): 148 | """ 149 | Parse all YAML documents in a stream 150 | and produce corresponding Python objects. 151 | 152 | Resolve all tags, even those known to be 153 | unsafe on untrusted input. 154 | """ 155 | return load_all(stream, UnsafeLoader) 156 | 157 | def emit(events, stream=None, Dumper=Dumper, 158 | canonical=None, indent=None, width=None, 159 | allow_unicode=None, line_break=None): 160 | """ 161 | Emit YAML parsing events into a stream. 162 | If stream is None, return the produced string instead. 163 | """ 164 | getvalue = None 165 | if stream is None: 166 | stream = io.StringIO() 167 | getvalue = stream.getvalue 168 | dumper = Dumper(stream, canonical=canonical, indent=indent, width=width, 169 | allow_unicode=allow_unicode, line_break=line_break) 170 | try: 171 | for event in events: 172 | dumper.emit(event) 173 | finally: 174 | dumper.dispose() 175 | if getvalue: 176 | return getvalue() 177 | 178 | def serialize_all(nodes, stream=None, Dumper=Dumper, 179 | canonical=None, indent=None, width=None, 180 | allow_unicode=None, line_break=None, 181 | encoding=None, explicit_start=None, explicit_end=None, 182 | version=None, tags=None): 183 | """ 184 | Serialize a sequence of representation trees into a YAML stream. 185 | If stream is None, return the produced string instead. 186 | """ 187 | getvalue = None 188 | if stream is None: 189 | if encoding is None: 190 | stream = io.StringIO() 191 | else: 192 | stream = io.BytesIO() 193 | getvalue = stream.getvalue 194 | dumper = Dumper(stream, canonical=canonical, indent=indent, width=width, 195 | allow_unicode=allow_unicode, line_break=line_break, 196 | encoding=encoding, version=version, tags=tags, 197 | explicit_start=explicit_start, explicit_end=explicit_end) 198 | try: 199 | dumper.open() 200 | for node in nodes: 201 | dumper.serialize(node) 202 | dumper.close() 203 | finally: 204 | dumper.dispose() 205 | if getvalue: 206 | return getvalue() 207 | 208 | def serialize(node, stream=None, Dumper=Dumper, **kwds): 209 | """ 210 | Serialize a representation tree into a YAML stream. 211 | If stream is None, return the produced string instead. 212 | """ 213 | return serialize_all([node], stream, Dumper=Dumper, **kwds) 214 | 215 | def dump_all(documents, stream=None, Dumper=Dumper, 216 | default_style=None, default_flow_style=False, 217 | canonical=None, indent=None, width=None, 218 | allow_unicode=None, line_break=None, 219 | encoding=None, explicit_start=None, explicit_end=None, 220 | version=None, tags=None, sort_keys=True): 221 | """ 222 | Serialize a sequence of Python objects into a YAML stream. 223 | If stream is None, return the produced string instead. 224 | """ 225 | getvalue = None 226 | if stream is None: 227 | if encoding is None: 228 | stream = io.StringIO() 229 | else: 230 | stream = io.BytesIO() 231 | getvalue = stream.getvalue 232 | dumper = Dumper(stream, default_style=default_style, 233 | default_flow_style=default_flow_style, 234 | canonical=canonical, indent=indent, width=width, 235 | allow_unicode=allow_unicode, line_break=line_break, 236 | encoding=encoding, version=version, tags=tags, 237 | explicit_start=explicit_start, explicit_end=explicit_end, sort_keys=sort_keys) 238 | try: 239 | dumper.open() 240 | for data in documents: 241 | dumper.represent(data) 242 | dumper.close() 243 | finally: 244 | dumper.dispose() 245 | if getvalue: 246 | return getvalue() 247 | 248 | def dump(data, stream=None, Dumper=Dumper, **kwds): 249 | """ 250 | Serialize a Python object into a YAML stream. 251 | If stream is None, return the produced string instead. 252 | """ 253 | return dump_all([data], stream, Dumper=Dumper, **kwds) 254 | 255 | def safe_dump_all(documents, stream=None, **kwds): 256 | """ 257 | Serialize a sequence of Python objects into a YAML stream. 258 | Produce only basic YAML tags. 259 | If stream is None, return the produced string instead. 260 | """ 261 | return dump_all(documents, stream, Dumper=SafeDumper, **kwds) 262 | 263 | def safe_dump(data, stream=None, **kwds): 264 | """ 265 | Serialize a Python object into a YAML stream. 266 | Produce only basic YAML tags. 267 | If stream is None, return the produced string instead. 268 | """ 269 | return dump_all([data], stream, Dumper=SafeDumper, **kwds) 270 | 271 | def add_implicit_resolver(tag, regexp, first=None, 272 | Loader=None, Dumper=Dumper): 273 | """ 274 | Add an implicit scalar detector. 275 | If an implicit scalar value matches the given regexp, 276 | the corresponding tag is assigned to the scalar. 277 | first is a sequence of possible initial characters or None. 278 | """ 279 | if Loader is None: 280 | loader.Loader.add_implicit_resolver(tag, regexp, first) 281 | loader.FullLoader.add_implicit_resolver(tag, regexp, first) 282 | loader.UnsafeLoader.add_implicit_resolver(tag, regexp, first) 283 | else: 284 | Loader.add_implicit_resolver(tag, regexp, first) 285 | Dumper.add_implicit_resolver(tag, regexp, first) 286 | 287 | def add_path_resolver(tag, path, kind=None, Loader=None, Dumper=Dumper): 288 | """ 289 | Add a path based resolver for the given tag. 290 | A path is a list of keys that forms a path 291 | to a node in the representation tree. 292 | Keys can be string values, integers, or None. 293 | """ 294 | if Loader is None: 295 | loader.Loader.add_path_resolver(tag, path, kind) 296 | loader.FullLoader.add_path_resolver(tag, path, kind) 297 | loader.UnsafeLoader.add_path_resolver(tag, path, kind) 298 | else: 299 | Loader.add_path_resolver(tag, path, kind) 300 | Dumper.add_path_resolver(tag, path, kind) 301 | 302 | def add_constructor(tag, constructor, Loader=None): 303 | """ 304 | Add a constructor for the given tag. 305 | Constructor is a function that accepts a Loader instance 306 | and a node object and produces the corresponding Python object. 307 | """ 308 | if Loader is None: 309 | loader.Loader.add_constructor(tag, constructor) 310 | loader.FullLoader.add_constructor(tag, constructor) 311 | loader.UnsafeLoader.add_constructor(tag, constructor) 312 | else: 313 | Loader.add_constructor(tag, constructor) 314 | 315 | def add_multi_constructor(tag_prefix, multi_constructor, Loader=None): 316 | """ 317 | Add a multi-constructor for the given tag prefix. 318 | Multi-constructor is called for a node if its tag starts with tag_prefix. 319 | Multi-constructor accepts a Loader instance, a tag suffix, 320 | and a node object and produces the corresponding Python object. 321 | """ 322 | if Loader is None: 323 | loader.Loader.add_multi_constructor(tag_prefix, multi_constructor) 324 | loader.FullLoader.add_multi_constructor(tag_prefix, multi_constructor) 325 | loader.UnsafeLoader.add_multi_constructor(tag_prefix, multi_constructor) 326 | else: 327 | Loader.add_multi_constructor(tag_prefix, multi_constructor) 328 | 329 | def add_representer(data_type, representer, Dumper=Dumper): 330 | """ 331 | Add a representer for the given type. 332 | Representer is a function accepting a Dumper instance 333 | and an instance of the given data type 334 | and producing the corresponding representation node. 335 | """ 336 | Dumper.add_representer(data_type, representer) 337 | 338 | def add_multi_representer(data_type, multi_representer, Dumper=Dumper): 339 | """ 340 | Add a representer for the given type. 341 | Multi-representer is a function accepting a Dumper instance 342 | and an instance of the given data type or subtype 343 | and producing the corresponding representation node. 344 | """ 345 | Dumper.add_multi_representer(data_type, multi_representer) 346 | 347 | class YAMLObjectMetaclass(type): 348 | """ 349 | The metaclass for YAMLObject. 350 | """ 351 | def __init__(cls, name, bases, kwds): 352 | super(YAMLObjectMetaclass, cls).__init__(name, bases, kwds) 353 | if 'yaml_tag' in kwds and kwds['yaml_tag'] is not None: 354 | if isinstance(cls.yaml_loader, list): 355 | for loader in cls.yaml_loader: 356 | loader.add_constructor(cls.yaml_tag, cls.from_yaml) 357 | else: 358 | cls.yaml_loader.add_constructor(cls.yaml_tag, cls.from_yaml) 359 | 360 | cls.yaml_dumper.add_representer(cls, cls.to_yaml) 361 | 362 | class YAMLObject(metaclass=YAMLObjectMetaclass): 363 | """ 364 | An object that can dump itself to a YAML stream 365 | and load itself from a YAML stream. 366 | """ 367 | 368 | __slots__ = () # no direct instantiation, so allow immutable subclasses 369 | 370 | yaml_loader = [Loader, FullLoader, UnsafeLoader] 371 | yaml_dumper = Dumper 372 | 373 | yaml_tag = None 374 | yaml_flow_style = None 375 | 376 | @classmethod 377 | def from_yaml(cls, loader, node): 378 | """ 379 | Convert a representation node to a Python object. 380 | """ 381 | return loader.construct_yaml_object(node, cls) 382 | 383 | @classmethod 384 | def to_yaml(cls, dumper, data): 385 | """ 386 | Convert a Python object to a representation node. 387 | """ 388 | return dumper.represent_yaml_object(cls.yaml_tag, data, cls, 389 | flow_style=cls.yaml_flow_style) 390 | 391 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/composer.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = ['Composer', 'ComposerError'] 3 | 4 | from .error import MarkedYAMLError 5 | from .events import * 6 | from .nodes import * 7 | 8 | class ComposerError(MarkedYAMLError): 9 | pass 10 | 11 | class Composer: 12 | 13 | def __init__(self): 14 | self.anchors = {} 15 | 16 | def check_node(self): 17 | # Drop the STREAM-START event. 18 | if self.check_event(StreamStartEvent): 19 | self.get_event() 20 | 21 | # If there are more documents available? 22 | return not self.check_event(StreamEndEvent) 23 | 24 | def get_node(self): 25 | # Get the root node of the next document. 26 | if not self.check_event(StreamEndEvent): 27 | return self.compose_document() 28 | 29 | def get_single_node(self): 30 | # Drop the STREAM-START event. 31 | self.get_event() 32 | 33 | # Compose a document if the stream is not empty. 34 | document = None 35 | if not self.check_event(StreamEndEvent): 36 | document = self.compose_document() 37 | 38 | # Ensure that the stream contains no more documents. 39 | if not self.check_event(StreamEndEvent): 40 | event = self.get_event() 41 | raise ComposerError("expected a single document in the stream", 42 | document.start_mark, "but found another document", 43 | event.start_mark) 44 | 45 | # Drop the STREAM-END event. 46 | self.get_event() 47 | 48 | return document 49 | 50 | def compose_document(self): 51 | # Drop the DOCUMENT-START event. 52 | self.get_event() 53 | 54 | # Compose the root node. 55 | node = self.compose_node(None, None) 56 | 57 | # Drop the DOCUMENT-END event. 58 | self.get_event() 59 | 60 | self.anchors = {} 61 | return node 62 | 63 | def compose_node(self, parent, index): 64 | if self.check_event(AliasEvent): 65 | event = self.get_event() 66 | anchor = event.anchor 67 | if anchor not in self.anchors: 68 | raise ComposerError(None, None, "found undefined alias %r" 69 | % anchor, event.start_mark) 70 | return self.anchors[anchor] 71 | event = self.peek_event() 72 | anchor = event.anchor 73 | if anchor is not None: 74 | if anchor in self.anchors: 75 | raise ComposerError("found duplicate anchor %r; first occurrence" 76 | % anchor, self.anchors[anchor].start_mark, 77 | "second occurrence", event.start_mark) 78 | self.descend_resolver(parent, index) 79 | if self.check_event(ScalarEvent): 80 | node = self.compose_scalar_node(anchor) 81 | elif self.check_event(SequenceStartEvent): 82 | node = self.compose_sequence_node(anchor) 83 | elif self.check_event(MappingStartEvent): 84 | node = self.compose_mapping_node(anchor) 85 | self.ascend_resolver() 86 | return node 87 | 88 | def compose_scalar_node(self, anchor): 89 | event = self.get_event() 90 | tag = event.tag 91 | if tag is None or tag == '!': 92 | tag = self.resolve(ScalarNode, event.value, event.implicit) 93 | node = ScalarNode(tag, event.value, 94 | event.start_mark, event.end_mark, style=event.style) 95 | if anchor is not None: 96 | self.anchors[anchor] = node 97 | return node 98 | 99 | def compose_sequence_node(self, anchor): 100 | start_event = self.get_event() 101 | tag = start_event.tag 102 | if tag is None or tag == '!': 103 | tag = self.resolve(SequenceNode, None, start_event.implicit) 104 | node = SequenceNode(tag, [], 105 | start_event.start_mark, None, 106 | flow_style=start_event.flow_style) 107 | if anchor is not None: 108 | self.anchors[anchor] = node 109 | index = 0 110 | while not self.check_event(SequenceEndEvent): 111 | node.value.append(self.compose_node(node, index)) 112 | index += 1 113 | end_event = self.get_event() 114 | node.end_mark = end_event.end_mark 115 | return node 116 | 117 | def compose_mapping_node(self, anchor): 118 | start_event = self.get_event() 119 | tag = start_event.tag 120 | if tag is None or tag == '!': 121 | tag = self.resolve(MappingNode, None, start_event.implicit) 122 | node = MappingNode(tag, [], 123 | start_event.start_mark, None, 124 | flow_style=start_event.flow_style) 125 | if anchor is not None: 126 | self.anchors[anchor] = node 127 | while not self.check_event(MappingEndEvent): 128 | #key_event = self.peek_event() 129 | item_key = self.compose_node(node, None) 130 | #if item_key in node.value: 131 | # raise ComposerError("while composing a mapping", start_event.start_mark, 132 | # "found duplicate key", key_event.start_mark) 133 | item_value = self.compose_node(node, item_key) 134 | #node.value[item_key] = item_value 135 | node.value.append((item_key, item_value)) 136 | end_event = self.get_event() 137 | node.end_mark = end_event.end_mark 138 | return node 139 | 140 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/cyaml.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = [ 3 | 'CBaseLoader', 'CSafeLoader', 'CFullLoader', 'CUnsafeLoader', 'CLoader', 4 | 'CBaseDumper', 'CSafeDumper', 'CDumper' 5 | ] 6 | 7 | from yaml._yaml import CParser, CEmitter 8 | 9 | from .constructor import * 10 | 11 | from .serializer import * 12 | from .representer import * 13 | 14 | from .resolver import * 15 | 16 | class CBaseLoader(CParser, BaseConstructor, BaseResolver): 17 | 18 | def __init__(self, stream): 19 | CParser.__init__(self, stream) 20 | BaseConstructor.__init__(self) 21 | BaseResolver.__init__(self) 22 | 23 | class CSafeLoader(CParser, SafeConstructor, Resolver): 24 | 25 | def __init__(self, stream): 26 | CParser.__init__(self, stream) 27 | SafeConstructor.__init__(self) 28 | Resolver.__init__(self) 29 | 30 | class CFullLoader(CParser, FullConstructor, Resolver): 31 | 32 | def __init__(self, stream): 33 | CParser.__init__(self, stream) 34 | FullConstructor.__init__(self) 35 | Resolver.__init__(self) 36 | 37 | class CUnsafeLoader(CParser, UnsafeConstructor, Resolver): 38 | 39 | def __init__(self, stream): 40 | CParser.__init__(self, stream) 41 | UnsafeConstructor.__init__(self) 42 | Resolver.__init__(self) 43 | 44 | class CLoader(CParser, Constructor, Resolver): 45 | 46 | def __init__(self, stream): 47 | CParser.__init__(self, stream) 48 | Constructor.__init__(self) 49 | Resolver.__init__(self) 50 | 51 | class CBaseDumper(CEmitter, BaseRepresenter, BaseResolver): 52 | 53 | def __init__(self, stream, 54 | default_style=None, default_flow_style=False, 55 | canonical=None, indent=None, width=None, 56 | allow_unicode=None, line_break=None, 57 | encoding=None, explicit_start=None, explicit_end=None, 58 | version=None, tags=None, sort_keys=True): 59 | CEmitter.__init__(self, stream, canonical=canonical, 60 | indent=indent, width=width, encoding=encoding, 61 | allow_unicode=allow_unicode, line_break=line_break, 62 | explicit_start=explicit_start, explicit_end=explicit_end, 63 | version=version, tags=tags) 64 | Representer.__init__(self, default_style=default_style, 65 | default_flow_style=default_flow_style, sort_keys=sort_keys) 66 | Resolver.__init__(self) 67 | 68 | class CSafeDumper(CEmitter, SafeRepresenter, Resolver): 69 | 70 | def __init__(self, stream, 71 | default_style=None, default_flow_style=False, 72 | canonical=None, indent=None, width=None, 73 | allow_unicode=None, line_break=None, 74 | encoding=None, explicit_start=None, explicit_end=None, 75 | version=None, tags=None, sort_keys=True): 76 | CEmitter.__init__(self, stream, canonical=canonical, 77 | indent=indent, width=width, encoding=encoding, 78 | allow_unicode=allow_unicode, line_break=line_break, 79 | explicit_start=explicit_start, explicit_end=explicit_end, 80 | version=version, tags=tags) 81 | SafeRepresenter.__init__(self, default_style=default_style, 82 | default_flow_style=default_flow_style, sort_keys=sort_keys) 83 | Resolver.__init__(self) 84 | 85 | class CDumper(CEmitter, Serializer, Representer, Resolver): 86 | 87 | def __init__(self, stream, 88 | default_style=None, default_flow_style=False, 89 | canonical=None, indent=None, width=None, 90 | allow_unicode=None, line_break=None, 91 | encoding=None, explicit_start=None, explicit_end=None, 92 | version=None, tags=None, sort_keys=True): 93 | CEmitter.__init__(self, stream, canonical=canonical, 94 | indent=indent, width=width, encoding=encoding, 95 | allow_unicode=allow_unicode, line_break=line_break, 96 | explicit_start=explicit_start, explicit_end=explicit_end, 97 | version=version, tags=tags) 98 | Representer.__init__(self, default_style=default_style, 99 | default_flow_style=default_flow_style, sort_keys=sort_keys) 100 | Resolver.__init__(self) 101 | 102 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/dumper.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = ['BaseDumper', 'SafeDumper', 'Dumper'] 3 | 4 | from .emitter import * 5 | from .serializer import * 6 | from .representer import * 7 | from .resolver import * 8 | 9 | class BaseDumper(Emitter, Serializer, BaseRepresenter, BaseResolver): 10 | 11 | def __init__(self, stream, 12 | default_style=None, default_flow_style=False, 13 | canonical=None, indent=None, width=None, 14 | allow_unicode=None, line_break=None, 15 | encoding=None, explicit_start=None, explicit_end=None, 16 | version=None, tags=None, sort_keys=True): 17 | Emitter.__init__(self, stream, canonical=canonical, 18 | indent=indent, width=width, 19 | allow_unicode=allow_unicode, line_break=line_break) 20 | Serializer.__init__(self, encoding=encoding, 21 | explicit_start=explicit_start, explicit_end=explicit_end, 22 | version=version, tags=tags) 23 | Representer.__init__(self, default_style=default_style, 24 | default_flow_style=default_flow_style, sort_keys=sort_keys) 25 | Resolver.__init__(self) 26 | 27 | class SafeDumper(Emitter, Serializer, SafeRepresenter, Resolver): 28 | 29 | def __init__(self, stream, 30 | default_style=None, default_flow_style=False, 31 | canonical=None, indent=None, width=None, 32 | allow_unicode=None, line_break=None, 33 | encoding=None, explicit_start=None, explicit_end=None, 34 | version=None, tags=None, sort_keys=True): 35 | Emitter.__init__(self, stream, canonical=canonical, 36 | indent=indent, width=width, 37 | allow_unicode=allow_unicode, line_break=line_break) 38 | Serializer.__init__(self, encoding=encoding, 39 | explicit_start=explicit_start, explicit_end=explicit_end, 40 | version=version, tags=tags) 41 | SafeRepresenter.__init__(self, default_style=default_style, 42 | default_flow_style=default_flow_style, sort_keys=sort_keys) 43 | Resolver.__init__(self) 44 | 45 | class Dumper(Emitter, Serializer, Representer, Resolver): 46 | 47 | def __init__(self, stream, 48 | default_style=None, default_flow_style=False, 49 | canonical=None, indent=None, width=None, 50 | allow_unicode=None, line_break=None, 51 | encoding=None, explicit_start=None, explicit_end=None, 52 | version=None, tags=None, sort_keys=True): 53 | Emitter.__init__(self, stream, canonical=canonical, 54 | indent=indent, width=width, 55 | allow_unicode=allow_unicode, line_break=line_break) 56 | Serializer.__init__(self, encoding=encoding, 57 | explicit_start=explicit_start, explicit_end=explicit_end, 58 | version=version, tags=tags) 59 | Representer.__init__(self, default_style=default_style, 60 | default_flow_style=default_flow_style, sort_keys=sort_keys) 61 | Resolver.__init__(self) 62 | 63 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/error.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = ['Mark', 'YAMLError', 'MarkedYAMLError'] 3 | 4 | class Mark: 5 | 6 | def __init__(self, name, index, line, column, buffer, pointer): 7 | self.name = name 8 | self.index = index 9 | self.line = line 10 | self.column = column 11 | self.buffer = buffer 12 | self.pointer = pointer 13 | 14 | def get_snippet(self, indent=4, max_length=75): 15 | if self.buffer is None: 16 | return None 17 | head = '' 18 | start = self.pointer 19 | while start > 0 and self.buffer[start-1] not in '\0\r\n\x85\u2028\u2029': 20 | start -= 1 21 | if self.pointer-start > max_length/2-1: 22 | head = ' ... ' 23 | start += 5 24 | break 25 | tail = '' 26 | end = self.pointer 27 | while end < len(self.buffer) and self.buffer[end] not in '\0\r\n\x85\u2028\u2029': 28 | end += 1 29 | if end-self.pointer > max_length/2-1: 30 | tail = ' ... ' 31 | end -= 5 32 | break 33 | snippet = self.buffer[start:end] 34 | return ' '*indent + head + snippet + tail + '\n' \ 35 | + ' '*(indent+self.pointer-start+len(head)) + '^' 36 | 37 | def __str__(self): 38 | snippet = self.get_snippet() 39 | where = " in \"%s\", line %d, column %d" \ 40 | % (self.name, self.line+1, self.column+1) 41 | if snippet is not None: 42 | where += ":\n"+snippet 43 | return where 44 | 45 | class YAMLError(Exception): 46 | pass 47 | 48 | class MarkedYAMLError(YAMLError): 49 | 50 | def __init__(self, context=None, context_mark=None, 51 | problem=None, problem_mark=None, note=None): 52 | self.context = context 53 | self.context_mark = context_mark 54 | self.problem = problem 55 | self.problem_mark = problem_mark 56 | self.note = note 57 | 58 | def __str__(self): 59 | lines = [] 60 | if self.context is not None: 61 | lines.append(self.context) 62 | if self.context_mark is not None \ 63 | and (self.problem is None or self.problem_mark is None 64 | or self.context_mark.name != self.problem_mark.name 65 | or self.context_mark.line != self.problem_mark.line 66 | or self.context_mark.column != self.problem_mark.column): 67 | lines.append(str(self.context_mark)) 68 | if self.problem is not None: 69 | lines.append(self.problem) 70 | if self.problem_mark is not None: 71 | lines.append(str(self.problem_mark)) 72 | if self.note is not None: 73 | lines.append(self.note) 74 | return '\n'.join(lines) 75 | 76 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/events.py: -------------------------------------------------------------------------------- 1 | 2 | # Abstract classes. 3 | 4 | class Event(object): 5 | def __init__(self, start_mark=None, end_mark=None): 6 | self.start_mark = start_mark 7 | self.end_mark = end_mark 8 | def __repr__(self): 9 | attributes = [key for key in ['anchor', 'tag', 'implicit', 'value'] 10 | if hasattr(self, key)] 11 | arguments = ', '.join(['%s=%r' % (key, getattr(self, key)) 12 | for key in attributes]) 13 | return '%s(%s)' % (self.__class__.__name__, arguments) 14 | 15 | class NodeEvent(Event): 16 | def __init__(self, anchor, start_mark=None, end_mark=None): 17 | self.anchor = anchor 18 | self.start_mark = start_mark 19 | self.end_mark = end_mark 20 | 21 | class CollectionStartEvent(NodeEvent): 22 | def __init__(self, anchor, tag, implicit, start_mark=None, end_mark=None, 23 | flow_style=None): 24 | self.anchor = anchor 25 | self.tag = tag 26 | self.implicit = implicit 27 | self.start_mark = start_mark 28 | self.end_mark = end_mark 29 | self.flow_style = flow_style 30 | 31 | class CollectionEndEvent(Event): 32 | pass 33 | 34 | # Implementations. 35 | 36 | class StreamStartEvent(Event): 37 | def __init__(self, start_mark=None, end_mark=None, encoding=None): 38 | self.start_mark = start_mark 39 | self.end_mark = end_mark 40 | self.encoding = encoding 41 | 42 | class StreamEndEvent(Event): 43 | pass 44 | 45 | class DocumentStartEvent(Event): 46 | def __init__(self, start_mark=None, end_mark=None, 47 | explicit=None, version=None, tags=None): 48 | self.start_mark = start_mark 49 | self.end_mark = end_mark 50 | self.explicit = explicit 51 | self.version = version 52 | self.tags = tags 53 | 54 | class DocumentEndEvent(Event): 55 | def __init__(self, start_mark=None, end_mark=None, 56 | explicit=None): 57 | self.start_mark = start_mark 58 | self.end_mark = end_mark 59 | self.explicit = explicit 60 | 61 | class AliasEvent(NodeEvent): 62 | pass 63 | 64 | class ScalarEvent(NodeEvent): 65 | def __init__(self, anchor, tag, implicit, value, 66 | start_mark=None, end_mark=None, style=None): 67 | self.anchor = anchor 68 | self.tag = tag 69 | self.implicit = implicit 70 | self.value = value 71 | self.start_mark = start_mark 72 | self.end_mark = end_mark 73 | self.style = style 74 | 75 | class SequenceStartEvent(CollectionStartEvent): 76 | pass 77 | 78 | class SequenceEndEvent(CollectionEndEvent): 79 | pass 80 | 81 | class MappingStartEvent(CollectionStartEvent): 82 | pass 83 | 84 | class MappingEndEvent(CollectionEndEvent): 85 | pass 86 | 87 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/loader.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = ['BaseLoader', 'FullLoader', 'SafeLoader', 'Loader', 'UnsafeLoader'] 3 | 4 | from .reader import * 5 | from .scanner import * 6 | from .parser import * 7 | from .composer import * 8 | from .constructor import * 9 | from .resolver import * 10 | 11 | class BaseLoader(Reader, Scanner, Parser, Composer, BaseConstructor, BaseResolver): 12 | 13 | def __init__(self, stream): 14 | Reader.__init__(self, stream) 15 | Scanner.__init__(self) 16 | Parser.__init__(self) 17 | Composer.__init__(self) 18 | BaseConstructor.__init__(self) 19 | BaseResolver.__init__(self) 20 | 21 | class FullLoader(Reader, Scanner, Parser, Composer, FullConstructor, Resolver): 22 | 23 | def __init__(self, stream): 24 | Reader.__init__(self, stream) 25 | Scanner.__init__(self) 26 | Parser.__init__(self) 27 | Composer.__init__(self) 28 | FullConstructor.__init__(self) 29 | Resolver.__init__(self) 30 | 31 | class SafeLoader(Reader, Scanner, Parser, Composer, SafeConstructor, Resolver): 32 | 33 | def __init__(self, stream): 34 | Reader.__init__(self, stream) 35 | Scanner.__init__(self) 36 | Parser.__init__(self) 37 | Composer.__init__(self) 38 | SafeConstructor.__init__(self) 39 | Resolver.__init__(self) 40 | 41 | class Loader(Reader, Scanner, Parser, Composer, Constructor, Resolver): 42 | 43 | def __init__(self, stream): 44 | Reader.__init__(self, stream) 45 | Scanner.__init__(self) 46 | Parser.__init__(self) 47 | Composer.__init__(self) 48 | Constructor.__init__(self) 49 | Resolver.__init__(self) 50 | 51 | # UnsafeLoader is the same as Loader (which is and was always unsafe on 52 | # untrusted input). Use of either Loader or UnsafeLoader should be rare, since 53 | # FullLoad should be able to load almost all YAML safely. Loader is left intact 54 | # to ensure backwards compatibility. 55 | class UnsafeLoader(Reader, Scanner, Parser, Composer, Constructor, Resolver): 56 | 57 | def __init__(self, stream): 58 | Reader.__init__(self, stream) 59 | Scanner.__init__(self) 60 | Parser.__init__(self) 61 | Composer.__init__(self) 62 | Constructor.__init__(self) 63 | Resolver.__init__(self) 64 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/nodes.py: -------------------------------------------------------------------------------- 1 | 2 | class Node(object): 3 | def __init__(self, tag, value, start_mark, end_mark): 4 | self.tag = tag 5 | self.value = value 6 | self.start_mark = start_mark 7 | self.end_mark = end_mark 8 | def __repr__(self): 9 | value = self.value 10 | #if isinstance(value, list): 11 | # if len(value) == 0: 12 | # value = '' 13 | # elif len(value) == 1: 14 | # value = '<1 item>' 15 | # else: 16 | # value = '<%d items>' % len(value) 17 | #else: 18 | # if len(value) > 75: 19 | # value = repr(value[:70]+u' ... ') 20 | # else: 21 | # value = repr(value) 22 | value = repr(value) 23 | return '%s(tag=%r, value=%s)' % (self.__class__.__name__, self.tag, value) 24 | 25 | class ScalarNode(Node): 26 | id = 'scalar' 27 | def __init__(self, tag, value, 28 | start_mark=None, end_mark=None, style=None): 29 | self.tag = tag 30 | self.value = value 31 | self.start_mark = start_mark 32 | self.end_mark = end_mark 33 | self.style = style 34 | 35 | class CollectionNode(Node): 36 | def __init__(self, tag, value, 37 | start_mark=None, end_mark=None, flow_style=None): 38 | self.tag = tag 39 | self.value = value 40 | self.start_mark = start_mark 41 | self.end_mark = end_mark 42 | self.flow_style = flow_style 43 | 44 | class SequenceNode(CollectionNode): 45 | id = 'sequence' 46 | 47 | class MappingNode(CollectionNode): 48 | id = 'mapping' 49 | 50 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/reader.py: -------------------------------------------------------------------------------- 1 | # This module contains abstractions for the input stream. You don't have to 2 | # looks further, there are no pretty code. 3 | # 4 | # We define two classes here. 5 | # 6 | # Mark(source, line, column) 7 | # It's just a record and its only use is producing nice error messages. 8 | # Parser does not use it for any other purposes. 9 | # 10 | # Reader(source, data) 11 | # Reader determines the encoding of `data` and converts it to unicode. 12 | # Reader provides the following methods and attributes: 13 | # reader.peek(length=1) - return the next `length` characters 14 | # reader.forward(length=1) - move the current position to `length` characters. 15 | # reader.index - the number of the current character. 16 | # reader.line, stream.column - the line and the column of the current character. 17 | 18 | __all__ = ['Reader', 'ReaderError'] 19 | 20 | from .error import YAMLError, Mark 21 | 22 | import codecs, re 23 | 24 | class ReaderError(YAMLError): 25 | 26 | def __init__(self, name, position, character, encoding, reason): 27 | self.name = name 28 | self.character = character 29 | self.position = position 30 | self.encoding = encoding 31 | self.reason = reason 32 | 33 | def __str__(self): 34 | if isinstance(self.character, bytes): 35 | return "'%s' codec can't decode byte #x%02x: %s\n" \ 36 | " in \"%s\", position %d" \ 37 | % (self.encoding, ord(self.character), self.reason, 38 | self.name, self.position) 39 | else: 40 | return "unacceptable character #x%04x: %s\n" \ 41 | " in \"%s\", position %d" \ 42 | % (self.character, self.reason, 43 | self.name, self.position) 44 | 45 | class Reader(object): 46 | # Reader: 47 | # - determines the data encoding and converts it to a unicode string, 48 | # - checks if characters are in allowed range, 49 | # - adds '\0' to the end. 50 | 51 | # Reader accepts 52 | # - a `bytes` object, 53 | # - a `str` object, 54 | # - a file-like object with its `read` method returning `str`, 55 | # - a file-like object with its `read` method returning `unicode`. 56 | 57 | # Yeah, it's ugly and slow. 58 | 59 | def __init__(self, stream): 60 | self.name = None 61 | self.stream = None 62 | self.stream_pointer = 0 63 | self.eof = True 64 | self.buffer = '' 65 | self.pointer = 0 66 | self.raw_buffer = None 67 | self.raw_decode = None 68 | self.encoding = None 69 | self.index = 0 70 | self.line = 0 71 | self.column = 0 72 | if isinstance(stream, str): 73 | self.name = "" 74 | self.check_printable(stream) 75 | self.buffer = stream+'\0' 76 | elif isinstance(stream, bytes): 77 | self.name = "" 78 | self.raw_buffer = stream 79 | self.determine_encoding() 80 | else: 81 | self.stream = stream 82 | self.name = getattr(stream, 'name', "") 83 | self.eof = False 84 | self.raw_buffer = None 85 | self.determine_encoding() 86 | 87 | def peek(self, index=0): 88 | try: 89 | return self.buffer[self.pointer+index] 90 | except IndexError: 91 | self.update(index+1) 92 | return self.buffer[self.pointer+index] 93 | 94 | def prefix(self, length=1): 95 | if self.pointer+length >= len(self.buffer): 96 | self.update(length) 97 | return self.buffer[self.pointer:self.pointer+length] 98 | 99 | def forward(self, length=1): 100 | if self.pointer+length+1 >= len(self.buffer): 101 | self.update(length+1) 102 | while length: 103 | ch = self.buffer[self.pointer] 104 | self.pointer += 1 105 | self.index += 1 106 | if ch in '\n\x85\u2028\u2029' \ 107 | or (ch == '\r' and self.buffer[self.pointer] != '\n'): 108 | self.line += 1 109 | self.column = 0 110 | elif ch != '\uFEFF': 111 | self.column += 1 112 | length -= 1 113 | 114 | def get_mark(self): 115 | if self.stream is None: 116 | return Mark(self.name, self.index, self.line, self.column, 117 | self.buffer, self.pointer) 118 | else: 119 | return Mark(self.name, self.index, self.line, self.column, 120 | None, None) 121 | 122 | def determine_encoding(self): 123 | while not self.eof and (self.raw_buffer is None or len(self.raw_buffer) < 2): 124 | self.update_raw() 125 | if isinstance(self.raw_buffer, bytes): 126 | if self.raw_buffer.startswith(codecs.BOM_UTF16_LE): 127 | self.raw_decode = codecs.utf_16_le_decode 128 | self.encoding = 'utf-16-le' 129 | elif self.raw_buffer.startswith(codecs.BOM_UTF16_BE): 130 | self.raw_decode = codecs.utf_16_be_decode 131 | self.encoding = 'utf-16-be' 132 | else: 133 | self.raw_decode = codecs.utf_8_decode 134 | self.encoding = 'utf-8' 135 | self.update(1) 136 | 137 | NON_PRINTABLE = re.compile('[^\x09\x0A\x0D\x20-\x7E\x85\xA0-\uD7FF\uE000-\uFFFD\U00010000-\U0010ffff]') 138 | def check_printable(self, data): 139 | match = self.NON_PRINTABLE.search(data) 140 | if match: 141 | character = match.group() 142 | position = self.index+(len(self.buffer)-self.pointer)+match.start() 143 | raise ReaderError(self.name, position, ord(character), 144 | 'unicode', "special characters are not allowed") 145 | 146 | def update(self, length): 147 | if self.raw_buffer is None: 148 | return 149 | self.buffer = self.buffer[self.pointer:] 150 | self.pointer = 0 151 | while len(self.buffer) < length: 152 | if not self.eof: 153 | self.update_raw() 154 | if self.raw_decode is not None: 155 | try: 156 | data, converted = self.raw_decode(self.raw_buffer, 157 | 'strict', self.eof) 158 | except UnicodeDecodeError as exc: 159 | character = self.raw_buffer[exc.start] 160 | if self.stream is not None: 161 | position = self.stream_pointer-len(self.raw_buffer)+exc.start 162 | else: 163 | position = exc.start 164 | raise ReaderError(self.name, position, character, 165 | exc.encoding, exc.reason) 166 | else: 167 | data = self.raw_buffer 168 | converted = len(data) 169 | self.check_printable(data) 170 | self.buffer += data 171 | self.raw_buffer = self.raw_buffer[converted:] 172 | if self.eof: 173 | self.buffer += '\0' 174 | self.raw_buffer = None 175 | break 176 | 177 | def update_raw(self, size=4096): 178 | data = self.stream.read(size) 179 | if self.raw_buffer is None: 180 | self.raw_buffer = data 181 | else: 182 | self.raw_buffer += data 183 | self.stream_pointer += len(data) 184 | if not data: 185 | self.eof = True 186 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/representer.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = ['BaseRepresenter', 'SafeRepresenter', 'Representer', 3 | 'RepresenterError'] 4 | 5 | from .error import * 6 | from .nodes import * 7 | 8 | import datetime, copyreg, types, base64, collections 9 | 10 | class RepresenterError(YAMLError): 11 | pass 12 | 13 | class BaseRepresenter: 14 | 15 | yaml_representers = {} 16 | yaml_multi_representers = {} 17 | 18 | def __init__(self, default_style=None, default_flow_style=False, sort_keys=True): 19 | self.default_style = default_style 20 | self.sort_keys = sort_keys 21 | self.default_flow_style = default_flow_style 22 | self.represented_objects = {} 23 | self.object_keeper = [] 24 | self.alias_key = None 25 | 26 | def represent(self, data): 27 | node = self.represent_data(data) 28 | self.serialize(node) 29 | self.represented_objects = {} 30 | self.object_keeper = [] 31 | self.alias_key = None 32 | 33 | def represent_data(self, data): 34 | if self.ignore_aliases(data): 35 | self.alias_key = None 36 | else: 37 | self.alias_key = id(data) 38 | if self.alias_key is not None: 39 | if self.alias_key in self.represented_objects: 40 | node = self.represented_objects[self.alias_key] 41 | #if node is None: 42 | # raise RepresenterError("recursive objects are not allowed: %r" % data) 43 | return node 44 | #self.represented_objects[alias_key] = None 45 | self.object_keeper.append(data) 46 | data_types = type(data).__mro__ 47 | if data_types[0] in self.yaml_representers: 48 | node = self.yaml_representers[data_types[0]](self, data) 49 | else: 50 | for data_type in data_types: 51 | if data_type in self.yaml_multi_representers: 52 | node = self.yaml_multi_representers[data_type](self, data) 53 | break 54 | else: 55 | if None in self.yaml_multi_representers: 56 | node = self.yaml_multi_representers[None](self, data) 57 | elif None in self.yaml_representers: 58 | node = self.yaml_representers[None](self, data) 59 | else: 60 | node = ScalarNode(None, str(data)) 61 | #if alias_key is not None: 62 | # self.represented_objects[alias_key] = node 63 | return node 64 | 65 | @classmethod 66 | def add_representer(cls, data_type, representer): 67 | if not 'yaml_representers' in cls.__dict__: 68 | cls.yaml_representers = cls.yaml_representers.copy() 69 | cls.yaml_representers[data_type] = representer 70 | 71 | @classmethod 72 | def add_multi_representer(cls, data_type, representer): 73 | if not 'yaml_multi_representers' in cls.__dict__: 74 | cls.yaml_multi_representers = cls.yaml_multi_representers.copy() 75 | cls.yaml_multi_representers[data_type] = representer 76 | 77 | def represent_scalar(self, tag, value, style=None): 78 | if style is None: 79 | style = self.default_style 80 | node = ScalarNode(tag, value, style=style) 81 | if self.alias_key is not None: 82 | self.represented_objects[self.alias_key] = node 83 | return node 84 | 85 | def represent_sequence(self, tag, sequence, flow_style=None): 86 | value = [] 87 | node = SequenceNode(tag, value, flow_style=flow_style) 88 | if self.alias_key is not None: 89 | self.represented_objects[self.alias_key] = node 90 | best_style = True 91 | for item in sequence: 92 | node_item = self.represent_data(item) 93 | if not (isinstance(node_item, ScalarNode) and not node_item.style): 94 | best_style = False 95 | value.append(node_item) 96 | if flow_style is None: 97 | if self.default_flow_style is not None: 98 | node.flow_style = self.default_flow_style 99 | else: 100 | node.flow_style = best_style 101 | return node 102 | 103 | def represent_mapping(self, tag, mapping, flow_style=None): 104 | value = [] 105 | node = MappingNode(tag, value, flow_style=flow_style) 106 | if self.alias_key is not None: 107 | self.represented_objects[self.alias_key] = node 108 | best_style = True 109 | if hasattr(mapping, 'items'): 110 | mapping = list(mapping.items()) 111 | if self.sort_keys: 112 | try: 113 | mapping = sorted(mapping) 114 | except TypeError: 115 | pass 116 | for item_key, item_value in mapping: 117 | node_key = self.represent_data(item_key) 118 | node_value = self.represent_data(item_value) 119 | if not (isinstance(node_key, ScalarNode) and not node_key.style): 120 | best_style = False 121 | if not (isinstance(node_value, ScalarNode) and not node_value.style): 122 | best_style = False 123 | value.append((node_key, node_value)) 124 | if flow_style is None: 125 | if self.default_flow_style is not None: 126 | node.flow_style = self.default_flow_style 127 | else: 128 | node.flow_style = best_style 129 | return node 130 | 131 | def ignore_aliases(self, data): 132 | return False 133 | 134 | class SafeRepresenter(BaseRepresenter): 135 | 136 | def ignore_aliases(self, data): 137 | if data is None: 138 | return True 139 | if isinstance(data, tuple) and data == (): 140 | return True 141 | if isinstance(data, (str, bytes, bool, int, float)): 142 | return True 143 | 144 | def represent_none(self, data): 145 | return self.represent_scalar('tag:yaml.org,2002:null', 'null') 146 | 147 | def represent_str(self, data): 148 | return self.represent_scalar('tag:yaml.org,2002:str', data) 149 | 150 | def represent_binary(self, data): 151 | if hasattr(base64, 'encodebytes'): 152 | data = base64.encodebytes(data).decode('ascii') 153 | else: 154 | data = base64.encodestring(data).decode('ascii') 155 | return self.represent_scalar('tag:yaml.org,2002:binary', data, style='|') 156 | 157 | def represent_bool(self, data): 158 | if data: 159 | value = 'true' 160 | else: 161 | value = 'false' 162 | return self.represent_scalar('tag:yaml.org,2002:bool', value) 163 | 164 | def represent_int(self, data): 165 | return self.represent_scalar('tag:yaml.org,2002:int', str(data)) 166 | 167 | inf_value = 1e300 168 | while repr(inf_value) != repr(inf_value*inf_value): 169 | inf_value *= inf_value 170 | 171 | def represent_float(self, data): 172 | if data != data or (data == 0.0 and data == 1.0): 173 | value = '.nan' 174 | elif data == self.inf_value: 175 | value = '.inf' 176 | elif data == -self.inf_value: 177 | value = '-.inf' 178 | else: 179 | value = repr(data).lower() 180 | # Note that in some cases `repr(data)` represents a float number 181 | # without the decimal parts. For instance: 182 | # >>> repr(1e17) 183 | # '1e17' 184 | # Unfortunately, this is not a valid float representation according 185 | # to the definition of the `!!float` tag. We fix this by adding 186 | # '.0' before the 'e' symbol. 187 | if '.' not in value and 'e' in value: 188 | value = value.replace('e', '.0e', 1) 189 | return self.represent_scalar('tag:yaml.org,2002:float', value) 190 | 191 | def represent_list(self, data): 192 | #pairs = (len(data) > 0 and isinstance(data, list)) 193 | #if pairs: 194 | # for item in data: 195 | # if not isinstance(item, tuple) or len(item) != 2: 196 | # pairs = False 197 | # break 198 | #if not pairs: 199 | return self.represent_sequence('tag:yaml.org,2002:seq', data) 200 | #value = [] 201 | #for item_key, item_value in data: 202 | # value.append(self.represent_mapping(u'tag:yaml.org,2002:map', 203 | # [(item_key, item_value)])) 204 | #return SequenceNode(u'tag:yaml.org,2002:pairs', value) 205 | 206 | def represent_dict(self, data): 207 | return self.represent_mapping('tag:yaml.org,2002:map', data) 208 | 209 | def represent_set(self, data): 210 | value = {} 211 | for key in data: 212 | value[key] = None 213 | return self.represent_mapping('tag:yaml.org,2002:set', value) 214 | 215 | def represent_date(self, data): 216 | value = data.isoformat() 217 | return self.represent_scalar('tag:yaml.org,2002:timestamp', value) 218 | 219 | def represent_datetime(self, data): 220 | value = data.isoformat(' ') 221 | return self.represent_scalar('tag:yaml.org,2002:timestamp', value) 222 | 223 | def represent_yaml_object(self, tag, data, cls, flow_style=None): 224 | if hasattr(data, '__getstate__'): 225 | state = data.__getstate__() 226 | else: 227 | state = data.__dict__.copy() 228 | return self.represent_mapping(tag, state, flow_style=flow_style) 229 | 230 | def represent_undefined(self, data): 231 | raise RepresenterError("cannot represent an object", data) 232 | 233 | SafeRepresenter.add_representer(type(None), 234 | SafeRepresenter.represent_none) 235 | 236 | SafeRepresenter.add_representer(str, 237 | SafeRepresenter.represent_str) 238 | 239 | SafeRepresenter.add_representer(bytes, 240 | SafeRepresenter.represent_binary) 241 | 242 | SafeRepresenter.add_representer(bool, 243 | SafeRepresenter.represent_bool) 244 | 245 | SafeRepresenter.add_representer(int, 246 | SafeRepresenter.represent_int) 247 | 248 | SafeRepresenter.add_representer(float, 249 | SafeRepresenter.represent_float) 250 | 251 | SafeRepresenter.add_representer(list, 252 | SafeRepresenter.represent_list) 253 | 254 | SafeRepresenter.add_representer(tuple, 255 | SafeRepresenter.represent_list) 256 | 257 | SafeRepresenter.add_representer(dict, 258 | SafeRepresenter.represent_dict) 259 | 260 | SafeRepresenter.add_representer(set, 261 | SafeRepresenter.represent_set) 262 | 263 | SafeRepresenter.add_representer(datetime.date, 264 | SafeRepresenter.represent_date) 265 | 266 | SafeRepresenter.add_representer(datetime.datetime, 267 | SafeRepresenter.represent_datetime) 268 | 269 | SafeRepresenter.add_representer(None, 270 | SafeRepresenter.represent_undefined) 271 | 272 | class Representer(SafeRepresenter): 273 | 274 | def represent_complex(self, data): 275 | if data.imag == 0.0: 276 | data = '%r' % data.real 277 | elif data.real == 0.0: 278 | data = '%rj' % data.imag 279 | elif data.imag > 0: 280 | data = '%r+%rj' % (data.real, data.imag) 281 | else: 282 | data = '%r%rj' % (data.real, data.imag) 283 | return self.represent_scalar('tag:yaml.org,2002:python/complex', data) 284 | 285 | def represent_tuple(self, data): 286 | return self.represent_sequence('tag:yaml.org,2002:python/tuple', data) 287 | 288 | def represent_name(self, data): 289 | name = '%s.%s' % (data.__module__, data.__name__) 290 | return self.represent_scalar('tag:yaml.org,2002:python/name:'+name, '') 291 | 292 | def represent_module(self, data): 293 | return self.represent_scalar( 294 | 'tag:yaml.org,2002:python/module:'+data.__name__, '') 295 | 296 | def represent_object(self, data): 297 | # We use __reduce__ API to save the data. data.__reduce__ returns 298 | # a tuple of length 2-5: 299 | # (function, args, state, listitems, dictitems) 300 | 301 | # For reconstructing, we calls function(*args), then set its state, 302 | # listitems, and dictitems if they are not None. 303 | 304 | # A special case is when function.__name__ == '__newobj__'. In this 305 | # case we create the object with args[0].__new__(*args). 306 | 307 | # Another special case is when __reduce__ returns a string - we don't 308 | # support it. 309 | 310 | # We produce a !!python/object, !!python/object/new or 311 | # !!python/object/apply node. 312 | 313 | cls = type(data) 314 | if cls in copyreg.dispatch_table: 315 | reduce = copyreg.dispatch_table[cls](data) 316 | elif hasattr(data, '__reduce_ex__'): 317 | reduce = data.__reduce_ex__(2) 318 | elif hasattr(data, '__reduce__'): 319 | reduce = data.__reduce__() 320 | else: 321 | raise RepresenterError("cannot represent an object", data) 322 | reduce = (list(reduce)+[None]*5)[:5] 323 | function, args, state, listitems, dictitems = reduce 324 | args = list(args) 325 | if state is None: 326 | state = {} 327 | if listitems is not None: 328 | listitems = list(listitems) 329 | if dictitems is not None: 330 | dictitems = dict(dictitems) 331 | if function.__name__ == '__newobj__': 332 | function = args[0] 333 | args = args[1:] 334 | tag = 'tag:yaml.org,2002:python/object/new:' 335 | newobj = True 336 | else: 337 | tag = 'tag:yaml.org,2002:python/object/apply:' 338 | newobj = False 339 | function_name = '%s.%s' % (function.__module__, function.__name__) 340 | if not args and not listitems and not dictitems \ 341 | and isinstance(state, dict) and newobj: 342 | return self.represent_mapping( 343 | 'tag:yaml.org,2002:python/object:'+function_name, state) 344 | if not listitems and not dictitems \ 345 | and isinstance(state, dict) and not state: 346 | return self.represent_sequence(tag+function_name, args) 347 | value = {} 348 | if args: 349 | value['args'] = args 350 | if state or not isinstance(state, dict): 351 | value['state'] = state 352 | if listitems: 353 | value['listitems'] = listitems 354 | if dictitems: 355 | value['dictitems'] = dictitems 356 | return self.represent_mapping(tag+function_name, value) 357 | 358 | def represent_ordered_dict(self, data): 359 | # Provide uniform representation across different Python versions. 360 | data_type = type(data) 361 | tag = 'tag:yaml.org,2002:python/object/apply:%s.%s' \ 362 | % (data_type.__module__, data_type.__name__) 363 | items = [[key, value] for key, value in data.items()] 364 | return self.represent_sequence(tag, [items]) 365 | 366 | Representer.add_representer(complex, 367 | Representer.represent_complex) 368 | 369 | Representer.add_representer(tuple, 370 | Representer.represent_tuple) 371 | 372 | Representer.add_multi_representer(type, 373 | Representer.represent_name) 374 | 375 | Representer.add_representer(collections.OrderedDict, 376 | Representer.represent_ordered_dict) 377 | 378 | Representer.add_representer(types.FunctionType, 379 | Representer.represent_name) 380 | 381 | Representer.add_representer(types.BuiltinFunctionType, 382 | Representer.represent_name) 383 | 384 | Representer.add_representer(types.ModuleType, 385 | Representer.represent_module) 386 | 387 | Representer.add_multi_representer(object, 388 | Representer.represent_object) 389 | 390 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/resolver.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = ['BaseResolver', 'Resolver'] 3 | 4 | from .error import * 5 | from .nodes import * 6 | 7 | import re 8 | 9 | class ResolverError(YAMLError): 10 | pass 11 | 12 | class BaseResolver: 13 | 14 | DEFAULT_SCALAR_TAG = 'tag:yaml.org,2002:str' 15 | DEFAULT_SEQUENCE_TAG = 'tag:yaml.org,2002:seq' 16 | DEFAULT_MAPPING_TAG = 'tag:yaml.org,2002:map' 17 | 18 | yaml_implicit_resolvers = {} 19 | yaml_path_resolvers = {} 20 | 21 | def __init__(self): 22 | self.resolver_exact_paths = [] 23 | self.resolver_prefix_paths = [] 24 | 25 | @classmethod 26 | def add_implicit_resolver(cls, tag, regexp, first): 27 | if not 'yaml_implicit_resolvers' in cls.__dict__: 28 | implicit_resolvers = {} 29 | for key in cls.yaml_implicit_resolvers: 30 | implicit_resolvers[key] = cls.yaml_implicit_resolvers[key][:] 31 | cls.yaml_implicit_resolvers = implicit_resolvers 32 | if first is None: 33 | first = [None] 34 | for ch in first: 35 | cls.yaml_implicit_resolvers.setdefault(ch, []).append((tag, regexp)) 36 | 37 | @classmethod 38 | def add_path_resolver(cls, tag, path, kind=None): 39 | # Note: `add_path_resolver` is experimental. The API could be changed. 40 | # `new_path` is a pattern that is matched against the path from the 41 | # root to the node that is being considered. `node_path` elements are 42 | # tuples `(node_check, index_check)`. `node_check` is a node class: 43 | # `ScalarNode`, `SequenceNode`, `MappingNode` or `None`. `None` 44 | # matches any kind of a node. `index_check` could be `None`, a boolean 45 | # value, a string value, or a number. `None` and `False` match against 46 | # any _value_ of sequence and mapping nodes. `True` matches against 47 | # any _key_ of a mapping node. A string `index_check` matches against 48 | # a mapping value that corresponds to a scalar key which content is 49 | # equal to the `index_check` value. An integer `index_check` matches 50 | # against a sequence value with the index equal to `index_check`. 51 | if not 'yaml_path_resolvers' in cls.__dict__: 52 | cls.yaml_path_resolvers = cls.yaml_path_resolvers.copy() 53 | new_path = [] 54 | for element in path: 55 | if isinstance(element, (list, tuple)): 56 | if len(element) == 2: 57 | node_check, index_check = element 58 | elif len(element) == 1: 59 | node_check = element[0] 60 | index_check = True 61 | else: 62 | raise ResolverError("Invalid path element: %s" % element) 63 | else: 64 | node_check = None 65 | index_check = element 66 | if node_check is str: 67 | node_check = ScalarNode 68 | elif node_check is list: 69 | node_check = SequenceNode 70 | elif node_check is dict: 71 | node_check = MappingNode 72 | elif node_check not in [ScalarNode, SequenceNode, MappingNode] \ 73 | and not isinstance(node_check, str) \ 74 | and node_check is not None: 75 | raise ResolverError("Invalid node checker: %s" % node_check) 76 | if not isinstance(index_check, (str, int)) \ 77 | and index_check is not None: 78 | raise ResolverError("Invalid index checker: %s" % index_check) 79 | new_path.append((node_check, index_check)) 80 | if kind is str: 81 | kind = ScalarNode 82 | elif kind is list: 83 | kind = SequenceNode 84 | elif kind is dict: 85 | kind = MappingNode 86 | elif kind not in [ScalarNode, SequenceNode, MappingNode] \ 87 | and kind is not None: 88 | raise ResolverError("Invalid node kind: %s" % kind) 89 | cls.yaml_path_resolvers[tuple(new_path), kind] = tag 90 | 91 | def descend_resolver(self, current_node, current_index): 92 | if not self.yaml_path_resolvers: 93 | return 94 | exact_paths = {} 95 | prefix_paths = [] 96 | if current_node: 97 | depth = len(self.resolver_prefix_paths) 98 | for path, kind in self.resolver_prefix_paths[-1]: 99 | if self.check_resolver_prefix(depth, path, kind, 100 | current_node, current_index): 101 | if len(path) > depth: 102 | prefix_paths.append((path, kind)) 103 | else: 104 | exact_paths[kind] = self.yaml_path_resolvers[path, kind] 105 | else: 106 | for path, kind in self.yaml_path_resolvers: 107 | if not path: 108 | exact_paths[kind] = self.yaml_path_resolvers[path, kind] 109 | else: 110 | prefix_paths.append((path, kind)) 111 | self.resolver_exact_paths.append(exact_paths) 112 | self.resolver_prefix_paths.append(prefix_paths) 113 | 114 | def ascend_resolver(self): 115 | if not self.yaml_path_resolvers: 116 | return 117 | self.resolver_exact_paths.pop() 118 | self.resolver_prefix_paths.pop() 119 | 120 | def check_resolver_prefix(self, depth, path, kind, 121 | current_node, current_index): 122 | node_check, index_check = path[depth-1] 123 | if isinstance(node_check, str): 124 | if current_node.tag != node_check: 125 | return 126 | elif node_check is not None: 127 | if not isinstance(current_node, node_check): 128 | return 129 | if index_check is True and current_index is not None: 130 | return 131 | if (index_check is False or index_check is None) \ 132 | and current_index is None: 133 | return 134 | if isinstance(index_check, str): 135 | if not (isinstance(current_index, ScalarNode) 136 | and index_check == current_index.value): 137 | return 138 | elif isinstance(index_check, int) and not isinstance(index_check, bool): 139 | if index_check != current_index: 140 | return 141 | return True 142 | 143 | def resolve(self, kind, value, implicit): 144 | if kind is ScalarNode and implicit[0]: 145 | if value == '': 146 | resolvers = self.yaml_implicit_resolvers.get('', []) 147 | else: 148 | resolvers = self.yaml_implicit_resolvers.get(value[0], []) 149 | wildcard_resolvers = self.yaml_implicit_resolvers.get(None, []) 150 | for tag, regexp in resolvers + wildcard_resolvers: 151 | if regexp.match(value): 152 | return tag 153 | implicit = implicit[1] 154 | if self.yaml_path_resolvers: 155 | exact_paths = self.resolver_exact_paths[-1] 156 | if kind in exact_paths: 157 | return exact_paths[kind] 158 | if None in exact_paths: 159 | return exact_paths[None] 160 | if kind is ScalarNode: 161 | return self.DEFAULT_SCALAR_TAG 162 | elif kind is SequenceNode: 163 | return self.DEFAULT_SEQUENCE_TAG 164 | elif kind is MappingNode: 165 | return self.DEFAULT_MAPPING_TAG 166 | 167 | class Resolver(BaseResolver): 168 | pass 169 | 170 | Resolver.add_implicit_resolver( 171 | 'tag:yaml.org,2002:bool', 172 | re.compile(r'''^(?:yes|Yes|YES|no|No|NO 173 | |true|True|TRUE|false|False|FALSE 174 | |on|On|ON|off|Off|OFF)$''', re.X), 175 | list('yYnNtTfFoO')) 176 | 177 | Resolver.add_implicit_resolver( 178 | 'tag:yaml.org,2002:float', 179 | re.compile(r'''^(?:[-+]?(?:[0-9][0-9_]*)\.[0-9_]*(?:[eE][-+][0-9]+)? 180 | |\.[0-9][0-9_]*(?:[eE][-+][0-9]+)? 181 | |[-+]?[0-9][0-9_]*(?::[0-5]?[0-9])+\.[0-9_]* 182 | |[-+]?\.(?:inf|Inf|INF) 183 | |\.(?:nan|NaN|NAN))$''', re.X), 184 | list('-+0123456789.')) 185 | 186 | Resolver.add_implicit_resolver( 187 | 'tag:yaml.org,2002:int', 188 | re.compile(r'''^(?:[-+]?0b[0-1_]+ 189 | |[-+]?0[0-7_]+ 190 | |[-+]?(?:0|[1-9][0-9_]*) 191 | |[-+]?0x[0-9a-fA-F_]+ 192 | |[-+]?[1-9][0-9_]*(?::[0-5]?[0-9])+)$''', re.X), 193 | list('-+0123456789')) 194 | 195 | Resolver.add_implicit_resolver( 196 | 'tag:yaml.org,2002:merge', 197 | re.compile(r'^(?:<<)$'), 198 | ['<']) 199 | 200 | Resolver.add_implicit_resolver( 201 | 'tag:yaml.org,2002:null', 202 | re.compile(r'''^(?: ~ 203 | |null|Null|NULL 204 | | )$''', re.X), 205 | ['~', 'n', 'N', '']) 206 | 207 | Resolver.add_implicit_resolver( 208 | 'tag:yaml.org,2002:timestamp', 209 | re.compile(r'''^(?:[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] 210 | |[0-9][0-9][0-9][0-9] -[0-9][0-9]? -[0-9][0-9]? 211 | (?:[Tt]|[ \t]+)[0-9][0-9]? 212 | :[0-9][0-9] :[0-9][0-9] (?:\.[0-9]*)? 213 | (?:[ \t]*(?:Z|[-+][0-9][0-9]?(?::[0-9][0-9])?))?)$''', re.X), 214 | list('0123456789')) 215 | 216 | Resolver.add_implicit_resolver( 217 | 'tag:yaml.org,2002:value', 218 | re.compile(r'^(?:=)$'), 219 | ['=']) 220 | 221 | # The following resolver is only for documentation purposes. It cannot work 222 | # because plain scalars cannot start with '!', '&', or '*'. 223 | Resolver.add_implicit_resolver( 224 | 'tag:yaml.org,2002:yaml', 225 | re.compile(r'^(?:!|&|\*)$'), 226 | list('!&*')) 227 | 228 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/serializer.py: -------------------------------------------------------------------------------- 1 | 2 | __all__ = ['Serializer', 'SerializerError'] 3 | 4 | from .error import YAMLError 5 | from .events import * 6 | from .nodes import * 7 | 8 | class SerializerError(YAMLError): 9 | pass 10 | 11 | class Serializer: 12 | 13 | ANCHOR_TEMPLATE = 'id%03d' 14 | 15 | def __init__(self, encoding=None, 16 | explicit_start=None, explicit_end=None, version=None, tags=None): 17 | self.use_encoding = encoding 18 | self.use_explicit_start = explicit_start 19 | self.use_explicit_end = explicit_end 20 | self.use_version = version 21 | self.use_tags = tags 22 | self.serialized_nodes = {} 23 | self.anchors = {} 24 | self.last_anchor_id = 0 25 | self.closed = None 26 | 27 | def open(self): 28 | if self.closed is None: 29 | self.emit(StreamStartEvent(encoding=self.use_encoding)) 30 | self.closed = False 31 | elif self.closed: 32 | raise SerializerError("serializer is closed") 33 | else: 34 | raise SerializerError("serializer is already opened") 35 | 36 | def close(self): 37 | if self.closed is None: 38 | raise SerializerError("serializer is not opened") 39 | elif not self.closed: 40 | self.emit(StreamEndEvent()) 41 | self.closed = True 42 | 43 | #def __del__(self): 44 | # self.close() 45 | 46 | def serialize(self, node): 47 | if self.closed is None: 48 | raise SerializerError("serializer is not opened") 49 | elif self.closed: 50 | raise SerializerError("serializer is closed") 51 | self.emit(DocumentStartEvent(explicit=self.use_explicit_start, 52 | version=self.use_version, tags=self.use_tags)) 53 | self.anchor_node(node) 54 | self.serialize_node(node, None, None) 55 | self.emit(DocumentEndEvent(explicit=self.use_explicit_end)) 56 | self.serialized_nodes = {} 57 | self.anchors = {} 58 | self.last_anchor_id = 0 59 | 60 | def anchor_node(self, node): 61 | if node in self.anchors: 62 | if self.anchors[node] is None: 63 | self.anchors[node] = self.generate_anchor(node) 64 | else: 65 | self.anchors[node] = None 66 | if isinstance(node, SequenceNode): 67 | for item in node.value: 68 | self.anchor_node(item) 69 | elif isinstance(node, MappingNode): 70 | for key, value in node.value: 71 | self.anchor_node(key) 72 | self.anchor_node(value) 73 | 74 | def generate_anchor(self, node): 75 | self.last_anchor_id += 1 76 | return self.ANCHOR_TEMPLATE % self.last_anchor_id 77 | 78 | def serialize_node(self, node, parent, index): 79 | alias = self.anchors[node] 80 | if node in self.serialized_nodes: 81 | self.emit(AliasEvent(alias)) 82 | else: 83 | self.serialized_nodes[node] = True 84 | self.descend_resolver(parent, index) 85 | if isinstance(node, ScalarNode): 86 | detected_tag = self.resolve(ScalarNode, node.value, (True, False)) 87 | default_tag = self.resolve(ScalarNode, node.value, (False, True)) 88 | implicit = (node.tag == detected_tag), (node.tag == default_tag) 89 | self.emit(ScalarEvent(alias, node.tag, implicit, node.value, 90 | style=node.style)) 91 | elif isinstance(node, SequenceNode): 92 | implicit = (node.tag 93 | == self.resolve(SequenceNode, node.value, True)) 94 | self.emit(SequenceStartEvent(alias, node.tag, implicit, 95 | flow_style=node.flow_style)) 96 | index = 0 97 | for item in node.value: 98 | self.serialize_node(item, node, index) 99 | index += 1 100 | self.emit(SequenceEndEvent()) 101 | elif isinstance(node, MappingNode): 102 | implicit = (node.tag 103 | == self.resolve(MappingNode, node.value, True)) 104 | self.emit(MappingStartEvent(alias, node.tag, implicit, 105 | flow_style=node.flow_style)) 106 | for key, value in node.value: 107 | self.serialize_node(key, node, None) 108 | self.serialize_node(value, node, key) 109 | self.emit(MappingEndEvent()) 110 | self.ascend_resolver() 111 | 112 | -------------------------------------------------------------------------------- /TA-user-agents/bin/yaml/tokens.py: -------------------------------------------------------------------------------- 1 | 2 | class Token(object): 3 | def __init__(self, start_mark, end_mark): 4 | self.start_mark = start_mark 5 | self.end_mark = end_mark 6 | def __repr__(self): 7 | attributes = [key for key in self.__dict__ 8 | if not key.endswith('_mark')] 9 | attributes.sort() 10 | arguments = ', '.join(['%s=%r' % (key, getattr(self, key)) 11 | for key in attributes]) 12 | return '%s(%s)' % (self.__class__.__name__, arguments) 13 | 14 | #class BOMToken(Token): 15 | # id = '' 16 | 17 | class DirectiveToken(Token): 18 | id = '' 19 | def __init__(self, name, value, start_mark, end_mark): 20 | self.name = name 21 | self.value = value 22 | self.start_mark = start_mark 23 | self.end_mark = end_mark 24 | 25 | class DocumentStartToken(Token): 26 | id = '' 27 | 28 | class DocumentEndToken(Token): 29 | id = '' 30 | 31 | class StreamStartToken(Token): 32 | id = '' 33 | def __init__(self, start_mark=None, end_mark=None, 34 | encoding=None): 35 | self.start_mark = start_mark 36 | self.end_mark = end_mark 37 | self.encoding = encoding 38 | 39 | class StreamEndToken(Token): 40 | id = '' 41 | 42 | class BlockSequenceStartToken(Token): 43 | id = '' 44 | 45 | class BlockMappingStartToken(Token): 46 | id = '' 47 | 48 | class BlockEndToken(Token): 49 | id = '' 50 | 51 | class FlowSequenceStartToken(Token): 52 | id = '[' 53 | 54 | class FlowMappingStartToken(Token): 55 | id = '{' 56 | 57 | class FlowSequenceEndToken(Token): 58 | id = ']' 59 | 60 | class FlowMappingEndToken(Token): 61 | id = '}' 62 | 63 | class KeyToken(Token): 64 | id = '?' 65 | 66 | class ValueToken(Token): 67 | id = ':' 68 | 69 | class BlockEntryToken(Token): 70 | id = '-' 71 | 72 | class FlowEntryToken(Token): 73 | id = ',' 74 | 75 | class AliasToken(Token): 76 | id = '' 77 | def __init__(self, value, start_mark, end_mark): 78 | self.value = value 79 | self.start_mark = start_mark 80 | self.end_mark = end_mark 81 | 82 | class AnchorToken(Token): 83 | id = '' 84 | def __init__(self, value, start_mark, end_mark): 85 | self.value = value 86 | self.start_mark = start_mark 87 | self.end_mark = end_mark 88 | 89 | class TagToken(Token): 90 | id = '' 91 | def __init__(self, value, start_mark, end_mark): 92 | self.value = value 93 | self.start_mark = start_mark 94 | self.end_mark = end_mark 95 | 96 | class ScalarToken(Token): 97 | id = '' 98 | def __init__(self, value, plain, start_mark, end_mark, style=None): 99 | self.value = value 100 | self.plain = plain 101 | self.start_mark = start_mark 102 | self.end_mark = end_mark 103 | self.style = style 104 | 105 | -------------------------------------------------------------------------------- /TA-user-agents/default/apl_logging.conf: -------------------------------------------------------------------------------- 1 | # #::HDR:: ;; # App: TA-user-agents ;; # File: apl_logging.conf ;; # Updated: 2023-06-13 14:37:48 2 | 3 | [TA-user-agents] 4 | kenny_loggins = WARN 5 | modularinput = INFO 6 | restclient = INFO 7 | utilities = INFO -------------------------------------------------------------------------------- /TA-user-agents/default/app.conf: -------------------------------------------------------------------------------- 1 | [ui] 2 | is_visible = 0 3 | label = PAVO TA User Agents 4 | 5 | [diag] 6 | extension_script = Diag.py 7 | 8 | [triggers] 9 | reload.apl_logging = simple 10 | 11 | 12 | [launcher] 13 | author = Aplura, LLC 14 | description = Provides an external Python lookup that parses User Agents strings. 15 | version = 1.7.7 16 | 17 | [package] 18 | id = TA-user-agents 19 | check_for_updates = 1 20 | 21 | [install] 22 | build = 21 23 | 24 | 25 | [id] 26 | version = 1.7.7 27 | name = TA-user-agents 28 | -------------------------------------------------------------------------------- /TA-user-agents/default/checklist.conf: -------------------------------------------------------------------------------- 1 | # #::HDR:: ;; # App: TA-user-agents ;; # File: checklist.conf ;; # Updated: 2023-06-13 14:37:48 2 | 3 | [apl_taua_checklist] 4 | category = Aplura_Security 5 | description = This checks for proper parsing of the user agent. 6 | failure_text = Parsing incorrect. 7 | search = | makeresults | eval http_user_agent="Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; GTB7.4; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30" | lookup user_agents http_user_agent | fields ua_device,ua_family,ua_major,ua_minor,ua_os_family,ua_os_major| eval total_failures = 4 - sum(if(ua_family=="IE",1,0), if(ua_major==8,1,0),if(ua_minor==0,1,0),if(ua_os_major=="XP",1,0)), message="Parsed User Agent", severity_level=if(total_failures==0,"0","3"), instance="local" | fields instance total_failures message severity_level 8 | suggested_action = Review System Logs. 9 | tags = security 10 | title = PAVO TA User Agents -------------------------------------------------------------------------------- /TA-user-agents/default/data/ui/nav/default.xml: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /TA-user-agents/default/log.cfg: -------------------------------------------------------------------------------- 1 | [TA-user-agents] 2 | modularinput=INFO 3 | restclient=INFO 4 | utilities=INFO 5 | kenny_loggins=WARN -------------------------------------------------------------------------------- /TA-user-agents/default/props.conf: -------------------------------------------------------------------------------- 1 | # #::HDR:: ;; # App: TA-user-agents ;; # File: props.conf ;; # Updated: 2023-06-13 14:37:48 2 | 3 | [source::...TA-user-agents.log] 4 | sourcetype = ta-user-agents -------------------------------------------------------------------------------- /TA-user-agents/default/server.conf: -------------------------------------------------------------------------------- 1 | # #::HDR:: ;; # App: TA-user-agents ;; # File: server.conf ;; # Updated: 2023-06-13 14:37:48 2 | 3 | [shclustering] 4 | conf_replication_include.apl_logging = true 5 | conf_replication_include.app = true -------------------------------------------------------------------------------- /TA-user-agents/default/transforms.conf: -------------------------------------------------------------------------------- 1 | # #::HDR:: ;; # App: TA-user-agents ;; # File: transforms.conf ;; # Updated: 2023-06-13 14:37:48 2 | 3 | [user_agents] 4 | external_cmd = user_agents.py http_user_agent ua_os_family ua_os_major ua_os_minor ua_os_patch ua_os_patch_minor ua_family ua_major ua_minor ua_patch ua_device 5 | external_type = python 6 | fields_list = http_user_agent,ua_os_family,ua_os_major,ua_os_minor,ua_os_patch,ua_os_patch_minor,ua_family,ua_major,ua_minor,ua_patch,ua_device 7 | python.version = python3 -------------------------------------------------------------------------------- /TA-user-agents/file.manifest: -------------------------------------------------------------------------------- 1 | fe18f5a6ddf995b1caea599b660553b6,LICENSE 2 | c44b982c9b008afc7d9b8a7a4d70b02e,README/apl_logging.conf.spec 3 | e0df325c28da07aaa002a42dfd9588cd,README.md 4 | 9835165a065b6ea59681b6cc19566ba6,UPDATING.md 5 | 98340f91c58596daa2bf4f0a7b7420d5,app.manifest 6 | 7fe225aaa2170539ac295feb37c9498f,appserver/static/appIcon.png 7 | 80dd6d813310bd5d21139ed0359b45af,appserver/static/css/dashboard.css 8 | 7fe225aaa2170539ac295feb37c9498f,appserver/static/appIconAlt.png 9 | 80dd6d813310bd5d21139ed0359b45af,appserver/static/dashboard.css 10 | 53c6597a5c3e93b5973d77e2831cc2ee,appserver/static/appIconAlt_2x.png 11 | 53c6597a5c3e93b5973d77e2831cc2ee,appserver/static/appIcon_2x.png 12 | e0df325c28da07aaa002a42dfd9588cd,appserver/static/documentation/README.md 13 | 8bbc1083193774baadd127fa74faf286,appserver/static/documentation/index.html 14 | e0df325c28da07aaa002a42dfd9588cd,appserver/static/README.md 15 | 24dfc001b11f9020f62f0b09cc8bd184,appserver/static/details.md 16 | a6c6cfe794800b659f85b82854ffa31c,appserver/static/installation.md 17 | 854a43ee8e929977ade3917f7f7a2b8e,appserver/static/third_party.md 18 | dc4c1398aa22a7b3f05b2f756d2fec17,appserver/static/troubleshooting.md 19 | bb7676b8f2675149614872b232eca1c5,bin/.gitignore 20 | cf63f6a3e92349ccd781f40e1c103d4c,bin/Utilities.py 21 | 4d2710b3db8d4ea3fd88ec6bc0129627,bin/_yaml/__init__.py 22 | ea6851dceaf87025a10bed0607203afe,bin/fetch_latest.sample 23 | ce581bc56d968a99468c372c9dbcdad2,bin/requirements.txt 24 | 8b0098d9eb5d75f5cea748b9187f736e,bin/ua_parser/__init__.py 25 | 10c38b1697bd3a858cb8517c547db815,bin/ua_parser/_regexes.py 26 | fd9878ae4b58a081df48a78734ac043c,bin/ua_parser/user_agent_parser.py 27 | e96fee060b7375014279aa668f9242e0,bin/ua_parser/user_agent_parser_test.py 28 | 2064b5229a7df1abf581a595bc6048ba,bin/uap-core/CONTRIBUTING.md 29 | 56a33776fdfb4656520ced7bd1034292,bin/uap-core/LICENSE 30 | cdb59f64fe9069c88f7fe14c2d463374,bin/uap-core/docs/specification.md 31 | 7e8f15943469c24a1316388d9765ffc4,bin/uap-core/package.json 32 | 8a6e05307448ea49f18258ff852a7a52,bin/uap-core/regexes.yaml 33 | 84fba9544c63f14b19d7eb2437358c8e,bin/uap-core/test_resources/additional_os_tests.yaml 34 | 5dca707671994574f178bfdaab7a77eb,bin/uap-core/test_resources/firefox_user_agent_strings.txt 35 | 7d9fcffb3c25d62d681ee9ab471eb363,bin/uap-core/test_resources/firefox_user_agent_strings.yaml 36 | b9084c0d2c4992753e5df9beadde15c1,bin/uap-core/test_resources/opera_mini_user_agent_strings.yaml 37 | d3ead853c59982f37eb44cc719b6a101,bin/uap-core/test_resources/pgts_browser_list-orig.yaml 38 | c2f435618e7a85993466a23b28e3c746,bin/uap-core/test_resources/pgts_browser_list.txt 39 | 186a7c43ec717b9312edec01ecda28a2,bin/uap-core/test_resources/pgts_browser_list.yaml 40 | 9c4759264929cbe412c30e2a1ca5784e,bin/uap-core/test_resources/podcasting_user_agent_strings.yaml 41 | 885c6087e2348961b37b1dd850ab9b40,bin/uap-core/test_resources/transform-pgts_browser_list.pl 42 | 6168a49cadb90a656757b28836dda3e4,bin/uap-core/tests/regexes.js 43 | 25102e6d75809c1ba5222724de5d6215,bin/uap-core/tests/sample.js 44 | 7fc93cf55b57338c03a81326d831843d,bin/uap-core/tests/test.js 45 | 02e4f940bc8cebb9f48c6e553023b41a,bin/uap-core/tests/test_device.yaml 46 | 1afc432ce9f79d5e4becf43f69371047,bin/uap-core/tests/test_os.yaml 47 | df538932af095539f7f9e7b126d106a4,bin/uap-core/tests/test_ua.yaml 48 | 7d5972507d004695eb0e17c4363b66d7,bin/user_agents.py 49 | 893a619fabf92e96a6877561ee162c1c,bin/version.py 50 | ca22489c5afa041f4a0b33fa6b71cbf2,bin/yaml/__init__.py 51 | c6e483eed9e1974ef2f01c8a7260276f,bin/yaml/composer.py 52 | 3722e375c216e7b1703de5973f6f0ad6,bin/yaml/constructor.py 53 | 601ef9aed47d0db72c34206680e2e344,bin/yaml/cyaml.py 54 | e0f0ca9c666a9a01791edbd817348a3f,bin/yaml/dumper.py 55 | 38e45073c42b4d3a89d25757577a9f5d,bin/yaml/emitter.py 56 | f2e05076835b7979ea3306bc49e9d70a,bin/yaml/error.py 57 | 040482aa0aa48c6f93a860a3bdba15f6,bin/yaml/events.py 58 | 11df43922cff707581230e7696e4a057,bin/yaml/loader.py 59 | f6e521b283d7539fb2bd48cb5ade5365,bin/yaml/nodes.py 60 | 76162f1345a16482938965d80a699e45,bin/yaml/parser.py 61 | ad6598cbeb6f768738d992fd6a27f1a4,bin/yaml/reader.py 62 | 37c9d5574052eb49f499fe2aba76c0df,bin/yaml/representer.py 63 | 5d424730938bfb35c7d6a7be6edc7f2b,bin/yaml/resolver.py 64 | 429dc0706c6f3606643a8ab749fd6f8b,bin/yaml/scanner.py 65 | ac5b86cbaa857699312176cba7490cc2,bin/yaml/serializer.py 66 | 33423c7f46708cc3884a32b8ed937ac6,bin/yaml/tokens.py 67 | 33b627d1833d1f2298f80230a4ff13e8,bin/Diag.py 68 | 0dc458c7607af6aa2fff2b573280d67a,bin/app_properties.py 69 | f85503236c2b5aa916ef2ee0b2dec727,default/apl_logging.conf 70 | 544f0dde7b436ab7d5814562f166baee,default/app.conf 71 | 1f227b6ecfedb83a74f812523fc8b10c,default/checklist.conf 72 | 4be98579c60224b2464c3a03437f01b1,default/data/ui/nav/default.xml 73 | 84d856d52f7cb703c3f6d3f2a8b6a10b,default/log.cfg 74 | 661afce2f8f450a02f7f0a1b7af7f88f,default/props.conf 75 | 2166347dddbfa0538e5ad2d7fcfe561c,default/server.conf 76 | d76ab46b3e77b6d586b5966108163d04,default/transforms.conf 77 | 5509d464ee8f3704a2f24ddaec58302f,metadata/default.meta 78 | 7fe225aaa2170539ac295feb37c9498f,static/appIcon.png 79 | 7fe225aaa2170539ac295feb37c9498f,static/appIconAlt.png 80 | 53c6597a5c3e93b5973d77e2831cc2ee,static/appIconAlt_2x.png 81 | 53c6597a5c3e93b5973d77e2831cc2ee,static/appIcon_2x.png 82 | -------------------------------------------------------------------------------- /TA-user-agents/metadata/default.meta: -------------------------------------------------------------------------------- 1 | # #::HDR:: ;; # App: TA-user-agents ;; # File: default.meta ;; # Updated: 2023-06-13 14:37:48 2 | 3 | [] 4 | access = read: [ * ], write: [ admin, sc_admin ] 5 | export = system 6 | 7 | [checklist] 8 | access = read: [ * ], write: [ admin, sc_admin ] 9 | export = system -------------------------------------------------------------------------------- /TA-user-agents/static/appIcon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/static/appIcon.png -------------------------------------------------------------------------------- /TA-user-agents/static/appIconAlt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/static/appIconAlt.png -------------------------------------------------------------------------------- /TA-user-agents/static/appIconAlt_2x.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/static/appIconAlt_2x.png -------------------------------------------------------------------------------- /TA-user-agents/static/appIcon_2x.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/TA-user-agents/static/appIcon_2x.png -------------------------------------------------------------------------------- /manual_check_definitions.json: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aplura/TA-user-agents/c6479f6dac3d17a12f82b7571f9c10feb446871a/manual_check_definitions.json --------------------------------------------------------------------------------