├── .flake8 ├── .gitignore ├── .gitlab-ci.yml ├── .gitmodules ├── LICENSE ├── Pipfile ├── Pipfile.lock ├── README.md ├── debug_adapter ├── __init__.py ├── __main__.py ├── base_schema.py ├── cli.py ├── command_processor.py ├── debug_adapter.py ├── internal_classes.py ├── log.py ├── schema.py ├── threads.py └── tools.py ├── debug_adapter_main.py ├── docs ├── src │ ├── Doxyfile │ ├── doxygen.md │ └── img │ │ └── testing │ │ ├── Part1ConnectionСhecking.png │ │ ├── Part2RequestsChecking.png │ │ └── Part3ManualEnd-to-endTesting.png ├── start_modes_and_arguments.md └── testing.md ├── esp_dap_adapter.code-workspace ├── requirements.txt ├── scripts ├── gen_debugger_json.py └── gen_debugger_protocol.py └── tests ├── __init__.py ├── conftest.py ├── environment.py ├── helpers.py ├── patterns ├── __init__.py ├── _impl.py ├── dap.py └── some.py ├── pytest_fixtures.py ├── requirements.txt ├── standard_requests.py ├── t_session.py ├── target ├── blink.elf ├── build_all.py ├── coredump.elf └── host │ ├── test_app.c │ └── test_app_src2.c ├── test_arguments.py ├── test_basic.py ├── test_debug_actions.py ├── test_patterns.py ├── test_timeline.py ├── timeline.md └── timeline.py /.flake8: -------------------------------------------------------------------------------- 1 | [flake8] 2 | max-line-length = 120 3 | exclude = .venv 4 | .git 5 | __pycache__ 6 | debug_backend 7 | scripts 8 | debug_adapter/schema.py 9 | debug_adapter/base_schema.py 10 | tests/debugpy 11 | tests/patterns 12 | tests/test_timeline.py 13 | tests/timeline.py 14 | max-complexity = 10 15 | ignore = F405 16 | per-file-ignores = 17 | */__init__.py:F401,F403 18 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # ide and env dirs 2 | /.idea 3 | /.venv 4 | /.vscode 5 | /.venv 6 | 7 | # ci related 8 | flake8_output.txt 9 | 10 | # temp folders 11 | /temp* 12 | /tmp 13 | 14 | # test generated files 15 | /.pytest_cache/* 16 | /tests/results/* 17 | /tests/target/host/test_app 18 | /tests/target/host/test_app.exe 19 | TEST*.xml 20 | 21 | # files 22 | *.pyc 23 | *.log 24 | -------------------------------------------------------------------------------- /.gitlab-ci.yml: -------------------------------------------------------------------------------- 1 | stages: 2 | - checks 3 | - tests 4 | 5 | cache: 6 | paths: 7 | - node_modules 8 | 9 | variables: 10 | GIT_SUBMODULE_STRATEGY: normal 11 | CI_DEBUG_TRACE: "true" 12 | 13 | before_script: 14 | - export LC_ALL=C.UTF-8 15 | - export LANG=C.UTF-8 16 | 17 | check_python_style: 18 | stage: checks 19 | image: python:3.7-slim-buster 20 | artifacts: 21 | when: on_failure 22 | paths: 23 | - flake8_output.txt 24 | expire_in: 1 week 25 | before_script: 26 | - python3 -m pip install flake8 27 | script: 28 | - python3 -m flake8 --config=.flake8 --output-file=flake8_output.txt --tee 29 | 30 | 31 | tests_unit: 32 | image: $CI_DOCKER_REGISTRY/target-test-env-v5.2:1 33 | stage: tests 34 | tags: 35 | - gdb_amd64_test 36 | artifacts: 37 | paths: 38 | - "*.log" 39 | - tests/*.log 40 | - tests/results/* 41 | reports: 42 | junit: 43 | - tests/results/* 44 | when: always 45 | expire_in: 1 week 46 | script: 47 | - apt-get update && apt-get install -y build-essential gdb gcc 48 | - python3 -m pip install -r ./requirements.txt 49 | - python3 -m pip install -r ./tests/requirements.txt 50 | - python3 ./tests/target/build_all.py 51 | # make 'xtensa-esp32-elf-gdb' accessible via PATH 52 | - export PATH=$(dirname $(find /root -name xtensa-esp32-elf-gdb)):$PATH 53 | - cd $CI_PROJECT_DIR/tests 54 | - python3 -m pytest 55 | -------------------------------------------------------------------------------- /.gitmodules: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/espressif/esp-debug-adapter/0cabfa41372ae7b8d8133b0f82c7eb8e6025adc3/.gitmodules -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Eclipse Public License - v 2.0 2 | 3 | THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE 4 | PUBLIC LICENSE ("AGREEMENT"). ANY USE, REPRODUCTION OR DISTRIBUTION 5 | OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT. 6 | 7 | 1. DEFINITIONS 8 | 9 | "Contribution" means: 10 | 11 | a) in the case of the initial Contributor, the initial content 12 | Distributed under this Agreement, and 13 | 14 | b) in the case of each subsequent Contributor: 15 | i) changes to the Program, and 16 | ii) additions to the Program; 17 | where such changes and/or additions to the Program originate from 18 | and are Distributed by that particular Contributor. A Contribution 19 | "originates" from a Contributor if it was added to the Program by 20 | such Contributor itself or anyone acting on such Contributor's behalf. 21 | Contributions do not include changes or additions to the Program that 22 | are not Modified Works. 23 | 24 | "Contributor" means any person or entity that Distributes the Program. 25 | 26 | "Licensed Patents" mean patent claims licensable by a Contributor which 27 | are necessarily infringed by the use or sale of its Contribution alone 28 | or when combined with the Program. 29 | 30 | "Program" means the Contributions Distributed in accordance with this 31 | Agreement. 32 | 33 | "Recipient" means anyone who receives the Program under this Agreement 34 | or any Secondary License (as applicable), including Contributors. 35 | 36 | "Derivative Works" shall mean any work, whether in Source Code or other 37 | form, that is based on (or derived from) the Program and for which the 38 | editorial revisions, annotations, elaborations, or other modifications 39 | represent, as a whole, an original work of authorship. 40 | 41 | "Modified Works" shall mean any work in Source Code or other form that 42 | results from an addition to, deletion from, or modification of the 43 | contents of the Program, including, for purposes of clarity any new file 44 | in Source Code form that contains any contents of the Program. Modified 45 | Works shall not include works that contain only declarations, 46 | interfaces, types, classes, structures, or files of the Program solely 47 | in each case in order to link to, bind by name, or subclass the Program 48 | or Modified Works thereof. 49 | 50 | "Distribute" means the acts of a) distributing or b) making available 51 | in any manner that enables the transfer of a copy. 52 | 53 | "Source Code" means the form of a Program preferred for making 54 | modifications, including but not limited to software source code, 55 | documentation source, and configuration files. 56 | 57 | "Secondary License" means either the GNU General Public License, 58 | Version 2.0, or any later versions of that license, including any 59 | exceptions or additional permissions as identified by the initial 60 | Contributor. 61 | 62 | 2. GRANT OF RIGHTS 63 | 64 | a) Subject to the terms of this Agreement, each Contributor hereby 65 | grants Recipient a non-exclusive, worldwide, royalty-free copyright 66 | license to reproduce, prepare Derivative Works of, publicly display, 67 | publicly perform, Distribute and sublicense the Contribution of such 68 | Contributor, if any, and such Derivative Works. 69 | 70 | b) Subject to the terms of this Agreement, each Contributor hereby 71 | grants Recipient a non-exclusive, worldwide, royalty-free patent 72 | license under Licensed Patents to make, use, sell, offer to sell, 73 | import and otherwise transfer the Contribution of such Contributor, 74 | if any, in Source Code or other form. This patent license shall 75 | apply to the combination of the Contribution and the Program if, at 76 | the time the Contribution is added by the Contributor, such addition 77 | of the Contribution causes such combination to be covered by the 78 | Licensed Patents. The patent license shall not apply to any other 79 | combinations which include the Contribution. No hardware per se is 80 | licensed hereunder. 81 | 82 | c) Recipient understands that although each Contributor grants the 83 | licenses to its Contributions set forth herein, no assurances are 84 | provided by any Contributor that the Program does not infringe the 85 | patent or other intellectual property rights of any other entity. 86 | Each Contributor disclaims any liability to Recipient for claims 87 | brought by any other entity based on infringement of intellectual 88 | property rights or otherwise. As a condition to exercising the 89 | rights and licenses granted hereunder, each Recipient hereby 90 | assumes sole responsibility to secure any other intellectual 91 | property rights needed, if any. For example, if a third party 92 | patent license is required to allow Recipient to Distribute the 93 | Program, it is Recipient's responsibility to acquire that license 94 | before distributing the Program. 95 | 96 | d) Each Contributor represents that to its knowledge it has 97 | sufficient copyright rights in its Contribution, if any, to grant 98 | the copyright license set forth in this Agreement. 99 | 100 | e) Notwithstanding the terms of any Secondary License, no 101 | Contributor makes additional grants to any Recipient (other than 102 | those set forth in this Agreement) as a result of such Recipient's 103 | receipt of the Program under the terms of a Secondary License 104 | (if permitted under the terms of Section 3). 105 | 106 | 3. REQUIREMENTS 107 | 108 | 3.1 If a Contributor Distributes the Program in any form, then: 109 | 110 | a) the Program must also be made available as Source Code, in 111 | accordance with section 3.2, and the Contributor must accompany 112 | the Program with a statement that the Source Code for the Program 113 | is available under this Agreement, and informs Recipients how to 114 | obtain it in a reasonable manner on or through a medium customarily 115 | used for software exchange; and 116 | 117 | b) the Contributor may Distribute the Program under a license 118 | different than this Agreement, provided that such license: 119 | i) effectively disclaims on behalf of all other Contributors all 120 | warranties and conditions, express and implied, including 121 | warranties or conditions of title and non-infringement, and 122 | implied warranties or conditions of merchantability and fitness 123 | for a particular purpose; 124 | 125 | ii) effectively excludes on behalf of all other Contributors all 126 | liability for damages, including direct, indirect, special, 127 | incidental and consequential damages, such as lost profits; 128 | 129 | iii) does not attempt to limit or alter the recipients' rights 130 | in the Source Code under section 3.2; and 131 | 132 | iv) requires any subsequent distribution of the Program by any 133 | party to be under a license that satisfies the requirements 134 | of this section 3. 135 | 136 | 3.2 When the Program is Distributed as Source Code: 137 | 138 | a) it must be made available under this Agreement, or if the 139 | Program (i) is combined with other material in a separate file or 140 | files made available under a Secondary License, and (ii) the initial 141 | Contributor attached to the Source Code the notice described in 142 | Exhibit A of this Agreement, then the Program may be made available 143 | under the terms of such Secondary Licenses, and 144 | 145 | b) a copy of this Agreement must be included with each copy of 146 | the Program. 147 | 148 | 3.3 Contributors may not remove or alter any copyright, patent, 149 | trademark, attribution notices, disclaimers of warranty, or limitations 150 | of liability ("notices") contained within the Program from any copy of 151 | the Program which they Distribute, provided that Contributors may add 152 | their own appropriate notices. 153 | 154 | 4. COMMERCIAL DISTRIBUTION 155 | 156 | Commercial distributors of software may accept certain responsibilities 157 | with respect to end users, business partners and the like. While this 158 | license is intended to facilitate the commercial use of the Program, 159 | the Contributor who includes the Program in a commercial product 160 | offering should do so in a manner which does not create potential 161 | liability for other Contributors. Therefore, if a Contributor includes 162 | the Program in a commercial product offering, such Contributor 163 | ("Commercial Contributor") hereby agrees to defend and indemnify every 164 | other Contributor ("Indemnified Contributor") against any losses, 165 | damages and costs (collectively "Losses") arising from claims, lawsuits 166 | and other legal actions brought by a third party against the Indemnified 167 | Contributor to the extent caused by the acts or omissions of such 168 | Commercial Contributor in connection with its distribution of the Program 169 | in a commercial product offering. The obligations in this section do not 170 | apply to any claims or Losses relating to any actual or alleged 171 | intellectual property infringement. In order to qualify, an Indemnified 172 | Contributor must: a) promptly notify the Commercial Contributor in 173 | writing of such claim, and b) allow the Commercial Contributor to control, 174 | and cooperate with the Commercial Contributor in, the defense and any 175 | related settlement negotiations. The Indemnified Contributor may 176 | participate in any such claim at its own expense. 177 | 178 | For example, a Contributor might include the Program in a commercial 179 | product offering, Product X. That Contributor is then a Commercial 180 | Contributor. If that Commercial Contributor then makes performance 181 | claims, or offers warranties related to Product X, those performance 182 | claims and warranties are such Commercial Contributor's responsibility 183 | alone. Under this section, the Commercial Contributor would have to 184 | defend claims against the other Contributors related to those performance 185 | claims and warranties, and if a court requires any other Contributor to 186 | pay any damages as a result, the Commercial Contributor must pay 187 | those damages. 188 | 189 | 5. NO WARRANTY 190 | 191 | EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT 192 | PERMITTED BY APPLICABLE LAW, THE PROGRAM IS PROVIDED ON AN "AS IS" 193 | BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR 194 | IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF 195 | TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR 196 | PURPOSE. Each Recipient is solely responsible for determining the 197 | appropriateness of using and distributing the Program and assumes all 198 | risks associated with its exercise of rights under this Agreement, 199 | including but not limited to the risks and costs of program errors, 200 | compliance with applicable laws, damage to or loss of data, programs 201 | or equipment, and unavailability or interruption of operations. 202 | 203 | 6. DISCLAIMER OF LIABILITY 204 | 205 | EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT 206 | PERMITTED BY APPLICABLE LAW, NEITHER RECIPIENT NOR ANY CONTRIBUTORS 207 | SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, 208 | EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST 209 | PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN 210 | CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) 211 | ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE 212 | EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE 213 | POSSIBILITY OF SUCH DAMAGES. 214 | 215 | 7. GENERAL 216 | 217 | If any provision of this Agreement is invalid or unenforceable under 218 | applicable law, it shall not affect the validity or enforceability of 219 | the remainder of the terms of this Agreement, and without further 220 | action by the parties hereto, such provision shall be reformed to the 221 | minimum extent necessary to make such provision valid and enforceable. 222 | 223 | If Recipient institutes patent litigation against any entity 224 | (including a cross-claim or counterclaim in a lawsuit) alleging that the 225 | Program itself (excluding combinations of the Program with other software 226 | or hardware) infringes such Recipient's patent(s), then such Recipient's 227 | rights granted under Section 2(b) shall terminate as of the date such 228 | litigation is filed. 229 | 230 | All Recipient's rights under this Agreement shall terminate if it 231 | fails to comply with any of the material terms or conditions of this 232 | Agreement and does not cure such failure in a reasonable period of 233 | time after becoming aware of such noncompliance. If all Recipient's 234 | rights under this Agreement terminate, Recipient agrees to cease use 235 | and distribution of the Program as soon as reasonably practicable. 236 | However, Recipient's obligations under this Agreement and any licenses 237 | granted by Recipient relating to the Program shall continue and survive. 238 | 239 | Everyone is permitted to copy and distribute copies of this Agreement, 240 | but in order to avoid inconsistency the Agreement is copyrighted and 241 | may only be modified in the following manner. The Agreement Steward 242 | reserves the right to publish new versions (including revisions) of 243 | this Agreement from time to time. No one other than the Agreement 244 | Steward has the right to modify this Agreement. The Eclipse Foundation 245 | is the initial Agreement Steward. The Eclipse Foundation may assign the 246 | responsibility to serve as the Agreement Steward to a suitable separate 247 | entity. Each new version of the Agreement will be given a distinguishing 248 | version number. The Program (including Contributions) may always be 249 | Distributed subject to the version of the Agreement under which it was 250 | received. In addition, after a new version of the Agreement is published, 251 | Contributor may elect to Distribute the Program (including its 252 | Contributions) under the new version. 253 | 254 | Except as expressly stated in Sections 2(a) and 2(b) above, Recipient 255 | receives no rights or licenses to the intellectual property of any 256 | Contributor under this Agreement, whether expressly, by implication, 257 | estoppel or otherwise. All rights in the Program not expressly granted 258 | under this Agreement are reserved. Nothing in this Agreement is intended 259 | to be enforceable by any entity that is not a Contributor or Recipient. 260 | No third-party beneficiary rights are created under this Agreement. 261 | 262 | Exhibit A - Form of Secondary Licenses Notice 263 | 264 | "This Source Code may also be made available under the following 265 | Secondary Licenses when the conditions for such availability set forth 266 | in the Eclipse Public License, v. 2.0 are satisfied: {name license(s), 267 | version(s), and exceptions or additional permissions here}." 268 | 269 | Simply including a copy of this Agreement, including this Exhibit A 270 | is not sufficient to license the Source Code under Secondary Licenses. 271 | 272 | If it is not possible or desirable to put the notice in a particular 273 | file, then You may include the notice in a location (such as a LICENSE 274 | file in a relevant directory) where a recipient would be likely to 275 | look for such a notice. 276 | 277 | You may add additional accurate notices of copyright ownership. 278 | -------------------------------------------------------------------------------- /Pipfile: -------------------------------------------------------------------------------- 1 | [[source]] 2 | url = "https://pypi.org/simple" 3 | verify_ssl = true 4 | name = "pypi" 5 | 6 | [packages] 7 | click = "*" 8 | esp-debug-backend = "*" 9 | requests = ">=2.21.0" 10 | 11 | [dev-packages] 12 | 13 | [requires] 14 | python_version = ">3.7" 15 | -------------------------------------------------------------------------------- /Pipfile.lock: -------------------------------------------------------------------------------- 1 | { 2 | "_meta": { 3 | "hash": { 4 | "sha256": "4eb266e74b7b5be1ce33448cfd2640037f11cf75fdc7cfbe946a48c563925c44" 5 | }, 6 | "pipfile-spec": 6, 7 | "requires": { 8 | "python_version": ">3.7" 9 | }, 10 | "sources": [ 11 | { 12 | "name": "pypi", 13 | "url": "https://pypi.org/simple", 14 | "verify_ssl": true 15 | } 16 | ] 17 | }, 18 | "default": { 19 | "certifi": { 20 | "hashes": [ 21 | "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3", 22 | "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18" 23 | ], 24 | "markers": "python_version >= '3.6'", 25 | "version": "==2022.12.7" 26 | }, 27 | "charset-normalizer": { 28 | "hashes": [ 29 | "sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6", 30 | "sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1", 31 | "sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e", 32 | "sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373", 33 | "sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62", 34 | "sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230", 35 | "sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be", 36 | "sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c", 37 | "sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0", 38 | "sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448", 39 | "sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f", 40 | "sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649", 41 | "sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d", 42 | "sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0", 43 | "sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706", 44 | "sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a", 45 | "sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59", 46 | "sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23", 47 | "sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5", 48 | "sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb", 49 | "sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e", 50 | "sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e", 51 | "sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c", 52 | "sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28", 53 | "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d", 54 | "sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41", 55 | "sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974", 56 | "sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce", 57 | "sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f", 58 | "sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1", 59 | "sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d", 60 | "sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8", 61 | "sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017", 62 | "sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31", 63 | "sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7", 64 | "sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8", 65 | "sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e", 66 | "sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14", 67 | "sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd", 68 | "sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d", 69 | "sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795", 70 | "sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b", 71 | "sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b", 72 | "sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b", 73 | "sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203", 74 | "sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f", 75 | "sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19", 76 | "sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1", 77 | "sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a", 78 | "sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac", 79 | "sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9", 80 | "sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0", 81 | "sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137", 82 | "sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f", 83 | "sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6", 84 | "sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5", 85 | "sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909", 86 | "sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f", 87 | "sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0", 88 | "sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324", 89 | "sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755", 90 | "sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb", 91 | "sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854", 92 | "sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c", 93 | "sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60", 94 | "sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84", 95 | "sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0", 96 | "sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b", 97 | "sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1", 98 | "sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531", 99 | "sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1", 100 | "sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11", 101 | "sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326", 102 | "sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df", 103 | "sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab" 104 | ], 105 | "markers": "python_version >= '3.7'", 106 | "version": "==3.1.0" 107 | }, 108 | "click": { 109 | "hashes": [ 110 | "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e", 111 | "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48" 112 | ], 113 | "index": "pypi", 114 | "version": "==8.1.3" 115 | }, 116 | "esp-debug-backend": { 117 | "hashes": [ 118 | "sha256:24a14edacc8ea91c461a7ff739ad8a2c907458c69e1b079a44e33c1d81868eb3", 119 | "sha256:bff2d720e39a3c89f4652b0a9c6d119be6ca6b0ec1c140b4a4bc5133da205f90" 120 | ], 121 | "index": "pypi", 122 | "version": "==1.0.2" 123 | }, 124 | "idna": { 125 | "hashes": [ 126 | "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4", 127 | "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2" 128 | ], 129 | "markers": "python_version >= '3.5'", 130 | "version": "==3.4" 131 | }, 132 | "pygdbmi": { 133 | "hashes": [ 134 | "sha256:7a286be2fcf25650d9f66e11adc46e972cf078a466864a700cd44739ad261fb0", 135 | "sha256:f7cac28e1d558927444c880ed1e65da1a5d8686121a3aac16f42fb84d3ceb60d" 136 | ], 137 | "version": "==0.11.0.0" 138 | }, 139 | "requests": { 140 | "hashes": [ 141 | "sha256:64299f4909223da747622c030b781c0d7811e359c37124b4bd368fb8c6518baa", 142 | "sha256:98b1b2782e3c6c4904938b84c0eb932721069dfdb9134313beff7c83c2df24bf" 143 | ], 144 | "index": "pypi", 145 | "version": "==2.28.2" 146 | }, 147 | "urllib3": { 148 | "hashes": [ 149 | "sha256:8a388717b9476f934a21484e8c8e61875ab60644d29b9b39e11e4b9dc1c6b305", 150 | "sha256:aa751d169e23c7479ce47a0cb0da579e3ede798f994f5816a74e4f4500dcea42" 151 | ], 152 | "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'", 153 | "version": "==1.26.15" 154 | } 155 | }, 156 | "develop": {} 157 | } 158 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # esp_debug_adapter 2 | 3 | esp_debug_adapter works as connecting part between IDE and an Debug server. From the IDE side it is needed to follow [DAP](https://microsoft.github.io/debug-adapter-protocol/), from Debug server this adapter is compatible with [GDB/MI](https://www.sourceware.org/gdb/onlinedocs/gdb/GDB_002fMI.html) protocol. 4 | 5 | Table of Content: 6 | 7 | * [Usage](#Usage) 8 | * [References](#References) 9 | 10 | See also: 11 | 12 | * [Start Modes and Arguments](docs/start_modes_and_arguments.md) 13 | * [Testing](docs/testing.md) 14 | * [Auto-generated documentation](docs/src/doxygen.md) 15 | 16 | ## Usage 17 | 18 | For advanced arguments description follow to specific section: [Start Modes and Arguments](docs/start_modes_and_arguments.md) 19 | 20 | Basic description is bellow: 21 | 22 | ```bash 23 | Usage: debug_adapter_main.py [OPTIONS] 24 | 25 | Options: 26 | -a, --app_flash_off INTEGER Program start address offset 27 | (ESP32_APP_FLASH_OFF) [default: 65536] 28 | -b, --board-type TEXT Type of the board to run tests on (you could 29 | use OOCD_TEST_BOARD envvar by default) 30 | -d, --debug INTEGER Debug level (0-4), 5 - for a full OOCD log 31 | [default: 2] 32 | -dn, --device-name TEXT The name of used hardware to debug 33 | (currently Esp32 or Esp32_S2). It defines 34 | --toolchain-prefix 35 | -p, --port INTEGER Listen on given port for VS Code connections 36 | [default: 43474] 37 | -pm, --postmortem Run the adapter without target in 'read- 38 | only' mode 39 | --developer-mode [none|connection-check|x86-test] 40 | Modes for development purposes [default: 41 | none] 42 | -l, --log-file TEXT Path to log file. 43 | -lm, --log-mult-files Log to separated files 44 | -t, --toolchain-prefix TEXT (If not set, controlled by --device-name!) 45 | Toolchain prefix. If set, rewrites the value 46 | specified by --device-name. 47 | -e, --elfpath TEXT A path to elf files for debugging. You can 48 | use several elf files e.g. `-e file1.elf -e 49 | file2.elf` 50 | -c, --core-file TEXT Use a file as a core dump to examine. 51 | -x, --cmdfile TEXT Path to a command file containing commands 52 | to automatic execute during a program 53 | startup 54 | -tsf, --tmo-scale-factor Scale factor for gdb timeout [default:1] 55 | -o, --oocd TEXT Path to OpenOCD binary file, (used 56 | OPENOCD_BIN envvar or (if not set) 'openocd' 57 | by default) [default: openocd] 58 | -oa, --oocd-args TEXT (If not set, drives by --device-name!) 59 | Specifies custom OpenOCD args. If set, 60 | rewrites the value specified by --device- 61 | name. 62 | -om, --oocd-mode [run_and_connect|connect_to_instance|without_oocd] 63 | Cooperation with OpenOCD [default: 64 | connect_to_instance] 65 | -ip, --oocd-ip TEXT Ip for remote OpenOCD connection [default: 66 | localhost] 67 | -s, --oocd-scripts TEXT Path to OpenOCD TCL scripts (use 68 | OPENOCD_SCRIPTS envvar by default) 69 | --help Show this message and exit. 70 | ``` 71 | 72 | 73 | 74 | ## References 75 | 76 | * This software used at the Espressif's vscode-plugin, but could be run standalone 77 | 78 | * The project contains debug adapter written in python following a protocol my Microsoft https://microsoft.github.io/debug-adapter-protocol/ 79 | 80 | 81 | ## Credits and licensing 82 | 83 | This software is based on a [tutorial](https://github.com/fabioz/python_debug_adapter_tutorial) by Fabio Zadrozny. 84 | 85 | All original code in this repository is Copyright 2020 Espressif Systems (Shanghai) Co. Ltd. 86 | 87 | The project is released under Eclipse Public License 2.0. 88 | 89 | In case an individual source file is also available under a different license, this is indicated in the file itself. 90 | -------------------------------------------------------------------------------- /debug_adapter/__init__.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from .debug_adapter import DebugAdapter, A2VSC_READY2CONNECT_STRING, A2VSC_STOPPED_STRING, A2VSC_STARTED_STRING 26 | from .internal_classes import DaArgs, DaDevModes, DaRunState, DaStates, DaOpenOcdModes 27 | from . import schema 28 | from .cli import cli 29 | from .tools import * 30 | __version__ = "1.1.0" 31 | -------------------------------------------------------------------------------- /debug_adapter/__main__.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | import sys 26 | from .cli import cli 27 | 28 | if __name__ == '__main__': 29 | cli(sys.argv[1:]) 30 | -------------------------------------------------------------------------------- /debug_adapter/base_schema.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Fabio Zadrozny 2 | # 3 | # This program and the accompanying materials are made 4 | # available under the terms of the Eclipse Public License 2.0 5 | # which is available at https://www.eclipse.org/legal/epl-2.0/ 6 | # 7 | # SPDX-License-Identifier: EPL-2.0 8 | from .log import debug_exception 9 | 10 | class BaseSchema(object): 11 | 12 | def to_json(self): 13 | import json 14 | return json.dumps(self.to_dict()) 15 | 16 | 17 | _requests_to_types = {} 18 | _responses_to_types = {} 19 | _all_messages = {} 20 | 21 | 22 | def register(cls): 23 | _all_messages[cls.__name__] = cls 24 | return cls 25 | 26 | 27 | def register_request(command): 28 | 29 | def do_register(cls): 30 | _requests_to_types[command] = cls 31 | return cls 32 | 33 | return do_register 34 | 35 | 36 | def register_response(command): 37 | 38 | def do_register(cls): 39 | _responses_to_types[command] = cls 40 | return cls 41 | 42 | return do_register 43 | 44 | 45 | def from_dict(dct): 46 | msg_type = dct.get('type') 47 | if msg_type is None: 48 | raise ValueError('Unable to make sense of message: %s' % (dct,)) 49 | 50 | if msg_type == 'request': 51 | cls = _requests_to_types[dct['command']] 52 | try: 53 | return cls(**dct) 54 | except: 55 | msg = 'Error creating %s from %s' % (cls, dct) 56 | debug_exception(msg) 57 | raise ValueError(msg) 58 | 59 | elif msg_type == 'response': 60 | cls = _responses_to_types[dct['command']] 61 | try: 62 | return cls(**dct) 63 | except: 64 | msg = 'Error creating %s from %s' % (cls, dct) 65 | debug_exception(msg) 66 | raise ValueError(msg) 67 | 68 | raise ValueError('Unable to create message from dict: %s' % (dct,)) 69 | 70 | 71 | def from_json(json_msg): 72 | import json 73 | return from_dict(json.loads(json_msg)) 74 | 75 | 76 | def build_response(request, kwargs=None): 77 | if kwargs is None: 78 | kwargs = {} 79 | response_class = _responses_to_types[request.command] 80 | kwargs.setdefault('seq', -1) # To be overwritten before sending 81 | return response_class(command=request.command, request_seq=request.seq, success=True, **kwargs) 82 | -------------------------------------------------------------------------------- /debug_adapter/cli.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | import os 26 | import sys 27 | import click 28 | from typing import Union 29 | from .debug_adapter import A2VSC_STARTED_STRING, DebugAdapter 30 | from .internal_classes import DaDevModes, DaOpenOcdModes, DaArgs 31 | 32 | h = { 33 | "--app_flash_off": 'Program start address offset (ESP32_APP_FLASH_OFF). The value can be in decimal, binary, octal \ 34 | or hexadecimal formats', 35 | "--board-type": 'Type of the board to run tests on (you could use OOCD_TEST_BOARD envvar by default)', 36 | "--debug": 'Debug level (0-4), 5 - for a full OOCD log', 37 | "--developer-mode": 'Modes for development purposes', 38 | "--device-name": 'The name of used hardware to debug (esp32, esp32s2, etc.).', 39 | "--port": "Listen on given port for VS Code connections", 40 | "--log-file": 'Path to log file.', 41 | "--log-mult-files": 'Log to separated files', 42 | "--log-no-debug-console": 'Turn off output to the debug console of the IDE', 43 | "--toolchain-prefix": 'Toolchain prefix.', 44 | "--elfpath": 'A path to elf files for debugging. You can use several elf files e.g. `-e file1.elf -e ' 45 | 'file2.elf`', 46 | "--core-file": 'Use a file as a core dump to examine.', 47 | "--oocd": 'Path to OpenOCD binary file, (used OPENOCD_BIN envvar or (if not set) ' 48 | '\'openocd\' by default)', 49 | "--oocd-args": "(If not set, drives by --device-name!) Specifies custom OpenOCD args. If set, rewrites the" 50 | " value specified by --device-name.", 51 | "--oocd-mode": 'Cooperation with OpenOCD', 52 | "--oocd-ip": "Ip for remote OpenOCD connection", 53 | "--postmortem": "Run the adapter without target in \'read-only\' mode", 54 | "--oocd-scripts": 'Path to OpenOCD TCL scripts (use OPENOCD_SCRIPTS envvar by default)', 55 | "--cmdfile": 'Path to a command file containing commands to automatic execute during a program startup', 56 | "--tmo-scale-factor": 'Scale factor for gdb timeout (default: 1)' 57 | } 58 | 59 | 60 | class IntegerWithPrefix(click.ParamType): 61 | """ 62 | Custom Cick type accepting numbers in different basis 63 | """ 64 | name = 'integer_with_prefix' 65 | 66 | def convert(self, value, param, ctx): 67 | try: 68 | if isinstance(value, int): 69 | return value 70 | if (isinstance(value, str) and len(value) > 1 and value[0] == '0' and value[1].isnumeric()): 71 | # convert 0123 as 8-based 123 72 | return int(value, 8) 73 | return int(value, 0) 74 | except ValueError: 75 | self.fail('%s is not a valid integer' % value, param, ctx) 76 | 77 | 78 | INT_PREF = IntegerWithPrefix() 79 | 80 | 81 | # TODO Toolchain from "idf.xtensaEsp32Path" settings.json 82 | # TODO xtensaEsp32Path -> xtensaEspToolchainPath, espToolchainPath 83 | @click.command() 84 | # Basic parameters: 85 | @click.option("--app_flash_off", "-a", help=h["--app_flash_off"], type=INT_PREF, default=0x10000, show_default=True) 86 | @click.option('--board-type', '-b', help=h['--board-type'], type=Union[str]) 87 | @click.option('--debug', '-d', help=h['--debug'], type=INT_PREF, default=2, show_default=True) 88 | @click.option('--device-name', '-dn', help=h['--device-name'], type=Union[str], default=None, show_default=True) 89 | @click.option('--port', '-p', help=h['--port'], default=43474, show_default=True, type=INT_PREF) 90 | @click.option('--tmo-scale-factor', '-tsf', help=h["--tmo-scale-factor"], type=INT_PREF, default=1) 91 | # 92 | # Specific modes: 93 | @click.option('--postmortem', '-pm', help=h['--postmortem'], is_flag=True) 94 | @click.option('--developer-mode', 95 | help=h['--developer-mode'], 96 | type=click.Choice(DaDevModes.get_modes()), 97 | default=DaDevModes.NONE, 98 | show_default=True) 99 | # logging parameters: 100 | @click.option('--log-file', '-l', help=h['--log-file'], type=Union[str]) 101 | @click.option('--log-mult-files', '-lm', help=h['--log-mult-files'], default=None, is_flag=True) 102 | @click.option('--log-no-debug-console', '-ln', help=h['--log-no-debug-console'], default=None, is_flag=True) 103 | # 104 | # GDB parameters: 105 | @click.option('--toolchain-prefix', 106 | '-t', 107 | help=h['--toolchain-prefix'], 108 | type=Union[str], 109 | default=None, 110 | show_default=True) 111 | @click.option('--elfpath', '-e', help=h['--elfpath'], multiple=True, default=None, type=Union[str]) 112 | @click.option('--core-file', '-c', help=h['--core-file'], multiple=True, default=None, type=Union[str]) 113 | @click.option('--cmdfile', '-x', help=h['--cmdfile'], default=None, type=Union[str]) 114 | # 115 | # OpenOCD parameters: 116 | @click.option('--oocd', '-o', help=h['--oocd'], default=os.environ.get("OPENOCD_BIN", "openocd"), show_default=True) 117 | @click.option('--oocd-args', '-oa', help=h['--oocd-args'], default=None, show_default=True) 118 | @click.option('--oocd-mode', 119 | '-om', 120 | help=h['--oocd-mode'], 121 | type=click.Choice(DaOpenOcdModes.get_modes()), 122 | default=DaOpenOcdModes.CONNECT, 123 | show_default=True) 124 | @click.option('--oocd-ip', '-ip', help=h['--oocd-ip'], default='localhost', show_default=True, type=Union[str]) 125 | @click.option('--oocd-scripts', '-s', help=h['--oocd-scripts'], default=None, show_default=True) 126 | # 127 | @click.pass_context 128 | def cli(ctx, **kwargs): 129 | args_main = DaArgs(**ctx.params) 130 | # Modificators 131 | if args_main.developer_mode == DaDevModes.X86: 132 | args_main.debug = 5 133 | args_main.log_file = "debug.log" 134 | args_main.oocd_mode = DaOpenOcdModes.NO_OOCD 135 | args_main.toolchain_prefix = "" 136 | args_main.elfpath = "" 137 | 138 | if args_main.developer_mode == DaDevModes.CON_CHECK: 139 | args_main.debug = 4 140 | args_main.port = 43474 141 | args_main.oocd_mode = DaOpenOcdModes.NO_OOCD 142 | args_main.toolchain_prefix = "" 143 | args_main.log_file = "debug.log" 144 | # Real work starts here 145 | dbg_a = DebugAdapter(args=args_main) 146 | dbg_a.log_cmd(A2VSC_STARTED_STRING) 147 | dbg_a.adapter_run() 148 | 149 | 150 | if __name__ == '__main__': 151 | cli(sys.argv[1:]) 152 | -------------------------------------------------------------------------------- /debug_adapter/internal_classes.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from enum import IntEnum 26 | from re import match 27 | 28 | # `typing` is used for provide Python2-compatible typing hint for IDEs like PyCharm. 29 | # Some other tools like flake8, language server of VSCode - Pylance do not sopport it 30 | # and can consider imports unused. To bypass the warning message `noqa` comment is used. 31 | from typing import Tuple, Union # noqa: F401 32 | import esp_debug_backend as dbg 33 | 34 | 35 | class Modes(object): 36 | def __init__(self, modes_list=[]): 37 | for m in modes_list: 38 | setattr(self, m, m) 39 | 40 | @classmethod 41 | def get_dict(cls): 42 | public = {} 43 | for k, v in cls.__dict__.items(): 44 | if k.startswith('_'): 45 | continue 46 | else: 47 | public[k] = v 48 | return public 49 | 50 | @classmethod 51 | def get_modes(cls): 52 | return list(cls.get_dict().values()) 53 | 54 | 55 | class DaOpenOcdModes(Modes): 56 | RUN_AND_CONNECT = 'run_and_connect' 57 | CONNECT = 'connect_to_instance' 58 | NO_OOCD = 'without_oocd' 59 | 60 | 61 | class DaDevModes(Modes): 62 | NONE = "none" 63 | CON_CHECK = 'connection-check' 64 | X86 = 'x86-test' 65 | 66 | 67 | class DaRunState(IntEnum): 68 | STOPPED = -2 69 | STOP_PREPARATION = -1 70 | UNKNOWN = 0 71 | RUNNING = 1 72 | READY_TO_CONNECT = 2 73 | CONNECTED = 3 74 | INITIALIZED = 4 75 | CONFIGURED = 5 76 | READY = 6 77 | 78 | 79 | class DaVariableReference(IntEnum): 80 | LOCALS = 12 81 | REGISTERS = 24 82 | WATCH = 36 83 | 84 | 85 | class DaStates(object): 86 | general_state = DaRunState.STOPPED # type: DaRunState 87 | ready = False # TODO: replace to a lock 88 | no_debug = False # argument of a launch request 89 | gdb_started = False 90 | ocd_started = False 91 | threads_updated = False # True if something called a get_threads() method 92 | threads_are_stopped = None # type: Union[bool, None] 93 | # sets to False after the update processed (for example, stopEvent generated) 94 | error = False 95 | start_time = None # type: Union[str, None] 96 | wait_target_state = dbg.TARGET_STATE_UNKNOWN 97 | 98 | 99 | class DaArgs(object): 100 | """ 101 | Contains mandatory set of Da arguments. Can be extended with **kwargs 102 | """ 103 | 104 | def __init__(self, 105 | app_flash_off=None, 106 | board_type="", 107 | cmdfile="", 108 | core_file=(), 109 | debug=2, 110 | developer_mode=None, 111 | device_name="", 112 | elfpath=(), 113 | log_file=None, 114 | log_mult_files=False, 115 | log_no_debug_console=False, 116 | oocd_args=None, 117 | oocd_ip="", 118 | oocd_mode="", 119 | oocd_scripts="", 120 | oocd="", 121 | port=43474, 122 | postmortem=False, 123 | tmo_scale_factor=1, 124 | toolchain_prefix="", 125 | **kwargs): 126 | """ 127 | 128 | Parameters 129 | ---------- 130 | app_flash_off: Union[int, None] 131 | board_type: str 132 | cmdfile: str 133 | core_file: Tuple[str] 134 | debug: int 135 | developer_mode: Union[str, None] 136 | device_name: str 137 | elfpath: Tuple[str] 138 | log_file: str 139 | log_mult_files: bool 140 | log_no_debug_console:bool 141 | oocd: str 142 | oocd_args: str 143 | oocd_ip: str 144 | oocd_mode: str 145 | oocd_scripts: str 146 | port: int 147 | postmortem: bool 148 | tmo_scale_factor: int 149 | toolchain_prefix: str 150 | """ 151 | self.app_flash_off = app_flash_off 152 | self.board_type = board_type 153 | self.debug = debug 154 | self.developer_mode = developer_mode 155 | self.device_name = device_name 156 | self.elfpath = elfpath 157 | self.cmdfile = cmdfile 158 | self.log_file = log_file 159 | self.log_mult_files = log_mult_files 160 | self.log_no_debug_console = log_no_debug_console 161 | self.oocd = oocd 162 | self.oocd_args = oocd_args 163 | self.oocd_ip = oocd_ip 164 | self.oocd_mode = oocd_mode 165 | self.oocd_scripts = oocd_scripts 166 | self.port = port 167 | self.toolchain_prefix = toolchain_prefix 168 | self.postmortem = postmortem 169 | self.core_file = core_file 170 | self.tmo_scale_factor = tmo_scale_factor 171 | # for key in kwargs: 172 | for key, value in kwargs.items(): 173 | setattr(self, key, value) 174 | 175 | def get_dict(self): 176 | return dict(self.__dict__) 177 | 178 | 179 | class Handle: 180 | START_HANDLE = 1000 181 | _next_handler = START_HANDLE + 1 182 | _handle_map = {} 183 | 184 | def __init__(self, start_handle: int) -> None: 185 | self._next_handler = start_handle if start_handle else self.START_HANDLE 186 | 187 | def create(self, val: list): 188 | self._next_handler = self._next_handler + 1 189 | self._handle_map[self._next_handler] = val 190 | return self._next_handler 191 | 192 | def get(self, ref): 193 | return self._handle_map[ref] if ref in self._handle_map and self._handle_map[ref] else None 194 | 195 | def get_handler(self): 196 | return self._next_handler 197 | 198 | def get_map(self): 199 | return self._handle_map 200 | 201 | def reset(self, start_handle: int): 202 | self._handle_map = {} 203 | self._next_handler = start_handle if start_handle else self.START_HANDLE 204 | 205 | 206 | class VariableParser: 207 | result_regex = r'^([a-zA-Z_\-][a-zA-Z0-9_\-]*|\[\d+\])\s*=\s*' 208 | unknown_regex = r'\(.*?\)' 209 | variable_regex = r'^[a-zA-Z_\-][a-zA-Z0-9_\-]*' 210 | error_regex = r'^\<.+?\>' 211 | reference_string_regex = r'^(0x[0-9a-fA-F]+\s*)"' 212 | reference_regex = r'^0x[0-9a-fA-F]+' 213 | nullpointer_regex = r'^0x0+\\b' 214 | char_regex = r'^(\d+) [\'"]' 215 | number_regex = r'^\d+(\.\d+)?' 216 | pointer_combine_char_regex = '.' 217 | value_str = '' 218 | addr = 10000 219 | 220 | def __init__(self, value_str: str, addr: int = 10000) -> None: 221 | self.value_str = value_str 222 | self.handler = Handle(addr) 223 | self.handler.reset(addr) 224 | 225 | def parse_variable_value(self): 226 | self.value_str = self.value_str.strip() 227 | if self.value_str[0] == '{': 228 | result_dict = self.parse_tuple_dict() 229 | return result_dict 230 | elif self.value_str[0] == '"': 231 | return self.parse_cstring() 232 | else: 233 | return self.parse_primitive() 234 | 235 | def parse_tuple_dict(self): 236 | self.value_str = self.value_str.strip() 237 | if self.value_str[0] != '{': 238 | return None 239 | 240 | self.value_str = self.value_str[1:].strip() 241 | if self.value_str[0] == '}': 242 | self.value_str = self.value_str[1:].strip() 243 | return [] 244 | if self.value_str.startswith('...'): 245 | self.value_str = self.value_str[3:].strip() 246 | if self.value_str[0] == '}': 247 | self.value_str = self.value_str[1:].strip() 248 | return '<...>' 249 | 250 | equalPos = self.value_str.find('=') 251 | newValPos1 = self.value_str.find('{') 252 | newValPos2 = self.value_str.find(',') 253 | newValPos = newValPos1 254 | if newValPos2 != -1 and newValPos2 < newValPos1: 255 | newValPos = newValPos2 256 | 257 | # Value list 258 | if ((newValPos != -1) and (equalPos > newValPos)) or equalPos == -1: 259 | return self.parse_tuple_dict_value_list() 260 | 261 | result = self.parse_result() 262 | if result: 263 | results = [] 264 | results.append(result) 265 | comma_result = self.parse_comma_result() 266 | while comma_result: 267 | results.append(comma_result) 268 | comma_result = self.parse_comma_result() 269 | self.value_str = self.value_str[1:].strip() 270 | return results 271 | 272 | def parse_tuple_dict_value_list(self): 273 | result_values = [] 274 | val = self.parse_variable_value() 275 | result_values.append(self.create_value('[0]', val)) 276 | i = 0 277 | while True: 278 | i = i + 1 279 | val = self.parse_comma_value() 280 | if val is None: 281 | break 282 | result_values.append( 283 | self.create_value('[' + str(i) + ']', val)) 284 | self.value_str = self.value_str[1:].strip() 285 | return result_values 286 | 287 | def parse_comma_result(self): 288 | self.value_str = self.value_str.strip() 289 | if self.value_str[0] != ',': 290 | return None 291 | self.value_str = self.value_str[1:].strip() 292 | # Remove comma values like \'\\000\' , 293 | empty_comma_regex = r"\'\\\d+\'+.*?>," 294 | var_match = match(empty_comma_regex, self.value_str) 295 | while var_match: 296 | self.value_str = self.value_str[len(var_match[0]):].strip() 297 | var_match = match(empty_comma_regex, self.value_str) 298 | return self.parse_result() 299 | 300 | def parse_comma_value(self): 301 | self.value_str = self.value_str.strip() 302 | if self.value_str[0] != ',': 303 | return None 304 | self.value_str = self.value_str[1:].strip() 305 | return self.parse_variable_value() 306 | 307 | def parse_cstring(self): 308 | self.value_str = self.value_str.strip() 309 | if self.value_str[0] != '"' and self.value_str[0] != '\'': 310 | return '' 311 | str_end = 1 312 | in_str = True 313 | char_str = self.value_str[0] 314 | remaining = self.value_str[1:] 315 | escaped = False 316 | while in_str: 317 | if escaped: 318 | escaped = False 319 | elif remaining[0] == '\\': 320 | escaped = True 321 | elif remaining[0] == char_str: 322 | in_str = False 323 | remaining = remaining[1:] 324 | str_end = str_end + 1 325 | result_str = self.value_str[:str_end].strip() 326 | self.value_str = self.value_str[str_end:] 327 | return result_str 328 | 329 | def parse_primitive(self): 330 | self.value_str = self.value_str.strip() 331 | if len(self.value_str) == 0: 332 | return None 333 | elif self.value_str.startswith('true'): 334 | primitive = 'true' 335 | self.value_str = self.value_str[4:].strip() 336 | elif self.value_str.startswith('false'): 337 | primitive = 'false' 338 | self.value_str = self.value_str[5:].strip() 339 | else: 340 | primitive = self.parse_primitive_regex() 341 | 342 | if primitive is None: 343 | primitive = self.value_str 344 | 345 | return primitive 346 | 347 | def parse_primitive_regex(self): 348 | primitive = None 349 | regex_match = None 350 | for patt in ( 351 | self.nullpointer_regex, 352 | self.reference_string_regex, 353 | self.reference_regex, 354 | self.char_regex, 355 | self.number_regex, 356 | self.variable_regex, 357 | self.unknown_regex, 358 | self.error_regex 359 | ): 360 | regex_match = match(patt, self.value_str) 361 | if regex_match: 362 | if patt == self.nullpointer_regex: 363 | primitive = '' 364 | self.value_str = self.value_str[len( 365 | regex_match[0]):].strip() 366 | elif patt == self.reference_string_regex: 367 | self.value_str = self.value_str[len( 368 | regex_match[1]):].strip() 369 | primitive = self.parse_cstring() 370 | elif patt == self.reference_regex: 371 | primitive = '*' + regex_match[0] 372 | self.value_str = self.value_str[len( 373 | regex_match[0]):].strip() 374 | elif patt == self.char_regex: 375 | primitive = regex_match[1] 376 | self.value_str = self.value_str[len( 377 | regex_match[0]) - 1:].strip() 378 | primitive += ' ' + self.parse_cstring() 379 | elif patt == self.number_regex: 380 | primitive = regex_match[0] 381 | self.value_str = self.value_str[len( 382 | regex_match[0]):].strip() 383 | elif patt == self.variable_regex or patt == self.error_regex or patt == self.unknown_regex: 384 | primitive = regex_match[0] 385 | self.value_str = self.value_str[len(regex_match[0]):].strip() 386 | return primitive 387 | 388 | def print_handles(self): 389 | print(self.get_variables()) 390 | 391 | def get_variables(self): 392 | return self.handler.get_map() 393 | 394 | def parse_result(self): 395 | self.value_str = self.value_str.strip() 396 | var_match = match(self.result_regex, self.value_str) 397 | if var_match: 398 | self.value_str = self.value_str[len(var_match[0]):].strip() 399 | name = var_match[1] 400 | val = self.parse_variable_value() 401 | return self.create_value(name, val) 402 | 403 | def create_value(self, name: str, val): 404 | var_type = type(val) 405 | ref = 0 406 | return_val = val 407 | if var_type is list: 408 | ref = self.handler.create(val) 409 | return_val = '{...}' 410 | 411 | return { 412 | 'name': name, 413 | 'ref': ref, 414 | 'value': return_val, 415 | 'mem_addr': None 416 | } 417 | -------------------------------------------------------------------------------- /debug_adapter/log.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from logging import CRITICAL, ERROR, WARNING, INFO, DEBUG, Formatter, FileHandler, Logger, StreamHandler, NOTSET 26 | from .internal_classes import DaArgs 27 | import os 28 | import sys 29 | import threading 30 | from typing import Any 31 | 32 | # CONFIG: ************************************************************************************************************** 33 | 34 | CFG_NO_DEBUG_CONSOLE = False 35 | CGF_LOG_TO_MULT_FILES = False 36 | CFG_BACKUP_OLD_LOG = False 37 | CFG_LOG_FORMAT = '%(asctime)-15s - %(name)s - %(levelname)s - %(message)s' 38 | 39 | # .CONFIG ************************************************************************************************************** 40 | formatter = None 41 | stream_handler = None 42 | file_handler = None 43 | level = DEBUG 44 | args = None # type: DaArgs 45 | __da = None # placeholder for debug adapter instance. Need for avoiding circular imports 46 | start_time = None # type: Any[str, None] 47 | _top_logger = None # type: Any[Logger, None] 48 | argval2loglevel = {0: CRITICAL, 1: ERROR, 2: WARNING, 3: INFO, 4: DEBUG} 49 | 50 | 51 | def da(): 52 | global __da 53 | return __da 54 | 55 | 56 | class DebugAdapterLogger(Logger): 57 | def __init__(self, name, level=NOTSET, da_inst=None, with_console_output=True): 58 | """ 59 | Parameters 60 | ---------- 61 | name : str 62 | Name of the logger 63 | level : int 64 | Log level 65 | da_inst 66 | Debug adapter instance 67 | with_console_output : bool 68 | Print to debug console of VSCode 69 | """ 70 | super(DebugAdapterLogger, self).__init__(name=name, level=level) 71 | self.con_out = with_console_output 72 | self.__da = da_inst 73 | 74 | def critical(self, msg, *args, **kwargs): 75 | if (not CFG_NO_DEBUG_CONSOLE) and self.con_out and da() and self.isEnabledFor(CRITICAL): 76 | da().output(msg) 77 | super(DebugAdapterLogger, self).critical(msg, *args, **kwargs) 78 | 79 | def warning(self, msg, *args, **kwargs): 80 | if (not CFG_NO_DEBUG_CONSOLE) and self.con_out and da() and self.isEnabledFor(WARNING): 81 | da().output(msg) 82 | super(DebugAdapterLogger, self).warning(msg, *args, **kwargs) 83 | 84 | def exception(self, msg, *args, **kwargs): 85 | if (not CFG_NO_DEBUG_CONSOLE) and self.con_out and da(): 86 | da().output(msg) 87 | super(DebugAdapterLogger, self).exception(msg, *args, **kwargs) 88 | 89 | def error(self, msg, *args, **kwargs): 90 | if (not CFG_NO_DEBUG_CONSOLE) and self.con_out and da() and self.isEnabledFor(ERROR): 91 | da().output(msg) 92 | super(DebugAdapterLogger, self).error(msg, *args, **kwargs) 93 | 94 | def info(self, msg, *args, **kwargs): 95 | if (not CFG_NO_DEBUG_CONSOLE) and self.con_out and da() and self.isEnabledFor(INFO): 96 | da().output(msg) 97 | super(DebugAdapterLogger, self).info(msg, *args, **kwargs) 98 | 99 | def debug(self, msg, *args, **kwargs): 100 | if (not CFG_NO_DEBUG_CONSOLE) and self.con_out and da() and self.isEnabledFor(DEBUG): 101 | da().output(msg) 102 | super(DebugAdapterLogger, self).debug(msg, *args, **kwargs) 103 | 104 | def warning_no_con(self, msg, *args, **kwargs): 105 | super(DebugAdapterLogger, self).warning(msg, *args, **kwargs) 106 | 107 | def error_no_con(self, msg, *args, **kwargs): 108 | super(DebugAdapterLogger, self).error(msg, *args, **kwargs) 109 | 110 | def info_no_con(self, msg, *args, **kwargs): 111 | super(DebugAdapterLogger, self).info(msg, *args, **kwargs) 112 | 113 | def debug_no_con(self, msg, *args, **kwargs): 114 | super(DebugAdapterLogger, self).debug(msg, *args, **kwargs) 115 | 116 | def exception_no_con(self, msg, *args, **kwargs): 117 | super(DebugAdapterLogger, self).exception(msg, *args, **kwargs) 118 | 119 | 120 | def getLogger(name=None, with_console_output=True): 121 | return DebugAdapterLogger(name, with_console_output=with_console_output) 122 | 123 | 124 | def init(in_args, 125 | start_time_str='', 126 | add_to_file_name='', 127 | da_inst=None, 128 | backup_old_log=CFG_BACKUP_OLD_LOG, 129 | no_debug_console=False): 130 | err_msg = None 131 | global CGF_LOG_TO_MULT_FILES 132 | global formatter 133 | global stream_handler 134 | global file_handler 135 | global level 136 | global args 137 | global _top_logger 138 | global start_time 139 | global __da 140 | args = in_args 141 | start_time = start_time_str 142 | __da = da_inst 143 | level = argval2loglevel.get(args.debug, DEBUG) 144 | 145 | if args.log_mult_files: 146 | CGF_LOG_TO_MULT_FILES = True 147 | formatter = Formatter(CFG_LOG_FORMAT) 148 | stream_handler = StreamHandler(sys.stdout) 149 | stream_handler.setFormatter(formatter) 150 | pref = '' 151 | if args.log_file is not None: 152 | try: 153 | if os.path.isfile(args.log_file) and backup_old_log: # if old log file exists 154 | st = "" 155 | with open(args.log_file) as f: 156 | first_line = f.readline() 157 | st_mark = first_line[-31:-20] 158 | if st_mark == 'START_TIME_': 159 | st = first_line[-20:-1] 160 | if st: 161 | path_a, path_b = os.path.split(args.log_file) 162 | path_b = st + "_" + path_b 163 | renamed_path = os.path.join(path_a, path_b) 164 | os.rename(args.log_file, renamed_path) 165 | except Exception as e: 166 | err_msg = e 167 | 168 | file_handler = FileHandler(pref + args.log_file, 'w') 169 | file_handler.setFormatter(formatter) 170 | else: 171 | file_handler = None 172 | _top_logger = new_logger(log_name='Debug Adapter (main)', 173 | stream_handler_=stream_handler, 174 | file_handler_=file_handler, 175 | with_console_output=(not args.log_no_debug_console)) 176 | debug_no_con("START_TIME_" + start_time) 177 | debug_no_con("Init errors:" + str(err_msg)) 178 | 179 | 180 | def new_logger(log_name='Logger', 181 | logging_level_=None, 182 | stream_handler_=None, 183 | file_handler_=None, 184 | with_console_output=True): 185 | """ 186 | 187 | Parameters 188 | ---------- 189 | log_name : str 190 | logging_level_ : int 191 | stream_handler_ : StreamHandler 192 | file_handler_ : FileHandler 193 | with_console_output :bool 194 | 195 | Returns 196 | ------- 197 | 198 | """ 199 | global CGF_LOG_TO_MULT_FILES 200 | logger = getLogger(log_name, with_console_output=with_console_output) 201 | if logging_level_: 202 | logger.setLevel(logging_level_) 203 | else: 204 | global level 205 | logger.setLevel(level) 206 | if stream_handler_: 207 | logger.addHandler(stream_handler_) 208 | else: 209 | global stream_handler 210 | logger.addHandler(stream_handler) 211 | if file_handler_: 212 | if CGF_LOG_TO_MULT_FILES: 213 | file_handler_ = get_file_handler(log_name) 214 | else: 215 | file_handler_ = get_file_handler() 216 | logger.addHandler(file_handler_) 217 | else: 218 | global file_handler # check if global file_handler is defined 219 | if file_handler: 220 | if CGF_LOG_TO_MULT_FILES: 221 | file_handler_mult = get_file_handler(log_name) 222 | logger.addHandler(file_handler_mult) 223 | else: 224 | logger.addHandler(file_handler) 225 | logger.propagate = False 226 | return logger 227 | 228 | 229 | def get_file_handler(file_pref=''): 230 | if not file_pref: 231 | global file_handler 232 | return file_handler 233 | else: 234 | global start_time 235 | file_pref = file_pref \ 236 | .replace(" ", "_") \ 237 | .lower() 238 | if len(start_time): 239 | file_pref = str(start_time) + '_' + file_pref 240 | folder, filename = os.path.split(args.log_file) 241 | filepath = os.path.join(folder, file_pref + filename) 242 | fh = FileHandler(filepath, 'w') 243 | fh.setFormatter(formatter) 244 | return fh 245 | 246 | 247 | _debug_lock = threading.Lock() 248 | # 249 | # strings 250 | dbg_msg_empty = "debug_exception without any message" 251 | dbg_msg_convert_error = "DEBUG OUTPUT ERROR: debug_exception(msg) got msg which can not to handle" 252 | 253 | 254 | def dbg_msg_to_str(msg): 255 | if msg: # if there is some message 256 | if type(msg) is not str: 257 | # TODO: processing of non str 258 | try: 259 | msg_str = str(msg) + '\n' 260 | except TypeError: 261 | msg_str = dbg_msg_convert_error 262 | return msg_str 263 | else: 264 | return msg 265 | else: 266 | return dbg_msg_empty 267 | 268 | 269 | def debug(msg): 270 | _top_logger.debug(msg) 271 | 272 | 273 | def debug_no_con(msg): 274 | _top_logger.debug_no_con(msg) 275 | 276 | 277 | def info(msg): 278 | _top_logger.info(msg) 279 | 280 | 281 | def info_no_con(msg): 282 | _top_logger.info_no_con(msg) 283 | 284 | 285 | def warning(msg): 286 | _top_logger.warning(msg) 287 | 288 | 289 | def warning_no_con(msg): 290 | _top_logger.warning_no_con(msg) 291 | 292 | 293 | def cmd(msg): 294 | _top_logger.critical(msg) 295 | 296 | 297 | def cmd_no_con(msg): 298 | _top_logger.critical_no_con(msg) 299 | 300 | 301 | def log(msg, level): 302 | _top_logger.log(level, msg) 303 | 304 | 305 | def debug_exception(msg=None): 306 | if not msg: 307 | import traceback 308 | traceback.print_exc() 309 | else: 310 | _top_logger.exception(msg) 311 | 312 | 313 | def debug_exception_no_con(msg=None): 314 | if not msg: 315 | import traceback 316 | traceback.print_exc() 317 | else: 318 | _top_logger.exception_no_con(msg) 319 | 320 | 321 | level = DEBUG 322 | formatter = Formatter(CFG_LOG_FORMAT) 323 | stream_handler = StreamHandler(sys.stdout) 324 | stream_handler.setFormatter(formatter) 325 | _top_logger = new_logger(log_name='early Debug Adapter (main)', 326 | stream_handler_=stream_handler, 327 | file_handler_=file_handler) 328 | -------------------------------------------------------------------------------- /debug_adapter/threads.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | import threading 26 | import json 27 | import itertools 28 | from . import base_schema, log 29 | from .tools import PY3 30 | 31 | if PY3: 32 | _next_seq = itertools.count().__next__ 33 | else: 34 | _next_seq = itertools.count().next 35 | 36 | 37 | class ReaderThread(threading.Thread): 38 | def __init__(self, stream, process_command): 39 | self.request_stop = False 40 | self._logger = log.new_logger("Debug Adapter (ReaderThread)", with_console_output=False) 41 | self.stream = stream # stream to read 42 | self.process_command = process_command 43 | threading.Thread.__init__(self, name="ReaderThread") 44 | 45 | def run(self): 46 | data = None 47 | try: 48 | while not self.request_stop: 49 | try: 50 | data = self.read() 51 | except RuntimeError as e: 52 | self._logger.debug(e) 53 | if data is None: # hence, EOF 54 | break 55 | protocol_message = base_schema.from_dict(data) 56 | self.process_command(protocol_message) 57 | except Exception as e: 58 | if e == SystemExit: 59 | return 60 | log.debug_exception_no_con(e) 61 | 62 | def read(self): 63 | """ 64 | Reads one message from the stream and returns the related dict (or None if EOF was reached). 65 | """ 66 | headers = {} 67 | while True: 68 | # Interpret the http protocol headers 69 | try: 70 | line = self.stream.readline() # The trailing \r\n should be there. 71 | except ConnectionResetError: 72 | self._logger.warning("Connection lost") 73 | break 74 | if not line: # EOF 75 | self._logger.debug("EOF") 76 | return None 77 | self._logger.debug('read line: >>%s<<\n' % (line.replace(b'\r', b'\\r').replace(b'\n', b'\\n')), ) 78 | line = line.strip().decode('ascii') 79 | if not line: # Read just a new line without any contents 80 | break 81 | try: 82 | name, value = line.split(': ', 1) 83 | except ValueError: 84 | raise RuntimeError('invalid header line: {}'.format(line)) 85 | headers[name] = value 86 | 87 | if not headers: 88 | raise RuntimeError('got message without headers') 89 | 90 | size = int(headers['Content-Length']) 91 | 92 | # Get the actual json 93 | body = self.stream.read(size) 94 | 95 | return json.loads(body.decode('utf-8')) 96 | 97 | def stop(self, blocking=True): 98 | self.request_stop = True 99 | if blocking: 100 | try: 101 | self.join() 102 | except RuntimeError: 103 | pass 104 | 105 | 106 | class WriterThread(threading.Thread): 107 | def __init__(self, stream, queue): 108 | self.request_stop = False 109 | self._logger = log.new_logger("Debug Adapter (WriterThread)", with_console_output=False) 110 | self.stream = stream 111 | self.queue = queue 112 | threading.Thread.__init__(self, name="WriterThread") 113 | 114 | def run(self): 115 | try: 116 | while not self.request_stop: 117 | to_write = self.queue.get() 118 | to_json = getattr(to_write, 'to_json', None) 119 | if to_json is not None: 120 | # Some protocol message 121 | to_write.seq = _next_seq() 122 | try: 123 | to_write = to_json() 124 | except Exception as e: 125 | log.debug_exception_no_con(e) 126 | log.debug_exception_no_con('Error serializing %s to json.' % (to_write, )) 127 | continue 128 | 129 | self._logger.debug_no_con('Writing: %s\n' % (to_write, )) 130 | 131 | if to_write.__class__ == bytes: 132 | as_bytes = to_write 133 | else: 134 | as_bytes = to_write.encode('utf-8') 135 | 136 | self.stream.write(b'Content-Length: %d\r\n\r\n' % (len(as_bytes))) 137 | self.stream.write(as_bytes) 138 | self.stream.flush() 139 | except Exception as e: 140 | log.debug_exception_no_con(e) 141 | 142 | def stop(self, blocking=True): 143 | self.request_stop = True 144 | self.queue.put('exit') 145 | if blocking: 146 | try: 147 | self.join() 148 | except RuntimeError: 149 | pass 150 | -------------------------------------------------------------------------------- /debug_adapter/tools.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | import os 26 | import sys 27 | import time 28 | from datetime import datetime 29 | from . import log 30 | 31 | PY3 = sys.version_info[0] == 3 32 | PY2 = sys.version_info[0] == 2 33 | WIN32 = sys.platform == "win32" 34 | 35 | if WIN32: 36 | import win32api 37 | 38 | 39 | class ObjFromDict(object): 40 | """ 41 | @DynamicAttrs 42 | Turns a dictionary into a class 43 | """ 44 | 45 | def __init__(self, dictionary): 46 | """Constructor""" 47 | for key in dictionary: 48 | setattr(self, key, dictionary[key]) 49 | 50 | def __repr__(self): 51 | """""" 52 | return "" % str(self.__dict__) 53 | 54 | def get_dict(self): 55 | return self.__dict__ 56 | 57 | 58 | def path_disassemble(in_path): 59 | """ 60 | 61 | Parameters 62 | ---------- 63 | in_path: str 64 | 65 | Returns 66 | ------- 67 | win_drive: str 68 | path: str 69 | name: str 70 | extension: str 71 | exists: bool 72 | """ 73 | path_abs = os.path.abspath(in_path) 74 | 75 | win_drive, tail = os.path.splitdrive(path_abs) 76 | path, f_ext = os.path.split(tail) 77 | name, extension = os.path.splitext(f_ext) 78 | exists = os.path.exists(path_abs) 79 | return win_drive, path, name, extension, exists 80 | 81 | 82 | def get_good_path(srs_path): 83 | """ 84 | 85 | Parameters 86 | ---------- 87 | srs_path 88 | 89 | Returns 90 | ------- 91 | srs_path 92 | """ 93 | r = "" 94 | try: 95 | if not WIN32: 96 | r = str(srs_path) 97 | else: 98 | r = win32api.GetLongPathName(win32api.GetShortPathName(srs_path)) 99 | except Exception: 100 | r = srs_path 101 | return r 102 | 103 | 104 | class Measurement(object): 105 | def __init__(self): 106 | self.start = time.time() # type: float 107 | self.end = None # type: float 108 | self.time = None # type: float 109 | 110 | def _ending(self): 111 | self.end = time.time() 112 | self.time = self.end - self.start 113 | 114 | def _logging(self, level): 115 | log.log("Measured %f s ( from ~ %s)" % ( 116 | self.time, 117 | datetime.fromtimestamp(self.start).strftime("%H:%M:%S,%f") 118 | ), level) 119 | 120 | def _warning(self, warn_msg): 121 | log.warning("%s - Measured %f s ( from ~ %s)" % ( 122 | warn_msg, 123 | self.time, 124 | datetime.fromtimestamp(self.start).strftime("%H:%M:%S,%f"))) 125 | 126 | def stop(self): 127 | self._ending() 128 | self._logging(log.INFO) 129 | 130 | def stop_n_check(self, warn_time_thr, warn_msg="The operation took too long!"): 131 | """ 132 | 133 | Parameters 134 | ---------- 135 | warn_time_thr : float 136 | in seconds 137 | warn_msg : str 138 | in words 139 | 140 | Returns 141 | ------- 142 | 143 | """ 144 | self._ending() 145 | if self.time > warn_time_thr: 146 | self._warning(warn_msg) 147 | return True 148 | return False 149 | -------------------------------------------------------------------------------- /debug_adapter_main.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | import sys 26 | from debug_adapter import cli 27 | 28 | 29 | if __name__ == '__main__': 30 | cli(sys.argv[1:]) 31 | -------------------------------------------------------------------------------- /docs/src/doxygen.md: -------------------------------------------------------------------------------- 1 | ## Important! 2 | 3 | For initiate that part of the documentation on a local repository, execute in a repo's root directory: 4 | 5 | ```bash 6 | pushd ./docs/src 7 | doxygen 8 | popd 9 | doxybook -i ./docs/doxygen/xml/ -o ./docs/doxygen/md -t mkdocs 10 | ``` 11 | 12 | Than you could follow to doxygen-generated 13 | 14 | [Got it!](../doxygen/md/files.md) -------------------------------------------------------------------------------- /docs/src/img/testing/Part1ConnectionСhecking.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/espressif/esp-debug-adapter/0cabfa41372ae7b8d8133b0f82c7eb8e6025adc3/docs/src/img/testing/Part1ConnectionСhecking.png -------------------------------------------------------------------------------- /docs/src/img/testing/Part2RequestsChecking.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/espressif/esp-debug-adapter/0cabfa41372ae7b8d8133b0f82c7eb8e6025adc3/docs/src/img/testing/Part2RequestsChecking.png -------------------------------------------------------------------------------- /docs/src/img/testing/Part3ManualEnd-to-endTesting.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/espressif/esp-debug-adapter/0cabfa41372ae7b8d8133b0f82c7eb8e6025adc3/docs/src/img/testing/Part3ManualEnd-to-endTesting.png -------------------------------------------------------------------------------- /docs/start_modes_and_arguments.md: -------------------------------------------------------------------------------- 1 | # Start Modes and Arguments 2 | 3 | ## Minimal config 4 | 5 | In minimal configuration is supposed you have the following set up: 6 | 7 | * Installed [OpenOCD_esp32](https://github.com/espressif/openocd-esp32) 8 | * Installed [Python](https://www.python.org) 9 | * Installed python dependencies from `requirements.txt` (`pip install -r requirements.txt`) 10 | * Variables set: 11 | * $PATH: path/to/openocd/bin/folder (e.g. `%USERPROFILE%/esp/openocd-esp32/bin`) 12 | * $OPENOCD_SCRIPTS: path/to/openocd/scripts (e.g. `%USERPROFILE%/esp/openocd-esp32/share/openocd/cripts`) 13 | * esp-wrover-kit connected to USB 14 | 15 | To run execute 16 | 17 | `debug_adapter_main.py -e PATH/2/ELF.elf` 18 | 19 | After that connect to `localhost:43474` or `my.custom.i.p:43474` via any DAP compatible debugger (e.g. VSCode) 20 | 21 | ## Specific developer modes 22 | 23 | For developers purposes there is following keys, setting up debugger to specific configuration 24 | 25 | ### --conn_check/-cc 26 | 27 | Turn Adapter in mode handling only `initialize` request and then disconnecting - needed for testing 28 | 29 | ### --dev-defaults / -dd 30 | 31 | Equals to run with: 32 | 33 | `-d 5 -e "C:\esp\branches\vsc_adapter_testing\blink\build\blink.elf" -l ./debug.log` 34 | 35 | ### --dev-x86rq / -dr 36 | 37 | Equals to run with: 38 | 39 | `-d 5 -e './testing/target_x86_app/main' -om without_oocd -l debug.log` 40 | 41 | ### --dev_dbg / -dd 42 | 43 | Launching with the main class `DebugAdapterTests` instead of `DebugAdapter` (see `debug_adapter/tests/debug_adapter_tests_class.py`) 44 | -------------------------------------------------------------------------------- /docs/testing.md: -------------------------------------------------------------------------------- 1 | # Testing 2 | 3 | ## VSCode<->DebugAdapter Connection checking 4 | 5 | ![Part1ConnectionСhecking](./src/img/testing/Part1ConnectionСhecking.png) 6 | 7 | ## Testing scenarios 8 | 9 | ### Main idea 10 | 11 | ![Part2RequestsChecking](./src/img/testing/Part2RequestsChecking.png) 12 | 13 | We have a working program working as a server (Debug Adapter in this case) and a client program that read json files with a specific structure (called scenarios), sends a specific data described in that files on input and expects output with some specific format also described. 14 | 15 | Each scenario is a json file with following structure 16 | 17 | ``` json 18 | { 19 | "name": "...", 20 | "previous": "...", 21 | "script": [ 22 | ... 23 | ] 24 | } 25 | ``` 26 | 27 | **name** - scenario's name 28 | 29 | **previous** - name of scenario, considered to be launch previously ("" - if the fist one) 30 | 31 | **script** - list of consiquenced DAP jsons which will sended ("request" types), and received ("response" and "event" types). 32 | 33 | We are sending requests to Adapter input and waiting for appropriate response for each of them according to the scenario. If we read json of event type, we wait for the specific event, according to for the scenario without any request sent. 34 | 35 | ### Script field's jsons 36 | 37 | Each request considers getting request. If the event was read - following the scenario we will wait for the event during some timeout. 38 | 39 | Responses and events could have `__ANY__` content in their fields in the scenario. That's mean, during the testing we do not care about its value. 40 | 41 | Example: 42 | 43 | ``` json 44 | { 45 | "body": { 46 | "reason": "breakpoint", 47 | "description": "ANY", 48 | "allThreadsStopped": true, 49 | "threadId": 8 50 | }, 51 | "type": "event", 52 | "event": "stopped", 53 | "seq": "ANY" 54 | } 55 | ``` 56 | 57 | At the example, we are not interested in the field *seq* (because e.g. we don't know the specific number of the response) and *body.description* (because this information could be program specific, but we are waiting for program insensitive behavior). 58 | 59 | ## End-to-end testing 60 | 61 | For that type of testing is used [TESTING_SCRIPT.md](https://gitlab.espressif.cn:6688/idf/vscode-plugin/blob/feature/debug-adapter-integration/TESTING_SCRIPT.md) from the vscode-plugin project 62 | 63 | ![Part3ManualEnd-to-endTesting](./src/img/testing/Part3ManualEnd-to-endTesting.png) -------------------------------------------------------------------------------- /esp_dap_adapter.code-workspace: -------------------------------------------------------------------------------- 1 | { 2 | "folders": [ 3 | { 4 | "path": "." 5 | } 6 | ], 7 | "settings": { 8 | "python.testing.pytestEnabled": true, 9 | "python.testing.unittestEnabled": false, 10 | "python.testing.nosetestsEnabled": false, 11 | "python.analysis.extraPaths": [ 12 | "tests/debugpy_tests", 13 | "tests/debugpy", 14 | "tests/debugpy/_vendored", 15 | "tests/debugpy/_vendored/pydevd", 16 | "tests" 17 | ], 18 | "python.testing.pytestArgs": [ 19 | "${workspaceFolder}/tests/", 20 | // "-s" // uncomment to full output 21 | ], 22 | }, 23 | "extensions": { 24 | "recommendations": [ 25 | "ms-python.python", 26 | "ms-python.vscode-pylance", 27 | ] 28 | }, 29 | "tasks": { 30 | "version": "2.0.0", 31 | "tasks": [ 32 | { 33 | "label": "Python Style Check", 34 | "type": "shell", 35 | "command": "python3 -m flake8 --config=${workspaceFolder}/.flake8 --tee" 36 | }, 37 | { 38 | "label": "Build test targets", 39 | "type": "shell", 40 | "command": "python3 ${workspaceFolder}/tests/target/build_all.py" 41 | } 42 | ] 43 | }, 44 | "launch": { 45 | "configurations": [ 46 | { 47 | "name": "Python: Current File", 48 | "type": "python", 49 | "request": "launch", 50 | "program": "${file}", 51 | "console": "integratedTerminal", 52 | "env": { 53 | "PYTHONPATH": "${workspaceFolder}" 54 | }, 55 | "cwd": "${workspaceFolder}" 56 | }, 57 | { 58 | "name": "Debug Adapter with coredump", 59 | "type": "python", 60 | "request": "launch", 61 | "program": "${workspaceFolder}/debug_adapter_main.py", 62 | "args": [ 63 | "-e", 64 | "${workspaceFolder}/tests/target/blink.elf", 65 | "-c", 66 | "${workspaceFolder}/tests/target/coredump.elf", 67 | "-l", 68 | "debug.log", 69 | "-ln", 70 | "-d", 71 | "4", 72 | "-dn", 73 | "esp32" 74 | ], 75 | "console": "integratedTerminal", 76 | "env": { 77 | "PYTHONPATH": "${workspaceFolder}" 78 | } 79 | }, 80 | { 81 | "name": "Debug Adapter with host application", 82 | "type": "python", 83 | "request": "launch", 84 | "program": "${workspaceFolder}/debug_adapter_main.py", 85 | "args": [ 86 | "-l", "debug.log", 87 | "-ln", 88 | "-d", "4", 89 | "-om", "without_oocd", 90 | "-e", "${workspaceFolder}/tests/target/x86/test_app.exe" 91 | ], 92 | "console": "integratedTerminal", 93 | "env": { 94 | "PYTHONPATH" : "${workspaceFolder}" 95 | } 96 | }, 97 | ], 98 | "compounds": [] 99 | } 100 | } -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | setuptools>=21 2 | # Version 21 is required to handle PEP 508 environment markers. 3 | click 4 | esp-debug-backend 5 | pywin32>=227; sys.platform == 'win32' 6 | requests>=2.21.0 7 | -------------------------------------------------------------------------------- /scripts/gen_debugger_json.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Fabio Zadrozny 2 | # 3 | # This program and the accompanying materials are made 4 | # available under the terms of the Eclipse Public License 2.0 5 | # which is available at https://www.eclipse.org/legal/epl-2.0/ 6 | # 7 | # SPDX-License-Identifier: EPL-2.0 8 | 9 | ''' 10 | Notes: 11 | 12 | The full debug protocol as a json may be found at: 13 | 14 | https://raw.githubusercontent.com/Microsoft/vscode-debugadapter-node/master/debugProtocol.json 15 | 16 | It's also provided as a typescript interface in: 17 | 18 | https://github.com/Microsoft/vscode-debugadapter-node/blob/master/protocol/src/debugProtocol.ts 19 | 20 | 21 | The implementation of the ts debugger is at: 22 | 23 | https://github.com/Microsoft/vscode-debugadapter-node/ 24 | 25 | Mono example: https://github.com/Microsoft/vscode-mono-debug/blob/master/src/typescript/extension.ts 26 | 27 | https://code.visualstudio.com/docs/extensionAPI/api-debugging has a brief overview. 28 | 29 | https://code.visualstudio.com/docs/extensionAPI/extension-points provides a place with more info on the debugger package.json bits. 30 | 31 | https://code.visualstudio.com/docs/extensions/example-debuggers has some examples. 32 | 33 | https://github.com/Microsoft/ptvsd has the microsoft wrapper for pydevd. 34 | ''' 35 | 36 | 37 | def generate_debugger(): 38 | return { 39 | 'type': 'PyDev', 40 | 'label': 'PyDev (Python)', 41 | 'languages': ['python'], 42 | 'adapterExecutableCommand': 'pydev.start.debugger', 43 | 'enableBreakpointsFor': { 44 | 'languageIds': ['python', 'html'], 45 | }, 46 | 'configurationAttributes': { 47 | 'launch': { 48 | 'properties': { 49 | 50 | 'program': { 51 | 'type': 'string', 52 | 'description': 'The .py file that should be debugged (i.e.: python `program.py`).', 53 | }, 54 | 55 | 'module': { 56 | 'type': 'string', 57 | 'description': 'The module to be debugged (i.e.: python -m `module`).', 58 | }, 59 | 60 | 'args': { 61 | 'type': ['string', 'array'], 62 | 'description': 'The command line arguments passed to the program.' 63 | }, 64 | 65 | "cwd": { 66 | "type": "string", 67 | "description": "The working directory of the program.", 68 | "default": "${workspaceFolder}" 69 | }, 70 | 71 | "console": { 72 | "type": "string", 73 | "enum": [ 74 | "none", 75 | "integratedTerminal", 76 | "externalTerminal" 77 | ], 78 | "enumDescriptions": [ 79 | "VS Code debug console.", 80 | "VS Code integrated terminal.", 81 | "External terminal that can be configured in user settings." 82 | ], 83 | "description": "The specified console to launch the program.", 84 | "default": "none" 85 | }, 86 | } 87 | } 88 | }, 89 | 90 | "configurationSnippets": [ 91 | { 92 | "label": "PyDev: Launch Python Program", 93 | "description": "Add a new configuration for launching a python program with the PyDev debugger.", 94 | "body": { 95 | "type": "PyDev", 96 | "name": "PyDev Debug (Launch)", 97 | "request": "launch", 98 | "cwd": "^\"\\${workspaceFolder}\"", 99 | "console": "none", 100 | "mainModule": "", 101 | "args": [], 102 | } 103 | }, 104 | ] 105 | } 106 | -------------------------------------------------------------------------------- /scripts/gen_debugger_protocol.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Fabio Zadrozny 2 | # 3 | # This program and the accompanying materials are made 4 | # available under the terms of the Eclipse Public License 2.0 5 | # which is available at https://www.eclipse.org/legal/epl-2.0/ 6 | # 7 | # SPDX-License-Identifier: EPL-2.0 8 | 9 | class _OrderedSet(object): 10 | # Not a good ordered set (just something to be small without adding any deps) 11 | 12 | def __init__(self, initial_contents=None): 13 | self._contents = [] 14 | self._contents_as_set = set() 15 | if initial_contents is not None: 16 | for x in initial_contents: 17 | self.add(x) 18 | 19 | def add(self, x): 20 | if x not in self._contents_as_set: 21 | self._contents_as_set.add(x) 22 | self._contents.append(x) 23 | 24 | def copy(self): 25 | return _OrderedSet(self._contents) 26 | 27 | def update(self, contents): 28 | for x in contents: 29 | self.add(x) 30 | 31 | def __iter__(self): 32 | return iter(self._contents) 33 | 34 | def __contains__(self, item): 35 | return item in self._contents_as_set 36 | 37 | def __len__(self): 38 | return len(self._contents) 39 | 40 | def set_repr(self): 41 | if len(self) == 0: 42 | return 'set()' 43 | 44 | lst = [repr(x) for x in self] 45 | return '{' + ', '.join(lst) + '}' 46 | 47 | 48 | class Ref(object): 49 | 50 | def __init__(self, ref): 51 | self.ref = ref 52 | 53 | def __str__(self): 54 | return self.ref 55 | 56 | 57 | def load_schema_data(): 58 | import os.path 59 | import json 60 | 61 | json_file = os.path.join(os.path.dirname(__file__), 'debugProtocol.json') 62 | if not os.path.exists(json_file): 63 | import requests 64 | req = requests.get('https://raw.githubusercontent.com/Microsoft/vscode-debugadapter-node/master/debugProtocol.json') 65 | assert req.status_code == 200 66 | with open(json_file, 'wb') as stream: 67 | stream.write(req.content) 68 | 69 | with open(json_file, 'rb') as json_contents: 70 | json_schema_data = json.loads(json_contents.read()) 71 | return json_schema_data 72 | 73 | 74 | def create_classes_to_generate_structure(json_schema_data): 75 | definitions = json_schema_data['definitions'] 76 | 77 | class_to_generatees = {} 78 | 79 | for name, definition in definitions.items(): 80 | all_of = definition.get('allOf') 81 | description = definition.get('description') 82 | properties = {} 83 | properties.update(definition.get('properties', {})) 84 | required = _OrderedSet(definition.get('required', _OrderedSet())) 85 | base_definitions = [] 86 | 87 | if all_of is not None: 88 | for definition in all_of: 89 | ref = definition.get('$ref') 90 | if ref is not None: 91 | assert ref.startswith('#/definitions/') 92 | ref = ref[len('#/definitions/'):] 93 | base_definitions.append(ref) 94 | else: 95 | if not description: 96 | description = definition.get('description') 97 | properties.update(definition.get('properties', {})) 98 | required.update(_OrderedSet(definition.get('required', _OrderedSet()))) 99 | 100 | class_to_generatees[name] = dict( 101 | name=name, 102 | properties=properties, 103 | base_definitions=base_definitions, 104 | description=description, 105 | required=required, 106 | ) 107 | return class_to_generatees 108 | 109 | 110 | def collect_bases(curr_class, classes_to_generate, memo=None): 111 | ret = [] 112 | if memo is None: 113 | memo = {} 114 | 115 | base_definitions = curr_class['base_definitions'] 116 | for base_definition in base_definitions: 117 | if base_definition not in memo: 118 | ret.append(base_definition) 119 | ret.extend(collect_bases(classes_to_generate[base_definition], classes_to_generate, memo)) 120 | 121 | return ret 122 | 123 | 124 | def fill_properties_and_required_from_base(classes_to_generate): 125 | # Now, resolve properties based on refs 126 | for class_to_generate in classes_to_generate.values(): 127 | dct = {} 128 | s = _OrderedSet() 129 | 130 | for base_definition in reversed(collect_bases(class_to_generate, classes_to_generate)): 131 | # Note: go from base to current so that the initial order of the properties has that 132 | # same order. 133 | dct.update(classes_to_generate[base_definition].get('properties', {})) 134 | s.update(classes_to_generate[base_definition].get('required', _OrderedSet())) 135 | 136 | dct.update(class_to_generate['properties']) 137 | class_to_generate['properties'] = dct 138 | 139 | s.update(class_to_generate['required']) 140 | class_to_generate['required'] = s 141 | 142 | return class_to_generate 143 | 144 | 145 | def update_class_to_generate_description(class_to_generate): 146 | import textwrap 147 | description = class_to_generate['description'] 148 | lines = [] 149 | for line in description.splitlines(): 150 | wrapped = textwrap.wrap(line.strip(), 100) 151 | lines.extend(wrapped) 152 | lines.append('') 153 | 154 | while lines and lines[-1] == '': 155 | lines = lines[:-1] 156 | 157 | class_to_generate['description'] = ' ' + ('\n '.join(lines)) 158 | 159 | 160 | def update_class_to_generate_type(class_to_generate): 161 | properties = class_to_generate.get('properties') 162 | for _prop_name, prop_val in properties.items(): 163 | prop_type = prop_val.get('type', '') 164 | if not prop_type: 165 | prop_type = prop_val.pop('$ref', '') 166 | if prop_type: 167 | assert prop_type.startswith('#/definitions/') 168 | prop_type = prop_type[len('#/definitions/'):] 169 | prop_val['type'] = Ref(prop_type) 170 | 171 | 172 | def update_class_to_generate_register_dec(classes_to_generate, class_to_generate): 173 | # Default 174 | class_to_generate['register_request'] = '' 175 | class_to_generate['register_dec'] = '@register' 176 | 177 | properties = class_to_generate.get('properties') 178 | enum_type = properties.get('type', {}).get('enum') 179 | command = None 180 | if enum_type and len(enum_type) == 1 and next(iter(enum_type)) in ("request", "response"): 181 | msg_type = next(iter(enum_type)) 182 | if msg_type == 'response': 183 | # The actual command is typed in the request 184 | response_name = class_to_generate['name'] 185 | request_name = response_name[:-len('Response')] + 'Request' 186 | if request_name in classes_to_generate: 187 | command = classes_to_generate[request_name]['properties'].get('command') 188 | else: 189 | if response_name == 'ErrorResponse': 190 | command = {'enum' : ['error']} 191 | else: 192 | raise AssertionError('Unhandled: %s' % (response_name,)) 193 | 194 | else: 195 | command = properties.get('command') 196 | 197 | if command: 198 | enum = command.get('enum') 199 | if enum and len(enum) == 1: 200 | class_to_generate['register_request'] = '@register_%s(%r)\n' % (msg_type, enum[0]) 201 | 202 | 203 | def extract_prop_name_and_prop(class_to_generate): 204 | properties = class_to_generate.get('properties') 205 | required = _OrderedSet(class_to_generate.get('required', _OrderedSet())) 206 | 207 | # Sort so that required come first 208 | prop_name_and_prop = list(properties.items()) 209 | 210 | def compute_sort_key(x): 211 | key = x[0] 212 | if key in required: 213 | if key == 'seq': 214 | return 0.5 # seq when required is after the other required keys (to have a default of -1). 215 | return 0 216 | return 1 217 | 218 | prop_name_and_prop.sort(key=compute_sort_key) 219 | 220 | return prop_name_and_prop 221 | 222 | 223 | def update_class_to_generate_to_json(class_to_generate): 224 | required = _OrderedSet(class_to_generate.get('required', _OrderedSet())) 225 | prop_name_and_prop = extract_prop_name_and_prop(class_to_generate) 226 | 227 | to_dict_body = ['def to_dict(self):'] 228 | to_dict_body.append(' dct = {') 229 | first_not_required = False 230 | 231 | for prop_name, prop in prop_name_and_prop: 232 | namespace = dict(prop_name=prop_name) 233 | is_ref = prop['type'].__class__ == Ref 234 | if prop_name in required: 235 | if is_ref: 236 | to_dict_body.append(' %(prop_name)r: self.%(prop_name)s.to_dict(),' % namespace) 237 | else: 238 | to_dict_body.append(' %(prop_name)r: self.%(prop_name)s,' % namespace) 239 | else: 240 | if not first_not_required: 241 | first_not_required = True 242 | to_dict_body.append(' }') 243 | 244 | to_dict_body.append(' if self.%(prop_name)s is not None:' % namespace) 245 | if is_ref: 246 | to_dict_body.append(' dct[%(prop_name)r] = self.%(prop_name)s.to_dict()' % namespace) 247 | else: 248 | to_dict_body.append(' dct[%(prop_name)r] = self.%(prop_name)s' % namespace) 249 | 250 | if not first_not_required: 251 | first_not_required = True 252 | to_dict_body.append(' }') 253 | 254 | to_dict_body.append(' dct.update(self.kwargs)') 255 | to_dict_body.append(' return dct') 256 | 257 | class_to_generate['to_dict'] = _indent_lines('\n'.join(to_dict_body)) 258 | 259 | 260 | def update_class_to_generate_init(class_to_generate): 261 | args = [] 262 | init_body = [] 263 | docstring = [] 264 | 265 | required = _OrderedSet(class_to_generate.get('required', _OrderedSet())) 266 | prop_name_and_prop = extract_prop_name_and_prop(class_to_generate) 267 | 268 | for prop_name, prop in prop_name_and_prop: 269 | enum = prop.get('enum') 270 | if enum and len(enum) == 1: 271 | init_body.append(' self.%(prop_name)s = %(enum)r' % dict(prop_name=prop_name, enum=next(iter(enum)))) 272 | else: 273 | if prop_name in required: 274 | if prop_name == 'seq': 275 | args.append(prop_name + '=-1') 276 | else: 277 | args.append(prop_name) 278 | else: 279 | args.append(prop_name + '=None') 280 | 281 | if prop['type'].__class__ == Ref: 282 | namespace = dict( 283 | prop_name=prop_name, 284 | ref_name=str(prop['type']) 285 | ) 286 | init_body.append(' if %(prop_name)s is None:' % namespace) 287 | init_body.append(' self.%(prop_name)s = %(ref_name)s()' % namespace) 288 | init_body.append(' else:') 289 | init_body.append(' self.%(prop_name)s = %(ref_name)s(**%(prop_name)s) if %(prop_name)s.__class__ != %(ref_name)s else %(prop_name)s' % namespace 290 | ) 291 | 292 | else: 293 | init_body.append(' self.%(prop_name)s = %(prop_name)s' % dict( 294 | prop_name=prop_name)) 295 | 296 | prop_type = prop['type'] 297 | prop_description = prop.get('description', '') 298 | 299 | docstring.append(':param %(prop_type)s %(prop_name)s: %(prop_description)s' % dict( 300 | prop_type=prop_type, prop_name=prop_name, prop_description=prop_description)) 301 | 302 | docstring = _indent_lines('\n'.join(docstring)) 303 | init_body = '\n'.join(init_body) 304 | 305 | # Actually bundle the whole __init__ from the parts. 306 | args = ', '.join(args) 307 | if args: 308 | args = ', ' + args 309 | 310 | # Note: added kwargs because some messages are expected to be extended by the user (so, we'll actually 311 | # make all extendable so that we don't have to worry about which ones -- we loose a little on typing, 312 | # but may be better than doing a whitelist based on something only pointed out in the documentation). 313 | class_to_generate['init'] = '''def __init__(self%(args)s, **kwargs): 314 | """ 315 | %(docstring)s 316 | """ 317 | %(init_body)s 318 | self.kwargs = kwargs 319 | ''' % dict(args=args, init_body=init_body, docstring=docstring) 320 | 321 | class_to_generate['init'] = _indent_lines(class_to_generate['init']) 322 | 323 | 324 | def update_class_to_generate_props(class_to_generate): 325 | import json 326 | 327 | def default(o): 328 | if isinstance(o, Ref): 329 | return o.ref 330 | raise AssertionError('Unhandled: %s' % (o,)) 331 | 332 | properties = class_to_generate['properties'] 333 | class_to_generate['props'] = ' __props__ = %s' % _indent_lines( 334 | json.dumps(properties, indent=4, default=default)).strip() 335 | 336 | 337 | def update_class_to_generate_refs(class_to_generate): 338 | properties = class_to_generate['properties'] 339 | class_to_generate['refs'] = ' __refs__ = %s' % _OrderedSet(key for (key, val) in properties.items() if val['type'].__class__ == Ref).set_repr() 340 | 341 | 342 | def update_class_to_generate_objects(classes_to_generate, class_to_generate): 343 | properties = class_to_generate['properties'] 344 | for key, val in properties.items(): 345 | if val['type'] == 'object': 346 | create_new = val.copy() 347 | create_new.update({ 348 | 'name': '%s%s' % (class_to_generate['name'], key.title()), 349 | 'description': ' "%s" of %s' % (key, class_to_generate['name']) 350 | }) 351 | if 'properties' not in create_new: 352 | create_new['properties'] = {} 353 | 354 | assert create_new['name'] not in classes_to_generate 355 | classes_to_generate[create_new['name']] = create_new 356 | 357 | update_class_to_generate_type(create_new) 358 | update_class_to_generate_props(create_new) 359 | 360 | # Update nested object types 361 | update_class_to_generate_objects(classes_to_generate, create_new) 362 | 363 | val['type'] = Ref(create_new['name']) 364 | val.pop('properties', None) 365 | 366 | 367 | def prepend_encoding(filename, encoding): 368 | with open(filename, 'r+') as f: 369 | content = f.read() 370 | f.seek(0, 0) 371 | f.write("# -*- coding: %s -*-\n" % encoding + content) 372 | 373 | 374 | def clean(): 375 | import os 376 | script_dir = os.path.dirname(os.path.realpath(__file__)) 377 | try: 378 | os.remove(os.path.join(script_dir, "debugProtocol.json")) 379 | except FileNotFoundError: 380 | pass 381 | 382 | 383 | def gen_debugger_protocol(): 384 | import os.path 385 | import sys 386 | 387 | if sys.version_info[:2] < (3, 6): 388 | raise AssertionError('Must be run with Python 3.6 onwards (to keep dict order).') 389 | 390 | classes_to_generate = create_classes_to_generate_structure(load_schema_data()) 391 | 392 | class_to_generate = fill_properties_and_required_from_base(classes_to_generate) 393 | 394 | for class_to_generate in list(classes_to_generate.values()): 395 | update_class_to_generate_description(class_to_generate) 396 | update_class_to_generate_type(class_to_generate) 397 | update_class_to_generate_props(class_to_generate) 398 | update_class_to_generate_objects(classes_to_generate, class_to_generate) 399 | 400 | for class_to_generate in classes_to_generate.values(): 401 | update_class_to_generate_refs(class_to_generate) 402 | update_class_to_generate_init(class_to_generate) 403 | update_class_to_generate_to_json(class_to_generate) 404 | update_class_to_generate_register_dec(classes_to_generate, class_to_generate) 405 | 406 | class_template = ''' 407 | %(register_request)s%(register_dec)s 408 | class %(name)s(BaseSchema): 409 | """ 410 | %(description)s 411 | 412 | Note: automatically generated code. Do not edit manually. 413 | """ 414 | 415 | %(props)s 416 | %(refs)s 417 | 418 | __slots__ = list(__props__.keys()) + ['kwargs'] 419 | 420 | %(init)s 421 | 422 | %(to_dict)s 423 | ''' 424 | 425 | contents = [] 426 | contents.append('# Note: automatically generated code. Do not edit manually.') 427 | contents.append('from .base_schema import BaseSchema, register, register_request, register_response') 428 | contents.append('') 429 | for class_to_generate in classes_to_generate.values(): 430 | contents.append(class_template % class_to_generate) 431 | 432 | parent_dir = os.path.dirname(os.path.dirname(__file__)) 433 | schema = os.path.join(parent_dir, 'debug_adapter', 'schema.py') 434 | with open(schema, 'w', encoding='utf-8') as stream: 435 | stream.write('\n'.join(contents)) 436 | prepend_encoding(schema, "utf-8") 437 | clean() 438 | 439 | 440 | 441 | def _indent_lines(lines, indent=' '): 442 | out_lines = [] 443 | for line in lines.splitlines(keepends=True): 444 | out_lines.append(indent + line) 445 | 446 | return ''.join(out_lines) 447 | 448 | 449 | if __name__ == '__main__': 450 | 451 | 452 | gen_debugger_protocol() 453 | -------------------------------------------------------------------------------- /tests/__init__.py: -------------------------------------------------------------------------------- 1 | from . import environment 2 | -------------------------------------------------------------------------------- /tests/conftest.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from tests import environment 26 | from debug_adapter.internal_classes import DaOpenOcdModes 27 | from datetime import datetime 28 | import debug_adapter 29 | import pytest 30 | import os 31 | 32 | 33 | @pytest.hookimpl(tryfirst=True) 34 | def pytest_configure(config): 35 | """ 36 | see: https://docs.pytest.org/en/stable/reference.html#ini-options-ref 37 | """ 38 | # set custom options only if none are provided from command line 39 | now = datetime.now() 40 | # create report target dir 41 | reports_dir = environment.ADAPTER_TOP_PATH / "tests" / "results" 42 | reports_dir.mkdir(parents=True, exist_ok=True) 43 | # custom report file 44 | config.option.xmlpath = str(reports_dir / ("report_%s.xml" % now.strftime('%H%M'))) 45 | 46 | 47 | @pytest.fixture() 48 | def setup_teardown(): 49 | print("setup") 50 | yield "setup_teardown" 51 | print("teardown") 52 | 53 | 54 | @pytest.fixture 55 | def coredump_args(): 56 | da_args = debug_adapter.DaArgs() 57 | da_args.debug = 4 58 | da_args.elfpath = str(environment.ADAPTER_TOP_PATH / "tests" / "target" / "blink.elf") 59 | da_args.core_file = str(environment.ADAPTER_TOP_PATH / "tests" / "target" / "coredump.elf") 60 | da_args.port = 43474 61 | da_args.device_name = "esp32" 62 | da_args.postmortem = True 63 | da_args.log_file = "debug.log" 64 | da_args.log_no_debug_console = True 65 | return da_args 66 | 67 | 68 | @pytest.fixture 69 | def hostapp_args(): 70 | da_args = debug_adapter.DaArgs() 71 | da_args.debug = 4 72 | testapp_name = "test_app" 73 | if os.name == 'nt': 74 | testapp_name += ".exe" 75 | da_args.elfpath = str(environment.ADAPTER_TOP_PATH / "tests" / "target" / "host" / testapp_name) 76 | da_args.log_file = "debug.log" 77 | da_args.log_no_debug_console = True 78 | da_args.oocd_mode = DaOpenOcdModes.NO_OOCD 79 | return da_args 80 | -------------------------------------------------------------------------------- /tests/environment.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | import sys 26 | import pathlib 27 | 28 | ADAPTER_TOP_PATH = pathlib.Path(__file__).parent.parent 29 | TEST_APP_HOST_SRC = ADAPTER_TOP_PATH / "tests" / "target" / "host" / "test_app.c" 30 | 31 | sys.path.append(str(ADAPTER_TOP_PATH.joinpath("tests"))) 32 | -------------------------------------------------------------------------------- /tests/helpers.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | import pathlib 26 | from debug_adapter import schema 27 | import timeline 28 | from tests.patterns import some 29 | 30 | 31 | def build_setbp_request(path, bps): 32 | if isinstance(path, pathlib.Path): 33 | path = str(path) 34 | rq = schema.SetBreakpointsRequest(arguments={ 35 | "source": schema.Source(path=path), 36 | "breakpoints": bps, 37 | }) 38 | return rq 39 | 40 | 41 | def set_breakpoints(ts, path, bps): 42 | rq = build_setbp_request(path, bps) 43 | resp = ts.send_request(rq) 44 | assert resp.success 45 | 46 | 47 | def continue_till_stopped(ts, stop_reason, thread_id=None, timeout=5): 48 | rq = schema.ContinueRequest(arguments=schema.ContinueArguments(0)) 49 | resp = ts.send_request(rq) 50 | assert resp.success 51 | 52 | expect_body = {"reason": stop_reason} 53 | if thread_id is not None: 54 | expect_body['threadId'] = thread_id 55 | expectation = timeline.Event(event="stopped", body=some.dict.containing(expect_body)) 56 | result = ts.wait_for(expectation, timeout_s=timeout) 57 | assert result 58 | 59 | 60 | def get_stack_trace(ts, thread_id, start, num): 61 | rq = schema.StackTraceRequest(arguments=schema.StackTraceArguments(threadId=thread_id, startFrame=0, levels=20)) 62 | resp = ts.send_request(rq) 63 | return resp.body.get("stackFrames") 64 | 65 | 66 | def get_top_frame_info(ts, thread_id): 67 | stack = get_stack_trace(ts, thread_id, 0, 1) 68 | assert len(stack) > 0 69 | line = stack[0].get("line") 70 | name = stack[0].get("name") 71 | return line, name 72 | -------------------------------------------------------------------------------- /tests/patterns/__init__.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) Microsoft Corporation. All rights reserved. 2 | # Licensed under the MIT License. See LICENSE in the project root 3 | # for license information. 4 | 5 | from __future__ import absolute_import, division, print_function, unicode_literals 6 | 7 | """Do not import this package directly - import tests.patterns.some instead. 8 | """ 9 | 10 | # Wire up some.dap to be an alias for dap, to allow writing some.dap.id etc. 11 | from tests.patterns import some 12 | from tests.patterns import dap 13 | 14 | some.dap = dap 15 | -------------------------------------------------------------------------------- /tests/patterns/_impl.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) Microsoft Corporation. All rights reserved. 2 | # Licensed under the MIT License. See LICENSE in the project root 3 | # for license information. 4 | 5 | from __future__ import absolute_import, division, print_function, unicode_literals 6 | 7 | # The actual patterns are defined here, so that tests.patterns.some can redefine 8 | # builtin names like str, int etc without affecting the implementations in this 9 | # file - some.* then provides shorthand aliases. 10 | 11 | import collections 12 | import itertools 13 | import py.path 14 | import re 15 | import sys 16 | 17 | from debugpy.common import compat, fmt 18 | from debugpy.common.compat import unicode, xrange 19 | import pydevd_file_utils 20 | 21 | 22 | class Some(object): 23 | """A pattern that can be tested against a value with == to see if it matches. 24 | """ 25 | 26 | def matches(self, value): 27 | raise NotImplementedError 28 | 29 | def __repr__(self): 30 | try: 31 | return self.name 32 | except AttributeError: 33 | raise NotImplementedError 34 | 35 | def __eq__(self, value): 36 | return self.matches(value) 37 | 38 | def __ne__(self, value): 39 | return not self.matches(value) 40 | 41 | def __invert__(self): 42 | """The inverse pattern - matches everything that this one doesn't. 43 | """ 44 | return Not(self) 45 | 46 | def __or__(self, pattern): 47 | """Union pattern - matches if either of the two patterns match. 48 | """ 49 | return Either(self, pattern) 50 | 51 | def such_that(self, condition): 52 | """Same pattern, but it only matches if condition() is true. 53 | """ 54 | return SuchThat(self, condition) 55 | 56 | def in_range(self, start, stop): 57 | """Same pattern, but it only matches if the start <= value < stop. 58 | """ 59 | return InRange(self, start, stop) 60 | 61 | def equal_to(self, obj): 62 | return EqualTo(self, obj) 63 | 64 | def not_equal_to(self, obj): 65 | return NotEqualTo(self, obj) 66 | 67 | def same_as(self, obj): 68 | return SameAs(self, obj) 69 | 70 | def matching(self, regex, flags=0): 71 | """Same pattern, but it only matches if re.match(regex, flags) produces 72 | a match that corresponds to the entire string. 73 | """ 74 | return Matching(self, regex, flags) 75 | 76 | # Used to obtain the JSON representation for logging. This is a hack, because 77 | # JSON serialization doesn't allow to customize raw output - this function can 78 | # only substitute for another object that is normally JSON-serializable. But 79 | # for patterns, we want <...> in the logs, not'"<...>". Thus, we insert dummy 80 | # marker chars here, such that it looks like "\002<...>\003" in serialized JSON - 81 | # and then tests.timeline._describe_message does a string substitution on the 82 | # result to strip out '"\002' and '\003"'. 83 | def __getstate__(self): 84 | return "\002" + repr(self) + "\003" 85 | 86 | 87 | class Not(Some): 88 | """Matches the inverse of the pattern. 89 | """ 90 | 91 | def __init__(self, pattern): 92 | self.pattern = pattern 93 | 94 | def __repr__(self): 95 | return fmt("~{0!r}", self.pattern) 96 | 97 | def matches(self, value): 98 | return value != self.pattern 99 | 100 | 101 | class Either(Some): 102 | """Matches either of the patterns. 103 | """ 104 | 105 | def __init__(self, *patterns): 106 | assert len(patterns) > 0 107 | self.patterns = tuple(patterns) 108 | 109 | def __repr__(self): 110 | try: 111 | return self.name 112 | except AttributeError: 113 | return fmt("({0})", " | ".join(repr(pat) for pat in self.patterns)) 114 | 115 | def matches(self, value): 116 | return any(pattern == value for pattern in self.patterns) 117 | 118 | def __or__(self, pattern): 119 | return Either(*(self.patterns + (pattern,))) 120 | 121 | 122 | class Object(Some): 123 | """Matches anything. 124 | """ 125 | 126 | name = "" 127 | 128 | def matches(self, value): 129 | return True 130 | 131 | 132 | class Thing(Some): 133 | """Matches anything that is not None. 134 | """ 135 | 136 | name = "<>" 137 | 138 | def matches(self, value): 139 | return value is not None 140 | 141 | 142 | class InstanceOf(Some): 143 | """Matches any object that is an instance of the specified type. 144 | """ 145 | 146 | def __init__(self, classinfo, name=None): 147 | if isinstance(classinfo, type): 148 | classinfo = (classinfo,) 149 | assert len(classinfo) > 0 and all( 150 | (isinstance(cls, type) for cls in classinfo) 151 | ), "classinfo must be a type or a tuple of types" 152 | 153 | self.name = name 154 | self.classinfo = classinfo 155 | 156 | def __repr__(self): 157 | if self.name: 158 | name = self.name 159 | else: 160 | name = " | ".join(cls.__name__ for cls in self.classinfo) 161 | return fmt("<{0}>", name) 162 | 163 | def matches(self, value): 164 | return isinstance(value, self.classinfo) 165 | 166 | 167 | class Path(Some): 168 | """Matches any string that matches the specified path. 169 | 170 | Uses os.path.normcase() to normalize both strings before comparison. 171 | 172 | If one string is unicode, but the other one is not, both strings are normalized 173 | to unicode using sys.getfilesystemencoding(). 174 | """ 175 | 176 | def __init__(self, path): 177 | if isinstance(path, py.path.local): 178 | path = path.strpath 179 | if isinstance(path, bytes): 180 | path = path.encode(sys.getfilesystemencoding()) 181 | assert isinstance(path, unicode) 182 | self.path = path 183 | 184 | def __repr__(self): 185 | return fmt("path({0!r})", self.path) 186 | 187 | def __str__(self): 188 | return compat.filename_str(self.path) 189 | 190 | def __unicode__(self): 191 | return self.path 192 | 193 | def __getstate__(self): 194 | return self.path 195 | 196 | def matches(self, other): 197 | if isinstance(other, py.path.local): 198 | other = other.strpath 199 | 200 | if isinstance(other, unicode): 201 | pass 202 | elif isinstance(other, bytes): 203 | other = other.encode(sys.getfilesystemencoding()) 204 | else: 205 | return NotImplemented 206 | 207 | left = pydevd_file_utils.get_path_with_real_case(self.path) 208 | right = pydevd_file_utils.get_path_with_real_case(other) 209 | return left == right 210 | 211 | 212 | class ListContaining(Some): 213 | """Matches any list that contains the specified subsequence of elements. 214 | """ 215 | 216 | def __init__(self, *items): 217 | self.items = tuple(items) 218 | 219 | def __repr__(self): 220 | if not self.items: 221 | return "[...]" 222 | s = repr(list(self.items)) 223 | return fmt("[..., {0}, ...]", s[1:-1]) 224 | 225 | def __getstate__(self): 226 | items = ["\002...\003"] 227 | if not self.items: 228 | return items 229 | items *= 2 230 | items[1:1] = self.items 231 | return items 232 | 233 | def matches(self, other): 234 | if not isinstance(other, list): 235 | return NotImplemented 236 | 237 | items = self.items 238 | if not items: 239 | return True # every list contains an empty sequence 240 | if len(items) == 1: 241 | return self.items[0] in other 242 | 243 | # Zip the other list with itself, shifting by one every time, to produce 244 | # tuples of equal length with items - i.e. all potential subsequences. So, 245 | # given other=[1, 2, 3, 4, 5] and items=(2, 3, 4), we want to get a list 246 | # like [(1, 2, 3), (2, 3, 4), (3, 4, 5)] - and then search for items in it. 247 | iters = [itertools.islice(other, i, None) for i in xrange(0, len(items))] 248 | subseqs = compat.izip(*iters) 249 | return any(subseq == items for subseq in subseqs) 250 | 251 | 252 | class DictContaining(Some): 253 | """Matches any dict that contains the specified key-value pairs:: 254 | 255 | d1 = {'a': 1, 'b': 2, 'c': 3} 256 | d2 = {'a': 1, 'b': 2} 257 | assert d1 == some.dict.containing(d2) 258 | assert d2 != some.dict.containing(d1) 259 | """ 260 | 261 | def __init__(self, items): 262 | self.items = collections.OrderedDict(items) 263 | 264 | def __repr__(self): 265 | return dict.__repr__(self.items)[:-1] + ", ...}" 266 | 267 | def __getstate__(self): 268 | items = self.items.copy() 269 | items["\002..."] = "...\003" 270 | return items 271 | 272 | def matches(self, other): 273 | if not isinstance(other, dict): 274 | return NotImplemented 275 | any = Object() 276 | d = {key: any for key in other} 277 | d.update(self.items) 278 | return d == other 279 | 280 | 281 | class Also(Some): 282 | """Base class for patterns that narrow down another pattern. 283 | """ 284 | 285 | def __init__(self, pattern): 286 | self.pattern = pattern 287 | 288 | def matches(self, value): 289 | return self.pattern == value and self._also(value) 290 | 291 | def _also(self, value): 292 | raise NotImplementedError 293 | 294 | 295 | class SuchThat(Also): 296 | """Matches only if condition is true. 297 | """ 298 | 299 | def __init__(self, pattern, condition): 300 | super(SuchThat, self).__init__(pattern) 301 | self.condition = condition 302 | 303 | def __repr__(self): 304 | try: 305 | return self.name 306 | except AttributeError: 307 | return fmt("({0!r} if {1})", self.pattern, compat.nameof(self.condition)) 308 | 309 | def _also(self, value): 310 | return self.condition(value) 311 | 312 | 313 | class InRange(Also): 314 | """Matches only if the value is within the specified range. 315 | """ 316 | 317 | def __init__(self, pattern, start, stop): 318 | super(InRange, self).__init__(pattern) 319 | self.start = start 320 | self.stop = stop 321 | 322 | def __repr__(self): 323 | try: 324 | return self.name 325 | except AttributeError: 326 | return fmt("({0!r} <= {1!r} < {2!r})", self.start, self.pattern, self.stop) 327 | 328 | def _also(self, value): 329 | return self.start <= value < self.stop 330 | 331 | 332 | class EqualTo(Also): 333 | """Matches any object that is equal to the specified object. 334 | """ 335 | 336 | def __init__(self, pattern, obj): 337 | super(EqualTo, self).__init__(pattern) 338 | self.obj = obj 339 | 340 | def __repr__(self): 341 | return repr(self.obj) 342 | 343 | def __str__(self): 344 | return str(self.obj) 345 | 346 | def __unicode__(self): 347 | return unicode(self.obj) 348 | 349 | def __getstate__(self): 350 | return self.obj 351 | 352 | def _also(self, value): 353 | return self.obj == value 354 | 355 | 356 | class NotEqualTo(Also): 357 | """Matches any object that is not equal to the specified object. 358 | """ 359 | 360 | def __init__(self, pattern, obj): 361 | super(NotEqualTo, self).__init__(pattern) 362 | self.obj = obj 363 | 364 | def __repr__(self): 365 | return fmt("", self.obj) 366 | 367 | def _also(self, value): 368 | return self.obj != value 369 | 370 | 371 | class SameAs(Also): 372 | """Matches one specific object only (i.e. makes '==' behave like 'is'). 373 | """ 374 | 375 | def __init__(self, pattern, obj): 376 | super(SameAs, self).__init__(pattern) 377 | self.obj = obj 378 | 379 | def __repr__(self): 380 | return fmt("", self.obj) 381 | 382 | def _also(self, value): 383 | return self.obj is value 384 | 385 | 386 | class Matching(Also): 387 | """Matches any string that matches the specified regular expression. 388 | """ 389 | 390 | def __init__(self, pattern, regex, flags=0): 391 | assert isinstance(regex, bytes) or isinstance(regex, unicode) 392 | super(Matching, self).__init__(pattern) 393 | self.regex = regex 394 | self.flags = flags 395 | 396 | def __repr__(self): 397 | s = repr(self.regex) 398 | if s[0] in "bu": 399 | return s[0] + "/" + s[2:-1] + "/" 400 | else: 401 | return "/" + s[1:-1] + "/" 402 | 403 | def _also(self, value): 404 | regex = self.regex 405 | 406 | # re.match() always starts matching at the beginning, but does not require 407 | # a complete match of the string - append "$" to ensure the latter. 408 | if isinstance(regex, bytes): 409 | if not isinstance(value, bytes): 410 | return NotImplemented 411 | regex += b"$" 412 | elif isinstance(regex, unicode): 413 | if not isinstance(value, unicode): 414 | return NotImplemented 415 | regex += "$" 416 | else: 417 | raise AssertionError() 418 | 419 | return re.match(regex, value, self.flags) is not None 420 | -------------------------------------------------------------------------------- /tests/patterns/dap.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) Microsoft Corporation. All rights reserved. 2 | # Licensed under the MIT License. See LICENSE in the project root 3 | # for license information. 4 | 5 | from __future__ import absolute_import, division, print_function, unicode_literals 6 | 7 | """Patterns that are specific to the Debug Adapter Protocol. 8 | """ 9 | 10 | import py.path 11 | 12 | from debugpy.common.compat import unicode 13 | # from tests import code 14 | from tests.patterns import some, _impl 15 | 16 | 17 | id = some.int.in_range(0, 10000) 18 | """Matches a DAP "id", assuming some reasonable range for an implementation that 19 | generates those ids sequentially. 20 | """ 21 | 22 | 23 | def source(path, **kwargs): 24 | """Matches DAP Source objects. 25 | """ 26 | if isinstance(path, py.path.local): 27 | path = some.path(path) 28 | d = {"path": path} 29 | d.update(kwargs) 30 | return some.dict.containing(d) 31 | 32 | 33 | # def frame(source, line, **kwargs): 34 | # """Matches DAP Frame objects. 35 | 36 | # If source is py.path.local, it's automatically wrapped with some.dap.source(). 37 | 38 | # If line is unicode, it is treated as a line marker, and translated to a line 39 | # number via get_marked_line_numbers(source["path"]) if possible. 40 | # """ 41 | 42 | # if isinstance(source, py.path.local): 43 | # source = some.dap.source(source) 44 | 45 | # if isinstance(line, unicode): 46 | # if isinstance(source, dict): 47 | # path = source["path"] 48 | # elif isinstance(source, _impl.DictContaining): 49 | # path = source.items["path"] 50 | # else: 51 | # path = None 52 | # assert isinstance( 53 | # path, _impl.Path 54 | # ), "source must be some.dap.source() to use line markers in some.dap.frame()" 55 | # line = code.get_marked_line_numbers(path.path)[line] 56 | 57 | # d = {"id": some.dap.id, "source": source, "line": line, "column": 1} 58 | # d.update(kwargs) 59 | # return some.dict.containing(d) 60 | -------------------------------------------------------------------------------- /tests/patterns/some.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) Microsoft Corporation. All rights reserved. 2 | # Licensed under the MIT License. See LICENSE in the project root 3 | # for license information. 4 | 5 | from __future__ import absolute_import, division, print_function, unicode_literals 6 | 7 | """Pattern matching for recursive Python data structures. 8 | 9 | Usage:: 10 | 11 | from tests.patterns import some 12 | 13 | assert object() == some.object 14 | assert None == some.object 15 | assert None != some.thing 16 | assert None == ~some.thing # inverse 17 | assert None == some.thing | None 18 | 19 | assert 123 == some.thing.in_range(0, 200) 20 | assert "abc" == some.thing.such_that(lambda s: s.startswith("ab")) 21 | 22 | xs = [] 23 | assert xs == some.specific_object(xs) # xs is xs 24 | assert xs != some.specific_object([]) # xs is not [] 25 | 26 | assert Exception() == some.instanceof(Exception) 27 | assert 123 == some.instanceof((int, str)) 28 | assert "abc" == some.instanceof((int, str)) 29 | 30 | assert True == some.bool 31 | assert 123.456 == some.number 32 | assert 123 == some.int 33 | assert Exception() == some.error 34 | assert object() == some.object.same_as(object()) 35 | 36 | assert b"abc" == some.bytes 37 | assert u"abc" == some.str 38 | if sys.version_info < (3,): 39 | assert b"abc" == some.str 40 | else: 41 | assert b"abc" != some.str 42 | 43 | assert "abbc" == some.str.starting_with("ab") 44 | assert "abbc" == some.str.ending_with("bc") 45 | assert "abbc" == some.str.containing("bb") 46 | 47 | assert "abbc" == some.str.matching(r".(b+).") 48 | assert "abbc" != some.str.matching(r"ab") 49 | assert "abbc" != some.str.matching(r"bc") 50 | 51 | if sys.platform == "win32": 52 | assert "\\Foo\\Bar" == some.path("/foo/bar") 53 | else: 54 | assert "/Foo/Bar" != some.path("/foo/bar") 55 | 56 | assert { 57 | "bool": True, 58 | "list": [None, True, 123], 59 | "dict": { 60 | "int": 123, 61 | "str": "abc", 62 | }, 63 | } == some.dict.containing({ 64 | "list": [None, some.bool, some.int | some.str], 65 | "dict": some.dict.containing({ 66 | "int": some.int.in_range(100, 200), 67 | }) 68 | }) 69 | """ 70 | 71 | __all__ = [ 72 | "bool", 73 | "bytes", 74 | "dap", 75 | "dict", 76 | "error", 77 | "instanceof", 78 | "int", 79 | "list", 80 | "number", 81 | "path", 82 | "str", 83 | "thing", 84 | "tuple", 85 | ] 86 | 87 | import numbers 88 | import re 89 | import sys 90 | 91 | from debugpy.common.compat import builtins 92 | from tests.patterns import _impl 93 | 94 | 95 | object = _impl.Object() 96 | thing = _impl.Thing() 97 | instanceof = _impl.InstanceOf 98 | path = _impl.Path 99 | 100 | 101 | bool = instanceof(builtins.bool) 102 | number = instanceof(numbers.Real, "number") 103 | int = instanceof(numbers.Integral, "int") 104 | tuple = instanceof(builtins.tuple) 105 | error = instanceof(Exception) 106 | 107 | 108 | bytes = instanceof(builtins.bytes) 109 | bytes.starting_with = lambda prefix: bytes.matching( 110 | re.escape(prefix) + b".*", re.DOTALL 111 | ) 112 | bytes.ending_with = lambda suffix: bytes.matching(b".*" + re.escape(suffix), re.DOTALL) 113 | bytes.containing = lambda sub: bytes.matching(b".*" + re.escape(sub) + b".*", re.DOTALL) 114 | 115 | 116 | """In Python 2, matches both str and unicode. In Python 3, only matches str. 117 | """ 118 | if sys.version_info < (3,): 119 | str = instanceof((builtins.str, builtins.unicode), "str") 120 | else: 121 | str = instanceof(builtins.str) 122 | 123 | str.starting_with = lambda prefix: str.matching(re.escape(prefix) + ".*", re.DOTALL) 124 | str.ending_with = lambda suffix: str.matching(".*" + re.escape(suffix), re.DOTALL) 125 | str.containing = lambda sub: str.matching(".*" + re.escape(sub) + ".*", re.DOTALL) 126 | 127 | 128 | list = instanceof(builtins.list) 129 | list.containing = _impl.ListContaining 130 | 131 | 132 | dict = instanceof(builtins.dict) 133 | dict.containing = _impl.DictContaining 134 | 135 | 136 | # Set in __init__.py to avoid circular dependency. 137 | dap = None 138 | -------------------------------------------------------------------------------- /tests/pytest_fixtures.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) Microsoft Corporation. All rights reserved. 2 | # Additions Copyright (c) 2020, Espressif Systems (Shanghai) Co. Ltd. 3 | # Licensed under the MIT License. See LICENSE in the project root 4 | # for license information. 5 | 6 | from __future__ import absolute_import, division, print_function, unicode_literals 7 | 8 | import py 9 | import pytest 10 | import sys 11 | import threading 12 | from debugpy.common import compat 13 | 14 | 15 | @pytest.fixture 16 | def daemon(request): 17 | """Provides a factory function for daemon threads. The returned thread is 18 | started immediately, and it must not be alive by the time the test returns. 19 | """ 20 | 21 | daemons = [] 22 | 23 | def factory(func, name_suffix=""): 24 | name = func.__name__ + name_suffix 25 | thread = threading.Thread(target=func, name=name) 26 | thread.daemon = True 27 | daemons.append(thread) 28 | thread.start() 29 | return thread 30 | 31 | yield factory 32 | 33 | try: 34 | failed = request.node.call_report.failed 35 | except AttributeError: 36 | pass 37 | else: 38 | if not failed: 39 | for thread in daemons: 40 | assert not thread.is_alive() 41 | 42 | 43 | if sys.platform != "win32": 44 | 45 | @pytest.fixture 46 | def long_tmpdir(request, tmpdir): 47 | return tmpdir 48 | 49 | 50 | else: 51 | import ctypes 52 | 53 | GetLongPathNameW = ctypes.windll.kernel32.GetLongPathNameW 54 | GetLongPathNameW.argtypes = [ctypes.c_wchar_p, ctypes.c_wchar_p, ctypes.c_uint32] 55 | GetLongPathNameW.restype = ctypes.c_uint32 56 | 57 | @pytest.fixture 58 | def long_tmpdir(request, tmpdir): 59 | """Like tmpdir, but ensures that it's a long rather than short filename on Win32. 60 | """ 61 | path = compat.filename(tmpdir.strpath) 62 | buffer = ctypes.create_unicode_buffer(512) 63 | if GetLongPathNameW(path, buffer, len(buffer)): 64 | path = buffer.value 65 | return py.path.local(path) 66 | -------------------------------------------------------------------------------- /tests/requirements.txt: -------------------------------------------------------------------------------- 1 | ## Used to run the tests: 2 | debugpy==1.6.0 3 | pydevd 4 | pytest 5 | pytest-xdist 6 | pytest-cov 7 | pytest-timeout 8 | 9 | ## Used by CI tasks: 10 | flake8 11 | -------------------------------------------------------------------------------- /tests/standard_requests.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from debug_adapter import schema 26 | 27 | REQUEST_INIT = schema.InitializeRequest(arguments={ 28 | "adapterID": "espidf", 29 | "clientID": "vscode", 30 | "clientName": "Visual Studio Code", 31 | "columnsStartAt1": True, 32 | "linesStartAt1": True, 33 | "locale": "en-us", 34 | "pathFormat": "path", 35 | "supportsRunInTerminalRequest": True, 36 | "supportsVariablePaging": True, 37 | "supportsVariableType": True 38 | }) 39 | 40 | REQUEST_LAUNCH = schema.LaunchRequest(arguments={ 41 | "__sessionId": "79b3673d-5b08-44e5-98ca-429a521464cb", 42 | "externalConsole": False, 43 | "logging": { 44 | "engineLogging": True 45 | }, 46 | "name": "Launch", 47 | "preLaunchTask": "adapter", 48 | "request": "launch", 49 | "type": "espidf" 50 | }) 51 | -------------------------------------------------------------------------------- /tests/t_session.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from debug_adapter import DebugAdapter, schema, log 26 | from debug_adapter.internal_classes import DaArgs, DaRunState 27 | from debugpy.common import messaging 28 | from debugpy.common.messaging import JsonIOStream 29 | from debugpy.common.sockets import create_client 30 | from tests import timeline 31 | from threading import Thread, Lock 32 | from typing import Union 33 | import collections 34 | import functools 35 | import itertools 36 | import json 37 | import select 38 | import socket 39 | import time 40 | 41 | 42 | class MessageFactory(object): 43 | """A factory for DAP messages that are not bound to a message channel. 44 | """ 45 | def __init__(self): 46 | self._seq_iter = itertools.count(1) 47 | self.messages = collections.OrderedDict() 48 | self.event = functools.partial(self._make, messaging.Event) 49 | self.request = functools.partial(self._make, messaging.Request) 50 | self.response = functools.partial(self._make, messaging.Response) 51 | 52 | def _make(self, message_type, *args, **kwargs): 53 | message = message_type(None, next(self._seq_iter), *args, **kwargs) 54 | self.messages[message.seq] = message 55 | return message 56 | 57 | 58 | class TSession: 59 | def __init__(self, adapter_args, name="noname"): 60 | """ 61 | Test session. 62 | 63 | Parameters 64 | ---------- 65 | adapter_args : DaArgs 66 | [description] 67 | """ 68 | self.session_name = name # type: str 69 | self._adapter_args = adapter_args # type: DaArgs 70 | self._adapter_obj = None # type: Union[DebugAdapter, None] 71 | self._adapter_thread = None # type: Union[Thread, None] 72 | # self._client = None # type: ClientDa 73 | self._client_stream = None # type: Union[JsonIOStream, None] 74 | self._client_socket = None # type: Union[socket.socket, None] 75 | self.session_timeline = timeline.Timeline("Timeline of \"%s\"" % self.session_name) 76 | self._message_factory = MessageFactory() 77 | self._reader_thread = None # type: Union[Thread, None] 78 | self._ongoing_request = None # used for catching responses. Will set to None after every recorded responce 79 | self._last_response = None 80 | self.__stop = False 81 | self.client_busy = Lock() 82 | 83 | def __enter__(self): 84 | log.info("================ Debug session ENTER: %s ================ " % self.session_name) 85 | self.session_start() 86 | return self 87 | 88 | def __exit__(self, exception_type, exception_value, traceback): 89 | if self.session_timeline.is_frozen: 90 | self.session_timeline.unfreeze() 91 | 92 | if exception_type is not None: 93 | print("closed with exception: " + str(exception_type)) 94 | self.session_stop() 95 | 96 | # Inherited from debugpy ----------------------------------------------- 97 | # # Work around https://bugs.python.org/issue37380 98 | # for popen in self.debuggee, self.adapter: 99 | # if popen is not None and popen.returncode is None: 100 | # popen.returncode = -1 101 | 102 | def is_input_message(self, timeout_s): 103 | ready, _, _ = select.select([self._client_socket], [], [], timeout_s) 104 | if len(ready): 105 | return True 106 | return False 107 | 108 | def session_start(self): 109 | self.start_adapter() 110 | while not self._adapter_obj.state.general_state == DaRunState.READY_TO_CONNECT: 111 | pass 112 | self.start_client() 113 | self._reader_thread = Thread(target=self._client_stream_monitor) 114 | self._reader_thread.start() 115 | 116 | def session_stop(self): 117 | self.__stop = True 118 | while self.client_busy.locked(): 119 | pass # waiting for finishing of operations 120 | if self._adapter_obj: 121 | disconnect_rq = schema.Request(command="disconnect", seq=-1, arguments={"restart": False}) 122 | self.send_request(disconnect_rq, False) 123 | # give adapter time to cleanup 124 | time.sleep(2.0) 125 | if self._client_stream and not self._client_stream._closed: 126 | self._client_stream.close() 127 | if self._client_socket and not self._client_socket._closed: 128 | self._client_socket.close() 129 | # avoid errors due to unobserved responses and events 130 | # TODO: avoid using 'TSession.wait_for' in tests. Instead of it, 131 | # compose expected message sequences and use 'timeline.wait_until_realized'. 132 | # Using that approach we will ensure that communication is done as expected and 133 | # do not need to call `timeline.observe_all` 134 | with self.session_timeline.frozen(): 135 | self.session_timeline.observe_all() 136 | self.session_timeline.close() 137 | 138 | def start_client(self): 139 | self._client_socket = create_client() 140 | self._client_socket.connect(("127.0.0.1", self._adapter_args.port)) 141 | self._client_stream = JsonIOStream.from_socket(self._client_socket) 142 | 143 | def start_adapter(self): 144 | self._adapter_obj = DebugAdapter(self._adapter_args) 145 | self._adapter_thread = Thread(target=self._adapter_obj.adapter_run) 146 | self._adapter_thread.start() 147 | 148 | def send_request(self, request: Union[schema.Request, dict], wait_responce=True): 149 | """ 150 | Parameters 151 | ---------- 152 | request : Union[schema.Request, dict] 153 | """ 154 | if isinstance(request, dict): 155 | request_dict = request 156 | else: 157 | request_dict = request.to_dict() 158 | self._client_stream.write_json(request_dict) 159 | request_msg = self._message_factory.request(request.command, request.arguments) 160 | self._ongoing_request = self.session_timeline.record_request(request_msg) 161 | 162 | log.warning("--->>> SENT JSON: %s" % request_dict) 163 | if wait_responce: 164 | while self._ongoing_request: 165 | pass 166 | assert request.command == self._last_response.command 167 | # TODO: Enable check below, currently DebugAdapter does not reflect request_seq in responses 168 | # assert self._ongoing_request.message.seq == self._last_response.request_seq 169 | return self._last_response 170 | 171 | def is_in_timeline(self, expectation): 172 | with self.session_timeline.frozen(): 173 | return expectation in self.session_timeline 174 | 175 | def wait_for(self, expectation, timeout_s=0): 176 | if timeout_s is not None: 177 | end = time.process_time() + timeout_s 178 | while not self.is_in_timeline(expectation): 179 | if timeout_s is not None and (time.process_time() >= end): 180 | return False 181 | return True 182 | 183 | def _client_stream_monitor(self): 184 | while not self.__stop: 185 | try: 186 | with self.client_busy: 187 | if self.is_input_message(0.05): 188 | new_json = self._client_stream.read_json() 189 | log.warning("<<<--- NEW JSON: %s" % new_json) 190 | new_json_type = new_json.get("type") 191 | if new_json_type == "event": 192 | event_msg = self._message_factory.event(new_json["event"], new_json.get("body", {})) 193 | event_msg.seq = new_json["seq"] 194 | self.session_timeline.record_event(event_msg) 195 | elif new_json_type == "response": 196 | assert self._ongoing_request 197 | response_msg = self._message_factory.response(new_json["command"], new_json.get("body", {})) 198 | response_msg.seq = new_json["request_seq"] 199 | response_msg.request = self._ongoing_request.message 200 | self._last_response = schema.Response( 201 | seq=response_msg.seq, 202 | request_seq=self._ongoing_request.seq, 203 | success=response_msg.success, 204 | body=response_msg.body, 205 | command=response_msg.request.command 206 | ) 207 | self.session_timeline.record_response(self._ongoing_request, response_msg) 208 | self._ongoing_request = None 209 | else: 210 | log.warning("Unknown message: %s" % new_json) 211 | else: 212 | pass 213 | except json.decoder.JSONDecodeError as e: 214 | log.warning("Failed to decode JSON: %s!" % e) 215 | except messaging.NoMoreMessages: 216 | pass 217 | -------------------------------------------------------------------------------- /tests/target/blink.elf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/espressif/esp-debug-adapter/0cabfa41372ae7b8d8133b0f82c7eb8e6025adc3/tests/target/blink.elf -------------------------------------------------------------------------------- /tests/target/build_all.py: -------------------------------------------------------------------------------- 1 | import os 2 | import pathlib 3 | 4 | HERE = pathlib.Path(__file__).parent 5 | test_app_srcs = [ 6 | HERE / "host" / "test_app.c", 7 | HERE / "host" / "test_app_src2.c" 8 | ] 9 | test_app_out = HERE / "host" / "test_app" 10 | 11 | os.system("gcc -O0 -g -pthread %s -o %s" % (' '.join(str(e) for e in test_app_srcs), str(test_app_out))) 12 | -------------------------------------------------------------------------------- /tests/target/coredump.elf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/espressif/esp-debug-adapter/0cabfa41372ae7b8d8133b0f82c7eb8e6025adc3/tests/target/coredump.elf -------------------------------------------------------------------------------- /tests/target/host/test_app.c: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | 5 | extern int print_hamlet(void); 6 | 7 | static void start_new_thread(long num); 8 | 9 | static void *print_hamlet_thread(void *arg) 10 | { 11 | long num = (long)arg; 12 | int lines = print_hamlet(); 13 | printf("\n[Printed lines of the play: %d]\n", lines ); 14 | printf("[Bye!] from thread %ld\n", num); 15 | if (num < 5) 16 | start_new_thread(num + 1); 17 | } 18 | 19 | static void start_new_thread(long num) 20 | { 21 | pthread_t thread_id; 22 | 23 | sleep(1); 24 | int ret = pthread_create(&thread_id, NULL, print_hamlet_thread, (void *)num); 25 | if (ret != 0) { 26 | printf("Failed to create thread %ld\n", num); 27 | } else { 28 | printf("Created thread %ld, id 0x%lx\n", num, thread_id); 29 | } 30 | } 31 | 32 | int main() 33 | { 34 | start_new_thread(0); 35 | while(1); 36 | return 0; 37 | } -------------------------------------------------------------------------------- /tests/target/host/test_app_src2.c: -------------------------------------------------------------------------------- 1 | #include 2 | 3 | int print_hamlet(void){ 4 | int lines = 0; 5 | lines++; 6 | printf("HAMLET\n"); 7 | lines++; 8 | printf("\n"); 9 | lines++; 10 | printf("O, I die, Horatio;\n"); 11 | lines++; 12 | printf("The potent poison quite o'er-crows my spirit:\n"); 13 | lines++; 14 | printf("I cannot live to hear the news from England;\n"); 15 | lines++; 16 | printf("But I do prophesy the election lights\n"); 17 | lines++; 18 | printf("On Fortinbras: he has my dying voice;\n"); 19 | lines++; 20 | printf("So tell him, with the occurrents, more and less,\n"); 21 | lines++; 22 | printf("Which have solicited. The rest is silence.\n"); 23 | lines++; 24 | printf("\n"); 25 | lines++; 26 | printf("Dies\n"); 27 | return lines; 28 | } 29 | -------------------------------------------------------------------------------- /tests/test_arguments.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from tests.conftest import setup_teardown # noqa: F401 26 | import debug_adapter 27 | import pytest 28 | 29 | 30 | @pytest.mark.timeout(30) 31 | def test_daargs_class(setup_teardown): # noqa: F811 32 | da_args = debug_adapter.DaArgs() 33 | assert isinstance(da_args, debug_adapter.DaArgs) 34 | 35 | val = "test_args.log" 36 | da_args.port = val 37 | assert da_args.port == val 38 | 39 | val = "test_args.log" 40 | da_args = debug_adapter.DaArgs(log_file=val) 41 | assert da_args.log_file == val 42 | 43 | da_args_new_int = debug_adapter.DaArgs(new_arg_int=123) 44 | assert da_args_new_int.new_arg_int == 123 45 | 46 | val = 123 47 | da_args = debug_adapter.DaArgs(new_arg_int=val) 48 | assert da_args.new_arg_int == val 49 | 50 | val = "123" 51 | da_args = debug_adapter.DaArgs(new_arg2_str=val) 52 | assert da_args.new_arg2_str == val 53 | 54 | 55 | if __name__ == "__main__": 56 | # run tests from this file; print all output 57 | pytest.main([__file__, "-s"]) 58 | -------------------------------------------------------------------------------- /tests/test_basic.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from tests.conftest import setup_teardown, coredump_args, hostapp_args # noqa: F401 26 | from t_session import TSession 27 | from tests import timeline 28 | import pytest 29 | # from tests.patterns import some 30 | from tests.standard_requests import REQUEST_INIT, REQUEST_LAUNCH 31 | 32 | 33 | @pytest.mark.timeout(30) 34 | def test_init_launch_coredump(setup_teardown, coredump_args): # noqa: F811 35 | with TSession(coredump_args) as ts: 36 | print("Session is run") 37 | ts.send_request(REQUEST_INIT) 38 | ts.send_request(REQUEST_LAUNCH) 39 | # if we have this event, GDB successfully loaded and started the application: 40 | expectation = timeline.Event("initialized") 41 | result = ts.wait_for(expectation, timeout_s=5) 42 | assert result 43 | print("Session is stopped") 44 | 45 | 46 | @pytest.mark.timeout(30) 47 | def test_init_launch_hostapp(setup_teardown, hostapp_args): # noqa: F811 48 | with TSession(hostapp_args) as ts: 49 | print("Session is run") 50 | ts.send_request(REQUEST_INIT) 51 | ts.send_request(REQUEST_LAUNCH) 52 | # if we have this event, GDB successfully loaded and started the application: 53 | expectation = timeline.Event("initialized") 54 | result = ts.wait_for(expectation, timeout_s=5) 55 | assert result 56 | print("Session is stopped") 57 | 58 | 59 | if __name__ == "__main__": 60 | # run tests from this file; print all output; 61 | pytest.main([__file__, "-s"]) 62 | -------------------------------------------------------------------------------- /tests/test_debug_actions.py: -------------------------------------------------------------------------------- 1 | # MIT License 2 | # 3 | # Copyright (c) 2020 Espressif Systems (Shanghai) Co. Ltd. 4 | # 5 | # Permission is hereby granted, free of charge, to any person obtaining a copy 6 | # of this software and associated documentation files (the "Software"), to deal 7 | # in the Software without restriction, including without limitation the rights to 8 | # use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies 9 | # of the Software, and to permit persons to whom the Software is furnished to do 10 | # so, subject to the following conditions: 11 | # 12 | # The above copyright notice and this permission notice shall be included in all 13 | # copies or substantial portions of the Software. 14 | # 15 | # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | # SOFTWARE. 22 | # 23 | # SPDX-License-Identifier: MIT 24 | 25 | from tests.conftest import setup_teardown, coredump_args, hostapp_args # noqa: F401 26 | from debug_adapter import schema 27 | from t_session import TSession 28 | import pytest 29 | from tests.helpers import continue_till_stopped, get_top_frame_info, set_breakpoints 30 | import timeline 31 | from tests.standard_requests import REQUEST_INIT, REQUEST_LAUNCH 32 | from tests.patterns import some 33 | import time 34 | 35 | 36 | @pytest.mark.timeout(30) 37 | def test_threads(setup_teardown, coredump_args): # noqa: F811 38 | with TSession(coredump_args, "test_threads") as ts: 39 | ts.send_request(REQUEST_INIT) 40 | ts.send_request(REQUEST_LAUNCH) 41 | resp = ts.send_request(schema.ThreadsRequest()) 42 | assert len(resp.body.get("threads")) == 4 43 | 44 | 45 | @pytest.mark.timeout(30) 46 | def test_stack(setup_teardown, coredump_args): # noqa: F811 47 | with TSession(coredump_args, "test_stack") as ts: 48 | ts.send_request(REQUEST_INIT) 49 | ts.send_request(REQUEST_LAUNCH) 50 | rq = schema.Request(command="stackTrace", 51 | arguments={ 52 | "format": {}, 53 | "levels": 20, 54 | "startFrame": 0, 55 | "threadId": 1 56 | }) 57 | resp = ts.send_request(rq) 58 | stack = resp.body.get("stackFrames") 59 | stack_sz = len(stack) 60 | assert stack[stack_sz - 1]["name"] == "app_main" 61 | 62 | 63 | @pytest.mark.timeout(30) 64 | def test_source_breakpoints(setup_teardown, hostapp_args): # noqa: F811 65 | with TSession(hostapp_args, "test_source_breakpoints") as ts: 66 | ts.send_request(REQUEST_INIT) 67 | ts.send_request(REQUEST_LAUNCH) 68 | set_breakpoints(ts, "test_app.c", [{"line": 11}]) 69 | # run to the first breakpoint 70 | continue_till_stopped(ts, stop_reason="breakpoint", thread_id=2, timeout=None) 71 | line, name = get_top_frame_info(ts, thread_id=2) 72 | assert name == "print_hamlet_thread" 73 | assert line == 11 74 | set_breakpoints(ts, "test_app_src2.c", [{"line": 4}, {"line": 17}]) 75 | # run to the next breakpoint 76 | continue_till_stopped(ts, stop_reason="breakpoint", thread_id=2, timeout=None) 77 | line, name = get_top_frame_info(ts, thread_id=2) 78 | assert name == "print_hamlet" 79 | assert line == 4 80 | # run to the next breakpoint 81 | continue_till_stopped(ts, stop_reason="breakpoint", thread_id=2, timeout=None) 82 | line, name = get_top_frame_info(ts, thread_id=2) 83 | assert name == "print_hamlet" 84 | assert line == 17 85 | 86 | 87 | @pytest.mark.timeout(30) 88 | def test_continue(setup_teardown, hostapp_args): # noqa: F811 89 | with TSession(hostapp_args, "test_continue") as ts: 90 | ts.send_request(REQUEST_INIT) 91 | ts.send_request(REQUEST_LAUNCH) 92 | 93 | set_breakpoints(ts, "test_app.c", [{"line": 11}]) 94 | # run to the first breakpoint 95 | continue_till_stopped(ts, stop_reason="breakpoint", thread_id=2, timeout=None) 96 | line, name = get_top_frame_info(ts, thread_id=2) 97 | assert name == "print_hamlet_thread" 98 | assert line == 11 99 | 100 | 101 | def continue_wait_pause(ts, thread_id=0, wait_time=1.0): 102 | rq = schema.ContinueRequest(arguments=schema.ContinueArguments(thread_id)) 103 | resp = ts.send_request(rq) 104 | assert resp.success 105 | time.sleep(wait_time) 106 | rq = schema.PauseRequest(arguments=schema.PauseArguments(threadId=thread_id)) 107 | resp = ts.send_request(rq) 108 | assert resp.success 109 | expectation = timeline.Event(event="stopped", body=some.dict.containing({"reason": "pause"})) 110 | result = ts.wait_for(expectation, timeout_s=5) 111 | assert result 112 | 113 | 114 | @pytest.mark.timeout(30) 115 | def test_pause(setup_teardown, hostapp_args): # noqa: F811 116 | with TSession(hostapp_args, "test_pause") as ts: 117 | ts.send_request(REQUEST_INIT) 118 | ts.send_request(REQUEST_LAUNCH) 119 | continue_wait_pause(ts) 120 | 121 | 122 | @pytest.mark.timeout(30) 123 | def test_step(setup_teardown, hostapp_args): # noqa: F811 124 | with TSession(hostapp_args, "test_step") as ts: 125 | ts.send_request(REQUEST_INIT) 126 | ts.send_request(REQUEST_LAUNCH) 127 | 128 | # Set BP on `int lines = 0;` of `test_app_src2.c` 129 | bpline = 4 130 | set_breakpoints(ts, "test_app_src2.c", [{"line": bpline}]) 131 | # Continue to the BP 132 | continue_till_stopped(ts, stop_reason="breakpoint", thread_id=2, timeout=None) 133 | line, name = get_top_frame_info(ts, thread_id=2) 134 | assert line == bpline 135 | assert name == "print_hamlet" 136 | 137 | # StepIn 138 | rq = schema.StepInRequest(schema.StepInArguments(threadId=2)) 139 | resp = ts.send_request(rq) 140 | assert resp.success 141 | # Check the state 142 | line, name = get_top_frame_info(ts, thread_id=2) 143 | assert line == bpline + 1 144 | assert name == "print_hamlet" 145 | 146 | # StepOut 147 | rq = schema.StepOutRequest(schema.StepOutArguments(threadId=2)) 148 | resp = ts.send_request(rq) 149 | assert resp.success 150 | # Check the state 151 | _, name = get_top_frame_info(ts, thread_id=2) 152 | assert name == "print_hamlet_thread" 153 | 154 | 155 | # test thread ids changing: set break in thread, cont2break 156 | @pytest.mark.timeout(60) 157 | def test_threads_sequence(setup_teardown, hostapp_args): # noqa: F811 158 | with TSession(hostapp_args, "test_threads_sequence") as ts: 159 | ts.send_request(REQUEST_INIT) 160 | ts.send_request(REQUEST_LAUNCH) 161 | set_breakpoints(ts, "test_app.c", [{"line": 11}]) 162 | # run to the breakpoint at thread entry, 6 threads in total 163 | for tid in range(2, 8): 164 | continue_till_stopped(ts, stop_reason="breakpoint", thread_id=tid, timeout=None) 165 | rq = schema.Request(command="threads") 166 | resp = ts.send_request(rq) 167 | assert len(resp.body.get("threads")) == 2 168 | line, name = get_top_frame_info(ts, thread_id=tid) 169 | assert name == "print_hamlet_thread" 170 | assert line == 11 171 | line, name = get_top_frame_info(ts, thread_id=1) 172 | assert name == "main" 173 | assert line == 35 174 | 175 | 176 | def restart(ts, timeout=10): 177 | rq = schema.Request(command="restart") 178 | resp = ts.send_request(rq) 179 | assert resp.success 180 | expectation = timeline.Event(event="stopped", body=some.dict.containing({"reason": "breakpoint"})) 181 | result = ts.wait_for(expectation, timeout_s=timeout) 182 | assert result 183 | 184 | 185 | def debug_test_scenario(ts): 186 | set_breakpoints(ts, "test_app.c", [{"line": 11}]) 187 | continue_till_stopped(ts, stop_reason="breakpoint", thread_id=2, timeout=None) 188 | set_breakpoints(ts, "test_app.c", []) 189 | continue_wait_pause(ts) 190 | set_breakpoints(ts, "test_app.c", [{"line": 11}]) 191 | continue_till_stopped(ts, stop_reason="breakpoint", timeout=None) 192 | 193 | 194 | # test debug session restart 195 | # test general debug session start, cont2break, remove break, cont, pause, set break, cont2break` 196 | @pytest.mark.timeout(60) 197 | def test_complex_debug_scenario(setup_teardown, hostapp_args): # noqa: F811 198 | with TSession(hostapp_args, "test_complex_debug_scenario") as ts: 199 | ts.send_request(REQUEST_INIT) 200 | ts.send_request(REQUEST_LAUNCH) 201 | debug_test_scenario(ts) 202 | restart(ts) 203 | debug_test_scenario(ts) 204 | 205 | 206 | if __name__ == "__main__": 207 | # run tests from this file; print all output; 208 | pytest.main([__file__, "-s"]) 209 | -------------------------------------------------------------------------------- /tests/test_patterns.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) Microsoft Corporation. All rights reserved. 2 | # Licensed under the MIT License. See LICENSE in the project root 3 | # for license information. 4 | 5 | from __future__ import absolute_import, division, print_function, unicode_literals 6 | 7 | import pytest 8 | import sys 9 | 10 | from debugpy.common import log 11 | from tests.patterns import some 12 | 13 | 14 | NONE = None 15 | NAN = float("nan") 16 | 17 | 18 | def log_repr(x): 19 | s = repr(x) 20 | log.info("{0}", s) 21 | 22 | 23 | VALUES = [ 24 | object(), 25 | True, 26 | False, 27 | 0, 28 | -1, 29 | -1.0, 30 | 1.23, 31 | b"abc", 32 | b"abcd", 33 | "abc", 34 | "abcd", 35 | (), 36 | (1, 2, 3), 37 | [], 38 | [1, 2, 3], 39 | {}, 40 | {"a": 1, "b": 2}, 41 | ] 42 | 43 | 44 | @pytest.mark.parametrize("x", VALUES) 45 | def test_value(x): 46 | log_repr(some.object) 47 | assert x == some.object 48 | 49 | log_repr(some.object.equal_to(x)) 50 | assert x == some.object.equal_to(x) 51 | 52 | log_repr(some.object.not_equal_to(x)) 53 | assert x != some.object.not_equal_to(x) 54 | 55 | log_repr(some.object.same_as(x)) 56 | assert x == some.object.same_as(x) 57 | 58 | log_repr(some.thing) 59 | assert x == some.thing 60 | 61 | log_repr(~some.thing) 62 | assert x != ~some.thing 63 | 64 | log_repr(~some.object) 65 | assert x != ~some.object 66 | 67 | log_repr(~some.object | x) 68 | assert x == ~some.object | x 69 | 70 | 71 | def test_none(): 72 | assert NONE == some.object 73 | assert NONE == some.object.equal_to(None) 74 | assert NONE == some.object.same_as(None) 75 | assert NONE != some.thing 76 | assert NONE == some.thing | None 77 | 78 | 79 | def test_equal(): 80 | assert 123.0 == some.object.equal_to(123) 81 | assert NAN != some.object.equal_to(NAN) 82 | 83 | 84 | def test_not_equal(): 85 | assert 123.0 != some.object.not_equal_to(123) 86 | assert NAN == some.object.not_equal_to(NAN) 87 | 88 | 89 | def test_same(): 90 | assert 123.0 != some.object.same_as(123) 91 | assert NAN == some.object.same_as(NAN) 92 | 93 | 94 | def test_inverse(): 95 | pattern = ~some.object.equal_to(2) 96 | log_repr(pattern) 97 | 98 | assert pattern == 1 99 | assert pattern != 2 100 | assert pattern == 3 101 | assert pattern == "2" 102 | assert pattern == NONE 103 | 104 | 105 | def test_either(): 106 | pattern = some.number | some.str 107 | log_repr(pattern) 108 | assert pattern == 123 109 | 110 | pattern = some.str | 123 | some.bool 111 | log_repr(pattern) 112 | assert pattern == 123 113 | 114 | 115 | def test_in_range(): 116 | pattern = some.int.in_range(-5, 5) 117 | log_repr(pattern) 118 | 119 | assert all([pattern == x for x in range(-5, 5)]) 120 | assert pattern != -6 121 | assert pattern != 5 122 | 123 | 124 | def test_str(): 125 | log_repr(some.str) 126 | assert some.str == "abc" 127 | 128 | if sys.version_info < (3,): 129 | assert b"abc" == some.str 130 | else: 131 | assert b"abc" != some.str 132 | 133 | 134 | def test_matching(): 135 | pattern = some.str.matching(r".(b+).") 136 | log_repr(pattern) 137 | assert pattern == "abbbc" 138 | 139 | pattern = some.str.matching(r"bbb") 140 | log_repr(pattern) 141 | assert pattern != "abbbc" 142 | 143 | pattern = some.bytes.matching(br".(b+).") 144 | log_repr(pattern) 145 | assert pattern == b"abbbc" 146 | 147 | pattern = some.bytes.matching(br"bbb") 148 | log_repr(pattern) 149 | assert pattern != b"abbbc" 150 | 151 | 152 | def test_starting_with(): 153 | pattern = some.str.starting_with("aa") 154 | log_repr(pattern) 155 | assert pattern == "aabbbb" 156 | assert pattern != "bbbbaa" 157 | assert pattern != "bbaabb" 158 | assert pattern != "ababab" 159 | 160 | pattern = some.bytes.starting_with(b"aa") 161 | log_repr(pattern) 162 | assert pattern == b"aabbbb" 163 | assert pattern != b"bbbbaa" 164 | assert pattern != b"bbaabb" 165 | assert pattern != b"ababab" 166 | 167 | 168 | def test_ending_with(): 169 | pattern = some.str.ending_with("aa") 170 | log_repr(pattern) 171 | assert pattern == "bbbbaa" 172 | assert pattern == "bb\nbb\naa" 173 | assert pattern != "aabbbb" 174 | assert pattern != "bbaabb" 175 | assert pattern != "ababab" 176 | 177 | pattern = some.bytes.ending_with(b"aa") 178 | log_repr(pattern) 179 | assert pattern == b"bbbbaa" 180 | assert pattern == b"bb\nbb\naa" 181 | assert pattern != b"aabbbb" 182 | assert pattern != b"bbaabb" 183 | assert pattern != b"ababab" 184 | 185 | 186 | def test_containing(): 187 | pattern = some.str.containing("aa") 188 | log_repr(pattern) 189 | assert pattern == "aabbbb" 190 | assert pattern == "bbbbaa" 191 | assert pattern == "bbaabb" 192 | assert pattern == "bb\naa\nbb" 193 | assert pattern != "ababab" 194 | 195 | pattern = some.bytes.containing(b"aa") 196 | log_repr(pattern) 197 | assert pattern == b"aabbbb" 198 | assert pattern == b"bbbbaa" 199 | assert pattern == b"bbaabb" 200 | assert pattern == b"bb\naa\nbb" 201 | assert pattern != b"ababab" 202 | 203 | 204 | def test_list(): 205 | assert [1, 2, 3] == [1, some.thing, 3] 206 | assert [1, 2, 3, 4] != [1, some.thing, 4] 207 | 208 | assert [1, 2, 3, 4] == some.list.containing(1) 209 | assert [1, 2, 3, 4] == some.list.containing(2) 210 | assert [1, 2, 3, 4] == some.list.containing(3) 211 | assert [1, 2, 3, 4] == some.list.containing(4) 212 | assert [1, 2, 3, 4] == some.list.containing(1, 2) 213 | assert [1, 2, 3, 4] == some.list.containing(2, 3) 214 | assert [1, 2, 3, 4] == some.list.containing(3, 4) 215 | assert [1, 2, 3, 4] == some.list.containing(1, 2, 3) 216 | assert [1, 2, 3, 4] == some.list.containing(2, 3, 4) 217 | assert [1, 2, 3, 4] == some.list.containing(1, 2, 3, 4) 218 | 219 | assert [1, 2, 3, 4] != some.list.containing(5) 220 | assert [1, 2, 3, 4] != some.list.containing(1, 3) 221 | assert [1, 2, 3, 4] != some.list.containing(1, 2, 4) 222 | assert [1, 2, 3, 4] != some.list.containing(2, 3, 5) 223 | 224 | 225 | def test_dict(): 226 | pattern = {"a": some.thing, "b": 2} 227 | log_repr(pattern) 228 | assert pattern == {"a": 1, "b": 2} 229 | 230 | pattern = some.dict.containing({"a": 1}) 231 | log_repr(pattern) 232 | assert pattern == {"a": 1, "b": 2} 233 | 234 | 235 | def test_such_that(): 236 | pattern = some.thing.such_that(lambda x: x != 1) 237 | log_repr(pattern) 238 | 239 | assert 0 == pattern 240 | assert 1 != pattern 241 | assert 2 == pattern 242 | 243 | 244 | def test_error(): 245 | log_repr(some.error) 246 | assert some.error == Exception("error!") 247 | assert some.error != {} 248 | 249 | 250 | def test_recursive(): 251 | pattern = some.dict.containing( 252 | { 253 | "dict": some.dict.containing({"int": some.int.in_range(100, 200)}), 254 | "list": [None, ~some.error, some.number | some.str], 255 | } 256 | ) 257 | 258 | log_repr(pattern) 259 | 260 | assert pattern == { 261 | "list": [None, False, 123], 262 | "bool": True, 263 | "dict": {"int": 123, "str": "abc"}, 264 | } 265 | -------------------------------------------------------------------------------- /tests/test_timeline.py: -------------------------------------------------------------------------------- 1 | # Copyright (c) Microsoft Corporation. All rights reserved. 2 | # Additions Copyright (c) 2020, Espressif Systems (Shanghai) Co. Ltd. 3 | # Licensed under the MIT License. See LICENSE in the project root 4 | # for license information. 5 | 6 | from __future__ import absolute_import, division, print_function, unicode_literals 7 | 8 | import collections 9 | import functools 10 | import itertools 11 | import pytest 12 | import threading 13 | import time 14 | 15 | from debugpy.common import log, messaging 16 | from tests.patterns import some 17 | from tests.timeline import Timeline, Mark, Event, Request, Response 18 | from tests.pytest_fixtures import daemon 19 | 20 | 21 | class MessageFactory(object): 22 | """A factory for DAP messages that are not bound to a message channel. 23 | """ 24 | 25 | def __init__(self): 26 | self._seq_iter = itertools.count(1) 27 | self.messages = collections.OrderedDict() 28 | self.event = functools.partial(self._make, messaging.Event) 29 | self.request = functools.partial(self._make, messaging.Request) 30 | self.response = functools.partial(self._make, messaging.Response) 31 | 32 | def _make(self, message_type, *args, **kwargs): 33 | message = message_type(None, next(self._seq_iter), *args, **kwargs) 34 | self.messages[message.seq] = message 35 | return message 36 | 37 | 38 | @pytest.fixture 39 | def make_timeline(request): 40 | """Provides a timeline factory. All timelines created by this factory 41 | are automatically finalized and checked for basic consistency after the 42 | end of the test. 43 | """ 44 | 45 | timelines = [] 46 | 47 | def factory(): 48 | timeline = Timeline() 49 | timelines.append(timeline) 50 | with timeline.frozen(): 51 | assert timeline.beginning is not None 52 | initial_history = [some.object.same_as(timeline.beginning)] 53 | assert timeline.history() == initial_history 54 | return timeline, initial_history 55 | 56 | yield factory 57 | log.newline() 58 | 59 | try: 60 | failed = request.node.call_result.failed 61 | except AttributeError: 62 | pass 63 | else: 64 | if not failed: 65 | for timeline in timelines: 66 | timeline.finalize() 67 | history = timeline.history() 68 | history.sort(key=lambda occ: occ.timestamp) 69 | assert history == timeline.history() 70 | assert history[-1] == Mark("FINISH") 71 | timeline.close() 72 | 73 | 74 | @pytest.mark.timeout(1) 75 | def test_occurrences(make_timeline): 76 | timeline, initial_history = make_timeline() 77 | 78 | mark1 = timeline.mark("dum") 79 | mark2 = timeline.mark("dee") 80 | mark3 = timeline.mark("dum") 81 | 82 | assert mark1 == Mark("dum") 83 | assert mark1.id == "dum" 84 | assert mark2 == Mark("dee") 85 | assert mark2.id == "dee" 86 | assert mark3 == Mark("dum") 87 | assert mark3.id == "dum" 88 | 89 | with timeline.frozen(): 90 | assert timeline.history() == initial_history + [ 91 | some.object.same_as(mark1), 92 | some.object.same_as(mark2), 93 | some.object.same_as(mark3), 94 | ] 95 | timeline.finalize() 96 | 97 | assert timeline.all_occurrences_of(Mark("dum")) == ( 98 | some.object.same_as(mark1), 99 | some.object.same_as(mark3), 100 | ) 101 | assert timeline.all_occurrences_of(Mark("dee")) == (some.object.same_as(mark2),) 102 | 103 | assert timeline[:].all_occurrences_of(Mark("dum")) == ( 104 | some.object.same_as(mark1), 105 | some.object.same_as(mark3), 106 | ) 107 | 108 | # Lower boundary is inclusive. 109 | assert timeline[mark1:].all_occurrences_of(Mark("dum")) == ( 110 | some.object.same_as(mark1), 111 | some.object.same_as(mark3), 112 | ) 113 | assert timeline[mark2:].all_occurrences_of(Mark("dum")) == ( 114 | some.object.same_as(mark3), 115 | ) 116 | assert timeline[mark3:].all_occurrences_of(Mark("dum")) == ( 117 | some.object.same_as(mark3), 118 | ) 119 | 120 | # Upper boundary is exclusive. 121 | assert timeline[:mark1].all_occurrences_of(Mark("dum")) == () 122 | assert timeline[:mark2].all_occurrences_of(Mark("dum")) == ( 123 | some.object.same_as(mark1), 124 | ) 125 | assert timeline[:mark3].all_occurrences_of(Mark("dum")) == ( 126 | some.object.same_as(mark1), 127 | ) 128 | 129 | 130 | def expectation_of(message): 131 | if isinstance(message, messaging.Event): 132 | return Event(some.str.equal_to(message.event), some.dict.equal_to(message.body)) 133 | elif isinstance(message, messaging.Request): 134 | return Request( 135 | some.str.equal_to(message.command), some.dict.equal_to(message.arguments) 136 | ) 137 | else: 138 | raise AssertionError("Unsupported message type") 139 | 140 | 141 | def test_event(make_timeline): 142 | timeline, initial_history = make_timeline() 143 | messages = MessageFactory() 144 | 145 | event_msg = messages.event("stopped", {"reason": "pause"}) 146 | event_exp = expectation_of(event_msg) 147 | event = timeline.record_event(event_msg) 148 | 149 | assert event == event_exp 150 | assert event.message is event_msg 151 | assert event.circumstances == ("event", event_msg.event, event_msg.body) 152 | 153 | with timeline.frozen(): 154 | assert timeline.last is event 155 | assert timeline.history() == initial_history + [some.object.same_as(event)] 156 | timeline.expect_realized(event_exp) 157 | 158 | 159 | @pytest.mark.parametrize("outcome", ["success", "failure"]) 160 | def test_request_response(make_timeline, outcome): 161 | timeline, initial_history = make_timeline() 162 | messages = MessageFactory() 163 | 164 | request_msg = messages.request("next", {"threadId": 3}) 165 | request_exp = expectation_of(request_msg) 166 | request = timeline.record_request(request_msg) 167 | 168 | assert request == request_exp 169 | assert request.command == request_msg.command 170 | assert request.arguments == request_msg.arguments 171 | assert request.circumstances == ( 172 | "request", 173 | request_msg.command, 174 | request_msg.arguments, 175 | ) 176 | 177 | with timeline.frozen(): 178 | assert timeline.last is request 179 | assert timeline.history() == initial_history + [some.object.same_as(request)] 180 | timeline.expect_realized(request_exp) 181 | timeline.expect_realized(Request("next")) 182 | 183 | response_msg = messages.response( 184 | request_msg, {} if outcome == "success" else Exception("error!") 185 | ) 186 | response = timeline.record_response(request, response_msg) 187 | assert response.request is request 188 | assert response.body == response_msg.body 189 | if outcome == "success": 190 | assert response.success 191 | else: 192 | assert not response.success 193 | assert response.circumstances == ("response", request, response_msg.body) 194 | 195 | assert response == Response(request, response_msg.body) 196 | assert response == Response(request, some.object) 197 | assert response == Response(request) 198 | 199 | if outcome == "success": 200 | assert response == Response(request, some.dict) 201 | assert response != Response(request, some.error) 202 | else: 203 | assert response != Response(request, some.dict) 204 | assert response == Response(request, some.error) 205 | 206 | assert response == Response(request_exp, response_msg.body) 207 | assert response == Response(request_exp, some.object) 208 | assert response == Response(request_exp) 209 | 210 | if outcome == "success": 211 | assert response == Response(request_exp, some.dict) 212 | assert response != Response(request_exp, some.error) 213 | else: 214 | assert response != Response(request_exp, some.dict) 215 | assert response == Response(request_exp, some.error) 216 | 217 | with timeline.frozen(): 218 | assert timeline.last is response 219 | assert timeline.history() == initial_history + [ 220 | some.object.same_as(request), 221 | some.object.same_as(response), 222 | ] 223 | 224 | timeline.expect_realized(Response(request)) 225 | timeline.expect_realized(Response(request, response_msg.body)) 226 | timeline.expect_realized(Response(request, some.object)) 227 | 228 | if outcome == "success": 229 | timeline.expect_realized(Response(request, some.dict)) 230 | timeline.expect_not_realized(Response(request, some.error)) 231 | else: 232 | timeline.expect_not_realized(Response(request, some.dict)) 233 | timeline.expect_realized(Response(request, some.error)) 234 | 235 | timeline.expect_realized(Response(request)) 236 | timeline.expect_realized(Response(request_exp, response_msg.body)) 237 | timeline.expect_realized(Response(request_exp, some.object)) 238 | 239 | if outcome == "success": 240 | timeline.expect_realized(Response(request_exp, some.dict)) 241 | timeline.expect_not_realized(Response(request_exp, some.error)) 242 | else: 243 | timeline.expect_not_realized(Response(request_exp, some.dict)) 244 | timeline.expect_realized(Response(request_exp, some.error)) 245 | 246 | 247 | def test_after(make_timeline): 248 | timeline, _ = make_timeline() 249 | first = timeline.mark("first") 250 | 251 | second_exp = first >> Mark("second") 252 | with timeline.frozen(): 253 | assert second_exp not in timeline 254 | 255 | timeline.mark("second") 256 | with timeline.frozen(): 257 | assert second_exp in timeline 258 | 259 | timeline.mark("first") 260 | with timeline.frozen(): 261 | assert Mark("second") >> Mark("first") in timeline 262 | 263 | 264 | def test_before(make_timeline): 265 | timeline, _ = make_timeline() 266 | t = timeline.beginning 267 | 268 | first = timeline.mark("first") 269 | timeline.mark("second") 270 | 271 | with timeline.frozen(): 272 | assert t >> Mark("second") >> Mark("first") not in timeline 273 | assert Mark("second") >> first not in timeline 274 | 275 | third = timeline.mark("third") 276 | 277 | with timeline.frozen(): 278 | assert t >> Mark("second") >> Mark("first") not in timeline 279 | assert Mark("second") >> first not in timeline 280 | assert t >> Mark("second") >> Mark("third") in timeline 281 | assert Mark("second") >> third in timeline 282 | 283 | 284 | def test_and(make_timeline): 285 | eggs_exp = Mark("eggs") 286 | ham_exp = Mark("ham") 287 | cheese_exp = Mark("cheese") 288 | 289 | timeline, _ = make_timeline() 290 | t = timeline.beginning 291 | 292 | with timeline.frozen(): 293 | assert t >> (eggs_exp & ham_exp) not in timeline 294 | assert t >> (ham_exp & eggs_exp) not in timeline 295 | assert t >> (cheese_exp & ham_exp & eggs_exp) not in timeline 296 | 297 | timeline.mark("eggs") 298 | with timeline.frozen(): 299 | assert t >> (eggs_exp & ham_exp) not in timeline 300 | assert t >> (ham_exp & eggs_exp) not in timeline 301 | assert t >> (cheese_exp & ham_exp & eggs_exp) not in timeline 302 | 303 | timeline.mark("ham") 304 | with timeline.frozen(): 305 | assert t >> (eggs_exp & ham_exp) in timeline 306 | assert t >> (ham_exp & eggs_exp) in timeline 307 | assert t >> (cheese_exp & ham_exp & eggs_exp) not in timeline 308 | 309 | timeline.mark("cheese") 310 | with timeline.frozen(): 311 | assert t >> (eggs_exp & ham_exp) in timeline 312 | assert t >> (ham_exp & eggs_exp) in timeline 313 | assert t >> (cheese_exp & ham_exp & eggs_exp) in timeline 314 | 315 | 316 | def test_or(make_timeline): 317 | eggs_exp = Mark("eggs") 318 | ham_exp = Mark("ham") 319 | cheese_exp = Mark("cheese") 320 | 321 | timeline, _ = make_timeline() 322 | t = timeline.beginning 323 | 324 | with timeline.frozen(): 325 | assert t >> (eggs_exp | ham_exp) not in timeline 326 | assert t >> (ham_exp | eggs_exp) not in timeline 327 | assert t >> (cheese_exp | ham_exp | eggs_exp) not in timeline 328 | 329 | timeline.mark("eggs") 330 | with timeline.frozen(): 331 | assert t >> (eggs_exp | ham_exp) in timeline 332 | assert t >> (ham_exp | eggs_exp) in timeline 333 | assert t >> (cheese_exp | ham_exp | eggs_exp) in timeline 334 | 335 | timeline.mark("cheese") 336 | with timeline.frozen(): 337 | assert t >> (eggs_exp | ham_exp) in timeline 338 | assert t >> (ham_exp | eggs_exp) in timeline 339 | assert t >> (cheese_exp | ham_exp | eggs_exp) in timeline 340 | 341 | timeline.mark("ham") 342 | with timeline.frozen(): 343 | assert t >> (eggs_exp | ham_exp) in timeline 344 | assert t >> (ham_exp | eggs_exp) in timeline 345 | assert t >> (cheese_exp | ham_exp | eggs_exp) in timeline 346 | t = timeline.last 347 | 348 | timeline.mark("cheese") 349 | with timeline.frozen(): 350 | assert t >> (eggs_exp | ham_exp) not in timeline 351 | assert t >> (ham_exp | eggs_exp) not in timeline 352 | assert t >> (cheese_exp | ham_exp | eggs_exp) in timeline 353 | 354 | 355 | def test_xor(make_timeline): 356 | eggs_exp = Mark("eggs") 357 | ham_exp = Mark("ham") 358 | cheese_exp = Mark("cheese") 359 | 360 | timeline, _ = make_timeline() 361 | t1 = timeline.beginning 362 | 363 | with timeline.frozen(): 364 | assert t1 >> (eggs_exp ^ ham_exp) not in timeline 365 | assert t1 >> (ham_exp ^ eggs_exp) not in timeline 366 | assert t1 >> (cheese_exp ^ ham_exp ^ eggs_exp) not in timeline 367 | 368 | timeline.mark("eggs") 369 | with timeline.frozen(): 370 | assert t1 >> (eggs_exp ^ ham_exp) in timeline 371 | assert t1 >> (ham_exp ^ eggs_exp) in timeline 372 | assert t1 >> (cheese_exp ^ ham_exp ^ eggs_exp) in timeline 373 | t2 = timeline.last 374 | 375 | timeline.mark("ham") 376 | with timeline.frozen(): 377 | assert t1 >> (eggs_exp ^ ham_exp) not in timeline 378 | assert t2 >> (eggs_exp ^ ham_exp) in timeline 379 | assert t1 >> (ham_exp ^ eggs_exp) not in timeline 380 | assert t2 >> (ham_exp ^ eggs_exp) in timeline 381 | assert t1 >> (cheese_exp ^ ham_exp ^ eggs_exp) not in timeline 382 | assert t2 >> (cheese_exp ^ ham_exp ^ eggs_exp) in timeline 383 | 384 | 385 | def test_conditional(make_timeline): 386 | def is_exciting(occ): 387 | return occ == Event(some.str, "exciting") 388 | 389 | messages = MessageFactory() 390 | 391 | something = Event("something", some.object) 392 | something_exciting = something.when(is_exciting) 393 | timeline, _ = make_timeline() 394 | t = timeline.beginning 395 | 396 | timeline.record_event(messages.event("something", "boring")) 397 | with timeline.frozen(): 398 | timeline.expect_realized(t >> something) 399 | timeline.expect_not_realized(t >> something_exciting) 400 | 401 | timeline.record_event(messages.event("something", "exciting")) 402 | with timeline.frozen(): 403 | timeline.expect_realized(t >> something_exciting) 404 | 405 | 406 | def test_lower_bound(make_timeline): 407 | timeline, _ = make_timeline() 408 | timeline.mark("1") 409 | timeline.mark("2") 410 | timeline.mark("3") 411 | timeline.freeze() 412 | 413 | assert (Mark("2") >> (Mark("1") >> Mark("3"))) not in timeline 414 | assert (Mark("2") >> (Mark("1") & Mark("3"))) not in timeline 415 | assert (Mark("2") >> (Mark("1") ^ Mark("3"))) in timeline 416 | 417 | 418 | @pytest.mark.timeout(3) 419 | def test_frozen(make_timeline, daemon): 420 | timeline, initial_history = make_timeline() 421 | assert not timeline.is_frozen 422 | 423 | timeline.freeze() 424 | assert timeline.is_frozen 425 | 426 | timeline.unfreeze() 427 | assert not timeline.is_frozen 428 | 429 | with timeline.frozen(): 430 | assert timeline.is_frozen 431 | assert not timeline.is_frozen 432 | 433 | timeline.mark("dum") 434 | 435 | timeline.freeze() 436 | assert timeline.is_frozen 437 | 438 | worker_started = threading.Event() 439 | worker_can_proceed = threading.Event() 440 | 441 | @daemon 442 | def worker(): 443 | worker_started.set() 444 | worker_can_proceed.wait() 445 | timeline.mark("dee") 446 | 447 | worker_started.wait() 448 | 449 | assert Mark("dum") in timeline 450 | assert Mark("dee") not in timeline 451 | 452 | with timeline.unfrozen(): 453 | worker_can_proceed.set() 454 | worker.join() 455 | 456 | assert Mark("dee") in timeline 457 | 458 | 459 | @pytest.mark.timeout(3) 460 | def test_unobserved(make_timeline, daemon): 461 | timeline, initial_history = make_timeline() 462 | 463 | worker_can_proceed = threading.Event() 464 | 465 | @daemon 466 | def worker(): 467 | messages = MessageFactory() 468 | 469 | worker_can_proceed.wait() 470 | worker_can_proceed.clear() 471 | timeline.record_event(messages.event("dum", {})) 472 | log.debug("dum") 473 | 474 | worker_can_proceed.wait() 475 | timeline.record_event(messages.event("dee", {})) 476 | log.debug("dee") 477 | 478 | timeline.record_event(messages.event("dum", {})) 479 | log.debug("dum") 480 | 481 | timeline.freeze() 482 | assert timeline.is_frozen 483 | 484 | timeline.proceed() 485 | assert not timeline.is_frozen 486 | worker_can_proceed.set() 487 | 488 | dum = timeline.wait_for_next(Event("dum")) 489 | assert timeline.is_frozen 490 | assert dum.observed 491 | 492 | # Should be fine since we observed 'dum' by waiting for it. 493 | timeline.proceed() 494 | assert not timeline.is_frozen 495 | 496 | worker_can_proceed.set() 497 | worker.join() 498 | 499 | dum2 = timeline.wait_for_next(Event("dum"), freeze=False) 500 | assert not timeline.is_frozen 501 | assert dum2.observed 502 | 503 | timeline.wait_until_realized(Event("dum") >> Event("dum"), freeze=True) 504 | assert timeline.is_frozen 505 | 506 | dee = dum.next 507 | assert not dee.observed 508 | 509 | # Should complain since 'dee' is unobserved. 510 | with pytest.raises(Exception): 511 | timeline.proceed() 512 | 513 | # Observe it! 514 | timeline.expect_realized(Event("dee")) 515 | assert dee.observed 516 | 517 | # Should be good now. 518 | timeline.proceed() 519 | 520 | 521 | def test_new(make_timeline): 522 | timeline, _ = make_timeline() 523 | 524 | m1 = timeline.mark("1") 525 | timeline.freeze() 526 | 527 | assert timeline.expect_new(Mark("1")) is m1 528 | with pytest.raises(Exception): 529 | timeline.expect_new(Mark("2")) 530 | 531 | timeline.proceed() 532 | m2 = timeline.mark("2") 533 | timeline.freeze() 534 | 535 | with pytest.raises(Exception): 536 | timeline.expect_new(Mark("1")) 537 | assert timeline.expect_new(Mark("2")) is m2 538 | with pytest.raises(Exception): 539 | timeline.expect_new(Mark("3")) 540 | 541 | timeline.unfreeze() 542 | m3 = timeline.mark("3") 543 | timeline.freeze() 544 | 545 | with pytest.raises(Exception): 546 | timeline.expect_new(Mark("1")) 547 | assert timeline.expect_new(Mark("2")) is m2 548 | assert timeline.expect_new(Mark("3")) is m3 549 | 550 | timeline.proceed() 551 | m4 = timeline.mark("4") 552 | timeline.mark("4") 553 | timeline.freeze() 554 | 555 | with pytest.raises(Exception): 556 | timeline.expect_new(Mark("1")) 557 | with pytest.raises(Exception): 558 | timeline.expect_new(Mark("2")) 559 | with pytest.raises(Exception): 560 | timeline.expect_new(Mark("3")) 561 | assert timeline.expect_new(Mark("4")) is m4 562 | 563 | 564 | @pytest.mark.timeout(3) 565 | @pytest.mark.parametrize("order", ["mark_then_wait", "wait_then_mark"]) 566 | def test_concurrency(make_timeline, daemon, order): 567 | timeline, initial_history = make_timeline() 568 | 569 | occurrences = [] 570 | worker_can_proceed = threading.Event() 571 | 572 | @daemon 573 | def worker(): 574 | worker_can_proceed.wait() 575 | mark = timeline.mark("tada") 576 | occurrences.append(mark) 577 | 578 | if order == "mark_then_wait": 579 | worker_can_proceed.set() 580 | unblock_worker_later = None 581 | else: 582 | 583 | @daemon 584 | def unblock_worker_later(): 585 | time.sleep(0.1) 586 | worker_can_proceed.set() 587 | 588 | mark = timeline.wait_for_next(Mark("tada"), freeze=True) 589 | 590 | worker.join() 591 | assert mark is occurrences[0] 592 | assert timeline.last is mark 593 | assert timeline.history() == initial_history + occurrences 594 | 595 | if unblock_worker_later: 596 | unblock_worker_later.join() 597 | -------------------------------------------------------------------------------- /tests/timeline.md: -------------------------------------------------------------------------------- 1 | # Timeline testing 2 | 3 | A ptvsd debug session consists of the DAP ([Debug Adapter Protocol](https://microsoft.github.io/debug-adapter-protocol/specification)) requests, responses and events, which, in general, don't have any specific *absolute* ordering that can be meaningfully expected. Related requests, responses and events have *relative* ordering, and so tests for a debug session have to be able to express such ordering. For example: event E1, and either event E2 or event E3, all happened after request R was sent, but before response S to R was received. 4 | 5 | The timeline framework, as implemented by the [timeline module](timeline.py), allows to express such tests straightforwardly in a declarative fashion in either blocking or non-blocking mode: 6 | ```py 7 | expectation = ( 8 | Request(R) 9 | >> 10 | (Event(E1) & (Event(E2) | Event(E3)) 11 | >> 12 | Response(Request(R)) 13 | ) 14 | timeline.wait_until(expectation) 15 | assert expectation in timeline 16 | ``` 17 | 18 | ## Basic terms and concepts 19 | 20 | ### Occurrence 21 | 22 | An *occurrence* is something that occurs on a [timeline](#Timeline); it is represented by an immutable instance of `Occurrence`. An occurrence is described by its *timestamp* (`occurrence.timestamp`) and its *circumstances* (`occurrence.circumstances`), the latter being a tuple containing everything that describes the occurrence. By convention, the first element of the tuple is a string that identifies the *kind* of occurrence, while the following elements have different meanings depending on the kind. 23 | 24 | in a ptvsd debug session as seen from the perspective of a test, the fundamental occurrences and their respective circumstances are: 25 | 26 | - request sent: `('Request', command, arguments)` 27 | - response received: `('Response', request, body)` 28 | - event received: `('Event', event, body)` 29 | 30 | (Note that ptvsd itself never sends requests - it only receives and handles them.) 31 | 32 | For a response, `body` is `RequestFailure(error_message)` if the request failed, and the actual body of the response if it succeeded. 33 | 34 | There's a pseudo-occurrence called *mark*, which is never caused by ptvsd itself, and exists solely so that tests can mark important points on a timeline for ordering and debugging purposes. Its circumstances are `('Mark', id)`, where `id` is an arbitrary value. 35 | 36 | For objects representing these occurrences, named members are provided to extract individual components of `circumstances`. Thus, if it is a request, you can write `request.command` instead of `request.circumstances[1]`. 37 | 38 | Every occurrence belongs to a specific timeline. Furthermore, every occurrence has a *preceding* occurrence (`occurrence.preceding`), except for the very first one on a timeline, for which the preceding occurrence is `None`. There is a helper method `occurrence.backtrack()` that "walks back" from an occurrence to the beginning of the timeline, returning an iterator over `[occurrence, occurrence.preceding, occurrence.preceding.preceding, ...]`. Relative ordering can be tested with `occurrence.precedes(other)` and `occurrence.follows(other)`. 39 | 40 | ### Timeline 41 | 42 | A *timeline* is a sequence of [occurrences](#Occurrence), in order in which they happened; it is represented by an instance of `Timeline`. Every timeline has a *beginning* (`timeline.beginning`), which is always `('Mark', 'beginning')`; thus, timelines are never empty. Every timeline also has a *last occurrence* (`timeline.last()`). 43 | 44 | A timeline can be grown by recording new occurrences in it. This is done automatically by the test infrastructure for requests, responses and events occurring in a debug session. Marks are recorded with the `timeline.mark(id)` method, which returns the recorded mark occurrence. It is not possible to "rewrite the history" - once recorded, occurrences can never be forgotten, and do not change. 45 | 46 | Timelines are completely thread-safe for both recording and inspection. However, because a timeline during an active debug session can grow asyncronously as new events and responses are received, it cannot be inspected directly, other than asking for the last occurrence via `timeline.last()` - which is a function rather than a property, indicating that it may return a different value on every subsequent call. 47 | 48 | It is, however, possible to take a snapshot of a timeline via `timeline.history()`; the returned value is a list of all occurrences that were in the timeline at the point of the call, in order from first to last. This is just a shortcut for `reversed(list(timeline.last()))`. 49 | 50 | Note that, since instances of `Occurrences` are immutable, it is safe to inspect them even as timeline grows. 51 | 52 | ### Expectation 53 | 54 | An *expectation* is to an [occurrence](#Occurrence) as a regex is to a string; it is represented by an instance of `Expectation`. An expectation can be *realized* at a specific occurrence. It can also be said that an expectation is realized in a [timeline](#Timeline), which means that it is realized at the last occurrence in the timeline. 55 | 56 | Testing an occurrence against an expectation is done with `has_occurred_by`: 57 | ```py 58 | expectation.has_occurred_by(occurrence) 59 | ``` 60 | alternatively, if we have a timeline, then operator `in` can be used to check the expectation against it: 61 | ```py 62 | expectation in timeline # expectation.has_occurred_by(timeline.last()) 63 | ``` 64 | Finally, a timeline can also perform a blocking wait for an expectation to be realized with `wait_until()`: 65 | ```py 66 | t = timeline.wait_until(expectation) # blocks this thread 67 | assert expectation in timeline 68 | ``` 69 | the return value of `wait_until()` is the first occurrence that realized the expectation. If that occurrence was already in the timeline when `wait_until()` was invoked, it returns immediately. 70 | 71 | 72 | ### Basic expectations 73 | 74 | A *basic* expectation is described by the circumstances of the occurrence the expectation is to be realized (`expectation.circumstances`). Whereas the circumstances of an occurrence is a data object, the circumstances of the expectation is a *pattern*, as represented by a `Pattern` object from the `pattern` module. An expectation is realized by an occurrence if `occurrence.circumstances in expectation.circumstances` is true (where `in` is an overloaded operator of the `Pattern` object that is used to match values against it; see the docstrings for the `pattern` module for details on patterns). For example, given a basic expectation with these circumstances: 75 | ```py 76 | ('Event', ANY, some.dict.containing({'threadId': 1})) 77 | ``` 78 | It can be realized by any of these occurrences: 79 | ```py 80 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 1}) 81 | ('Event', 'continued', {'threadId': 1}) 82 | ``` 83 | but not by any of these: 84 | ```py 85 | ('Request', 'continue', {'threadId': 1}) 86 | ('Event', 'thread', {'reason': 'exited', 'threadId': 2}) 87 | ``` 88 | 89 | From this definition follows that if a basic expectation is realized at some occurrence `now`, then it is also realized at any future occurrence `later` in the same timeline such that `later.follows(now)`. Thus, once a basic expectation is realized in a timeline, it cannot be un-realized. Note that this is not necessarily true for other expectations! 90 | 91 | Basic expectations can be created by instantiating `BasicExpectation` directly with the desired circumstances, but it is more common to use the helper functions: 92 | ```py 93 | Request(command, arguments) # BasicExpectation('Request', command, arguments) 94 | Response(request, body) # BasicExpectation('Response', request, body) 95 | Event(event, body) # BasicExpectation('Event', event, body) 96 | Mark(id) # BasicExpectation('Mark', id) 97 | ``` 98 | 99 | For responses, it is often desirable to specify success or failure in general, without details. This can be done by using `SUCCESS` and `FAILURE` in the pattern: 100 | ```py 101 | Response(request, SUCCESS) 102 | Response(request, FAILURE) 103 | ``` 104 | Note that you don't need to do it if you specify the body of the response explicitly, since a succesful response will always have a dict as a body, and a failed response will have an exception as a body. 105 | 106 | Since it is very common to wait for a response to a particular request, there is a shortcut to do it directly via the request occurrence: 107 | ```py 108 | initialize = debug_session.send_request('initialize', {'adapterID': 'test'}) 109 | initialize.wait_for_response() # timeline.wait_until(Response(initialize, ANY)) 110 | ``` 111 | 112 | 113 | ## Expectation algebra 114 | 115 | Basic expectations can be combined together to form more complicated ones. The four basic operators on expectations are *sequencing* (`>>`), *conjunction* aka "and" (`&`), *disjunction* aka "or" (`|`), and *exclusive disjunction* aka "xor" (`^`). In addition to those, an expectation can be made *conditional*. 116 | 117 | ### Sequencing (`>>`) 118 | 119 | When two expectations are sequenced: `(A >> B)` - the resulting expectation is realized at the occurrence at which `A` and `B` are both realized, but only if `A` was realized before `B`. For example, given an expectation: 120 | ```py 121 | Event('stopped', ANY) >> Event('continued', ANY) 122 | ``` 123 | it will **not** be realized in a timeline: 124 | ```py 125 | ('Event', 'continued', {'threadId': 1}) 126 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 127 | ``` 128 | because "stopped" happened after "continued", and the expectation was for it to happen before. However, it will be realized in: 129 | ```py 130 | ('Event', 'continued', {'threadId': 1}) 131 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 132 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 133 | ('Event', 'continued', {'threadId': 2}) 134 | ``` 135 | Note that in this case, there is an unrelated event "thread" in between "stopped" and "continued", which does not affect the result of the operation - by the end of the timeline, both "stopped" and "continued" happened, and they did so in the requested order, so what else happened in the timeline does not affect the realization of our expectation. 136 | 137 | Sequencing can also be done with respect to occurrences. Given occurrence `O` and expectation `X`, `(O >> X)` is an expectation that is realized at the first occurrence at which `X` is realized, and which **follows** `O` (note that it cannot be `O` itself!). For example, given: 138 | ```py 139 | something = timeline.mark('something') 140 | something >> Event('stopped', ANY) 141 | ``` 142 | a timeline like this: 143 | ```py 144 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 1}) 145 | ('Mark', 'something') 146 | ``` 147 | will not realize `something`, but this one will: 148 | ```py 149 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 1}) 150 | ('Mark', 'something') 151 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 152 | ``` 153 | 154 | Conversely, `(X >> O)` is an expectation that is realized at the first occurrence at which `X` is realized, and which **precedes** `O` (and, again, it cannot be `O` itself!). So: 155 | ```py 156 | something = timeline.mark('something') 157 | Event('stopped', ANY) >> something 158 | ``` 159 | will be realized at `something`, but only if the timeline already had an event "stopped" at that moment - since timelines only grow into the future, it is impossible for an event necessary to realize this expectation to appear after `mark()` was invoked. 160 | 161 | In practice, this is most often used with requests and responses; for example, to describe an event that should occur between a specific request and its response: 162 | ```py 163 | initialize = debug_session.send_request('initialize', {'adapterID': 'test'}) 164 | initialize_response = initialize_request.wait_for_response() 165 | assert ( 166 | initialize 167 | >> 168 | Event('initialized', {}) 169 | >> 170 | initialize_response 171 | ) in debug_session.timeline 172 | ``` 173 | Another useful pattern is `>>` combined with `wait_until`: 174 | ``` 175 | initialize = debug_session.send_request('initialize', {'adapterID': 'test'}) 176 | initialized = debug_session.wait_until(Event('initialized')) 177 | assert ( 178 | initialize 179 | >> 180 | Event('output', some.dict.containing({'category': 'telemetry'})) 181 | >> 182 | initialized 183 | ) in debug_session.timeline 184 | ``` 185 | 186 | ### Conjuction (`&`) 187 | 188 | When two expectations are conjuncted: `(A & B)` - the resulting expectation is realized at the occurrence at which `A` and `B` are both realized, regardless of their relative order. Thus: 189 | ```py 190 | Event('stopped', ANY) & Event('continued', ANY) 191 | ``` 192 | this expectation will be realized in timeline: 193 | ```py 194 | ('Event', 'continued', {'threadId': 1}) 195 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 196 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 197 | ``` 198 | but also in a differently ordered timeline: 199 | ```py 200 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 201 | ('Event', 'continued', {'threadId': 1}) 202 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 203 | ``` 204 | This is most commonly used as seen above, with related events where their relative ordering is unspecified (usually also framed by `>>` to narrow it down to a specific request/response pair). It can also be used to concurrently send multiple requests, and wait until they all got their responses: 205 | ```py 206 | pause1 = debug_session.send_request('pause', {'threadId': '1'}) 207 | pause2 = debug_session.send_request('pause', {'threadId': '2'}) 208 | debug_session.wait_until(Response(pause1) & Response(pause2)) 209 | ``` 210 | 211 | ### Disjunction (`|`) 212 | 213 | When two expectations are disjuncted: `(A | B)` - the resulting expectation is realized at the first occurrence at which either `A` or `B` is realized, or both are realized. Thus, the expectation: 214 | ```py 215 | Event('stopped', ANY) | Event('continued', ANY) 216 | ``` 217 | will be realized in any of these timelines: 218 | ```py 219 | ('Event', 'continued', {'threadId': 1}) 220 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 221 | 222 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 223 | 224 | ('Event', 'continued', {'threadId': 1}) 225 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 226 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 227 | 228 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 229 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 230 | ('Event', 'continued', {'threadId': 1}) 231 | ``` 232 | 233 | It is usually used to concurrently send multiple requests, and wait until the first one gets a response: 234 | ```py 235 | pause1 = debug_session.send_request('pause', {'threadId': '1'}) 236 | pause2 = debug_session.send_request('pause', {'threadId': '2'}) 237 | pause_response = debug_session.wait_until(Response(pause1) | Response(pause2)) 238 | handled_request = pause_response.request 239 | if handled_request is pause1: 240 | ... 241 | ``` 242 | 243 | ### Exclusive disjuction (`^`) 244 | 245 | When two expectations are exclusively disjuncted: `(A ^ B)` - the resulting expectation is realized at the first occurrence at which either `A` or `B` is realized, but not both. Thus, the expectation: 246 | ```py 247 | Event('stopped', ANY) ^ Event('continued', ANY) 248 | ``` 249 | is realized in: 250 | ```py 251 | ('Event', 'continued', {'threadId': 1}) 252 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 253 | 254 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 255 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 256 | ``` 257 | but not in: 258 | ```py 259 | ('Event', 'continued', {'threadId': 1}) 260 | ('Event', 'thread', {'reason': 'exited', 'threadId': 1}) 261 | ('Event', 'stopped', {'reason': 'breakpoint', 'threadId': 2}) 262 | ``` 263 | This can be used, for example, to test a request that can produce different events depending on various conditions, but should never produce both at the same time. 264 | 265 | ### Conditional expectation 266 | 267 | Given an expectation `X`, a conditional expectation `X.when(condition)` is realized at the occurrence `O` at which `X` is realized, but only if `condition(O)` returns `True`. In practice, it is typically used with a lambda to check for events that are caused by requests, but only happen when a request is successful, e.g.: 268 | ``` 269 | initialize = debug_session.send_request('initialize', {'adapterID': 'test'}).wait_for_response() 270 | assert Event('initialized', {}).when( 271 | lambda occ: Response(initialize, SUCCESS) in occ.timeline 272 | ) in debug_session.timeline 273 | ``` 274 | 275 | ## Debug session 276 | 277 | A debug session runs ptvsd, and records requests, responses and events on a timeline as they occur. It is an instance of `DebugSession`. A test normally obtains an instance by using the `debug_session` fixture: 278 | ```py 279 | def test_run(debug_session): 280 | ... 281 | ``` 282 | 283 | The timeline is exposed as `debug_session.timeline`. In addition to that, the following map directly to the corresponding `debug_session.timeline` methods: 284 | - `debug_session.mark()` 285 | - `debug_session.wait_until()` 286 | - `debug_session.history()` 287 | 288 | A freshly obtained session is dormant - there's no ptvsd running, and nothing to record. To make it useful, it needs to be primed to run some code: 289 | ```py 290 | debug_session.configure('program', 'example.py') 291 | ``` 292 | or, alternatively: 293 | ```py 294 | debug_session.configure(t'module', 'example') 295 | ``` 296 | 297 | At this point ptvsd is spinned up and connected to the test process, and the initial handshake sequence ("initialize" and "configurationDone" requests) has been performed. The timeline is also live, containing the recording of the handshake, and waiting for further occurrences. However, the debugger is still idle - the script or module we specified isn't actually running yet. This is a good time to issue requests to set any breakpoints: 298 | ```py 299 | debug_session.send_request('setBreakpoints', [ 300 | { 301 | 'source': {'path': 'example.py'}, 302 | 'breakpoints': [{'line': 3}, {'line': 5}] 303 | } 304 | ]).wait_for_response() 305 | ``` 306 | 307 | Once all the initial setup is performed, we can start execution: 308 | ```py 309 | debug_session.start_debugging() 310 | debug_session.wait_until(Event('stopped', some.dict.containing({'reason': 'breakpoint'}))) 311 | ``` 312 | 313 | 314 | --------------------------------------------------------------------------------