├── .github ├── CODEOWNERS ├── CONTRIBUTING.md └── workflows │ ├── release.yml │ └── tests.yml ├── .gitignore ├── .husky └── pre-commit ├── .prettierignore ├── .prettierrc ├── LICENSE ├── README.md ├── bin ├── mo-dev.js └── mo-test.js ├── jest.config.js ├── package-lock.json ├── package.json ├── src ├── cache.ts ├── canister.ts ├── commands │ ├── mo-dev.ts │ └── mo-test.ts ├── dfx.ts ├── index.ts ├── replica.ts ├── settings.ts ├── testing.ts ├── utils │ └── motoko.ts ├── wasm.ts └── watch.ts ├── tests ├── mo-dev.test.ts ├── mo-test.test.ts └── project │ ├── .gitignore │ ├── dfx.json │ ├── mops.toml │ ├── motoko_canister │ ├── Main.mo │ ├── lib │ │ └── Echo.mo │ └── test │ │ ├── DefaultFail.test.mo │ │ ├── DefaultPass.test.mo │ │ ├── ImportPass.test.mo │ │ ├── WasiError.test.mo │ │ ├── WasiFail.test.mo │ │ └── WasiPass.test.mo │ └── vm │ ├── dfx.json │ ├── tests │ └── Main.test.mo │ └── vm_canister │ └── Main.mo ├── tsconfig.json └── wasm ├── .gitignore ├── Cargo.toml ├── src └── lib.rs └── tests └── test.rs /.github/CODEOWNERS: -------------------------------------------------------------------------------- 1 | * @dfinity/dx 2 | -------------------------------------------------------------------------------- /.github/CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing 2 | 3 | Thank you for your interest in contributing to this repo. As a member of the community, you are invited and encouraged to contribute by submitting issues, offering suggestions for improvements, adding review comments to existing pull requests, or creating new pull requests to fix issues. 4 | 5 | All contributions to DFINITY documentation and the developer community are respected and appreciated. 6 | Your participation is an important factor in the success of the Internet Computer. 7 | 8 | ## Before you contribute 9 | 10 | Before contributing, please take a few minutes to review these contributor guidelines. 11 | The contributor guidelines are intended to make the contribution process easy and effective for everyone involved in addressing your issue, assessing changes, and finalizing your pull requests. 12 | 13 | Before contributing, consider the following: 14 | 15 | - If you want to report an issue, click **Issues**. 16 | 17 | - If you have a general question, post a message to the [community forum](https://forum.dfinity.org/) or submit a [support request](mailto://support@dfinity.org). 18 | 19 | - If you are reporting a bug, provide as much information about the problem as possible. 20 | 21 | - If you want to contribute directly to this repository, typical fixes might include any of the following: 22 | 23 | - Fixes to resolve bugs or documentation errors 24 | - Code improvements 25 | - Feature requests 26 | 27 | Note that any contribution to this repository must be submitted in the form of a **pull request**. 28 | 29 | - If you are creating a pull request, be sure that the pull request only implements one fix or suggestion. 30 | 31 | If you are new to working with GitHub repositories and creating pull requests, consider exploring [First Contributions](https://github.com/firstcontributions/first-contributions) or [How to Contribute to an Open Source Project on GitHub](https://egghead.io/courses/how-to-contribute-to-an-open-source-project-on-github). 32 | 33 | # How to make a contribution 34 | 35 | Depending on the type of contribution you want to make, you might follow different workflows. 36 | 37 | This section describes the most common workflow scenarios: 38 | 39 | - Reporting an issue 40 | - Submitting a pull request 41 | 42 | ### Reporting an issue 43 | 44 | To open a new issue: 45 | 46 | 1. Click **Issues**. 47 | 48 | 1. Click **New Issue**. 49 | 50 | 1. Click **Open a blank issue**. 51 | 52 | 1. Type a title and description, then click **Submit new issue**. 53 | 54 | Be as clear and descriptive as possible. 55 | 56 | For any problem, describe it in detail, including details about the crate, the version of the code you are using, the results you expected, and how the actual results differed from your expectations. 57 | 58 | ### Submitting a pull request 59 | 60 | If you want to submit a pull request to fix an issue or add a feature, here's a summary of what you need to do: 61 | 62 | 1. Make sure you have a GitHub account, an internet connection, and access to a terminal shell or GitHub Desktop application for running commands. 63 | 64 | 2. Navigate to the official repository in a web browser. 65 | 66 | 3. Click **Fork** to create a copy of the repository associated with the issue you want to address under your GitHub account or organization name. 67 | 68 | 4. Clone the repository to your local machine. 69 | 70 | 5. Create a new branch for your fix by running a command similar to the following: 71 | 72 | ```bash 73 | git checkout -b my-branch-name-here 74 | ``` 75 | 76 | 6. Open the file you want to fix in a text editor and make the appropriate changes for the issue you are trying to address. 77 | 78 | 7. Add the file contents of the changed files to the index `git` uses to manage the state of the project by running a command similar to the following: 79 | 80 | ```bash 81 | git add path-to-changed-file 82 | ``` 83 | 8. Commit your changes to store the contents you added to the index along with a descriptive message by running a command similar to the following: 84 | 85 | ```bash 86 | git commit -m "Description of the fix being committed." 87 | ``` 88 | 89 | 9. Push the changes to the remote repository by running a command similar to the following: 90 | 91 | ```bash 92 | git push origin my-branch-name-here 93 | ``` 94 | 95 | 10. Create a new pull request for the branch you pushed to the upstream GitHub repository. 96 | 97 | Provide a title that includes a short description of the changes made. 98 | 99 | 11. Wait for the pull request to be reviewed. 100 | 101 | 12. Make changes to the pull request, if requested. 102 | 103 | 13. Celebrate your success after your pull request is merged! 104 | -------------------------------------------------------------------------------- /.github/workflows/release.yml: -------------------------------------------------------------------------------- 1 | name: Release 2 | 3 | on: 4 | push: 5 | branches: 6 | - main 7 | paths: 8 | - package.json 9 | 10 | workflow_dispatch: 11 | 12 | jobs: 13 | tag-release: 14 | name: Create release tag 15 | runs-on: ubuntu-latest 16 | steps: 17 | - uses: actions/checkout@v3 18 | - id: create-tag 19 | run: RELEASE_TAG=v$(node -e "console.log(require('./package.json').version)") && git tag $RELEASE_TAG && git push origin $RELEASE_TAG 20 | continue-on-error: true 21 | outputs: 22 | tag-outcome: ${{ steps.create-tag.outcome }} 23 | 24 | release: 25 | name: Release mo-dev 26 | needs: [tag-release] 27 | if: needs.tag-release.outputs.tag-outcome == 'success' 28 | runs-on: ubuntu-latest 29 | steps: 30 | - uses: actions/checkout@v3 31 | - name: Setup Node.js 32 | uses: actions/setup-node@v3 33 | with: 34 | node-version: 16.x 35 | cache: 'npm' 36 | - name: Install dfx 37 | uses: dfinity/setup-dfx@main 38 | - name: Install Mops 39 | run: npm i -g ic-mops 40 | - run: dfx cache install 41 | - run: npm ci 42 | - run: npm run package 43 | - name: Get version 44 | id: version 45 | run: echo ::set-output\ name=TAG_NAME::v$(node -e "console.log(require('./package.json').version)") 46 | - name: Create release 47 | id: create_release 48 | uses: actions/create-release@v1 49 | env: 50 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by GitHub Actions 51 | with: 52 | tag_name: ${{ steps.version.outputs.TAG_NAME }} 53 | release_name: mo-dev ${{ steps.version.outputs.TAG_NAME }} 54 | body: Portable `mo-dev` binaries for Linux, macOS, and Windows. 55 | draft: false 56 | prerelease: false 57 | - name: Upload release (Linux) 58 | id: upload-release-linux 59 | uses: actions/upload-release-asset@v1 60 | env: 61 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} 62 | with: 63 | upload_url: ${{ steps.create_release.outputs.upload_url }} 64 | asset_path: pkg/mo-dev-linux.tar.gz 65 | asset_name: mo-dev-linux.tar.gz 66 | asset_content_type: application/octet-stream 67 | - name: Upload release (macOS) 68 | id: upload-release-macos 69 | uses: actions/upload-release-asset@v1 70 | env: 71 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} 72 | with: 73 | upload_url: ${{ steps.create_release.outputs.upload_url }} 74 | asset_path: pkg/mo-dev-macos.tar.gz 75 | asset_name: mo-dev-macos.tar.gz 76 | asset_content_type: application/octet-stream 77 | - name: Upload release (Windows) 78 | id: upload-release-windows 79 | uses: actions/upload-release-asset@v1 80 | env: 81 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} 82 | with: 83 | upload_url: ${{ steps.create_release.outputs.upload_url }} 84 | asset_path: pkg/mo-dev-windows.zip 85 | asset_name: mo-dev-windows.zip 86 | asset_content_type: application/octet-stream 87 | -------------------------------------------------------------------------------- /.github/workflows/tests.yml: -------------------------------------------------------------------------------- 1 | # This workflow will do a clean installation of node dependencies, cache/restore them, build the source code and run tests across different versions of node 2 | # For more information see: https://help.github.com/actions/language-and-framework-guides/using-nodejs-with-github-actions 3 | 4 | name: Tests 5 | 6 | on: 7 | push: 8 | branches: [ "main" ] 9 | pull_request: 10 | branches: [ "main" ] 11 | 12 | jobs: 13 | build: 14 | runs-on: ubuntu-latest 15 | 16 | strategy: 17 | matrix: 18 | node-version: [18.x, 20.x] 19 | # See supported Node.js release schedule at https://nodejs.org/en/about/releases/ 20 | 21 | steps: 22 | - uses: actions/checkout@v3 23 | - name: Use Node.js ${{ matrix.node-version }} 24 | uses: actions/setup-node@v3 25 | with: 26 | node-version: ${{ matrix.node-version }} 27 | cache: 'npm' 28 | # - run: git clone https://github.com/dfinity/motoko.rs ../motoko.rs --depth 1 29 | - name: Install dfx 30 | uses: dfinity/setup-dfx@main 31 | - name: Install Mops 32 | run: npm i -g ic-mops 33 | - run: dfx cache install 34 | - run: npm ci 35 | - run: npm run build --if-present 36 | - run: npm test 37 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | node_modules/ 2 | 3 | /lib 4 | /pkg 5 | 6 | .dfx 7 | .vessel 8 | .mops 9 | 10 | *.test.wasm 11 | -------------------------------------------------------------------------------- /.husky/pre-commit: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | . "$(dirname "$0")/_/husky.sh" 3 | 4 | npm run precommit 5 | -------------------------------------------------------------------------------- /.prettierignore: -------------------------------------------------------------------------------- 1 | /lib/generated 2 | -------------------------------------------------------------------------------- /.prettierrc: -------------------------------------------------------------------------------- 1 | { 2 | "singleQuote": true, 3 | "semi": true, 4 | "tabWidth": 4, 5 | "bracketSpacing": true, 6 | "trailingComma": "all" 7 | } 8 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright 2023 DFINITY 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | # `mo-dev`  [![npm version](https://img.shields.io/npm/v/mo-dev.svg?logo=npm)](https://www.npmjs.com/package/mo-dev) [![GitHub license](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0) [![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/dfinity/motoko-dev-server/issues) 3 | 4 | > ### A [live reload](https://blog.logrocket.com/complete-guide-full-stack-live-reload/) development server for [Motoko](https://smartcontracts.org/) smart contracts. 5 | 6 | --- 7 | 8 | `mo-dev` is a flexible command-line tool for speeding up your Motoko development workflow. 9 | 10 | - [Announcement blog post](https://ryanvandersmith.medium.com/announcing-the-motoko-dev-server-live-reloading-for-web3-dapps-20363088afb4) 11 | - [Developer forum topic](https://forum.dfinity.org/t/announcing-mo-dev-live-reloading-for-motoko-dapps/21007) 12 | 13 | ## Try Online 14 | 15 | Get started with a full-stack [Vite + React + Motoko](https://github.com/rvanasa/vite-react-motoko#readme) project directly in your browser: 16 | 17 | [![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/rvanasa/vite-react-motoko) 18 | 19 | ## Quick Start 20 | 21 | Run the following command (requires [Node.js](https://nodejs.org/en/) ≥ 16): 22 | 23 | ```sh 24 | npm i -g mo-dev 25 | ``` 26 | 27 | > Note: standalone `mo-dev` binaries are also available as [GitHub releases](https://github.com/dfinity/motoko-dev-server/releases). 28 | 29 | Once installed, view the available command-line options by passing the `--help` flag: 30 | 31 | ```sh 32 | mo-dev --help 33 | ``` 34 | 35 | Check out the [Vite + React + Motoko](https://github.com/rvanasa/vite-react-motoko#readme) starter project for an example of how to integrate `mo-dev` into a modern full-stack webapp. 36 | 37 | ## Basic Features 38 | 39 | Regenerate type declarations on Motoko file change (`--generate` or `-g`): 40 | 41 | ```sh 42 | mo-dev --generate 43 | ``` 44 | 45 | Deploy canisters on Motoko file change (`--deploy` or `-d`): 46 | 47 | ```sh 48 | mo-dev --deploy 49 | ``` 50 | 51 | Automatically respond "yes" to reinstall prompts (`--yes` or `-y`; may clear canister state): 52 | 53 | ```sh 54 | mo-dev --deploy -y 55 | ``` 56 | 57 | Run unit tests (`*.test.mo`) on Motoko file change (`--test` or `-t`): 58 | 59 | ```sh 60 | mo-dev --test 61 | ``` 62 | 63 | Run an arbitrary command on Motoko file change (`--exec` or `-x`): 64 | 65 | ```sh 66 | mo-dev --exec 'npm run my-reload-script' 67 | ``` 68 | 69 | Specify the working directory (`--cwd` or `-C`; should contain a `dfx.json` file): 70 | 71 | ```sh 72 | mo-dev --cwd path/to/dfx_project 73 | ``` 74 | 75 | Only run the dev server for specific canisters (`--canister` or `-c`): 76 | 77 | ```sh 78 | mo-dev --canister foo --canister bar --deploy 79 | ``` 80 | 81 | Pass an installation argument to `dfx deploy` (`--argument` or `-a`): 82 | 83 | ```sh 84 | mo-dev --deploy --argument '()' 85 | ``` 86 | 87 | ## Advanced Features 88 | 89 | Show additional debug output in the console (`--verbose` or `-v`): 90 | 91 | ```sh 92 | mo-dev -v # more verbose 93 | mo-dev -vv # extra verbose 94 | ``` 95 | 96 | Programmatically start `mo-dev` using JavaScript: 97 | 98 | ```js 99 | import devServer from 'mo-dev'; 100 | 101 | // Default settings 102 | devServer(); 103 | 104 | // Custom settings 105 | devServer({ 106 | directory: '.', 107 | port: 7700, 108 | verbosity: 0, 109 | // ... 110 | }); 111 | ``` 112 | 113 | ## `mo-test` 114 | 115 | The `mo-dev` npm package includes a `mo-test` command which can be used to run unit tests in CI workflows. 116 | 117 | View all available options: 118 | 119 | ```sh 120 | mo-test --help 121 | ``` 122 | 123 | Run all Motoko unit tests (`*.test.mo`): 124 | 125 | ```sh 126 | mo-test 127 | ``` 128 | 129 | Run all Motoko unit tests using a WASI runtime by default (faster but requires installing [Wasmtime](https://wasmtime.dev/) on your system): 130 | 131 | ```sh 132 | mo-test --testmode wasi 133 | ``` 134 | 135 | Configure the runtime of an individual unit test by including the following comment in a `*.test.mo` file: 136 | 137 | ```motoko 138 | // @testmode wasi 139 | ``` 140 | 141 | Run specific unit tests by passing a file name prefix (`-f` or `--testfile`): 142 | 143 | ```sh 144 | mo-test -f Foo -f Bar # (only run tests for files starting with `Foo` or `Bar`) 145 | ``` 146 | 147 | These options may also be passed directly into the `mo-dev` command (e.g. `mo-dev --testmode wasi -f SomeTest`). 148 | 149 | --- 150 | 151 | `mo-dev` is early in development. Please feel free to report a bug, ask a question, or request a feature on the project's [GitHub issues](https://github.com/dfinity/motoko-dev-server/issues) page. 152 | 153 | Contributions are welcome! Please check out the [contributor guidelines](https://github.com/dfinity/motoko-dev-server/blob/main/.github/CONTRIBUTING.md) for more information. 154 | -------------------------------------------------------------------------------- /bin/mo-dev.js: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env node 2 | 'use strict'; 3 | 4 | require('../lib/commands/mo-dev'); 5 | -------------------------------------------------------------------------------- /bin/mo-test.js: -------------------------------------------------------------------------------- 1 | #! /usr/bin/env node 2 | 'use strict'; 3 | 4 | require('../lib/commands/mo-test'); 5 | -------------------------------------------------------------------------------- /jest.config.js: -------------------------------------------------------------------------------- 1 | module.exports = { 2 | preset: 'ts-jest', 3 | }; 4 | -------------------------------------------------------------------------------- /package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "mo-dev", 3 | "version": "0.13.0", 4 | "description": "A live reload development server for Motoko smart contracts.", 5 | "author": "DFINITY Foundation (https://dfinity.org)", 6 | "license": "Apache-2.0", 7 | "main": "lib/index.js", 8 | "types": "lib/index.d.ts", 9 | "bin": { 10 | "mo-dev": "bin/mo-dev.js", 11 | "mo-test": "bin/mo-test.js" 12 | }, 13 | "scripts": { 14 | "start": "ts-node src/commands/mo-dev", 15 | "mo-test": "ts-node src/commands/mo-test", 16 | "build:ts": "rimraf ./lib && tsc -p .", 17 | "build:wasm": "", 18 | "build:wasm_": "wasm-pack build wasm --target nodejs --out-dir pkg/nodejs", 19 | "build": "run-s build:wasm build:ts", 20 | "prepare": "husky install", 21 | "prepare_": "husky install && cd wasm && cargo update", 22 | "test": "cross-env NODE_OPTIONS=--experimental-vm-modules jest", 23 | "package": "rimraf ./pkg && run-s build test package:pkg package:zip", 24 | "package:pkg": "pkg -t node18-linux,node18-win,node18-macos lib/commands/mo-dev.js -o pkg/mo-dev", 25 | "package:zip": "run-s package:zip:linux package:zip:macos package:zip:win", 26 | "package:zip:linux": "cd pkg && mkdir linux && cp mo-dev-linux linux/mo-dev && cd linux && tar -czvf ../mo-dev-linux.tar.gz mo-dev", 27 | "package:zip:macos": "cd pkg && mkdir macos && cp mo-dev-macos macos/mo-dev && cd macos && tar -czvf ../mo-dev-macos.tar.gz mo-dev", 28 | "package:zip:win": "cd pkg && mkdir win && cp mo-dev-win.exe win/mo-dev.exe && cd win && zip ../mo-dev-windows.zip mo-dev.exe", 29 | "precommit": "lint-staged", 30 | "prepublishOnly": "run-s build test" 31 | }, 32 | "dependencies": { 33 | "@dfinity/candid": "^1.2.0", 34 | "@dfinity/principal": "^1.2.0", 35 | "body-parser": "^1.20.2", 36 | "chokidar": "^3.5.3", 37 | "commander": "^10.0.0", 38 | "cookie-parser": "^1.4.6", 39 | "cors": "^2.8.5", 40 | "debug": "^4.3.4", 41 | "execa": "^5.1.1", 42 | "express": "^4.19.2", 43 | "fast-glob": "^3.2.12", 44 | "morgan": "^1.10.0", 45 | "motoko": "^3.6.16", 46 | "node-cleanup": "^2.1.2", 47 | "picocolors": "^1.0.0", 48 | "shell-escape": "^0.2.0" 49 | }, 50 | "devDependencies": { 51 | "@types/body-parser": "^1.19.2", 52 | "@types/bytes": "^3.1.1", 53 | "@types/compression": "^1.7.2", 54 | "@types/cookie-parser": "^1.4.3", 55 | "@types/cors": "^2.8.13", 56 | "@types/express": "^4.17.17", 57 | "@types/jest": "^29.4.4", 58 | "@types/morgan": "^1.9.4", 59 | "@types/node-cleanup": "^2.1.2", 60 | "@types/shell-escape": "^0.2.1", 61 | "@types/swagger-jsdoc": "^6.0.1", 62 | "@types/swagger-ui-express": "^4.1.3", 63 | "@types/type-is": "^1.6.3", 64 | "@types/which": "^2.0.2", 65 | "@wasmer/wasi": "^1.2.2", 66 | "axios": "^1.3.4", 67 | "cross-env": "^7.0.3", 68 | "eslint-config-prettier": "^8.7.0", 69 | "husky": "^8.0.1", 70 | "jest": "^29.4.4", 71 | "lint-staged": "^13.2.0", 72 | "npm-run-all": "^4.1.5", 73 | "pkg": "^5.8.1", 74 | "prettier": "^2.8.4", 75 | "ts-jest": "^29.0.5", 76 | "ts-node": "^10.9.1", 77 | "typescript": "^4.9.5", 78 | "wasm-pack": "^0.10.3" 79 | }, 80 | "repository": { 81 | "type": "git", 82 | "url": "https://github.com/dfinity/motoko-dev-server.git" 83 | }, 84 | "lint-staged": { 85 | "{lib,contrib,utils}/**/*.{js,ts,jsx,tsx}": [ 86 | "prettier --write" 87 | ] 88 | }, 89 | "directories": { 90 | "lib": "lib", 91 | "example": "examples" 92 | }, 93 | "files": [ 94 | "index.js", 95 | "bin/**/*", 96 | "lib/**/*", 97 | "wasm/pkg/**/*" 98 | ], 99 | "keywords": [ 100 | "cli", 101 | "motoko", 102 | "language", 103 | "programming-language", 104 | "dfinity", 105 | "smart-contract", 106 | "canister", 107 | "browser", 108 | "ic", 109 | "icp", 110 | "internet-computer", 111 | "blockchain", 112 | "cryptocurrency", 113 | "nft", 114 | "token", 115 | "webpack", 116 | "live-reload", 117 | "server", 118 | "hot-module-reload" 119 | ] 120 | } 121 | -------------------------------------------------------------------------------- /src/cache.ts: -------------------------------------------------------------------------------- 1 | import { existsSync, readFileSync } from 'fs'; 2 | import { resolve } from 'path'; 3 | 4 | interface Config { 5 | directory: string; 6 | } 7 | 8 | export class FileCache { 9 | private readonly _config: Config; 10 | private readonly _map = new Map(); 11 | 12 | constructor(config: Config) { 13 | this._config = config; 14 | } 15 | 16 | _resolvePath(path: string) { 17 | return resolve(this._config.directory, path); 18 | } 19 | 20 | /** 21 | * Find the file content for a given path. 22 | * @param path File system path 23 | */ 24 | get(path: string): string | undefined { 25 | path = this._resolvePath(path); 26 | if (!existsSync(path)) { 27 | return; 28 | } 29 | const cached = this._map.get(path); 30 | if (cached !== undefined) { 31 | return cached; 32 | } 33 | const data = readFileSync(path, 'utf8'); 34 | this._map.set(path, data); 35 | return data; 36 | } 37 | 38 | /** 39 | * Update the source code for a given path. Returns `true` if changed, otherwise `false`. 40 | * @param path File system path 41 | */ 42 | update(path: string): boolean { 43 | path = this._resolvePath(path); 44 | const previous = this._map.get(path); 45 | this._map.delete(path); 46 | const current = this.get(path); 47 | return current !== previous; 48 | } 49 | 50 | /** 51 | * Reset the cache for the given path. Returns `true` if changed, otherwise `false`. 52 | * @returns File system path 53 | */ 54 | invalidate(path: string): boolean { 55 | path = this._resolvePath(path); 56 | return this._map.delete(path); 57 | } 58 | } 59 | -------------------------------------------------------------------------------- /src/canister.ts: -------------------------------------------------------------------------------- 1 | import { resolve } from 'path'; 2 | import { DfxConfig } from './dfx'; 3 | 4 | export interface Canister { 5 | alias: string; 6 | file: string; 7 | config: any; 8 | } 9 | 10 | export function getDfxCanisters( 11 | directory: string, 12 | dfxConfig: DfxConfig, 13 | ): Canister[] { 14 | const canisters: Canister[] = []; 15 | if (dfxConfig?.canisters) { 16 | Object.entries(dfxConfig.canisters).forEach(([alias, config]) => { 17 | if (!config) { 18 | return; 19 | } 20 | const { type, main } = config; 21 | if (main && (!type || type === 'motoko')) { 22 | canisters.push({ 23 | alias, 24 | file: resolve(directory, main), 25 | config, 26 | }); 27 | } 28 | }); 29 | } 30 | return canisters; 31 | } 32 | -------------------------------------------------------------------------------- /src/commands/mo-dev.ts: -------------------------------------------------------------------------------- 1 | import { program, Option } from 'commander'; 2 | import { resolve } from 'path'; 3 | import { Settings, defaultSettings } from '../settings'; 4 | import devServer from '..'; 5 | import { TestMode, asTestMode } from '../testing'; 6 | 7 | let verbosity = defaultSettings.verbosity; 8 | const increaseVerbosity = () => verbosity++; 9 | 10 | const testModes: TestMode[] = []; 11 | const addTestMode = (mode: string) => testModes.push(asTestMode(mode)); 12 | 13 | const testFiles: string[] = []; 14 | const addTestFile = (file: string) => testFiles.push(file); 15 | 16 | const canisterNames: string[] = []; 17 | const addCanisterName = (name: string) => canisterNames.push(name); 18 | 19 | const deployArgs: string[] = []; 20 | const addDeployArg = (arg: string) => deployArgs.push(arg); 21 | 22 | const examples: [string, string][] = [ 23 | ['-d', 'redeploy canisters on file change'], 24 | ['-d -y', 'upgrade canisters on file change'], 25 | ['-g', 'generate TypeScript bindings on file change'], 26 | ['-t', 'run unit tests on file change'], 27 | [ 28 | '-r -c foo_canister -c bar_canister', 29 | 'redeploy `foo_canister` and `bar_canister` on file change', 30 | ], 31 | ]; 32 | 33 | const { 34 | cwd, 35 | version, 36 | port, 37 | delay, 38 | exec, 39 | generate, 40 | deploy, 41 | test, 42 | yes, 43 | hotReload, 44 | } = program 45 | .name('mo-dev') 46 | .description( 47 | `Examples:\n${examples 48 | .map( 49 | ([usage, description]) => 50 | ` $ mo-dev ${usage} # ${description}`, 51 | ) 52 | .join('\n')}`, 53 | ) 54 | .option('-V, --version', `show installed version`) 55 | .option('-C, --cwd ', 'directory containing a `dfx.json` file') 56 | .option('-d, --deploy', `run \`dfx deploy\` on file change`) 57 | .option('-t, --test', `run unit tests on file change`) 58 | .option( 59 | '--testmode ', 60 | `default test mode (interpreter, wasi)`, 61 | addTestMode, 62 | ) 63 | .option( 64 | '-f, --testfile ', 65 | `only run tests with the given file name prefix`, 66 | addTestFile, 67 | ) 68 | .option( 69 | '-c, --canister ', 70 | `use the given Motoko canister`, 71 | addCanisterName, 72 | ) 73 | .option( 74 | '-a, --argument ', 75 | `pass an install argument to \`dfx deploy\``, 76 | addDeployArg, 77 | ) 78 | .option( 79 | '-y, --yes', 80 | `respond "yes" to reinstall prompts (may reset canister state)`, 81 | ) 82 | .option('-g, --generate', `run \`dfx generate\` on file change`) 83 | .option('-x, --exec ', `execute command on file change`) 84 | .option('-v, --verbose', `show more details in console`, increaseVerbosity) 85 | .parse() 86 | .opts(); 87 | 88 | if (version) { 89 | console.log('mo-dev', require('../../package.json').version); 90 | process.exit(0); 91 | } 92 | 93 | const settings: Settings = { 94 | directory: resolve(cwd || defaultSettings.directory), 95 | port: port ? +port : defaultSettings.port, 96 | delay: !!delay || defaultSettings.delay, 97 | execute: exec || defaultSettings.execute, 98 | verbosity, 99 | generate: !!generate || defaultSettings.generate, 100 | deploy: !!deploy || defaultSettings.deploy, 101 | deployArgs: deployArgs.length ? deployArgs : defaultSettings.deployArgs, 102 | test: !!test || defaultSettings.test, 103 | testModes: testModes.length ? testModes : defaultSettings.testModes, 104 | testFiles: testFiles.length ? testFiles : defaultSettings.testFiles, 105 | canisterNames: canisterNames.length 106 | ? canisterNames 107 | : defaultSettings.canisterNames, 108 | reinstall: !!yes || defaultSettings.reinstall, 109 | hotReload: !!hotReload || defaultSettings.hotReload, 110 | // ci: !!ci || defaultSettings.ci, 111 | ci: defaultSettings.ci, 112 | }; 113 | 114 | devServer(settings).catch((err) => { 115 | console.error(err.stack || err); 116 | process.exit(1); 117 | }); 118 | -------------------------------------------------------------------------------- /src/commands/mo-test.ts: -------------------------------------------------------------------------------- 1 | import { program } from 'commander'; 2 | import { resolve } from 'path'; 3 | import { defaultSettings } from '../settings'; 4 | import { TestMode, asTestMode, runTests } from '../testing'; 5 | 6 | let verbosity = defaultSettings.verbosity; 7 | const increaseVerbosity = () => verbosity++; 8 | 9 | const testModes: TestMode[] = []; 10 | const addTestMode = (mode: string) => testModes.push(asTestMode(mode)); 11 | const testFiles: string[] = []; 12 | const addTestFile = (file: string) => testFiles.push(file); 13 | 14 | const examples: [string, string][] = [ 15 | [ 16 | '--testmode wasi', 17 | 'Use the WASI runtime by default (faster but requires `wasmtime` on your system path)', 18 | ], 19 | [ 20 | '--testmode wasi --testmode interpreter', 21 | 'Use both the interpreter and WASI runtimes by default', 22 | ], 23 | ['-f MyTest', 'Only run tests with file names starting with `MyTest`'], 24 | [ 25 | '-f Foo -f Bar', 26 | 'Only run tests with file names starting with either `Foo` or `Bar`', 27 | ], 28 | ]; 29 | 30 | const { cwd, version } = program 31 | .name('mo-test') 32 | .description( 33 | `Examples:\n${examples 34 | .map( 35 | ([usage, description]) => 36 | ` $ mo-test ${usage} # ${description}`, 37 | ) 38 | .join('\n')}`, 39 | ) 40 | .option('-V, --version', `show installed version`) 41 | .option('-C, --cwd ', 'directory containing a `dfx.json` file') 42 | .option( 43 | '--testmode ', 44 | `default test mode (interpreter, wasi)`, 45 | addTestMode, 46 | ) 47 | .option( 48 | '-f, --testfile ', 49 | `only run tests with the given file name prefix`, 50 | addTestFile, 51 | ) 52 | .option('-v, --verbose', `show more details in console`, increaseVerbosity) 53 | .parse() 54 | .opts(); 55 | 56 | if (version) { 57 | console.log('mo-test', require('../../package.json').version); 58 | process.exit(0); 59 | } 60 | 61 | const settings = { 62 | directory: resolve(cwd || defaultSettings.directory), 63 | testModes: testModes.length ? testModes : defaultSettings.testModes, 64 | testFiles: testFiles.length ? testFiles : defaultSettings.testFiles, 65 | verbosity, 66 | }; 67 | 68 | runTests(settings) 69 | .then((runs) => { 70 | if ( 71 | runs.length === 0 || 72 | runs.some( 73 | (run) => run.status !== 'passed' && run.status !== 'skipped', 74 | ) 75 | ) { 76 | process.exit(1); 77 | } 78 | }) 79 | .catch((err) => { 80 | console.error(err.stack || err); 81 | process.exit(1); 82 | }); 83 | -------------------------------------------------------------------------------- /src/dfx.ts: -------------------------------------------------------------------------------- 1 | import { existsSync, readFileSync } from 'fs'; 2 | import { resolve } from 'path'; 3 | import execa from 'execa'; 4 | 5 | export interface DfxConfig { 6 | dfx?: string; 7 | canisters?: Record; 8 | defaults?: { 9 | build?: { 10 | packtool?: string; 11 | }; 12 | }; 13 | } 14 | 15 | export interface CanisterConfig { 16 | type?: string; 17 | main?: string; 18 | dependencies?: string[]; 19 | } 20 | 21 | export async function loadDfxConfig( 22 | directory: string, 23 | ): Promise { 24 | const dfxPath = resolve(directory, 'dfx.json'); 25 | if (!existsSync(dfxPath)) { 26 | return; 27 | } 28 | return JSON.parse(readFileSync(dfxPath, 'utf8')); 29 | } 30 | 31 | export async function loadDfxSources( 32 | directory: string, 33 | ): Promise { 34 | const dfxConfig = await loadDfxConfig(directory); 35 | const packtool = dfxConfig?.defaults?.build?.packtool; 36 | if (!packtool) { 37 | return; 38 | } 39 | const packtoolResult = await execa(packtool, { 40 | shell: true, 41 | cwd: directory, 42 | reject: false, 43 | }); 44 | if (packtoolResult.failed) { 45 | throw new Error( 46 | `Error while running 'defaults.build.packtool' command from dfx.json file in ${directory}`, 47 | ); 48 | } 49 | return packtoolResult.stdout.replace(/\n/g, ' '); 50 | } 51 | -------------------------------------------------------------------------------- /src/index.ts: -------------------------------------------------------------------------------- 1 | import { Settings, defaultSettings, validateSettings } from './settings'; 2 | import wasm from './wasm'; 3 | import { watch } from './watch'; 4 | 5 | export { Settings, defaultSettings }; 6 | 7 | export default async function devServer(options: Partial = {}) { 8 | const settings = await validateSettings(options); 9 | 10 | wasm.update_settings(settings); 11 | 12 | const output = { 13 | watcher: await watch(settings), 14 | close() { 15 | output.watcher.close(); 16 | }, 17 | }; 18 | return output; 19 | } 20 | -------------------------------------------------------------------------------- /src/replica.ts: -------------------------------------------------------------------------------- 1 | import { spawn } from 'child_process'; 2 | 3 | // Replica proxy fallback (currently unused) 4 | 5 | export function startReplica(directory: string, port: number) { 6 | const replica = spawn( 7 | 'dfx', 8 | [ 9 | 'start', 10 | // '--clean', 11 | '--host', 12 | `127.0.0.1:${port}`, 13 | ], 14 | { 15 | cwd: directory, 16 | }, 17 | ); 18 | replica.stdout.pipe(process.stdout); 19 | replica.stderr.pipe(process.stderr); 20 | 21 | return replica; 22 | } 23 | -------------------------------------------------------------------------------- /src/settings.ts: -------------------------------------------------------------------------------- 1 | import pc from 'picocolors'; 2 | import { loadDfxConfig } from './dfx'; 3 | import { TestMode } from './testing'; 4 | 5 | export interface Settings { 6 | directory: string; 7 | port: number; 8 | delay: boolean; 9 | execute: string; 10 | verbosity: number; 11 | generate: boolean; 12 | deploy: boolean; 13 | deployArgs: string[]; 14 | test: boolean; 15 | testModes: TestMode[]; 16 | testFiles: string[]; 17 | canisterNames: string[]; 18 | reinstall: boolean; 19 | hotReload: boolean; 20 | ci: boolean; 21 | } 22 | 23 | export const defaultSettings: Settings = { 24 | directory: '.', 25 | port: 7700, 26 | delay: false, 27 | execute: '', 28 | verbosity: 0, 29 | generate: false, 30 | deploy: false, 31 | deployArgs: [], 32 | test: false, 33 | testModes: ['interpreter'], 34 | testFiles: [], 35 | canisterNames: [], 36 | reinstall: false, 37 | hotReload: false, 38 | ci: process.env.CI && process.env.CI !== '0' && process.env.CI !== 'false', 39 | }; 40 | 41 | export async function validateSettings( 42 | settings: Partial, 43 | ): Promise { 44 | const resolvedSettings: Settings = { 45 | ...defaultSettings, 46 | ...settings, 47 | }; 48 | if (!(await loadDfxConfig(resolvedSettings.directory))) { 49 | console.error( 50 | pc.yellow( 51 | `Please specify a directory containing a \`dfx.json\` config file.`, 52 | ), 53 | ); 54 | console.error(); 55 | console.error( 56 | pc.bold(`Example:`), 57 | '$ mo-dev -c path/to/my/dfx_project', 58 | ); 59 | console.error(); 60 | process.exit(1); 61 | } 62 | return resolvedSettings; 63 | } 64 | -------------------------------------------------------------------------------- /src/testing.ts: -------------------------------------------------------------------------------- 1 | import execa from 'execa'; 2 | import glob from 'fast-glob'; 3 | import { existsSync, unlinkSync } from 'fs'; 4 | import { readFile } from 'fs/promises'; 5 | import onCleanup from 'node-cleanup'; 6 | import { basename, dirname, join, relative } from 'path'; 7 | import pc from 'picocolors'; 8 | import shellEscape from 'shell-escape'; 9 | import { loadDfxSources } from './dfx'; 10 | import { validateSettings } from './settings'; 11 | 12 | export interface TestSettings { 13 | directory: string; 14 | verbosity: number; 15 | testModes: TestMode[]; 16 | testFiles: string[]; 17 | } 18 | 19 | export interface Test { 20 | path: string; 21 | } 22 | 23 | export type TestMode = 'interpreter' | 'wasi'; 24 | export type TestStatus = 'passed' | 'failed' | 'errored' | 'skipped'; 25 | 26 | export interface TestRun { 27 | test: Test; 28 | mode: TestMode; 29 | status: TestStatus; 30 | message?: string | undefined; 31 | stdout: string; 32 | stderr: string; 33 | } 34 | 35 | export function asTestMode(mode: string): TestMode { 36 | // TODO: possibly validate here 37 | return mode as TestMode; 38 | } 39 | 40 | export async function runTests( 41 | options: Partial, 42 | callback?: (result: TestRun) => void | Promise, 43 | ) { 44 | const settings = (await validateSettings(options)) as TestSettings; 45 | const { directory } = settings; 46 | 47 | const testFilePattern = '*.test.mo'; 48 | const paths = ( 49 | await glob(`**/${testFilePattern}`, { 50 | cwd: directory, 51 | dot: false, 52 | ignore: ['**/node_modules/**'], 53 | }) 54 | ).filter((path) => { 55 | const filename = basename(path); 56 | return ( 57 | !settings.testFiles.length || 58 | settings.testFiles.some((f) => filename.startsWith(f)) 59 | ); 60 | }); 61 | 62 | const mocPath = await findMocPath(settings); 63 | const dfxSources = await loadDfxSources(directory); 64 | 65 | if (settings.verbosity >= 1) { 66 | console.log(pc.magenta(dfxSources ?? '(no package sources)')); 67 | } 68 | 69 | if (settings.verbosity >= 0) { 70 | console.log( 71 | `Running ${paths.length} unit test file${ 72 | paths.length === 1 ? '' : 's' 73 | } ${pc.dim( 74 | `(${ 75 | settings.testFiles.length 76 | ? settings.testFiles.map((f) => `${f}*`).join(', ') 77 | : testFilePattern 78 | })`, 79 | )}\n`, 80 | ); 81 | } 82 | 83 | const defaultStatusEmoji = '❓'; 84 | const statusEmojis: Record = { 85 | passed: '✅', 86 | failed: '❌', 87 | skipped: '⏩', 88 | errored: '❗', 89 | }; 90 | 91 | const runs: TestRun[] = []; 92 | for (const path of paths) { 93 | const test = { 94 | path: join(directory, path), 95 | }; 96 | const fileRuns = await runTestFile(test, settings, { 97 | mocPath, 98 | dfxSources, 99 | }); 100 | for (const run of fileRuns) { 101 | runs.push(run); 102 | if (callback) { 103 | await callback(run); 104 | } 105 | if (settings.verbosity >= 0) { 106 | const important = 107 | run.status === 'errored' || run.status === 'failed'; 108 | if (important) { 109 | if (run.stdout?.trim()) { 110 | console.error(run.stdout); 111 | } 112 | if (run.stderr?.trim()) { 113 | console.error(pc.bold(pc.red(run.stderr))); 114 | } 115 | } 116 | const showTestMode = 117 | settings.testModes.length > 1 || 118 | !settings.testModes.includes(run.mode); 119 | const decorateExtension = '.test.mo'; 120 | const decoratedPath = run.test.path.endsWith(decorateExtension) 121 | ? `${run.test.path.slice(0, -decorateExtension.length)}` 122 | : run.test.path; 123 | console.log( 124 | pc.white( 125 | `${ 126 | statusEmojis[run.status] ?? defaultStatusEmoji 127 | } ${relative(settings.directory, decoratedPath)}${ 128 | showTestMode ? pc.dim(` (${run.mode})`) : '' 129 | }`, 130 | ), 131 | ); 132 | } 133 | } 134 | } 135 | 136 | if (runs.length && settings.verbosity >= 0) { 137 | const counts = {} as Record; 138 | runs.forEach( 139 | (run) => (counts[run.status] = (counts[run.status] || 0) + 1), 140 | ); 141 | console.log(); 142 | console.log( 143 | (['passed', 'failed', 'errored', 'skipped']) 144 | .filter((s) => s in counts) 145 | .map((s) => `${counts[s]} ${s}`) 146 | .concat([`${runs.length} total`]) 147 | .join(', '), 148 | ); 149 | console.log(); 150 | } 151 | return runs; 152 | } 153 | 154 | interface SharedTestInfo { 155 | mocPath: string; 156 | dfxSources: string | undefined; 157 | } 158 | 159 | async function runTestFile( 160 | test: Test, 161 | settings: TestSettings, 162 | shared: SharedTestInfo, 163 | ): Promise { 164 | const source = await readFile(test.path, 'utf8'); 165 | const modes: TestMode[] = []; 166 | const modeRegex = /\/\/[^\S\n]*@testmode[^\S\n]*([a-zA-Z]+)/g; 167 | let nextMode: string; 168 | while ((nextMode = modeRegex.exec(source)?.[1])) { 169 | modes.push(asTestMode(nextMode)); 170 | } 171 | if (!modes.length) { 172 | modes.push(...settings.testModes); 173 | } 174 | 175 | const runs: TestRun[] = []; 176 | for (const mode of modes) { 177 | runs.push(await runTest(test, mode, settings, shared)); 178 | } 179 | return runs; 180 | } 181 | 182 | async function runTest( 183 | test: Test, 184 | mode: TestMode, 185 | settings: TestSettings, 186 | { mocPath, dfxSources }: SharedTestInfo, 187 | ): Promise { 188 | const { path } = test; 189 | 190 | if (settings.verbosity >= 1) { 191 | console.log( 192 | pc.blue( 193 | `Running test: ${relative( 194 | settings.directory, 195 | test.path, 196 | )} (${mode})`, 197 | ), 198 | ); 199 | } 200 | 201 | try { 202 | if (mode === 'interpreter') { 203 | const interpretResult = await execa( 204 | `${shellEscape([mocPath, '-r', path])}${ 205 | dfxSources ? ` ${dfxSources}` : '' 206 | }`, 207 | { shell: true, cwd: settings.directory, reject: false }, 208 | ); 209 | return { 210 | test, 211 | mode, 212 | status: interpretResult.failed ? 'failed' : 'passed', 213 | stdout: interpretResult.stdout, 214 | stderr: interpretResult.stderr, 215 | }; 216 | } else if (mode === 'wasi') { 217 | const wasmPath = `${path.replace(/\.mo$/i, '')}.wasm`; 218 | const cleanup = () => { 219 | if (existsSync(wasmPath)) { 220 | unlinkSync(wasmPath); 221 | } 222 | }; 223 | cleanupHandlers.add(cleanup); 224 | try { 225 | await execa( 226 | `${shellEscape([ 227 | mocPath, 228 | '-wasi-system-api', 229 | '-o', 230 | wasmPath, 231 | path, 232 | ])}${dfxSources ? ` ${dfxSources}` : ''}`, 233 | { shell: true, cwd: settings.directory }, 234 | ); 235 | const wasmtimeResult = await execa( 236 | 'wasmtime', 237 | [ 238 | basename(wasmPath), 239 | '--enable-cranelift-nan-canonicalization', 240 | '--wasm-features', 241 | 'multi-memory,bulk-memory', 242 | ], 243 | { 244 | cwd: dirname(path), 245 | reject: false, 246 | }, 247 | ); 248 | return { 249 | test, 250 | mode, 251 | status: wasmtimeResult.failed ? 'failed' : 'passed', 252 | stdout: wasmtimeResult.stdout, 253 | stderr: wasmtimeResult.stderr, 254 | }; 255 | } finally { 256 | cleanup(); 257 | cleanupHandlers.delete(cleanup); 258 | } 259 | } else { 260 | throw new Error(`Invalid test mode: '${mode}'`); 261 | } 262 | } catch (err) { 263 | return { 264 | test, 265 | mode, 266 | status: 'errored', 267 | stdout: err.stdout, 268 | stderr: err.stderr || String(err.stack || err), 269 | }; 270 | } 271 | } 272 | 273 | const dfxCacheLocationMap = new Map(); 274 | async function findDfxCacheLocation(directory: string): Promise { 275 | let cacheLocation = dfxCacheLocationMap.get(directory); 276 | if (!cacheLocation) { 277 | cacheLocation = ( 278 | await execa('dfx', ['cache', 'show'], { 279 | cwd: directory, 280 | }) 281 | ).stdout; 282 | dfxCacheLocationMap.set(directory, cacheLocation); 283 | } 284 | return cacheLocation; 285 | } 286 | 287 | async function findMocPath(settings: TestSettings): Promise { 288 | const mocCommand = 'moc'; 289 | return ( 290 | // (await which(mocCommand, { nothrow: true })) || 291 | process.env.MOC_BINARY || 292 | join(await findDfxCacheLocation(settings.directory), mocCommand) 293 | ); 294 | } 295 | 296 | type CleanupHandler = ( 297 | exitCode: number | null, 298 | signal: string | null, 299 | ) => boolean | undefined | void; 300 | 301 | const cleanupHandlers = new Set(); 302 | onCleanup((...args) => { 303 | for (const handler of cleanupHandlers) { 304 | handler(...args); 305 | } 306 | }); 307 | -------------------------------------------------------------------------------- /src/utils/motoko.ts: -------------------------------------------------------------------------------- 1 | import motoko from 'motoko'; 2 | import { sep } from 'path'; 3 | 4 | export function getVirtualFile(file: string) { 5 | return motoko.file(file.replace(sep, '/')); // TODO: test on Windows 6 | } 7 | -------------------------------------------------------------------------------- /src/wasm.ts: -------------------------------------------------------------------------------- 1 | // import wasm from '../wasm/pkg/nodejs/wasm'; 2 | // export default wasm; 3 | 4 | // Wasm functionality is currently unused 5 | export default { 6 | update_settings(settings: any) {}, 7 | update_canister(path: string, alias: string, source: string) { 8 | throw new Error('Unsupported runtime'); 9 | }, 10 | remove_canister(alias: string) { 11 | throw new Error('Unsupported runtime'); 12 | }, 13 | call_canister(...args: any[]) { 14 | throw new Error('Unsupported runtime'); 15 | }, 16 | candid_to_js(candid: string) { 17 | throw new Error('Unsupported runtime'); 18 | }, 19 | }; 20 | -------------------------------------------------------------------------------- /src/watch.ts: -------------------------------------------------------------------------------- 1 | import { spawn, spawnSync } from 'child_process'; 2 | import chokidar from 'chokidar'; 3 | import glob from 'fast-glob'; 4 | import { existsSync, readFileSync, writeFileSync } from 'fs'; 5 | import { resolve } from 'path'; 6 | import pc from 'picocolors'; 7 | import { FileCache } from './cache'; 8 | import { Canister, getDfxCanisters } from './canister'; 9 | import { loadDfxConfig } from './dfx'; 10 | import { Settings } from './settings'; 11 | import { runTests } from './testing'; 12 | import { getVirtualFile } from './utils/motoko'; 13 | import wasm from './wasm'; 14 | 15 | // Pattern to watch for file changes 16 | const watchGlob = '**/*.mo'; 17 | 18 | // Paths to always exclude from the file watcher 19 | const watchIgnore = ['**/node_modules/**/*', '**/.vessel/.tmp/**/*']; 20 | 21 | // Canisters loaded from `dfx.json` 22 | let canisters: Canister[] | undefined; 23 | 24 | // Inject dev server logic into bindings created by `dfx generate` 25 | const editJSBinding = ( 26 | canister: Canister, 27 | source: string, 28 | ) => `// Modified by 'mo-dev' 29 | ${source.replace('export const createActor', 'export const _createActor')} 30 | export function createActor(canisterId, ...args) { 31 | const alias = ${JSON.stringify(canister.alias)}; 32 | const actor = _createActor(canisterId, ...args); 33 | if (process.env.NODE_ENV === 'development') { 34 | Object.keys(actor).forEach((methodName) => { 35 | actor[methodName] = async (...args) => { 36 | const response = await fetch( 37 | new URL(\`http://localhost:7700/call/\${alias}/\${methodName}\`), 38 | { 39 | method: 'POST', 40 | headers: { 41 | 'Content-Type': 'application/json', 42 | }, 43 | body: JSON.stringify({ args }), 44 | }, 45 | ); 46 | if (!response.ok) { 47 | throw new Error( 48 | \`Received status code \${response.status} from Motoko HMR server\`, 49 | ); 50 | } 51 | const data = await response.json(); 52 | return data.value; 53 | }; 54 | }); 55 | } 56 | return actor; 57 | } 58 | `; 59 | 60 | export function findCanister(alias: string): Canister | undefined { 61 | return canisters.find((c) => c.alias === alias); 62 | } 63 | 64 | export async function watch(settings: Settings) { 65 | const { 66 | directory, 67 | execute, 68 | verbosity, 69 | generate, 70 | deploy, 71 | reinstall, 72 | test, 73 | hotReload, 74 | } = settings; 75 | 76 | const deployArgs: string[] = []; 77 | settings.deployArgs.forEach((arg) => { 78 | deployArgs.push('--argument', arg); 79 | }); 80 | 81 | const log = (level: number, ...args: any[]) => { 82 | if (verbosity >= level) { 83 | const time = new Date().toLocaleTimeString(); 84 | console.log( 85 | `${pc.dim(time)} ${pc.cyan(pc.bold('[mo-dev]'))}`, 86 | ...args, 87 | ); 88 | } 89 | }; 90 | 91 | const fileCache = new FileCache({ directory }); 92 | 93 | const loadCanisterIds = (): Record | undefined => { 94 | try { 95 | const network = 'local'; 96 | const canisterIdsPath = resolve( 97 | directory, 98 | '.dfx', 99 | network, 100 | 'canister_ids.json', 101 | ); 102 | if (!existsSync(canisterIdsPath)) { 103 | return; 104 | } 105 | log(1, 'Loading canister IDs...'); 106 | const canisterIds: { 107 | [alias: string]: { [network: string]: string }; 108 | } = JSON.parse(readFileSync(canisterIdsPath, 'utf8')); 109 | return Object.fromEntries( 110 | Object.entries(canisterIds) 111 | .map(([alias, ids]) => [alias, ids[network]]) 112 | .filter(([, id]) => id), 113 | ); 114 | } catch (err) { 115 | console.error('Error while loading `canister_ids.json`:', err); 116 | } 117 | }; 118 | 119 | const updateDfxConfig = async () => { 120 | try { 121 | const dfxConfig = await loadDfxConfig(directory); 122 | if (!dfxConfig) { 123 | console.error( 124 | `${pc.bold( 125 | 'Could not find a `dfx.json` file in directory:', 126 | )} ${directory}`, 127 | ); 128 | return; 129 | } 130 | canisters = getDfxCanisters(directory, dfxConfig); 131 | if (settings.canisterNames.length) { 132 | const showExpected = settings.canisterNames.some((name) => { 133 | if (!canisters.some((c) => name === c.alias)) { 134 | log( 135 | 0, 136 | 'unexpected canister:', 137 | pc.yellow(pc.bold(name)), 138 | ); 139 | return true; 140 | } 141 | }); 142 | if (showExpected) { 143 | log( 144 | 0, 145 | `options:`, 146 | pc.gray( 147 | canisters 148 | .map((c) => pc.bold(pc.green(c.alias))) 149 | .join(', '), 150 | ), 151 | ); 152 | } 153 | canisters = canisters.filter((canister) => 154 | settings.canisterNames.some( 155 | (name) => name === canister.alias, 156 | ), 157 | ); 158 | } 159 | } catch (err) { 160 | console.error( 161 | `Error while loading 'dfx.json' file:\n${err.message || err}`, 162 | ); 163 | } 164 | 165 | const runDfx = (parts: string[], verbosity: number) => { 166 | spawnSync('dfx', parts, { 167 | cwd: directory, 168 | stdio: verbosity >= 1 ? 'inherit' : 'ignore', 169 | encoding: 'utf-8', 170 | }); 171 | }; 172 | 173 | if (generate) { 174 | runDfx(['canister', 'create', '--all'], verbosity); 175 | } 176 | if (deploy || reinstall) { 177 | let canisterIds = loadCanisterIds(); 178 | 179 | const uiAlias = '__Candid_UI'; 180 | const uiAddress = canisterIds?.[uiAlias]; 181 | if (!uiAddress) { 182 | if (generate) { 183 | // Find asset canister dependencies 184 | const dfxConfig = await loadDfxConfig(directory); 185 | const dependencies: string[] = []; 186 | Object.values(dfxConfig.canisters)?.forEach((config) => { 187 | if (config.type === 'assets') { 188 | config.dependencies?.forEach((dependency) => { 189 | if (!dependencies.includes(dependency)) { 190 | dependencies.push(dependency); 191 | } 192 | }); 193 | } 194 | }); 195 | // TODO: handle deploy args 196 | dependencies.forEach((alias) => { 197 | log(0, pc.green('prepare'), pc.gray(alias)); 198 | runDfx(['deploy', alias], 1); 199 | }); 200 | runDfx(['generate'], 1); 201 | runDfx(['deploy'], 1); 202 | } else { 203 | // Deploy initial canisters 204 | canisters.forEach((canister) => { 205 | log(0, pc.green('prepare'), pc.gray(canister.alias)); 206 | runDfx( 207 | [ 208 | 'deploy', 209 | canister.alias, 210 | ...(reinstall ? ['-y'] : []), 211 | ...deployArgs, 212 | ], 213 | 1, 214 | ); 215 | }); 216 | } 217 | } else if (canisters.length) { 218 | // Skip for environments where Candid UI is not available 219 | if (!process.env.MO_DEV_HIDE_URLS) { 220 | // Show Candid UI addresses 221 | canisters.forEach((canister) => { 222 | const id = canisterIds[canister.alias]; 223 | if (id) { 224 | log( 225 | 0, 226 | pc.cyan( 227 | `${pc.bold( 228 | `${canister.alias}`, 229 | )} → ${pc.white( 230 | `http://127.0.0.1:4943?canisterId=${uiAddress}&id=${id}`, 231 | )}`, 232 | ), 233 | ); 234 | } 235 | }); 236 | } 237 | } 238 | } 239 | }; 240 | await updateDfxConfig(); 241 | 242 | const runCommand = ( 243 | command: string, 244 | { args, pipe }: { args?: string[]; pipe?: boolean } = {}, 245 | ) => { 246 | if (verbosity >= 1) { 247 | log( 248 | 1, 249 | pc.magenta( 250 | `${pc.bold('run')} ${ 251 | args 252 | ? `${command} [${args 253 | .map((arg) => `'${arg}'`) 254 | .join(', ')}]` 255 | : command 256 | }`, 257 | ), 258 | ); 259 | } 260 | const commandProcess = spawn(command, args, { 261 | shell: !args, 262 | cwd: directory, 263 | }); 264 | if (pipe) { 265 | process.stdin.pipe(commandProcess.stdin); 266 | commandProcess.stdout.pipe(process.stdout); 267 | commandProcess.stderr.pipe(process.stderr); 268 | } else { 269 | commandProcess.on('exit', (code) => { 270 | if (code) { 271 | commandProcess.stdout.pipe(process.stdout); 272 | commandProcess.stderr.pipe(process.stderr); 273 | } 274 | }); 275 | } 276 | return commandProcess; 277 | }; 278 | 279 | const finishProcess = async ( 280 | process: ReturnType, 281 | ): Promise => { 282 | if (typeof process.exitCode === 'number') { 283 | return process.exitCode; 284 | } 285 | return new Promise((resolve) => 286 | process.on('close', () => resolve(process.exitCode)), 287 | ); 288 | }; 289 | 290 | let changeTimeout: ReturnType | undefined; 291 | let execProcess: ReturnType | undefined; 292 | let deployPromise: Promise | undefined; 293 | const deployProcesses: ReturnType[] = []; 294 | const notifyChange = () => { 295 | clearTimeout(changeTimeout); 296 | changeTimeout = setTimeout(async () => { 297 | try { 298 | if (execute) { 299 | if (execProcess) { 300 | // kill(execProcess.pid); 301 | await finishProcess(execProcess); 302 | } 303 | execProcess = runCommand(execute, { pipe: true }); 304 | // commandProcess.on('close', (code) => { 305 | // if (verbosity >= 1) { 306 | // console.log( 307 | // pc.dim('Command exited with code'), 308 | // code ? pc.yellow(code) : pc.gray(0), 309 | // ); 310 | // } 311 | // }); 312 | } 313 | 314 | // Restart deployment 315 | // deployProcesses.forEach((p) => kill(p.pid)); 316 | for (const process of deployProcesses) { 317 | await finishProcess(process); 318 | } 319 | await deployPromise; 320 | deployProcesses.length = 0; 321 | 322 | deployPromise = (async () => { 323 | const pipe = verbosity >= 1; 324 | let testsPassed = true; 325 | if (test) { 326 | try { 327 | const runs = await runTests(settings); 328 | testsPassed = runs.every( 329 | (run) => 330 | run.status === 'passed' || 331 | run.status === 'skipped', 332 | ); 333 | } catch (err) { 334 | testsPassed = false; 335 | console.error( 336 | 'Error while running unit tests:', 337 | err.stack || err, 338 | ); 339 | } 340 | } 341 | if (!testsPassed) { 342 | if (settings.ci) { 343 | process.exit(1); 344 | } 345 | return; 346 | } 347 | if (generate) { 348 | const generateProcess = runCommand('dfx', { 349 | // args: ['generate', canister.alias], 350 | args: ['generate'], 351 | pipe, 352 | }); 353 | const exitCode = await finishProcess(generateProcess); 354 | if (exitCode === 0) { 355 | log( 356 | 0, 357 | pc.green( 358 | // `generate ${pc.gray(canister.alias)}`, 359 | 'generate', 360 | ), 361 | ); 362 | // if (hotReload) { 363 | // const outputPath = resolve( 364 | // directory, 365 | // canister.config.declarations?.output || 366 | // `src/declarations/${canister.alias}`, 367 | // ); 368 | // const jsPath = resolve( 369 | // outputPath, 370 | // 'index.js', 371 | // ); 372 | // if (existsSync(jsPath)) { 373 | // try { 374 | // const binding = readFileSync( 375 | // jsPath, 376 | // 'utf8', 377 | // ); 378 | // const newBinding = editJSBinding( 379 | // canister, 380 | // binding, 381 | // ); 382 | // writeFileSync( 383 | // jsPath, 384 | // newBinding, 385 | // 'utf8', 386 | // ); 387 | // } catch (err) { 388 | // console.error( 389 | // 'Error while generating hot reload bindings:', 390 | // err, 391 | // ); 392 | // } 393 | // } else { 394 | // console.error( 395 | // 'Error while generating hot reload bindings. File expected at path:', 396 | // jsPath, 397 | // ); 398 | // } 399 | // } 400 | } else if (settings.ci) { 401 | process.exit(1); 402 | } 403 | } 404 | for (const canister of canisters) { 405 | if (deploy) { 406 | const deployProcess = runCommand('dfx', { 407 | args: [ 408 | 'deploy', 409 | canister.alias, 410 | ...(verbosity >= 1 ? [] : ['-qq']), 411 | ...(reinstall ? ['-y'] : []), 412 | ...deployArgs, 413 | ], 414 | // TODO: hide 'Module hash ... is already installed' warnings 415 | pipe: pipe || !reinstall, 416 | }); 417 | deployProcesses.push(deployProcess); 418 | const exitCode = await finishProcess(deployProcess); 419 | if (exitCode === 0) { 420 | log( 421 | 0, 422 | pc.green( 423 | `deploy ${pc.gray(canister.alias)}`, 424 | ), 425 | ); 426 | } else if (settings.ci) { 427 | process.exit(1); 428 | } 429 | } 430 | } 431 | })(); 432 | } catch (err) { 433 | throw err; 434 | // console.error(err); 435 | } 436 | }, 100); 437 | }; 438 | 439 | // Update a canister on the HMR server 440 | const updateCanister = (canister: Canister, { quiet = false } = {}) => { 441 | try { 442 | if (hotReload) { 443 | const source = fileCache.get(canister.file); 444 | if (!source) { 445 | console.warn( 446 | 'Missing source file for HMR server:', 447 | canister.file, 448 | ); 449 | return; 450 | } 451 | wasm.update_canister(canister.file, canister.alias, source); 452 | const file = getVirtualFile(canister.file); 453 | file.write(source); 454 | if (!quiet) { 455 | log(0, pc.green(`update ${pc.gray(canister.alias)}`)); 456 | } 457 | } 458 | return true; 459 | } catch (err) { 460 | console.error( 461 | pc.bold( 462 | pc.red( 463 | `Error while updating canister '${canister.alias}':\n${ 464 | err.message || err 465 | }`, 466 | ), 467 | ), 468 | ); 469 | return false; 470 | } 471 | }; 472 | 473 | // Remove a canister on the HMR server 474 | const removeCanister = (canister: Canister, { quiet = false } = {}) => { 475 | try { 476 | if (hotReload) { 477 | wasm.remove_canister(canister.alias); 478 | const file = getVirtualFile(canister.file); 479 | file.delete(); 480 | if (!quiet) { 481 | log(0, pc.green(`remove ${pc.gray(canister.alias)}`)); 482 | } 483 | } 484 | } catch (err) { 485 | console.error( 486 | pc.bold( 487 | pc.red( 488 | `Error while removing canister '${canister.alias}':\n${ 489 | err.message || err 490 | }`, 491 | ), 492 | ), 493 | ); 494 | } 495 | }; 496 | 497 | // Synchronously prepare files 498 | glob.sync(watchGlob, { cwd: directory, ignore: watchIgnore }).forEach( 499 | (path) => { 500 | const resolvedPath = resolve(directory, path); 501 | canisters?.forEach((canister) => { 502 | if (resolvedPath === canister.file) { 503 | fileCache.update(resolvedPath); 504 | updateCanister(canister, { quiet: true }); 505 | } 506 | }); 507 | }, 508 | ); 509 | 510 | if (settings.ci) { 511 | notifyChange(); 512 | } else { 513 | if (verbosity >= 1) { 514 | console.log(pc.gray('Waiting for Motoko file changes...')); 515 | } 516 | 517 | const dfxWatcher = chokidar 518 | .watch('./dfx.json', { cwd: directory, ignored: watchIgnore }) 519 | .on('change', async (path) => { 520 | if (!path.endsWith('dfx.json')) { 521 | console.warn('Received unexpected `dfx.json` path:', path); 522 | return; 523 | } 524 | notifyChange(); 525 | console.log(pc.blue(`${pc.bold('change')} ${path}`)); 526 | const previousCanisters = canisters; 527 | await updateDfxConfig(); 528 | previousCanisters?.forEach((canister) => { 529 | if (!canisters?.some((c) => c.alias === canister.alias)) { 530 | removeCanister(canister); 531 | } 532 | }); 533 | canisters?.forEach((canister) => updateCanister(canister)); 534 | }); 535 | 536 | const moWatcher = chokidar 537 | .watch(watchGlob, { cwd: directory, ignored: watchIgnore }) 538 | .on('all', (event, path) => { 539 | if (!path.endsWith('.mo')) { 540 | return; 541 | } 542 | log(1, pc.blue(`${pc.bold(event)} ${path}`)); 543 | let shouldNotify = true; 544 | const resolvedPath = resolve(directory, path); 545 | if (event === 'unlink') { 546 | fileCache.invalidate(resolvedPath); 547 | } else { 548 | if (!fileCache.update(resolvedPath)) { 549 | shouldNotify = false; 550 | } 551 | log(2, 'cache', resolvedPath); 552 | } 553 | canisters?.forEach((canister) => { 554 | if (resolvedPath === canister.file) { 555 | if (event === 'unlink') { 556 | removeCanister(canister); 557 | } else if (!updateCanister(canister)) { 558 | shouldNotify = false; 559 | } 560 | } 561 | }); 562 | if (shouldNotify) { 563 | notifyChange(); 564 | } 565 | }); 566 | 567 | return { 568 | dfxJson: dfxWatcher, 569 | motoko: moWatcher, 570 | close() { 571 | dfxWatcher.close(); 572 | moWatcher.close(); 573 | }, 574 | }; 575 | } 576 | } 577 | -------------------------------------------------------------------------------- /tests/mo-dev.test.ts: -------------------------------------------------------------------------------- 1 | import axios from 'axios'; 2 | import { existsSync, readFileSync, unlinkSync } from 'fs'; 3 | import { join } from 'path'; 4 | import devServer from '../src'; 5 | 6 | const projectPath = join(__dirname, 'project'); 7 | 8 | const waitUntilLoaded = async () => { 9 | await new Promise((resolve) => setTimeout(resolve, 1000)); // TODO: deterministic 10 | }; 11 | 12 | describe('mo-dev', () => { 13 | test('detects Motoko files', async () => { 14 | const { watcher, close } = await devServer({ 15 | directory: projectPath, 16 | ci: false, 17 | }); 18 | 19 | const files: string[] = []; 20 | watcher!.dfxJson.on('add', (file) => { 21 | files.push(file); 22 | }); 23 | watcher!.motoko.on('add', (file) => { 24 | files.push(file); 25 | }); 26 | 27 | await waitUntilLoaded(); 28 | try { 29 | files.sort(); 30 | expect( 31 | files.filter((file) => !file.startsWith('.mops')), 32 | ).toStrictEqual([ 33 | 'dfx.json', 34 | 'motoko_canister/Main.mo', 35 | 'motoko_canister/lib/Echo.mo', 36 | 'motoko_canister/test/DefaultFail.test.mo', 37 | 'motoko_canister/test/DefaultPass.test.mo', 38 | 'motoko_canister/test/ImportPass.test.mo', 39 | 'motoko_canister/test/WasiError.test.mo', 40 | 'motoko_canister/test/WasiFail.test.mo', 41 | 'motoko_canister/test/WasiPass.test.mo', 42 | 'vm/tests/Main.test.mo', 43 | 'vm/vm_canister/Main.mo', 44 | ]); 45 | } finally { 46 | close(); 47 | } 48 | }); 49 | 50 | test('runs a provided command', async () => { 51 | const outputPath = join(projectPath, 'generated.txt'); 52 | if (existsSync(outputPath)) { 53 | unlinkSync(outputPath); 54 | } 55 | 56 | const { close } = await devServer({ 57 | directory: projectPath, 58 | execute: `echo "ran command" >> generated.txt`, 59 | ci: false, 60 | }); 61 | 62 | await waitUntilLoaded(); 63 | try { 64 | expect(readFileSync(outputPath, 'utf8')).toEqual('ran command\n'); 65 | } finally { 66 | close(); 67 | } 68 | }); 69 | 70 | test.skip('generates type bindings', async () => { 71 | const declarationPath = join( 72 | projectPath, 73 | 'src/declarations/motoko_canister/index.js', 74 | ); 75 | if (existsSync(declarationPath)) { 76 | unlinkSync(declarationPath); 77 | } 78 | 79 | const { close } = await devServer({ 80 | directory: projectPath, 81 | generate: true, 82 | ci: false, 83 | }); 84 | 85 | await waitUntilLoaded(); 86 | try { 87 | expect(existsSync(declarationPath)); 88 | } finally { 89 | close(); 90 | } 91 | }); 92 | 93 | test.skip('starts the VM server on a custom port', async () => { 94 | const port = 56789; 95 | const { close } = await devServer({ 96 | directory: join(projectPath, 'vm'), 97 | hotReload: true, 98 | port, 99 | ci: false, 100 | }); 101 | await waitUntilLoaded(); 102 | try { 103 | const response = await axios.post( 104 | `http://localhost:${port}/call/vm_canister/main`, 105 | { args: [] }, 106 | ); 107 | expect(response.data).toStrictEqual({ value: '123' }); 108 | } finally { 109 | close(); 110 | } 111 | }); 112 | }); 113 | -------------------------------------------------------------------------------- /tests/mo-test.test.ts: -------------------------------------------------------------------------------- 1 | import { join } from 'path'; 2 | import { TestRun, runTests } from '../src/testing'; 3 | 4 | const projectPath = join(__dirname, 'project'); 5 | 6 | describe('mo-test', () => { 7 | test('basic usage', async () => { 8 | const testRuns: TestRun[] = []; 9 | await runTests( 10 | { 11 | directory: projectPath, 12 | }, 13 | (result) => { 14 | testRuns.push(result); 15 | }, 16 | ); 17 | expect(testRuns.length).toEqual(7); 18 | }, 20000); 19 | 20 | test('--testmode, wasi', async () => { 21 | const testRuns: TestRun[] = []; 22 | await runTests( 23 | { 24 | directory: projectPath, 25 | testModes: ['wasi'], 26 | testFiles: ['ImportPass'], 27 | }, 28 | (result) => { 29 | testRuns.push(result); 30 | }, 31 | ); 32 | expect(testRuns.length).toEqual(1); 33 | }); 34 | 35 | test('--testfile, basic', async () => { 36 | const testRuns: TestRun[] = []; 37 | await runTests( 38 | { 39 | directory: projectPath, 40 | testFiles: ['ImportPass'], 41 | }, 42 | (result) => { 43 | testRuns.push(result); 44 | }, 45 | ); 46 | expect(testRuns.length).toEqual(1); 47 | }); 48 | 49 | test('--testfile, multiple', async () => { 50 | const testRuns: TestRun[] = []; 51 | await runTests( 52 | { 53 | directory: projectPath, 54 | testFiles: ['DefaultPass', 'Import'], 55 | }, 56 | (result) => { 57 | testRuns.push(result); 58 | }, 59 | ); 60 | expect(testRuns.length).toEqual(2); 61 | }); 62 | 63 | test('--testfile, empty', async () => { 64 | const testRuns: TestRun[] = []; 65 | await runTests( 66 | { 67 | directory: projectPath, 68 | testFiles: ['__nonexistent__'], 69 | }, 70 | (result) => { 71 | testRuns.push(result); 72 | }, 73 | ); 74 | expect(testRuns.length).toEqual(0); 75 | }); 76 | }); 77 | -------------------------------------------------------------------------------- /tests/project/.gitignore: -------------------------------------------------------------------------------- 1 | *generated* 2 | /src/declarations 3 | -------------------------------------------------------------------------------- /tests/project/dfx.json: -------------------------------------------------------------------------------- 1 | { 2 | "canisters": { 3 | "motoko_canister": { 4 | "type": "motoko", 5 | "main": "motoko_canister/Main.mo" 6 | }, 7 | "vm_canister": { 8 | "type": "motoko", 9 | "main": "vm/vm_canister/Main.mo" 10 | } 11 | }, 12 | "defaults": { 13 | "build": { 14 | "packtool": "mops sources" 15 | } 16 | } 17 | } 18 | -------------------------------------------------------------------------------- /tests/project/mops.toml: -------------------------------------------------------------------------------- 1 | [dependencies] 2 | base = "0.7.4" 3 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/Main.mo: -------------------------------------------------------------------------------- 1 | import Lib "lib/Echo"; 2 | 3 | actor { 4 | public func main() : async Nat { 5 | Lib.echo(123); 6 | }; 7 | }; 8 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/lib/Echo.mo: -------------------------------------------------------------------------------- 1 | module { 2 | public func echo(value : Nat) : Nat { 3 | value; 4 | }; 5 | }; 6 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/test/DefaultFail.test.mo: -------------------------------------------------------------------------------- 1 | assert false 2 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/test/DefaultPass.test.mo: -------------------------------------------------------------------------------- 1 | import { print } "mo:base/Debug"; 2 | 3 | print("OUTPUT"); 4 | 5 | assert true; 6 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/test/ImportPass.test.mo: -------------------------------------------------------------------------------- 1 | import Debug "mo:base/Debug"; 2 | 3 | Debug.print("Hello"); 4 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/test/WasiError.test.mo: -------------------------------------------------------------------------------- 1 | // @testmode wasi 2 | 3 | ExpectedError 4 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/test/WasiFail.test.mo: -------------------------------------------------------------------------------- 1 | // @testmode wasi 2 | 3 | assert 1 == 2; 4 | -------------------------------------------------------------------------------- /tests/project/motoko_canister/test/WasiPass.test.mo: -------------------------------------------------------------------------------- 1 | // @testmode wasi 2 | 3 | assert 1 == 1; 4 | -------------------------------------------------------------------------------- /tests/project/vm/dfx.json: -------------------------------------------------------------------------------- 1 | { 2 | "canisters": { 3 | "vm_canister": { 4 | "type": "motoko", 5 | "main": "vm_canister/Main.mo" 6 | } 7 | } 8 | } 9 | -------------------------------------------------------------------------------- /tests/project/vm/tests/Main.test.mo: -------------------------------------------------------------------------------- 1 | import { Main } "../vm_canister/Main"; 2 | 3 | let canister = await Main(); 4 | 5 | assert (await canister.main()) == 123; 6 | -------------------------------------------------------------------------------- /tests/project/vm/vm_canister/Main.mo: -------------------------------------------------------------------------------- 1 | actor class Main() { 2 | public func main() : async Nat { 3 | 123; 4 | }; 5 | }; 6 | -------------------------------------------------------------------------------- /tsconfig.json: -------------------------------------------------------------------------------- 1 | { 2 | "compilerOptions": { 3 | "module": "commonjs", 4 | "esModuleInterop": true, 5 | "allowSyntheticDefaultImports": true, 6 | "target": "es6", 7 | "noImplicitAny": true, 8 | "moduleResolution": "node", 9 | "sourceMap": true, 10 | "declaration": true, 11 | "declarationMap": true, 12 | "outDir": "lib", 13 | "declarationDir": "lib", 14 | "baseUrl": ".", 15 | "alwaysStrict": true, 16 | "paths": { 17 | "*": ["node_modules/*", "src/types/*"] 18 | } 19 | }, 20 | "include": ["src/**/*"] 21 | } 22 | -------------------------------------------------------------------------------- /wasm/.gitignore: -------------------------------------------------------------------------------- 1 | /target 2 | **/*.rs.bk 3 | Cargo.lock 4 | bin/ 5 | pkg/ 6 | wasm-pack.log 7 | -------------------------------------------------------------------------------- /wasm/Cargo.toml: -------------------------------------------------------------------------------- 1 | [package] 2 | name = "wasm" 3 | version = "0.1.0" 4 | authors = ["Ryan Vandersmith "] 5 | edition = "2018" 6 | 7 | [lib] 8 | crate-type = ["cdylib", "rlib"] 9 | 10 | [features] 11 | default = ["console_error_panic_hook"] 12 | 13 | [dependencies] 14 | console_error_panic_hook = { version = "0.1.6", optional = true } 15 | serde = { version = "1.0.143", features = ["derive"] } 16 | serde_json = "1.0.83" 17 | serde-wasm-bindgen = "0.4.5" 18 | serde-transcode = "1.1.1" 19 | wasm-bindgen = { version = "0.2.82", features = ["serde-serialize"] } 20 | motoko = { path = "../../motoko.rs/crates/motoko" } 21 | candid = "0.8.3" 22 | 23 | [dev-dependencies] 24 | wasm-bindgen-test = "0.3.13" 25 | 26 | [profile.release] 27 | # Tell `rustc` to optimize for small code size. 28 | opt-level = "s" 29 | -------------------------------------------------------------------------------- /wasm/src/lib.rs: -------------------------------------------------------------------------------- 1 | use motoko::{ 2 | ast::ToId, 3 | vm_types::{Core, Limits}, 4 | Interruption, ToMotoko, Value, 5 | }; 6 | use serde::{de::DeserializeOwned, Deserialize, Serialize}; 7 | use serde_wasm_bindgen::{from_value, to_value}; 8 | use std::cell::RefCell; 9 | use wasm_bindgen::prelude::*; 10 | 11 | type Result = std::result::Result; 12 | 13 | thread_local! { 14 | static SETTINGS: RefCell> = RefCell::new(None); 15 | static CORE: RefCell = RefCell::new(Core::empty()); 16 | } 17 | 18 | #[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)] 19 | struct Message { 20 | arg: Vec, 21 | canister_id: Vec, 22 | ingress_expiry: String, 23 | method_name: String, 24 | nonce: Vec, 25 | request_type: String, 26 | sender: Vec, 27 | } 28 | 29 | #[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)] 30 | struct Response { 31 | value: Vec, 32 | } 33 | 34 | #[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)] 35 | struct LiveCanister { 36 | alias: String, 37 | file: String, 38 | source: String, 39 | } 40 | 41 | #[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)] 42 | struct Settings { 43 | verbosity: usize, 44 | } 45 | 46 | #[wasm_bindgen] 47 | extern "C" { 48 | #[wasm_bindgen(js_namespace = console, js_name = log)] 49 | fn js_log(s: &str); 50 | } 51 | 52 | fn js_input(input: JsValue) -> Result { 53 | from_value(input).map_err(|e| JsError::new(&format!("Deserialization error ({:?})", e))) 54 | } 55 | 56 | fn js_return(value: &T) -> Result { 57 | to_value(value).map_err(|e| JsError::new(&format!("Serialization error ({:?})", e))) 58 | } 59 | 60 | fn js_error(error: impl Into) -> JsError { 61 | JsError::new(&format!("{:?}", error.into())) 62 | } 63 | 64 | fn motoko_to_js_value(value: &Value) -> Result { 65 | // TODO: replace with Candid 66 | Ok(match value { 67 | Value::Null => JsValue::null(), 68 | Value::Bool(b) => JsValue::from_bool(*b), 69 | Value::Unit => JsValue::undefined(), 70 | Value::Nat(n) => JsValue::from_str(&n.to_string()), 71 | Value::Int(i) => JsValue::from_str(&i.to_string()), 72 | Value::Float(f) => JsValue::from_f64(f.0), 73 | Value::Char(c) => JsValue::from_str(&c.to_string()), 74 | Value::Text(s) => JsValue::from_str(&s.to_string()), 75 | Value::Option(o) => motoko_to_js_value(o)?, 76 | _ => todo!(), 77 | }) 78 | } 79 | 80 | fn with_settings(f: impl FnOnce(&Settings) -> R) -> R { 81 | SETTINGS.with(|settings| { 82 | f(&mut settings 83 | .borrow_mut() 84 | .as_ref() 85 | .expect("Uninitialized settings")) 86 | }) 87 | } 88 | 89 | macro_rules! log { 90 | ($($input:tt)+) => { 91 | with_settings(|settings| { 92 | if settings.verbosity >= 2 { 93 | js_log(&format!($($input)+)) 94 | } 95 | }) 96 | }; 97 | } 98 | 99 | #[wasm_bindgen(start)] 100 | pub fn start() { 101 | #[cfg(feature = "console_error_panic_hook")] 102 | console_error_panic_hook::set_once(); 103 | } 104 | 105 | #[wasm_bindgen] 106 | pub fn update_settings(settings: JsValue) -> Result { 107 | let settings: Settings = js_input(settings)?; 108 | SETTINGS.with(|s| *s.borrow_mut() = Some(settings)); 109 | Ok(JsValue::TRUE) 110 | } 111 | 112 | /// Handle a message directed at the IC replica. 113 | #[wasm_bindgen] 114 | pub fn handle_message(_alias: String, _method: String, _message: JsValue) -> Result { 115 | // let message: Vec = from_value(message)?; 116 | // let args = candid::decode_args(&message)?; 117 | // log!("Candid args: {:?}", args); 118 | // js_return(&candid::encode_one("abc")?) 119 | unimplemented!() 120 | } 121 | 122 | /// Directly call a canister from JavaScript. 123 | #[wasm_bindgen] 124 | pub fn call_canister(alias: String, method: String, _args: Vec) -> Result { 125 | log!("[wasm] calling canister: {}.{}", alias, method); 126 | // let args = motoko::candid_utils::decode_candid_args(&args)?.share(); // TODO 127 | let args = ().to_shared().map_err(js_error)?; //// 128 | log!("[wasm] input: {:?}", args); 129 | CORE.with(|core| { 130 | let mut new_core = core.borrow().clone(); 131 | let id = motoko::value::ActorId::Alias(alias.to_id()); 132 | let limits = Limits::none(); 133 | let value = new_core 134 | .call(&id, &method.to_id(), args, &limits) 135 | .map_err(js_error)?; 136 | // TODO: don't update the core for `query` methods 137 | *core.borrow_mut() = new_core; 138 | motoko_to_js_value(&value) 139 | }) 140 | } 141 | 142 | /// Create or update a canister. Returns `true` if a canister was successfully updated. 143 | #[wasm_bindgen] 144 | pub fn update_canister(path: String, alias: String, source: String) -> Result { 145 | log!("[wasm] updating canister: {}", alias); 146 | CORE.with(|core| { 147 | // let mut core = core.borrow_mut(); 148 | let id = motoko::value::ActorId::Alias(alias.to_id()); 149 | let mut new_core = core.borrow().clone(); 150 | new_core.set_actor(path, id, &source).map_err(js_error)?; 151 | *core.borrow_mut() = new_core; 152 | js_return(&()) 153 | // js_return(&result.is_ok()) 154 | }) 155 | } 156 | 157 | /// Remove a canister if it exists. Returns `true` if a canister was successfully removed. 158 | #[wasm_bindgen] 159 | pub fn remove_canister(alias: String) -> Result { 160 | log!("[wasm] removing canister: {}", alias); 161 | CORE.with(|core| { 162 | let mut core = core.borrow_mut(); 163 | let id = motoko::value::ActorId::Alias(alias.to_id()); 164 | js_return(&core.actors.map.remove(&id).is_some()) 165 | }) 166 | } 167 | 168 | #[wasm_bindgen] 169 | pub fn candid_to_js(candid: String) -> Result { 170 | use candid::{check_prog, IDLProg, TypeEnv}; 171 | let ast = candid 172 | .parse::() 173 | .map_err(|err| JsError::new(&err.to_string()))?; 174 | let mut env = TypeEnv::new(); 175 | let actor = check_prog(&mut env, &ast).map_err(|err| JsError::new(&err.to_string()))?; 176 | js_return(&candid::bindings::javascript::compile(&env, &actor)) 177 | } 178 | -------------------------------------------------------------------------------- /wasm/tests/test.rs: -------------------------------------------------------------------------------- 1 | //! Test suite for the Web and headless browsers. 2 | 3 | #![cfg(target_arch = "wasm32")] 4 | 5 | extern crate wasm_bindgen_test; 6 | use wasm_bindgen_test::*; 7 | 8 | wasm_bindgen_test_configure!(run_in_browser); 9 | 10 | #[wasm_bindgen_test] 11 | fn pass() { 12 | assert_eq!(1 + 1, 2); 13 | } 14 | --------------------------------------------------------------------------------