├── .gitbook └── assets │ ├── CompilerVersion.png │ ├── azure-storage-contributor (1).png │ ├── azure-storage-contributor.png │ ├── azure-storage-iam (1).png │ ├── azure-storage-iam.png │ ├── compilerversion.png │ ├── image (1) (1).png │ ├── image (1) (2).png │ ├── image (1).png │ ├── image (10) (1).png │ ├── image (10).png │ ├── image (11) (1).png │ ├── image (11).png │ ├── image (12).png │ ├── image (2) (1).png │ ├── image (2).png │ ├── image (3) (1).png │ ├── image (3).png │ ├── image (4) (1).png │ ├── image (4).png │ ├── image (5) (1).png │ ├── image (5).png │ ├── image (6) (1).png │ ├── image (6).png │ ├── image (7) (1).png │ ├── image (7).png │ ├── image (8) (1).png │ ├── image (8).png │ ├── image (9).png │ └── image.png ├── .github └── workflows │ └── deploy.yml ├── .gitignore ├── README.md ├── SUMMARY.md ├── azure ├── c-script-function-apps-beyond-azure-portal.md ├── code-less-redirection-with-serverless-azure-functions.md ├── exploring-azure-data-with-kusto-and-dashboards.md ├── how-to-launch-multiple-azure-functions-apps-on-different-ports.md ├── publishing-function-app-from-github-folder.md ├── shared-secret-authorization-with-azure-signalr-service.md └── using-azure-file-copy-from-devops-yaml-pipeline.md ├── book.json ├── devops-ci-cd ├── how-to-run-azure-storage-unit-tests-in-ci.md ├── how-to-skip-steps-or-jobs-in-github-actions-for-prs-from-forks.md ├── push-to-protected-branch-from-github-actions.md └── update-version-and-publish-npm-from-gh.md ├── dotnet ├── accessing-tor-via-httpclient-with-.net6.md ├── asynclocal-never-leaks-and-is-safe-for-callcontext-like-state.md ├── disable-diagnostic-analyzers-for-entire-folder-submodules.md ├── how-to-emit-descriptions-for-exported-json-schema-using-jsonschemaexporter.md ├── how-to-locate-dotnet.md ├── ignore-folder-from-dotnet-format.md ├── installing-.net-5.0-on-raspberry-pi-4.md ├── nuget │ ├── README.md │ ├── hide-contentfiles-from-your-nuget-packages.md │ ├── packaging-transitive-analyzers-with-nuget.md │ ├── populate-repositorybranch-in-ci-for-nuget-pack.md │ ├── suppress-dependencies-when-packing.md │ └── use-dotnet-vs-to-get-developer-prompt-in-terminal.md ├── persisting-output-files-from-source-generators.md ├── quickly-check-c-compiler-and-language-version.md ├── use-c-9-records-in-non-net5.0-projects.md └── using-hashcode-in-.netframework.md ├── msbuild ├── detect-ci-builds-for-every-ci-system.md ├── how-to-build-project-when-content-files-change.md ├── how-to-get-user-home-dir-cross-platform.md ├── how-to-include-commit-url-in-nuget-package-description.md ├── how-to-include-package-reference-files-in-your-nuget-package.md ├── how-to-select-first-item-in-an-itemgroup.md ├── modify-all-command-line-builds-in-entire-repo.md ├── modifying-the-build-for-every-solution-in-a-repository.md └── write-entire-xml-fragments-in-msbuild-with-xmlpoke.md ├── styles └── website.css └── testing ├── conditional-unit-tests.md └── skip-tagged-scenarios-in-specflow-with-xunit.md /.gitbook/assets/CompilerVersion.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/CompilerVersion.png -------------------------------------------------------------------------------- /.gitbook/assets/azure-storage-contributor (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/azure-storage-contributor (1).png -------------------------------------------------------------------------------- /.gitbook/assets/azure-storage-contributor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/azure-storage-contributor.png -------------------------------------------------------------------------------- /.gitbook/assets/azure-storage-iam (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/azure-storage-iam (1).png -------------------------------------------------------------------------------- /.gitbook/assets/azure-storage-iam.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/azure-storage-iam.png -------------------------------------------------------------------------------- /.gitbook/assets/compilerversion.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/compilerversion.png -------------------------------------------------------------------------------- /.gitbook/assets/image (1) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (1) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (1) (2).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (1) (2).png -------------------------------------------------------------------------------- /.gitbook/assets/image (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (10) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (10) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (10).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (10).png -------------------------------------------------------------------------------- /.gitbook/assets/image (11) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (11) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (11).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (11).png -------------------------------------------------------------------------------- /.gitbook/assets/image (12).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (12).png -------------------------------------------------------------------------------- /.gitbook/assets/image (2) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (2) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (2).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (2).png -------------------------------------------------------------------------------- /.gitbook/assets/image (3) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (3) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (3).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (3).png -------------------------------------------------------------------------------- /.gitbook/assets/image (4) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (4) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (4).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (4).png -------------------------------------------------------------------------------- /.gitbook/assets/image (5) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (5) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (5).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (5).png -------------------------------------------------------------------------------- /.gitbook/assets/image (6) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (6) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (6).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (6).png -------------------------------------------------------------------------------- /.gitbook/assets/image (7) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (7) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (7).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (7).png -------------------------------------------------------------------------------- /.gitbook/assets/image (8) (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (8) (1).png -------------------------------------------------------------------------------- /.gitbook/assets/image (8).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (8).png -------------------------------------------------------------------------------- /.gitbook/assets/image (9).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image (9).png -------------------------------------------------------------------------------- /.gitbook/assets/image.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/kzu/til/7ab5047084e08310af4157e7dedd126e1ed8fbc0/.gitbook/assets/image.png -------------------------------------------------------------------------------- /.github/workflows/deploy.yml: -------------------------------------------------------------------------------- 1 | name: 'deploy website' 2 | 3 | on: 4 | push: 5 | branches: 6 | - master 7 | 8 | jobs: 9 | job_deploy_website: 10 | name: 'deploy gh-pages' 11 | runs-on: ubuntu-latest 12 | steps: 13 | - uses: actions/checkout@v1 14 | - uses: actions/setup-node@v1 15 | with: 16 | node-version: '10.x' 17 | - name: 'Installing gitbook cli' 18 | run: npm install -g gitbook-cli 19 | - name: 'Generating distributable files' 20 | run: | 21 | gitbook install 22 | gitbook build 23 | - name: Deploying to gh-pages 24 | uses: peaceiris/actions-gh-pages@v3 25 | with: 26 | github_token: ${{ secrets.GITHUB_TOKEN }} 27 | publish_dir: ./_book -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | _book 2 | node_modules 3 | package-lock.json -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: >- 3 | This is the site where I share little snippets and tidbits of learnings that 4 | don't deserve a full blog post on https://cazzulino.com, grouped by areas, 5 | rather than by date. 6 | --- 7 | 8 | # Today I Learned 9 | 10 | -------------------------------------------------------------------------------- /SUMMARY.md: -------------------------------------------------------------------------------- 1 | # Table of contents 2 | 3 | * [Today I Learned](README.md) 4 | 5 | ## dotnet 6 | 7 | * [How to emit descriptions for exported JSON schema using JsonSchemaExporter](dotnet/how-to-emit-descriptions-for-exported-json-schema-using-jsonschemaexporter.md) 8 | * [NuGet](dotnet/nuget/README.md) 9 | * [Suppress dependencies when packing](dotnet/nuget/suppress-dependencies-when-packing.md) 10 | * [Hide contentFiles from your nuget packages](dotnet/nuget/hide-contentfiles-from-your-nuget-packages.md) 11 | * [Packaging transitive analyzers with NuGet](dotnet/nuget/packaging-transitive-analyzers-with-nuget.md) 12 | * [How to add search to static nuget feed](dotnet/nuget/use-dotnet-vs-to-get-developer-prompt-in-terminal.md) 13 | * [Populate RepositoryBranch in CI for NuGet Pack](dotnet/nuget/populate-repositorybranch-in-ci-for-nuget-pack.md) 14 | * [Ignore folder from dotnet-format](dotnet/ignore-folder-from-dotnet-format.md) 15 | * [Accessing Tor .onion URLs via HttpClient with .NET6](dotnet/accessing-tor-via-httpclient-with-.net6.md) 16 | * [Installing .NET 5.0 on Raspberry Pi 4](dotnet/installing-.net-5.0-on-raspberry-pi-4.md) 17 | * [Quickly check C# compiler and language version](dotnet/quickly-check-c-compiler-and-language-version.md) 18 | * [Disable diagnostic analyzers for entire folder/submodules](dotnet/disable-diagnostic-analyzers-for-entire-folder-submodules.md) 19 | * [Persisting output files from source generators](dotnet/persisting-output-files-from-source-generators.md) 20 | * [Use C# 9 records in non-net5.0 projects](dotnet/use-c-9-records-in-non-net5.0-projects.md) 21 | * [AsyncLocal never leaks and is safe for CallContext-like state](dotnet/asynclocal-never-leaks-and-is-safe-for-callcontext-like-state.md) 22 | * [Using HashCode in .NETFramework](dotnet/using-hashcode-in-.netframework.md) 23 | * [How to locate dotnet](dotnet/how-to-locate-dotnet.md) 24 | 25 | ## testing 26 | 27 | * [Conditional unit tests](testing/conditional-unit-tests.md) 28 | * [Skip tagged scenarios in SpecFlow with Xunit](testing/skip-tagged-scenarios-in-specflow-with-xunit.md) 29 | 30 | ## msbuild 31 | 32 | * [How to get user home dir \~ cross-platform](msbuild/how-to-get-user-home-dir-cross-platform.md) 33 | * [Modifying the build for every solution in a repository](msbuild/modifying-the-build-for-every-solution-in-a-repository.md) 34 | * [Detect CI builds for every CI system](msbuild/detect-ci-builds-for-every-ci-system.md) 35 | * [Modify all command-line builds in entire repo](msbuild/modify-all-command-line-builds-in-entire-repo.md) 36 | * [Write entire XML fragments in MSBuild with XmlPoke](msbuild/write-entire-xml-fragments-in-msbuild-with-xmlpoke.md) 37 | * [How to select first item in an ItemGroup](msbuild/how-to-select-first-item-in-an-itemgroup.md) 38 | * [How to include commit URL in nuget package description](msbuild/how-to-include-commit-url-in-nuget-package-description.md) 39 | * [How to include package reference files in your nuget](msbuild/how-to-include-package-reference-files-in-your-nuget-package.md) 40 | * [How to build project when content files change](msbuild/how-to-build-project-when-content-files-change.md) 41 | 42 | ## azure 43 | 44 | * [How to launch multiple Azure Functions apps on different ports](azure/how-to-launch-multiple-azure-functions-apps-on-different-ports.md) 45 | * [C# script function apps beyond Azure portal](azure/c-script-function-apps-beyond-azure-portal.md) 46 | * [Publishing function app from GitHub folder](azure/publishing-function-app-from-github-folder.md) 47 | * [Exploring Azure Data with Kusto and Dashboards](azure/exploring-azure-data-with-kusto-and-dashboards.md) 48 | * [Shared secret authorization with Azure SignalR Service](azure/shared-secret-authorization-with-azure-signalr-service.md) 49 | * [Using Azure File Copy from DevOps yaml pipeline](azure/using-azure-file-copy-from-devops-yaml-pipeline.md) 50 | * [Code-less redirection with serverless Azure Functions](azure/code-less-redirection-with-serverless-azure-functions.md) 51 | 52 | ## DevOps/CI/CD 53 | 54 | * [How to run Azure Storage unit tests in CI](devops-ci-cd/how-to-run-azure-storage-unit-tests-in-ci.md) 55 | * [How to skip steps or jobs in GitHub Actions for PRs from forks](devops-ci-cd/how-to-skip-steps-or-jobs-in-github-actions-for-prs-from-forks.md) 56 | * [Update version and publish npm from GH](devops-ci-cd/update-version-and-publish-npm-from-gh.md) 57 | * [Push to protected branch from GitHub actions](devops-ci-cd/push-to-protected-branch-from-github-actions.md) 58 | -------------------------------------------------------------------------------- /azure/c-script-function-apps-beyond-azure-portal.md: -------------------------------------------------------------------------------- 1 | # C# script function apps beyond Azure portal 2 | 3 | Creating a C# [function app in the Azure portal](https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-azure-function) is incredibly simple and great for playing and learning. I love the simplicity and light-ness that comes from just having a single `.csx` script file with everything you need for the function. The more "serious" approach with a "proper" [C# project, the functions SDK](https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-your-first-function-visual-studio) and the corresponding CI/CD setup seems quite the leap, in comparison. 4 | 5 | I [recently had to move](https://twitter.com/kzu/status/1300481855565725696) a bunch of functions from one subscription to another and wanted to take the chance to improve the maintainability for some in-portal functions I had. Moving them also failed in the portal, so I looking at a non-enjoyable time copying/pasting files over. Made me question why I used in-portal functions at all. 6 | 7 | So, while trying to save myself some time doing that, I got a zip of the existing in-portal function app using the Kudu debug console at [_https://\[APP\_NAME\].scm.azurewebsites.net/DebugConsole_](https://azdo-api.scm.azurewebsites.net/DebugConsole), and simply clicking the download icon next to the _wwwroot_ folder: 8 | 9 | ![](<../.gitbook/assets/image (8) (1).png>) 10 | 11 | And that's where it clicked: why not just put all those files in my own [GitHub repo](https://github.com/kzu/azdo/tree/main/api) and deploy straight from there via a [subfolder](https://til.cazzulino.com/azure/publishing-function-app-from-github-folder)? 12 | 13 | ![](<../.gitbook/assets/image (7) (1).png>) 14 | 15 | Well, that \*totally\* works, and you can keep the simplicity of the `.csx` while having a proper deployment history (and rollback capabilities) you're likely going to need sooner or later, no matter how simple the function. 16 | 17 | I couldn't get rid of the _function.json_ file though, so placing the Functions SDK attributes on the code in the .csx doesn't work, but I'd say it's an acceptable trade-off. 18 | -------------------------------------------------------------------------------- /azure/code-less-redirection-with-serverless-azure-functions.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: >- 3 | How to quickly and simply configure redirections without writing code in Azure 4 | Functions 5 | --- 6 | 7 | # Code-less redirection with serverless Azure Functions 8 | 9 | Say you want to have a nicer URI for something \(like an Azure storage blob, a feed or something else\). You likely have a nice short custom domain \(i.e. I use kzu.io for things like this\), and would like to set up arbitrary \(temporary or permanent\) redirections. This can trivially be achieved by creating an empty Functions App and leveraging [Functions Proxies](https://docs.microsoft.com/en-us/azure/azure-functions/functions-proxies). 10 | 11 | The `proxies.json` file for code-less redirects looks as follows: 12 | 13 | ```javascript 14 | { 15 | "$schema": "http://json.schemastore.org/proxies", 16 | "proxies": { 17 | "[SOME_ID]": { 18 | "matchCondition": { 19 | "methods": [ "GET" ], 20 | "route": "[SHORT_PATH_HERE]" 21 | }, 22 | "responseOverrides": { 23 | "response.statusCode": "[301|302|307|308|", 24 | "response.headers.location": "[LONG_URL_HERE]" 25 | } 26 | } 27 | } 28 | } 29 | ``` 30 | 31 | You can have as many of those IDs/entries as needed. For example, this is one I use to set up `https://pkg.kzu.io/index.json` > `https://kzu.blob.core.windows.net/nuget/index.json`: 32 | 33 | ```javascript 34 | { 35 | "$schema": "http://json.schemastore.org/proxies", 36 | "proxies": { 37 | "default": { 38 | "matchCondition": { 39 | "methods": [ "GET" ], 40 | "route": "index.json" 41 | }, 42 | "responseOverrides": { 43 | "response.statusCode": "307", 44 | "response.headers.location": "https://kzu.blob.core.windows.net/nuget/index.json" 45 | } 46 | } 47 | } 48 | } 49 | ``` 50 | 51 | 52 | 53 | -------------------------------------------------------------------------------- /azure/exploring-azure-data-with-kusto-and-dashboards.md: -------------------------------------------------------------------------------- 1 | # Exploring Azure Data with Kusto and Dashboards 2 | 3 | In order to more effectively learn [Kusto](https://docs.microsoft.com/en-us/azure/azure-monitor/log-query/query-language) (the query language powering Azure analytics, log querying and PowerBI) and data visualization capabilites in Azure, I did the following: 4 | 5 | 1. [Created a cluster and database](https://docs.microsoft.com/en-us/azure/data-explorer/create-cluster-database-portal) (this takes quite a while) 6 | 2. Open the [Azure Data Explorer](https://aka.ms/kwe) (a.k.a. Kusto Web Explorer) at [https://aka.ms/kwe](https://aka.ms/kwe) 7 | 3. Add the cluster with the full URI or alternatively just the `name.region` parts (i.e. `kzukusto.centralus` vs [`https://kzukusto.centralus.kusto.windows.net`](https://kzukusto.centralus.kusto.windows.net)) 8 | 4. Optionally add an arbitrary Application Insights app as a "virtual cluster", [using a url with the format](https://docs.microsoft.com/en-us/azure/data-explorer/query-monitor-data#connect-to-the-proxy) `https://ade.applicationinsights.io/subscriptions//resourcegroups//providers/microsoft.insights/components/` 9 | 5. Right--lick database and select Ingest new data 10 | 11 | ![Ingest new data context menu](<../.gitbook/assets/image (12).png>) 12 | 6. Find some interesting [Azure Open Dataset from the catalog](https://azure.microsoft.com/en-us/services/open-datasets/catalog/) that has an Azure storage URL readily available from the Azure Open Datasets catalog, such as the [Bing COVID-19 Data](https://azure.microsoft.com/en-us/services/open-datasets/catalog/bing-covid-19-data/) (I used the `.jsonl` link). NOTE: the `.json` one will not properly infer schema because it has a root object of type array. The `.jsonl` is actually a "JSON fragment" (if that even exists, would be the equivalent of an XML fragment) where each entry is just a JSON entry/line in the file. 13 | 14 | ![](<../.gitbook/assets/image (1) (1).png>) 15 | 16 | The JSON version is preferable to `.csv` because it properly infer the data type for columns. 17 | 7. Click on `Dashboards` for the new stuff here. Parameters driven by queries are particularly handy: 18 | 19 | ![Query-driven multi-select parameter for widget](<../.gitbook/assets/image (2) (1).png>) 20 | 21 | While editing the query/widget, if the parameter is used in the query, you can interactively change its value to explore the visualizations. For example: 22 | 23 | ![Used parameter becomes enabled for selection](<../.gitbook/assets/image (3) (1).png>) 24 | 25 | Whereas if it wasn't used: 26 | 27 | ![Unused parameter unavailable](<../.gitbook/assets/image (4) (1).png>) 28 | 29 | After shaping the `Results`the way you want, clicking the Visual tab allows configuring a bunch of visualizations. Inference works quite nicely if the data/results are filtered to just what you want to display. 30 | 31 | ![Many chart options and inference that works great](<../.gitbook/assets/image (5) (1).png>) 32 | -------------------------------------------------------------------------------- /azure/how-to-launch-multiple-azure-functions-apps-on-different-ports.md: -------------------------------------------------------------------------------- 1 | # How to launch multiple Azure Functions apps on different ports 2 | 3 | Turns out setting up `Hosts.LocalHostPort` via local.settings.json doesn't work \(consistently at least, I keep getting semi-random _Value cannot be null. \(Parameter 'provider'\)_ errors on function startup\), [contrary to what the documentation says](https://docs.microsoft.com/en-us/azure/azure-functions/functions-run-local?tabs=windows%2Ccsharp%2Cbash#local-settings-file). If you override the command line args directly, via a `launchSettings.json` , it works consistently and I even get faster startup ¯\_ \(ツ\)\_/¯: 4 | 5 | ```javascript 6 | { 7 | "profiles": { 8 | "api": { 9 | "commandName": "Project", 10 | "commandLineArgs": "start --port 7072" 11 | } 12 | } 13 | } 14 | ``` 15 | 16 | You might want to add a `--pause-on-error` in there too. 17 | 18 | -------------------------------------------------------------------------------- /azure/publishing-function-app-from-github-folder.md: -------------------------------------------------------------------------------- 1 | # Publishing function app from GitHub folder 2 | 3 | For very simple functions, I really liked just hacking them in the Azure portal. This is super brittle, however, since changes that break the app can't be undone. Great for learning, not so much once you start depending on those functions. 4 | 5 | While you can deploy using [GitHub Actions](https://docs.microsoft.com/en-us/azure/azure-functions/functions-how-to-github-actions?tabs=javascript) and [Azure Pipelines](https://docs.microsoft.com/en-us/azure/azure-functions/functions-how-to-azure-devops?tabs=csharp), both seem a massive leap in complexity from the beauty of in-portal editing and updating. [Kudu](https://docs.microsoft.com/en-us/azure/app-service/deploy-continuous-deployment#option-1-kudu-app-service) (or _App Service build service_) is way simpler and generally sufficient for simple functions. You just select GitHub as the source: 6 | 7 | ![](<../.gitbook/assets/image (10) (1).png>) 8 | 9 | And App Service build service as the build provider: 10 | 11 | ![](<../.gitbook/assets/image (6) (1).png>) 12 | 13 | You next simply connect it to your GitHub repository and that's it. But what if you want to deploy a [subfolder](https://github.com/kzu/azdo/tree/main/redir) from the repository as the function app? 14 | 15 | Turns out you can just add an application setting to the function app, named `DEPLOYMENT_SOURCE`, pointing to the right subfolder (i.e. `.\redir` or `.\api` in my case) and that's it! Here you can see it in action in the logs, where just the `redir` subfolder is being sync'ed to the `wwwroot` for the function app: 16 | 17 | ![](<../.gitbook/assets/image (11) (1).png>) 18 | -------------------------------------------------------------------------------- /azure/shared-secret-authorization-with-azure-signalr-service.md: -------------------------------------------------------------------------------- 1 | # Shared secret authorization with Azure SignalR Service 2 | 3 | While testing out [Azure Functions development and configuration with Azure SignalR Service](https://docs.microsoft.com/en-us/azure/azure-signalr/signalr-concept-serverless-development-config#negotiate-experience-in-class-based-model), I needed a very simple key-based \(shared secret\) authorization mechanism so that my console-based SignalR client could connect to my Azure SignalR Service-powered hub using a very simple mechanism. 4 | 5 | The docs and the sample showcased the `negotiate` endpoint returning the connection info directly: 6 | 7 | ```text 8 | [FunctionName("negotiate")] 9 | public SignalRConnectionInfo Negotiate([HttpTrigger(AuthorizationLevel.Anonymous)]HttpRequest req) 10 | ``` 11 | 12 | But a [stackoverflow answer](https://stackoverflow.com/a/55586165/24684) pointed me to the solution: just the proper `IActionResult` using `OkObjectResult` with the connection info when the access key is properly validated: 13 | 14 | ```text 15 | [FunctionName("negotiate")] 16 | public IActionResult Negotiate( 17 | [HttpTrigger(AuthorizationLevel.Anonymous, "post")] HttpRequest req, 18 | [SignalRConnectionInfo(HubName = "events")] SignalRConnectionInfo connectionInfo) 19 | { 20 | var expectedKey = Environment.GetEnvironmentVariable("AccessKey"); 21 | if (string.IsNullOrEmpty(expectedKey)) 22 | return new OkObjectResult(connectionInfo); 23 | 24 | var accessKey = req.Query["accessKey"]; 25 | if (StringValues.IsNullOrEmpty(accessKey) || 26 | !StringValues.Equals(expectedKey, accessKey)) 27 | return new UnauthorizedResult(); 28 | 29 | return new OkObjectResult(connectionInfo); 30 | } 31 | ``` 32 | 33 | 34 | 35 | 36 | 37 | 38 | 39 | 40 | 41 | -------------------------------------------------------------------------------- /azure/using-azure-file-copy-from-devops-yaml-pipeline.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: >- 3 | I learned that it's not enough to authorize Azure Resource Manager access from 4 | DevOps 5 | --- 6 | 7 | # Using Azure File Copy from DevOps yaml pipeline 8 | 9 | Oh boy, did I waste time on this one :(. So I had my pipeline pretty naively doing an upload to blob storage: 10 | 11 | ``` 12 | - task: AzureFileCopy@4 13 | displayName: Upload Vsix 14 | inputs: 15 | SourcePath: '$(Pipeline.Workspace)\vsix\RoslynDeployment.$(Build.BuildId).vsix' 16 | azureSubscription: 'roslyn-Azure' 17 | Destination: 'AzureBlob' 18 | storage: 'roslyn' 19 | ContainerName: 'vsix' 20 | BlobPrefix: '$(Build.SourceBranchName)/RoslynDeployment.$(Build.BuildId).vsix' 21 | ``` 22 | 23 | I used a service principal [managed by DevOps which is the recommended approach](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/connect-to-azure?view=azure-devops#create-an-azure-resource-manager-service-connection-using-automated-security). The blob storage account was under the same subscription, where the automatically created app properly showed up in IAM: 24 | 25 | ![Access control (IAM) pane for storage account](<../.gitbook/assets/azure-storage-iam (1).png>) 26 | 27 | as a contributor: 28 | 29 | ![DevOps-managed app as contributor to the storage account](<../.gitbook/assets/azure-storage-contributor (1).png>) 30 | 31 | I kept getting a 403 response when the task run, with the message `This request is not authorized to perform this operation using this permission.` 32 | 33 | Turns out being a **Contributor** is not enough. I tried [changing guest user permissions](https://docs.microsoft.com/en-us/azure/devops/pipelines/release/azure-rm-endpoint?view=azure-devops#insufficient-privileges-to-complete-the-operation), but in the end the only thing that worked was manually adding the [Storage Blob Data Contributor role](https://github.com/MicrosoftDocs/azure-docs/issues/36454), which I found mentioned in a[ blog post](https://www.catrina.me/azcopy-403-error/). 34 | 35 | In the process I learned how DevOps creates the app registration and what-not, but still, not fun. 36 | 37 | [Submitted a doc fix](https://github.com/MicrosoftDocs/azure-devops-docs/pull/8622) for the [AzureFileCopy task docs](https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-file-copy-version3?view=azure-devops) so this is more easily discoverable. 38 | -------------------------------------------------------------------------------- /book.json: -------------------------------------------------------------------------------- 1 | { 2 | "plugins": [ "rss" ], 3 | "pluginsConfig": { 4 | "rss": { 5 | "title": "Kzu's Today I Learned", 6 | "description": "My daily learning tidbits.", 7 | "author": "Daniel Cazzulino", 8 | "site_url": "https://til.cazzulino.com/", 9 | "feed_url": "https://til.cazzulino.com/rss.xml", 10 | "image_url": "https://github.com/kzu/kzu.github.io/raw/master/img/kzu.jpg" 11 | }, 12 | "styles": { 13 | "website": "styles/website.css" 14 | } 15 | } 16 | } -------------------------------------------------------------------------------- /devops-ci-cd/how-to-run-azure-storage-unit-tests-in-ci.md: -------------------------------------------------------------------------------- 1 | # How to run Azure Storage unit tests in CI 2 | 3 | If you have tests that need to exercise Blob, Queue or Table storage from Azure Storage, you can use [Azurite](https://github.com/Azure/Azurite) \(v3\) in CI, which can be configured for GitHub Actions as follows: 4 | 5 | ```text 6 | - name: ⚙ azurite 7 | run: | 8 | npm install azurite 9 | npx azurite-table & 10 | ``` 11 | 12 | The first line will install it, and the second will start it in a background process. Tests will need to use `CloudStorageAccount.DevelopmentStorageAccount` to access the locally running instance automatically. Note that this is compatible with the older Azure Storage Emulator included with the Azure SDK with Visual Studio, so no changes are needed between local test runs and CI. 13 | 14 | You can see this in action in the following run: [https://github.com/devlooped/TableStorage/actions/runs/829193167](https://github.com/devlooped/TableStorage/actions/runs/829193167) which is running on all supported .NET platforms via the workflow definition at [https://github.com/devlooped/TableStorage/blob/main/.github/workflows/build.yml](https://github.com/devlooped/TableStorage/blob/main/.github/workflows/build.yml). 15 | 16 | -------------------------------------------------------------------------------- /devops-ci-cd/how-to-skip-steps-or-jobs-in-github-actions-for-prs-from-forks.md: -------------------------------------------------------------------------------- 1 | # How to skip steps or jobs in GitHub Actions for PRs from forks 2 | 3 | [Encrypted secrets in GitHub](https://docs.github.com/en/free-pro-team@latest/actions/reference/encrypted-secrets#using-encrypted-secrets-in-a-workflow) actions aren't available for builds from forks, so if your build script includes PRs, like [Avatar](https://github.com/kzu/avatar/blob/main/.github/workflows/build.yml#L2-L6): 4 | 5 | ```yaml 6 | on: 7 | push: 8 | branches: [ main, dev, 'feature/*', 'rel/*' ] 9 | pull_request: 10 | types: [opened, synchronize, reopened] 11 | ``` 12 | 13 | You may need to limit steps or entire jobs from running when PRs are coming from forks due to missing secrets \(i.e. pushing an output package to a CI nuget feed, say\). 14 | 15 | The "magic" thing is to use an `if` condition with the following expression: 16 | 17 | ```yaml 18 | push: 19 | name: push nuget.ci 20 | runs-on: ubuntu-latest 21 | needs: [build, acceptance] 22 | if: ${{ !github.event.pull_request.head.repo.fork }} 23 | steps: 24 | ``` 25 | 26 | Note that in the above case, an entire job is skipped, but you can also apply it to a step instead. 27 | 28 | -------------------------------------------------------------------------------- /devops-ci-cd/push-to-protected-branch-from-github-actions.md: -------------------------------------------------------------------------------- 1 | # Push to protected branch from GitHub actions 2 | 3 | It turns out that you really can't just `git push` from your GitHub actions [if the repository has branch protection turned on](https://github.community/t/how-to-push-to-protected-branches-in-a-github-action/16101) or required checks before merging. Sorta makes sense, but still a PITA. 4 | 5 | The solution that worked for me was to [use a different token on checkout](https://github.community/t/how-to-push-to-protected-branches-in-a-github-action/16101/34). Since the awesome GitHub CLI [allows using a separate, higher-permissions token](https://github.com/cli/cli/issues/1229) named `GH_TOKEN` \(since depending on the command you use, you might need a different one than `GITHUB_TOKEN`\), I decided to \(ab\)use the same: 6 | 7 | An [example workflow](https://github.com/devlooped/oss/blob/main/.github/workflows/changelog.yml) that uses this to generate a full changelog and push it to main on releases looks like this: 8 | 9 | ```text 10 | name: changelog 11 | on: 12 | release: 13 | types: [released] 14 | 15 | env: 16 | GH_TOKEN: ${{ secrets.GH_TOKEN }} 17 | 18 | jobs: 19 | changelog: 20 | runs-on: ubuntu-latest 21 | steps: 22 | - name: 🔍 GH_TOKEN 23 | if: env.GH_TOKEN == '' 24 | env: 25 | GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} 26 | run: echo "GH_TOKEN=${GITHUB_TOKEN}" >> $GITHUB_ENV 27 | 28 | - name: 🤘 checkout 29 | uses: actions/checkout@v2 30 | with: 31 | fetch-depth: 0 32 | ref: main 33 | token: ${{ env.GH_TOKEN }} 34 | 35 | - name: ⚙ changelog 36 | uses: faberNovel/github-changelog-generator-action@master 37 | with: 38 | options: --token ${{ secrets.GITHUB_TOKEN }} --o changelog.md 39 | 40 | - name: 🚀 changelog 41 | run: | 42 | git config --local user.name github-actions 43 | git config --local user.email github-actions@github.com 44 | git add changelog.md 45 | git commit -m "🖉 Update changelog with ${GITHUB_REF#refs/*/}" 46 | git push 47 | ``` 48 | 49 | Important parts: 50 | 51 | * I default the `GH_TOKEN` envvar to a same-name secret, if present 52 | * If it's not present, I default it to `GITHUB_TOKEN` 53 | * Checkout always uses `GH_TOKEN`, which now may be a higher-permissions PAT than the default 54 | * I do the defaulting since the push **will** succeed if the repository doesn't use branch protection for `main` and in that case I don't want to always force the presence of a `GH_TOKEN` secret. 55 | 56 | -------------------------------------------------------------------------------- /devops-ci-cd/update-version-and-publish-npm-from-gh.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: >- 3 | Setting up CI with GitHub Actions to update a node.js package version from 4 | GitHub release tag and publish it to npm 5 | --- 6 | 7 | # Update version and publish npm from GH 8 | 9 | Super proud of this, since it's my first node.js package ever \(created brand-new account on [https://www.npmjs.com/](https://www.npmjs.com/) and all 😁\), for doing [syntax highlighting](https://github.com/dotnetconfig/highlightjs-dotnetconfig) for [dotnetconfig](https://dotnetconfig.org/) that works in [docfx](https://github.com/dotnetconfig/dotnet-config/tree/dev/docs). 10 | 11 | Assuming you already have a `package.json` in the root repo dir, add the `.github\workflows\npm.yml` as follows: 12 | 13 | ```text 14 | name: publish 15 | 16 | on: 17 | release: 18 | types: [created] 19 | 20 | jobs: 21 | publish-npm: 22 | runs-on: ubuntu-latest 23 | steps: 24 | - uses: actions/checkout@v2 25 | - uses: actions/setup-node@v1 26 | with: 27 | node-version: 12 28 | registry-url: https://registry.npmjs.org/ 29 | - name: Set version from tag 30 | run: npm --no-git-tag-version version ${GITHUB_REF#refs/*/} 31 | - run: npm publish 32 | env: 33 | NODE_AUTH_TOKEN: ${{secrets.npm_token}} 34 | ``` 35 | 36 | 1. I'm running only on release creation 37 | 2. Using the GitHub envvar for the [checked out tag](https://stackoverflow.com/questions/58177786/get-the-current-pushed-tag-in-github-actions/58178121#58178121), I'm [updating the version](https://docs.npmjs.com/cli/version) but opting out of the git behavior \(which fails on CI because it's a detached head at that point\) 38 | 3. Finally you'll need to create that secret in [GH](https://docs.github.com/en/actions/configuring-and-managing-workflows/creating-and-storing-encrypted-secrets) for the publish step. 39 | 40 | -------------------------------------------------------------------------------- /dotnet/accessing-tor-via-httpclient-with-.net6.md: -------------------------------------------------------------------------------- 1 | # Accessing Tor .onion URLs via HttpClient with .NET6 2 | 3 | In .NET6, there is built-in [support for SOCKS4/5 proxies](https://github.com/dotnet/runtime/pull/48883)! This means you can install the [Tor](https://dist.torproject.org/torbrowser/) service and without anything else, access any `.onion` URL with plain `HttpClient`: 4 | 5 | ```csharp 6 | using System; 7 | using System.Net; 8 | using System.Net.Http; 9 | 10 | var http = new HttpClient(new HttpClientHandler 11 | { 12 | Proxy = new WebProxy("socks5://127.0.0.1:1338") 13 | }); 14 | 15 | Console.WriteLine(await http.GetStringAsync("https://bbcnewsv2vjtpsuy.onion")); 16 | ``` 17 | 18 | NOTE: I could not get this to work with gRPC, which requires HTTP/2. 19 | 20 | You can get the daily .NET6 SDK installer permalinks from: 21 | 22 | | Platform | Main | 23 | | :--- | :---: | 24 | | **Windows \(x64\)** | [https://aka.ms/dotnet/6.0/daily/dotnet-sdk-win-x64.exe](https://aka.ms/dotnet/6.0/daily/dotnet-sdk-win-x64.exe) | 25 | | **Windows \(x86\)** | [https://aka.ms/dotnet/6.0/daily/dotnet-sdk-win-x86.exe](https://aka.ms/dotnet/6.0/daily/dotnet-sdk-win-x86.exe) | 26 | | **Windows \(arm64\)** | [https://aka.ms/dotnet/6.0/daily/dotnet-sdk-win-arm64.exe](https://aka.ms/dotnet/6.0/daily/dotnet-sdk-win-arm64.exe) | 27 | | **macOS \(x64\)** | [https://aka.ms/dotnet/6.0/daily/dotnet-sdk-osx-x64.pkg](https://aka.ms/dotnet/6.0/daily/dotnet-sdk-osx-x64.pkg) | 28 | | **macOS \(arm64\)** | [https://aka.ms/dotnet/6.0/daily/dotnet-sdk-osx-arm64.pkg](https://aka.ms/dotnet/6.0/daily/dotnet-sdk-osx-arm64.pkg) | 29 | 30 | -------------------------------------------------------------------------------- /dotnet/asynclocal-never-leaks-and-is-safe-for-callcontext-like-state.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: >- 3 | Even if it's typically used in a static field, the values never leak since 4 | they are bound to a transient ExecutionContext 5 | --- 6 | 7 | # AsyncLocal never leaks and is safe for CallContext-like state 8 | 9 | Sometime ago I wrote on [How to migrate CallContext to .NETStandard and .NETCore](https://www.cazzulino.com/callcontext-netstandard-netcore.html), and one question mentioned that the values themselves might leak, which actually is not the case, as shown here, due to the "magic" that is AsyncLocal in combination with the [transient nature of the ExecutionContext](https://github.com/dotnet/runtime/blob/master/src/libraries/System.Private.CoreLib/src/System/Threading/ExecutionContext.cs#L133-L200): 10 | 11 | ```text 12 | static AsyncLocal local = new AsyncLocal(); 13 | 14 | async Task Main() 15 | { 16 | WeakReference data = null; 17 | 18 | await Task.Run(() => 19 | { 20 | var o = new object(); 21 | data = new WeakReference(o); 22 | // We assign to the static async local, to see if it leaks. 23 | local.Value = o; 24 | }); 25 | 26 | // After execution is finished, we do have a live reference still. 27 | System.Diagnostics.Debug.Assert(data.IsAlive == true); 28 | 29 | GC.Collect(); 30 | 31 | // But a GC proves nobody is holding a strong reference to it. 32 | System.Diagnostics.Debug.Assert(data.IsAlive == false); 33 | } 34 | 35 | ``` 36 | 37 | -------------------------------------------------------------------------------- /dotnet/disable-diagnostic-analyzers-for-entire-folder-submodules.md: -------------------------------------------------------------------------------- 1 | # Disable diagnostic analyzers for entire folder/submodules 2 | 3 | When you have submodules in your repository, it's not uncommon for those to be subject to a different quality bar than your main project. You may not even have a say in fixing issues upstream, or conventions may be totally different even. 4 | 5 | In this case, you can introduce an `.editorconfig` in your repository which turns off all analyzer diagnostics for an entire directory path \(and all subdirectories\) by leveraging wildcards. Say your submodules are in a folder `ext` a couple levels up from your `.editorconfig`: 6 | 7 | ```text 8 | [../../ext/**/*.cs] 9 | dotnet_analyzer_diagnostic.severity = none 10 | ``` 11 | 12 | Learned from [this commit](https://github.com/AArnott/Nerdbank.Streams/commit/50e322a90903283191ccaf67a8f27cf5aba6784c) after following it from a related [roslyn issue](https://github.com/dotnet/roslyn/issues/42762). 13 | 14 | -------------------------------------------------------------------------------- /dotnet/how-to-emit-descriptions-for-exported-json-schema-using-jsonschemaexporter.md: -------------------------------------------------------------------------------- 1 | # How to emit descriptions for exported JSON schema using JsonSchemaExporter 2 | 3 | We now (.NET 9+) have an API to export a JSON schema from a .NET type: 4 | 5 |
6 | 7 | The default exporter will not, however, provide descriptions for those properties, even if a `[Description(...)]`attribute is provided. The way to fix that is to provide a `TransformSchemaNode` callback like so: 8 | 9 | ```csharp 10 | var node = JsonSchemaExporter.GetJsonSchemaAsNode(options, typeof(Product), new JsonSchemaExporterOptions 11 | { 12 | TreatNullObliviousAsNonNullable = true, 13 | TransformSchemaNode = (context, node) => 14 | { 15 | var description = context.PropertyInfo?.AttributeProvider?.GetCustomAttributes(typeof(DescriptionAttribute), false) 16 | .OfType() 17 | .FirstOrDefault()?.Description; 18 | 19 | if (description != null) 20 | node["description"] = description; 21 | 22 | return node; 23 | }, 24 | }); 25 | ``` 26 | 27 | -------------------------------------------------------------------------------- /dotnet/how-to-locate-dotnet.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: How to locate dotnet from the currently running .NET Core application 3 | --- 4 | 5 | # How to locate dotnet 6 | 7 | I've seen a bunch of `DotNetMuxer` implementations, but they all look quite similar \(if not exactly the same\). 8 | 9 | * [ASP.NET Core](https://github.com/dotnet/aspnetcore/blob/master/src/Shared/CommandLineUtils/Utilities/DotNetMuxer.cs) \(same used by others, like [Orleans](https://github.com/dotnet/orleans/blob/master/src/Orleans.CodeGenerator.MSBuild.Tasks/DotNetMuxer.cs)\) 10 | * [.NET runtime](https://github.com/dotnet/runtime/blob/master/src/libraries/Common/src/Extensions/CommandLineUtils/Utilities/DotNetMuxer.cs), [.NET extensions](https://github.com/dotnet/extensions/blob/master/src/Shared/src/CommandLineUtils/Utilities/DotNetMuxer.cs) 11 | * [Azure WebJobs SDK](https://github.com/Azure/azure-webjobs-sdk/blob/master/src/Analyzer/ExtensionViewer/DotNetMuxer.cs) \(same as [Azure Functions SDK](https://github.com/Azure/azure-functions-vs-build-sdk/blob/master/src/Microsoft.NET.Sdk.Functions.MSBuild/Tasks/DotNetMuxer.cs)\) 12 | * [.NET command-line-api](https://github.com/dotnet/command-line-api/blob/master/src/System.CommandLine.Suggest/DotnetMuxer.cs) 13 | 14 | Seems like there are two approaches: those that try to locate a `FX_DEPS_FILE` file and those that just use the current process' `MainModule` \(which would be `dotnet[.exe]` itself for a .NET Core app. The latter looks simpler and the former seems unnecessary, until you take into account [self-contained .NET Core apps](https://docs.microsoft.com/en-us/dotnet/core/deploying/#publish-self-contained) or even [trimmed self-contained](https://docs.microsoft.com/en-us/dotnet/core/deploying/trim-self-contained), but even that version doesn't work in those cases :\( 15 | 16 | -------------------------------------------------------------------------------- /dotnet/ignore-folder-from-dotnet-format.md: -------------------------------------------------------------------------------- 1 | # Ignore folder from dotnet-format 2 | 3 | The easiest way to ignore an entire folder (i.e. one containing files you're syncing from an external repository such as [catbag](https://github.com/devlooped/catbag) using [dotnet-file](https://www.nuget.org/packages/dotnet-file)) is to create a `.editorconfig` file with the following content: 4 | 5 | ```editorconfig 6 | [*] 7 | generated_code = true 8 | ``` 9 | -------------------------------------------------------------------------------- /dotnet/installing-.net-5.0-on-raspberry-pi-4.md: -------------------------------------------------------------------------------- 1 | # Installing .NET 5.0 on Raspberry Pi 4 2 | 3 | None of the official methods worked for me \(not even [involving horrid fixes](https://github.com/dotnet/core/issues/4446#issuecomment-843162084) ;\)\). So I ended up going the super manual route \(mostly copying [https://elbruno.com/2019/08/27/raspberrypi-how-to-install-dotnetcore-in-a-raspberrypi4-and-test-with-helloworld-of-course/](https://elbruno.com/2019/08/27/raspberrypi-how-to-install-dotnetcore-in-a-raspberrypi4-and-test-with-helloworld-of-course/) and [https://elbruno.com/2020/01/05/raspberrypi-how-to-solve-dotnet-core-not-recognized-after-reboot/](https://elbruno.com/2020/01/05/raspberrypi-how-to-solve-dotnet-core-not-recognized-after-reboot/)\). 4 | 5 | ```text 6 | $ sudo apt-get install lshw 7 | $ sudo lshw 8 | description: Computer 9 | product: Raspberry Pi 4 Model B Rev 1.4 10 | serial: 10000000d5e618a2 11 | width: 64 bits 12 | 13 | ; OK, it's 64bits 14 | ; Get Arm64 download URL from https://dotnet.microsoft.com/download/dotnet/5.0 15 | 16 | $ mkdir temp && cd temp 17 | $ curl [URL] --output [FILENAME] 18 | $ mkdir $HOME/dotnet 19 | $ sudo tar zxf [FILENAME] -C $HOME/dotnet/ 20 | $ sudo ln -s $HOME/dotnet/dotnet /usr/local/bin 21 | $ dotnet --version 22 | 5.0.203 23 | ; or whatever version you downloaded 24 | 25 | ; Now we need to add it to PATH. Add the exports 26 | $ sudo nano ~/.bashrc 27 | 28 | export DOTNET_ROOT=$HOME/dotnet 29 | export PATH=$PATH:$HOME/dotnet 30 | export PATH=$PATH:$HOME/.dotnet/tools 31 | 32 | ; then Ctrl+X to save and exit. finally, load the new exports 33 | $ source ~/.bashrc 34 | ``` 35 | 36 | 37 | 38 | 39 | 40 | -------------------------------------------------------------------------------- /dotnet/nuget/README.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: NuGet-related learnings 3 | --- 4 | 5 | # NuGet 6 | 7 | -------------------------------------------------------------------------------- /dotnet/nuget/hide-contentfiles-from-your-nuget-packages.md: -------------------------------------------------------------------------------- 1 | # Hide contentFiles from your nuget packages 2 | 3 | When you package compile items with your package \(i.e. `.cs)`, they are visible by default in your consumer's project. That's not always what you want. From [a github issue on NuGet](https://github.com/NuGet/Home/issues/4856#issuecomment-288151716), I learned a neat trick to hide all the files in your package, automatically! All you need is to include a .props file in your package `build` folder \(i.e. if your package is named `Foo`, make sure the file ends up `build\Foo.props`\) 4 | 5 | ```markup 6 | 7 | 8 | 9 | 10 | false 11 | 12 | 13 | 14 | 15 | ``` 16 | 17 | This works because NuGet generates a .props file containing all your contentFiles items, under the `obj` folder, which contains both pieces of metadata used above to filter the visibility on items coming from our package. 18 | 19 | -------------------------------------------------------------------------------- /dotnet/nuget/packaging-transitive-analyzers-with-nuget.md: -------------------------------------------------------------------------------- 1 | # Packaging transitive analyzers with NuGet 2 | 3 | Consider the scenario of [ThisAssembly](https://github.com/kzu/ThisAssembly) and its referenced packages: the main package is essentially a meta-package so that anyone wanting to leverage all the codegen in all the ThisAssembly.\* packages can reference a single one. By default, NuGet pack will create a package that declares the project reference dependencies like so: 4 | 5 | ```markup 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | ``` 15 | 16 | Note all those `exclude`. Not good since now it means projects referencing this package will not get the transitive analyzers, which are precisely the point of this meta-package. 17 | 18 | The fix is [highly non-obvious](https://github.com/NuGet/Home/issues/3697#issuecomment-342983009): you must explicitly state that _none_ of the referenced project assets are to be flagged as private: 19 | 20 | ```markup 21 | 22 | 23 | 24 | 25 | 26 | 27 | ``` 28 | 29 | This properly allows the transitive build and analyzer assets to be properly installed on the referencing project. 30 | 31 | 32 | 33 | -------------------------------------------------------------------------------- /dotnet/nuget/populate-repositorybranch-in-ci-for-nuget-pack.md: -------------------------------------------------------------------------------- 1 | # Populate RepositoryBranch in CI for NuGet Pack 2 | 3 | I learned that [RepositoryBranch](https://github.com/dotnet/sourcelink/issues/188#issuecomment-427975234) is the only NuGet Pack-related property not populated already by the .NET SDK+SourceLink. So instead of going for a full-blown \(and typically over-blown\) solution for build/assembly versioning from Git information \(such as GitInfo or GitVersion or the myriad others\), you can trivially pass in this value from your CI script with: 4 | 5 | ```text 6 | dotnet pack -p:RepositoryBranch=${GITHUB_REF#refs/*/} 7 | ``` 8 | 9 | This works in all OSes if you're running with a bash shell. On a Windows agent, you'd need to opt-in to that using `shell: bash` on the `run` action \(if you're using GitHub Actions\). Or you can also just [default to bash for all run actions](https://github.com/kzu/oss/blob/main/.github/workflows/build.yml#L14-L16). 10 | 11 | NOTE: the wildcard instead of `heads` is so that this also works for tags, in which case the "branch" will be the tag name. 12 | 13 | 14 | 15 | 16 | 17 | -------------------------------------------------------------------------------- /dotnet/nuget/suppress-dependencies-when-packing.md: -------------------------------------------------------------------------------- 1 | # Suppress dependencies when packing 2 | 3 | Some packing scenarios, especially those involving tools, require ignoring all dependencies \(PackageReference as well as framework references\) from the resulting package dependencies. 4 | 5 | I [learned of a property](https://github.com/NuGet/Home/issues/6354) supported by `dotnet pack` for this: 6 | 7 | ```markup 8 | 9 | true 10 | 11 | ``` 12 | 13 | NuGetizer has more flexibility in this regard, providing both a `$(PackFrameworkReferences)` property as well as granular control over referenced packages via the `Pack` metadata on each `PackageReference`: 14 | 15 | ```markup 16 | 17 | 18 | false 19 | 20 | 21 | 22 | 23 | 24 | false 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | ``` 34 | 35 | But in order to simplify this scenario further, both the compatibility `SuppressDependenciesWhenPacking` property as well as a new `PackDependencies` property is supported in [nugetizer](https://www.nuget.org/packages/nugetizer) to achieve the same. 36 | 37 | -------------------------------------------------------------------------------- /dotnet/nuget/use-dotnet-vs-to-get-developer-prompt-in-terminal.md: -------------------------------------------------------------------------------- 1 | # How to add search to static nuget feed 2 | 3 | I have been using [static serverless nuget](https://www.cazzulino.com/serverless-nuget-feed.html) feeds for a while now. [Sleet](https://github.com/emgarten/Sleet) is totally awesome. One missing piece was search: searching a static feed would just return everything, always. [Not anymore](https://github.com/emgarten/Sleet/pull/142)! 4 | 5 | To set it up: 6 | 7 | 1. Fork [Sleet.Search](https://github.com/emgarten/Sleet.Search): this is the Azure function project that will perform the search on your static feeds. You'll host your own in a consumption \(aka serverless\) plan in Azure. 8 | 2. Create an Azure Functions app and follow the [deployment steps from my blog](https://www.cazzulino.com/minimalist-shortlinks.html#deployment): that's basically setting up the simplest CI/CD for a GH repo. 9 | 3. By default, the project allows passing in the static feed "search" url \(which is a blob named under `/search/query` in your sleet blob container\) and requires function authentication. In my case, I wanted a simpler search URL and anonymous access, so I [made that change to Sleet.Search](https://github.com/kzu/Sleet.Search/commit/99c736b). Basically the function can now be accessed at `/query`, without URL-encoding another URL parameter there. 10 | 4. Follow instructions to [enable search for your sleet feed via settings](https://github.com/emgarten/Sleet/blob/master/doc/external-search.md). 11 | 12 | That's it. Now you can configure my feed https://pkg.kzu.io/index.json and it will be fully searchable. Plus, I used [Azure Functions Proxies](https://docs.microsoft.com/en-us/azure/azure-functions/functions-proxies) to make the blob storage URL that much nicer. I even made the query URL nice, since it was trivial too: https://pkg.kzu.io/search 😍. 13 | 14 | -------------------------------------------------------------------------------- /dotnet/persisting-output-files-from-source-generators.md: -------------------------------------------------------------------------------- 1 | # Persisting output files from source generators 2 | 3 | Some time back, I had my own I/O code that [based on some MSBuild property would persist my generated source](https://github.com/kzu/ThisAssembly/commit/65d43b2f6) for troubleshooting. This is no longer needed since you can now set 4 | 5 | ```markup 6 | true 7 | ``` 8 | 9 | That will emit the sources to the `$(IntermediateOutputPath)/generated/[GeneratorAssembly]/[GeneratorTypeFullName]` folder by default. If you want to also change where the generated sources are placed, you can additionally set the `CompilerGeneratedFilesOutputPath` property. 10 | 11 | 12 | 13 | -------------------------------------------------------------------------------- /dotnet/quickly-check-c-compiler-and-language-version.md: -------------------------------------------------------------------------------- 1 | # Quickly check C\# compiler and language version 2 | 3 | Just add `#error version` anywhere in a C\# file and you'll see a tooltip with the information: 4 | 5 | ![](../.gitbook/assets/compilerversion.png) 6 | 7 | 8 | 9 | -------------------------------------------------------------------------------- /dotnet/use-c-9-records-in-non-net5.0-projects.md: -------------------------------------------------------------------------------- 1 | # Use C\# 9 records in non-net5.0 projects 2 | 3 | The new C\# 9 records syntax is quite nice: 4 | 5 | ```csharp 6 | [DebuggerDisplay("{Name} = {Value}")] 7 | record ResourceValue(string Name, string Value, bool HasFormat) 8 | { 9 | public bool IsIndexed { get; init; } 10 | public List Format { get; } = new List(); 11 | } 12 | ``` 13 | 14 | When using it in a non-_net5.0_ project \(i.e. _netstandard2.0_\), [compilation fails with an error](https://github.com/dotnet/roslyn/issues/45510): 15 | 16 | ```text 17 | Predefined type 'System.Runtime.CompilerServices.IsExternalInit' is not defined or imported 18 | ``` 19 | 20 | To workaround the issue, simply declare the missing type in your project: 21 | 22 | ```csharp 23 | // Licensed to the .NET Foundation under one or more agreements. 24 | // The .NET Foundation licenses this file to you under the MIT license. 25 | // See the LICENSE file in the project root for more information. 26 | 27 | using System.ComponentModel; 28 | 29 | namespace System.Runtime.CompilerServices 30 | { 31 | /// 32 | /// Reserved to be used by the compiler for tracking metadata. 33 | /// This class should not be used by developers in source code. 34 | /// 35 | [EditorBrowsable(EditorBrowsableState.Never)] 36 | sealed class IsExternalInit 37 | { 38 | } 39 | } 40 | ``` 41 | 42 | \([as seen elsewhere too](https://github.com/dotnet/runtime/blob/master/src/libraries/System.Text.Json/tests/Serialization/IsExternalInit.cs)\). Note the class doesn't even need to be public. 43 | 44 | If you're multitargeting net5.0 and other TFMs, just add this bit of MSBuild to remove it for net5.0 since it's built-in: 45 | 46 | ```markup 47 | 48 | 49 | 50 | ``` 51 | 52 | -------------------------------------------------------------------------------- /dotnet/using-hashcode-in-.netframework.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: 'How to use HashCode type in full .NET, which doesn''t support it' 3 | --- 4 | 5 | # Using HashCode in .NETFramework 6 | 7 | So, I just learned that not even in .net472/net48, you can leverage the cool [HashCode](https://github.com/dotnet/coreclr/blob/master/src/System.Private.CoreLib/shared/System/HashCode.cs) type from .NETCore. It **is** [mentioned in docs](https://docs.microsoft.com/en-us/dotnet/api/system.hashcode?view=dotnet-plat-ext-3.1) as available in ".NET Platform Extensions 3.1", which isn't quite clear what it means. After a quick search here and there, I found out the [Microsoft.Bcl.HashCode](https://www.nuget.org/packages/Microsoft.Bcl.HashCode/) package, which seems to be one such ".NET platform extension" \(although the version \# doesn't match either :/\). 8 | 9 | With that, I can now make a `KeyValuePairComparer`to check for dictionaries equality like: 10 | 11 | ```text 12 | public class KeyValuePairComparer : IEqualityComparer> 13 | { 14 | public static KeyValuePairComparer Default { get;} = new KeyValuePairComparer(); 15 | 16 | public bool Equals(KeyValuePair x, KeyValuePair y) => object.Equals(x.Key, y.Key) && object.Equals(x.Value, y.Value); 17 | 18 | public int GetHashCode(KeyValuePair obj) => HashCode.Combine(obj.Key, obj.Value); 19 | } 20 | ``` 21 | 22 | To be used as: 23 | 24 | ```text 25 | IReadOnlyDictionary first = ... 26 | IReadOnlyDictionary second = ... 27 | 28 | return first.SequenceEquals(second, KeyValuePairComparer.Default); 29 | 30 | ``` 31 | 32 | 33 | 34 | -------------------------------------------------------------------------------- /msbuild/detect-ci-builds-for-every-ci-system.md: -------------------------------------------------------------------------------- 1 | # Detect CI builds for every CI system 2 | 3 | In order to unify the environment variable to detect whether a build is a CI build as simply `CI` \(as is done already in [GitHub Actions](https://docs.github.com/en/free-pro-team@latest/actions/reference/environment-variables), [Circle CI](https://circleci.com/docs/2.0/env-vars/#built-in-environment-variables) and [GitLab](https://docs.gitlab.com/ee/ci/variables/predefined_variables.html)\), you just place this bit of MSBuild in every project's `Directory.Build.props` to make things consistent everywhere: 4 | 5 | ```markup 6 | 7 | false 8 | 9 | true 17 | 18 | ``` 19 | 20 | 21 | 22 | -------------------------------------------------------------------------------- /msbuild/how-to-build-project-when-content-files-change.md: -------------------------------------------------------------------------------- 1 | # How to build project when content files change 2 | 3 | Visual Studio will typically just report the project is up-to-date and doesn't need to be built if you just changed a `Content` or `None` item. If you want it to consider those file types to also trigger a build, just add the relevant items as `UpToDateCheckInput`: 4 | 5 | ```text 6 | 7 | 8 | 9 | ``` 10 | 11 | -------------------------------------------------------------------------------- /msbuild/how-to-get-user-home-dir-cross-platform.md: -------------------------------------------------------------------------------- 1 | # How to get user home dir \~ cross-platform 2 | 3 | It would be great if you could just use `$(~)`, say, but that would be too good to be true :). 4 | 5 | Here's how you can get the user profile directory in both Unix-like OS (Linux/Mac) and Windows: 6 | 7 | ```markup 8 | 9 | $(HOME) 10 | $(USERPROFILE) 11 | 12 | ``` 13 | -------------------------------------------------------------------------------- /msbuild/how-to-include-commit-url-in-nuget-package-description.md: -------------------------------------------------------------------------------- 1 | # How to include commit URL in nuget package description 2 | 3 | It's quite helpful to include a [direct link to the source code](https://www.nuget.org/packages/Microsoft.AspnetCore.Authorization) that produced a given package version, so your users can quickly see what's included \(i.e. they may be tracking a bug fix\). 4 | 5 | If you are using [Microsoft.SourceLink](https://www.nuget.org/packages?q=Microsoft.SourceLink), you already have all the information you need automatically populated for you. The following target will update the description just before the nuget is produced: 6 | 7 | ```markup 8 | 11 | 12 | 13 | $(Description) 14 | 15 | Built from $(RepositoryUrl)/tree/$(SourceRevisionId.Substring(0, 9)) 16 | 17 | 18 | $(RepositoryUrl) 19 | $(Description) 20 | 21 | 22 | ``` 23 | 24 | This will result in a short-ish link \(we trim it to 9 chars, which is the common short sha in Git\) to the repo, similar to many [ASP.NET Core](https://www.nuget.org/packages/Microsoft.AspNetCore.Http/) [packages](https://www.nuget.org/packages/Microsoft.AspNetCore.Mvc.Core/). 25 | 26 | If you're using [GitInfo](https://www.nuget.org/packages/GitInfo/) instead, you'll have to populate the `RepositoryUrl` yourself, and replace `SourceRevisionId` with `GitSha`. 27 | 28 | -------------------------------------------------------------------------------- /msbuild/how-to-include-package-reference-files-in-your-nuget-package.md: -------------------------------------------------------------------------------- 1 | # How to include package reference files in your nuget 2 | 3 | This is a particularly common scenario if you're developing MSBuild tasks or Roslyn code analyzers: all the dependencies you use in your task, analyzer or source generator, need to be included in your nuget package too, alongside your project primary output \(i.e. under the proper`analyzer, tools or build`folders\). A very convenient way \(if not comprehensive, since it won't include with transitive dependencies\) to do so is by annotating your `PackageReference` themselves, like: 4 | 5 | ```markup 6 | 7 | ``` 8 | 9 | The `Pack` metadata value can then be used to include the assets in the package with the following target: 10 | 11 | ```markup 12 | 13 | 19 | 20 | 21 | 22 | 23 | @(NuGetPackageId -> Distinct()) 24 | 25 | 26 | 27 | 28 | 29 | @(PackageReferenceDependency -> '%(Pack)') 30 | 31 | 32 | <_PackageFiles Include="@(RuntimeCopyLocalItems)" PackagePath="$(BuildOutputTargetFolder)/$(TargetFramework)/%(Filename)%(Extension)" /> 33 | 34 | 35 | 36 | 37 | ``` 38 | 39 | Quite a few things to note in the above target that aren't too obvious: 40 | 41 | 1. We use the `RuntimeCopyLocalItems` item group which is resolved by `ResolvePackageAssets` and contains the stuff that is needed for the dependency to run \(i.e. the actual binaries, not reference assemblies, if it includes them\). 42 | 2. We use Inputs/Outputs on it so we can batch by `%(RuntimeCopyLocalItems.NuGetPackageId)`: this makes processing simpler inside the target, since we'll be dealing with a single `NuGetPackageId` for each batch, regardless of how many `RuntimeCopyLocalItems`there are. 43 | 3. We next get that package ID as a property, and find the `@(PackageReference)` with that ID, to determine if it needs to be packed or not. 44 | 4. Note we use property syntax next since we know there can be at most one such `@(PackageReferenceDependency)`, in which case we'd get either an empty value or `true` for `%(Pack)`. 45 | 5. If we have to pack, we include all the `@(RuntimeCopyLocalItems)` \(in the current batch, MSBuild does this for us for free thanks to the Inputs/Outputs\) and use as the package path the same as what the primary project output will use. These are added directly as `_PackageFiles` which is what the SDK-style project `Pack` uses. 46 | 6. We update the items metadata so they are also flagged as copy-local 47 | 7. Finally, the `ResolvedFileToPublish` is useful when creating dotnet tools, since packing those is slightly different and includes a publish operation. 48 | 49 | 50 | 51 | -------------------------------------------------------------------------------- /msbuild/how-to-select-first-item-in-an-itemgroup.md: -------------------------------------------------------------------------------- 1 | # How to select first item in an ItemGroup 2 | 3 | I was entirely unaware of this [simple trick](https://gist.github.com/shadow-cs/cb5499b010bdacd1f778be29daf7f04c)! Turns out that when you assign a property value to an item metadata, and there are multiple items, all items are iterated and the property is assigned consecutively to each item metadata, leaving you with the last such item metadata as the property value. 4 | 5 | If you want the first item, you reverse the item group using the built-in item function and then assign the property: 6 | 7 | ```markup 8 | 9 | 10 | 11 | 12 | 13 | %(Reversed.Identity) 14 | 15 | 16 | 17 | 18 | ``` 19 | 20 | -------------------------------------------------------------------------------- /msbuild/modify-all-command-line-builds-in-entire-repo.md: -------------------------------------------------------------------------------- 1 | # Modify all command-line builds in entire repo 2 | 3 | I used to place `msbuild.rsp` [response files](https://docs.microsoft.com/en-us/visualstudio/msbuild/msbuild-response-files?view=vs-2019#msbuildrsp) alongside solution and project files to avoid repeating `-nr:false -v:m -bl` \(no node reuse, minimal verbosity, binlogs\). It was super annoying to have to have it all over the place. While [browsing another repo](https://github.com/microsoft/vs-streamjsonrpc/blob/master/Directory.Build.rsp), I just learned that since MSBuild 15.6+, you can now have a [Directory.Build.rsp response file](https://docs.microsoft.com/en-us/visualstudio/msbuild/msbuild-response-files?view=vs-2019#directorybuildrsp) that affects every build in every descendent folder. This sounds like a great default one for me: 4 | 5 | ```text 6 | # See https://docs.microsoft.com/en-us/visualstudio/msbuild/msbuild-response-files 7 | -nr:false 8 | -m:1 9 | -v:m 10 | -clp:Summary;ForceNoAlign 11 | ``` 12 | 13 | Note I also prefer much more the `-` syntax rather than `/` for switches. 14 | 15 | -------------------------------------------------------------------------------- /msbuild/modifying-the-build-for-every-solution-in-a-repository.md: -------------------------------------------------------------------------------- 1 | # Modifying the build for every solution in a repository 2 | 3 | Just like you can have [Directory.Build.props and Directory.Build.targets](https://docs.microsoft.com/en-us/visualstudio/msbuild/customize-your-build?view=vs-2019#directorybuildprops-and-directorybuildtargets) to customize your projects' build, you can also use `Directory.Solution.props` and `Directory.Solution.targets` to customize your solutions \(command-line\) builds. Just like the original \(older?\) mecanisms, [Visual Studio will not load those customizations either](https://docs.microsoft.com/en-us/visualstudio/msbuild/customize-your-build?view=vs-2019#customize-the-solution-build), however. 4 | 5 | In order to inspect how and where they are included in the build, it's useful to set use the [troubleshooting technique](https://docs.microsoft.com/en-us/visualstudio/msbuild/how-to-build-specific-targets-in-solutions-by-using-msbuild-exe?view=vs-2019#troubleshooting) of setting the envvar `MSBUILDEMITSOLUTION=1` and run a build. You can inspect the _.metaproj_ MSBuild project generated from the solution, where you will see the imported projects. 6 | 7 | For a Directory.Solution.props with: 8 | 9 | ```markup 10 | 11 | 12 | from-solution.props 13 | 14 | 15 | ``` 16 | 17 | And a Directory.Solution.targets with: 18 | 19 | ```markup 20 | 21 | 22 | from-solution.targets 23 | 24 | 25 | 26 | 27 | 28 | ``` 29 | 30 | You will see a _.metaproj_ similar to: 31 | 32 | ```markup 33 | 34 | 35 | 36 | C:\Program Files\dotnet\sdk\5.0.100-rc.2.20479.15\Roslyn 37 | <_DirectorySolutionPropsFile>Directory.Solution.props 38 | <_DirectorySolutionPropsBasePath>C:\Code\kzu\moq 39 | C:\Code\kzu\moq\Directory.Solution.props 40 | from-solution.props 41 | Debug 42 | Any CPU 43 | ... 44 | <_DirectorySolutionTargetsFile>Directory.Solution.targets 45 | <_DirectorySolutionTargetsBasePath>C:\Code\kzu\moq 46 | C:\Code\kzu\moq\Directory.Solution.targets 47 | from-solution.targets 48 | 49 | ... 50 | 51 | 52 | 53 | ... 54 | 55 | ``` 56 | 57 | Notice how the targets aren't imported but rather embedded in specific places \(top of property group for .props-declared properties, bottom of property group for .targets-declared properties, and before Build target for targets\). If you have properties, they are actually even evaluated before being embedded in the file, i.e.: 58 | 59 | ```markup 60 | $([System.DateTime]::Now) 61 | ``` 62 | 63 | is embedded in the .metaproj as the actually evaluated value, such as: 64 | 65 | ```markup 66 | 10/21/2020 4:36:30 AM 67 | ``` 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | -------------------------------------------------------------------------------- /msbuild/write-entire-xml-fragments-in-msbuild-with-xmlpoke.md: -------------------------------------------------------------------------------- 1 | # Write entire XML fragments in MSBuild with XmlPoke 2 | 3 | I recently needed to write an entire project file via a targets \(weird, I know\). I was dreading all the crazy angle brackets escaping as `<` and `>` when I decided to check the [latest official docs on XmlPoke](https://docs.microsoft.com/en-us/visualstudio/msbuild/xmlpoke-task?view=vs-2019) to refresh the parameters and format. To my surprise, I found this example: 4 | 5 | ```markup 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 21 | 22 | 23 | ``` 24 | 25 | Notice how the `$(Namespace)` property has **beautiful** XML inside. Which only makes sense, since, there had to be **some** advantage in MSBuild being XML, right? So I figured, if the namespaces can be a full nested XML element, could the `Value` be too? And the answer was a [resounding YEAHHH](https://github.com/kzu/SmallSharp/blob/main/src/SmallSharp/SmallSharp.targets#L88-L100): 26 | 27 | ```markup 28 | 29 | 30 | 31 | $(StartupFile) 32 | 33 | 34 | 35 | 36 | 40 | ``` 41 | 42 | That writes an entire `` element into a .user project file! 43 | 44 | For the attentive reader: you'd think I made a mistake, since, in XML-land, elements inherit the XML namespace of their parent element. Seems like the XML namespace management in XmlPoke is a bit on the loose side of things: the above snippets are in a .targets file without any xmlns \(it's an SDK-style `` without namespace, to keep it clean. Yet, the property group is added without messing up the namespace: 45 | 46 | ```markup 47 | 48 | 49 | Program.cs 50 | 51 | 52 | ``` 53 | 54 | If it had inserted it without a namespace, per the XML spec, it should have added an `xmlns=""` to clear the parent Project node namespace. But alas, it didn't, and in this particular case, it makes it way more convenient this way :\). 55 | 56 | I'm using this in [SmallSharp](https://github.com/kzu/SmallSharp) to initialize the .user project options properly with the right startup file for multi-startup top-level statements scripts in a single project :\). 57 | 58 | -------------------------------------------------------------------------------- /styles/website.css: -------------------------------------------------------------------------------- 1 | p a { 2 | border-bottom: dashed 1px blue !important; 3 | } -------------------------------------------------------------------------------- /testing/conditional-unit-tests.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: >- 3 | How to run xunit tests conditionally depending on the environment that is 4 | running the tests 5 | --- 6 | 7 | # Conditional unit tests 8 | 9 | Turns out that there aren't any conditional `FactAttribute` in [xunit](https://github.com/xunit/xunit), but you can pretty much copy wholesale the extensions created by the [ASP.NET Core team from their repo](https://github.com/dotnet/runtime/tree/master/src/libraries/Common/tests/Extensions/TestingUtils/Microsoft.AspNetCore.Testing/src/xunit). 10 | 11 | It allows you to do things like: 12 | 13 | ```text 14 | [SkipOnCI] 15 | [ConditionalFact] 16 | public async Task SomeTest() 17 | ``` 18 | 19 | to avoid running a test entirely if you're on CI. the implementation didn't take into account CI builds running via GH actions/workflows, so I tweaked the [OnCI\(\) method](https://github.com/dotnet/runtime/blob/master/src/libraries/Common/tests/Extensions/TestingUtils/Microsoft.AspNetCore.Testing/src/xunit/SkipOnCIAttribute.cs#L38) like so to also consider the commonly used `CI` envvar too: 20 | 21 | ```text 22 | public static bool OnCI() => 23 | (bool.TryParse(Environment.GetEnvironmentVariable("CI"), out var ci) && ci) || 24 | OnHelix() || OnAzdo(); 25 | ``` 26 | 27 | I didn't think it was necessary to create an `OnGitHub` method since the envvar is just `CI` and well, the method is already called `OnCI` :\) 28 | 29 | Another alternative is using the [Xunit.SkippableFact nuget package](https://github.com/AArnott/Xunit.SkippableFact). 30 | 31 | -------------------------------------------------------------------------------- /testing/skip-tagged-scenarios-in-specflow-with-xunit.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: How to skip scenarios with a certain tag from executing 3 | --- 4 | 5 | # Skip tagged scenarios in SpecFlow with Xunit 6 | 7 | First apply a tag of your choosing to flag scenarios that are still not ready for running: 8 | 9 | ```text 10 | Feature: Skipping scenarios by tag 11 | 12 | @Draft 13 | Scenario: This is not ready yet 14 | Given Something not done yet 15 | Then It should not fail the CI build (yet) 16 | ``` 17 | 18 | Now in the `[Binding]` class for any of the steps in the scenario, create a constructor that receives the `ScenarioContext` as follows: 19 | 20 | ```text 21 | public Steps(ScenarioContext context) 22 | { 23 | Skip.If(context.ScenarioInfo.Tags.Contains("Draft")); 24 | } 25 | ``` 26 | 27 | That's it. Turns out that the generated test for a scenario is annotated with `[SkippableFact]` so you can just skip from anywhere during the test run. From [SkippableFact](https://www.nuget.org/packages/Xunit.SkippableFact/) package. 28 | 29 | --------------------------------------------------------------------------------