├── .github └── workflows │ ├── deploy.yml │ └── readme.yml ├── .gitignore ├── 1password └── use-specific-ssh-keys.md ├── README.md ├── astro └── generate-a-static-svg-sprite-sheet.md ├── bash ├── control-terminal-appearance-with-tput.md └── run-commands-when-a-shell-script-exits.md ├── blender └── export-a-blender-file-to-glb-from-the-command-line.md ├── c └── prevent-clangformat-from-breaking-before-curly-braces.md ├── caddy ├── run-a-local-static-file-server-over-https.md ├── run-an-https-reverse-proxy-for-local-development.md └── serve-markdown-files-as-html.md ├── css ├── anchor-scroll-position-to-the-bottom.md ├── dynamically-change-styles-from-html-or-jsx.md ├── make-a-css-variable-color-translucent.md ├── set-default-styles-for-tags.md ├── swap-between-black-and-white-text-based-on-background-color.md ├── use-css-variables-in-a-dialog-backdrop.md └── use-grid-template-to-set-grid-columns-rows-and-areas.md ├── django └── conditionally-extend-a-template.md ├── docker ├── add-root-certificates-to-a-debian-container.md └── fix-at-least-one-invalid-signature-was-encountered.md ├── entr ├── reload-a-webpage-when-a-file-changes.md └── rerun-a-command-when-files-change.md ├── esbuild └── run-a-development-server-with-live-reload.md ├── fzf └── make-a-tui-for-switching-and-deleting-git-branches.md ├── git ├── add-a-git-hook-on-windows.md ├── add-a-global-gitignore.md ├── list-all-files-tracked-by-git.md ├── run-a-command-if-there-are-unstaged-changes.md └── update-all-submodules-to-latest-commit-on-origin.md ├── github ├── run-github-actions-locally.md ├── trigger-a-workflow-run-in-another-repo.md └── write-an-inline-script-in-a-github-actions-workflow.md ├── gltf └── extract-data-from-a-mesh-primitive.md ├── html └── define-a-custom-element.md ├── htmx ├── attach-attributes-to-dynamically-added-elements.md └── load-modal-content-when-shoelace-dialog-opens.md ├── javascript ├── access-css-variables-from-javascript.md ├── load-a-user-created-javascript-file-in-the-browser.md └── programmatically-create-svg-elements.md ├── logic └── send-stereo-output-through-specific-output-channels.md ├── make ├── build-all-files-with-a-given-extension.md └── list-all-commands-in-a-makefile.md ├── math ├── 3d-coordinate-systems.png ├── check-whether-an-angle-is-between-two-other-angles.md ├── convert-between-3d-coordinate-systems.md ├── find-a-point-on-a-sphere.md └── rotate-a-point-around-a-circle.md ├── nextjs └── dont-server-side-render-a-client-component.md ├── nodejs ├── execute-typescript-files-in-nodejs.md └── get-dirname-in-esm.md ├── pnpm └── patch-a-node_modules-dependency.md ├── poetry └── create-venv-folders-within-projects.md ├── prosemirror ├── prevent-extra-whitespace-in-nodeviews.md └── use-a-svelte-component-as-a-nodeview.md ├── redbean └── bundle-files-into-a-redbean-zip-archive.md ├── rsbuild └── migrate-from-create-react-app.md ├── rust └── link-against-a-cpp-file.md ├── svelte ├── bail-out-of-a-reactive-block.md └── force-reactive-state-to-reevaluate.md ├── svg └── create-an-svg-sprite-sheet.md ├── systemd └── set-up-a-basic-service.md ├── tailwind ├── style-conditionally-based-on-data-and-aria-attributes.md └── style-shadow-trees-from-the-light-dom.md ├── threejs ├── display-an-objects-bounding-box.md ├── pick-objects-with-the-mouse-cursor.md └── set-default-camera-in-react-three-fiber.md ├── typescript ├── add-custom-element-to-jsx-intrinsic-elements.md ├── assert-that-a-variable-is-not-null-or-undefined.md ├── tsconfig-flags-to-prevent-common-errors.md ├── type-concrete-subclasses-of-an-abstract-class.md └── types-and-variables-can-share-names.md ├── valibot └── parse-a-date-or-string-into-a-date.md ├── volta └── install-volta-with-homebrew.md └── windows └── create-a-file-without-an-extension.md /.github/workflows/deploy.yml: -------------------------------------------------------------------------------- 1 | name: Deploy 2 | on: 3 | push: 4 | branches: 5 | - main 6 | 7 | jobs: 8 | deploy: 9 | runs-on: ubuntu-latest 10 | steps: 11 | - name: Dispatch to workflows 12 | run: | 13 | curl -H "Accept: application/vnd.github.everest-preview+json" \ 14 | -H "Authorization: token ${{ secrets.PERSONAL_ACCESS_TOKEN }}" \ 15 | --request POST \ 16 | --data '{ "event_type": "deploy" }' https://api.github.com/repos/jakelazaroff/todayilearned/dispatches 17 | -------------------------------------------------------------------------------- /.github/workflows/readme.yml: -------------------------------------------------------------------------------- 1 | # This script creates a nice README.md with an index of all TIL files. 2 | # TIL files are Markdown files named anything other than README.md. 3 | # 4 | # The readme is split into two sections: the "header" and the "index", 5 | # separated by three hyphens on their own line. 6 | # Anything above the three hyphens is the "header" and will be kept as is. 7 | # Anything below will be replaced by the "index". 8 | 9 | name: Build README 10 | 11 | on: 12 | push: 13 | branches: 14 | - main 15 | 16 | jobs: 17 | readme: 18 | runs-on: ubuntu-latest 19 | steps: 20 | - name: Check out repo 21 | uses: actions/checkout@v3 22 | with: 23 | fetch-depth: 0 # get full history or else it'll be overwritten 24 | 25 | - name: Regenerate README 26 | uses: actions/github-script@v6 27 | with: 28 | script: | 29 | const { readFile, writeFile } = require("node:fs/promises"); 30 | 31 | console.log("Building index…"); 32 | 33 | // load readme 34 | let readme = await readFile("README.md").then(file => file.toString()); 35 | 36 | // add header separator 37 | const separator = "\n---\n"; 38 | const index = readme.indexOf(separator); 39 | if (index === -1) readme += separator; 40 | else readme = readme.substring(0, index + separator.length); 41 | 42 | // collect entries 43 | const files = await glob.create("./**/*.md").then(globber => globber.glob()); 44 | const entries = files 45 | .filter(name => !name.endsWith("/README.md")) // exclude README.md 46 | .sort() 47 | .map(name => name.split("/").slice(-2)); 48 | 49 | // add summary 50 | readme += `\n${entries.length} TILs so far:\n`; 51 | 52 | // create category map 53 | const categories = new Map(); 54 | for (const [category, file] of entries) { 55 | const list = categories.get(category) || []; 56 | categories.set(category, [...list, file]); 57 | } 58 | 59 | // create a section for each category 60 | for (const [category, entries] of categories.entries()) { 61 | // write category header 62 | readme += `\n## ${category}\n\n`; 63 | 64 | // write link for each file 65 | for (const file of entries) { 66 | const filepath = [category, file].join("/"); 67 | const contents = await readFile(filepath).then(file => file.toString()); 68 | const [, title] = contents.match(/^# (.+)$/m); 69 | readme += `- [${title}](/${filepath})\n`; 70 | } 71 | } 72 | 73 | // write readme 74 | await writeFile("README.md", readme); 75 | 76 | - name: Commit and push if README changed 77 | run: |- 78 | git config --global user.email "actions@users.noreply.github.com" 79 | git config --global user.name "tilbot" 80 | git diff --quiet || git commit --all --message "Update README.md" 81 | git push 82 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .DS_Store 2 | -------------------------------------------------------------------------------- /1password/use-specific-ssh-keys.md: -------------------------------------------------------------------------------- 1 | # Use specific SSH keys 2 | 3 | One reason I use 1Password over Apple Passwords is that it also [manages your SSH keys](https://developer.1password.com/docs/ssh/). 4 | 5 | I recently got a new job, and because _all_ my passwords live in 1Password (including employee benefits and such, which use my personal email) I logged into my personal vault on my work computer. 6 | 7 | That meant both my personal computer's SSH key (from my personal vault) and my work computer's SSH key (from my employer's vault) were synced on that computer. And annoyingly, 1Password kept asking if I wanted to use my personal computer's SSH key on my work computer. 8 | 9 | It turns out that you can fix that! In the menu on the top right of an SSH key in 1Password, there's an option to "Configure for SSH Agent…". That opens a file called `agent.toml` in your text editor. 10 | 11 | Most of the file is a long comment explaining how it works, and it's also [painstakingly documented online](https://developer.1password.com/docs/ssh/agent/config/), but here's the gist: 12 | 13 | ```toml 14 | [[ssh-keys]] 15 | account = "My Work Account Name" 16 | ``` 17 | 18 | Basically, you have as many of those `[[ssh-keys]]` items as you want with the criteria for which keys to include. You can specify them by `account`, `vault` or `item`. In my case, it's enough to just specify the account, but if the company leans more into 1Password I could get even more specific. 19 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Today I Learned 2 | 3 | A collection of useful things I've learned. Inspired by [simonw/til](https://github.com/simonw/til). 4 | 5 | See the full website at [til.jakelazaroff.com](https://til.jakelazaroff.com). 6 | 7 | Subscribe via [RSS](https://til.jakelazaroff.com/rss.xml)! 8 | 9 | --- 10 | 11 | 72 TILs so far: 12 | 13 | ## 1password 14 | 15 | - [Use specific SSH keys](/1password/use-specific-ssh-keys.md) 16 | 17 | ## astro 18 | 19 | - [Generate a static SVG sprite sheet](/astro/generate-a-static-svg-sprite-sheet.md) 20 | 21 | ## bash 22 | 23 | - [Control terminal appearance with `tput`](/bash/control-terminal-appearance-with-tput.md) 24 | - [Run commands when a shell script exits](/bash/run-commands-when-a-shell-script-exits.md) 25 | 26 | ## blender 27 | 28 | - [Export a Blender file to GLB from the command line](/blender/export-a-blender-file-to-glb-from-the-command-line.md) 29 | 30 | ## c 31 | 32 | - [Prevent ClangFormat from breaking before curly braces](/c/prevent-clangformat-from-breaking-before-curly-braces.md) 33 | 34 | ## caddy 35 | 36 | - [Run a local static file server over HTTPS](/caddy/run-a-local-static-file-server-over-https.md) 37 | - [Run an HTTPS reverse proxy for local development](/caddy/run-an-https-reverse-proxy-for-local-development.md) 38 | - [Serve Markdown files as HTML](/caddy/serve-markdown-files-as-html.md) 39 | 40 | ## css 41 | 42 | - [Anchor scroll position to the bottom](/css/anchor-scroll-position-to-the-bottom.md) 43 | - [Dynamically change styles from HTML or JSX](/css/dynamically-change-styles-from-html-or-jsx.md) 44 | - [Make a CSS variable color translucent](/css/make-a-css-variable-color-translucent.md) 45 | - [Set default styles for tags](/css/set-default-styles-for-tags.md) 46 | - [Swap between black and white text based on background color](/css/swap-between-black-and-white-text-based-on-background-color.md) 47 | - [Use CSS variables in a `` backdrop](/css/use-css-variables-in-a-dialog-backdrop.md) 48 | - [Use `grid-template` to set grid columns, rows and areas](/css/use-grid-template-to-set-grid-columns-rows-and-areas.md) 49 | 50 | ## django 51 | 52 | - [Conditionally extend a template](/django/conditionally-extend-a-template.md) 53 | 54 | ## docker 55 | 56 | - [Add root certificates to a Debian container](/docker/add-root-certificates-to-a-debian-container.md) 57 | - [Fix "At least one invalid signature was encountered"](/docker/fix-at-least-one-invalid-signature-was-encountered.md) 58 | 59 | ## entr 60 | 61 | - [Reload a webpage when a file changes](/entr/reload-a-webpage-when-a-file-changes.md) 62 | - [Rerun a command when files change](/entr/rerun-a-command-when-files-change.md) 63 | 64 | ## esbuild 65 | 66 | - [Run a development server with live reload](/esbuild/run-a-development-server-with-live-reload.md) 67 | 68 | ## fzf 69 | 70 | - [Make a TUI for switching and deleting git branches](/fzf/make-a-tui-for-switching-and-deleting-git-branches.md) 71 | 72 | ## git 73 | 74 | - [Add a Git hook on Windows](/git/add-a-git-hook-on-windows.md) 75 | - [Add a global `.gitignore`](/git/add-a-global-gitignore.md) 76 | - [List all files tracked by git](/git/list-all-files-tracked-by-git.md) 77 | - [Run a command if there are unstaged changes](/git/run-a-command-if-there-are-unstaged-changes.md) 78 | - [Update all submodules to latest commit on origin](/git/update-all-submodules-to-latest-commit-on-origin.md) 79 | 80 | ## github 81 | 82 | - [Run GitHub Actions locally](/github/run-github-actions-locally.md) 83 | - [Trigger a workflow run in another repo](/github/trigger-a-workflow-run-in-another-repo.md) 84 | - [Write an inline script in a GitHub Actions workflow](/github/write-an-inline-script-in-a-github-actions-workflow.md) 85 | 86 | ## gltf 87 | 88 | - [Extract data from a mesh primitive](/gltf/extract-data-from-a-mesh-primitive.md) 89 | 90 | ## html 91 | 92 | - [Define a custom element](/html/define-a-custom-element.md) 93 | 94 | ## htmx 95 | 96 | - [Attach attributes to dynamically added elements](/htmx/attach-attributes-to-dynamically-added-elements.md) 97 | - [Load modal content when a Shoelace dialog opens](/htmx/load-modal-content-when-shoelace-dialog-opens.md) 98 | 99 | ## javascript 100 | 101 | - [Access CSS variables from JavaScript](/javascript/access-css-variables-from-javascript.md) 102 | - [Load a user-created JavaScript file in the browser](/javascript/load-a-user-created-javascript-file-in-the-browser.md) 103 | - [Programmatically create SVG elements](/javascript/programmatically-create-svg-elements.md) 104 | 105 | ## logic 106 | 107 | - [Send stereo output through specific output channels](/logic/send-stereo-output-through-specific-output-channels.md) 108 | 109 | ## make 110 | 111 | - [Build all files with a given extension](/make/build-all-files-with-a-given-extension.md) 112 | - [List all commands in a Makefile](/make/list-all-commands-in-a-makefile.md) 113 | 114 | ## math 115 | 116 | - [Check whether an angle is between two other angles](/math/check-whether-an-angle-is-between-two-other-angles.md) 117 | - [Convert between 3D coordinate systems](/math/convert-between-3d-coordinate-systems.md) 118 | - [Find a point on a sphere](/math/find-a-point-on-a-sphere.md) 119 | - [Rotate a point around a circle](/math/rotate-a-point-around-a-circle.md) 120 | 121 | ## nextjs 122 | 123 | - [Don't server-side render a client component](/nextjs/dont-server-side-render-a-client-component.md) 124 | 125 | ## nodejs 126 | 127 | - [Execute TypeScript files in Node.js](/nodejs/execute-typescript-files-in-nodejs.md) 128 | - [Get `__dirname` in ESM](/nodejs/get-dirname-in-esm.md) 129 | 130 | ## pnpm 131 | 132 | - [Patch a `node_modules` dependency](/pnpm/patch-a-node_modules-dependency.md) 133 | 134 | ## poetry 135 | 136 | - [Create `.venv` folders within projects](/poetry/create-venv-folders-within-projects.md) 137 | 138 | ## prosemirror 139 | 140 | - [Prevent extra whitespace in NodeViews](/prosemirror/prevent-extra-whitespace-in-nodeviews.md) 141 | - [Use a Svelte component as a NodeView](/prosemirror/use-a-svelte-component-as-a-nodeview.md) 142 | 143 | ## redbean 144 | 145 | - [Bundle files into a redbean zip archive](/redbean/bundle-files-into-a-redbean-zip-archive.md) 146 | 147 | ## rsbuild 148 | 149 | - [Migrate from Create React App](/rsbuild/migrate-from-create-react-app.md) 150 | 151 | ## rust 152 | 153 | - [Link against a C++ file](/rust/link-against-a-cpp-file.md) 154 | 155 | ## svelte 156 | 157 | - [Bail out of a reactive block](/svelte/bail-out-of-a-reactive-block.md) 158 | - [Force reactive state to reevaluate](/svelte/force-reactive-state-to-reevaluate.md) 159 | 160 | ## svg 161 | 162 | - [Create an SVG sprite sheet](/svg/create-an-svg-sprite-sheet.md) 163 | 164 | ## systemd 165 | 166 | - [Set up a basic service](/systemd/set-up-a-basic-service.md) 167 | 168 | ## tailwind 169 | 170 | - [Style conditionally based on data and ARIA attributes](/tailwind/style-conditionally-based-on-data-and-aria-attributes.md) 171 | - [Style shadow trees from the light DOM](/tailwind/style-shadow-trees-from-the-light-dom.md) 172 | 173 | ## threejs 174 | 175 | - [Display an object's bounding box](/threejs/display-an-objects-bounding-box.md) 176 | - [Pick objects with the mouse cursor](/threejs/pick-objects-with-the-mouse-cursor.md) 177 | - [Set default camera in React Three Fiber](/threejs/set-default-camera-in-react-three-fiber.md) 178 | 179 | ## typescript 180 | 181 | - [Add custom element to `JSX.IntrinsicElements`](/typescript/add-custom-element-to-jsx-intrinsic-elements.md) 182 | - [Assert that a variable is not `null` or `undefined`](/typescript/assert-that-a-variable-is-not-null-or-undefined.md) 183 | - [tsconfig flags to prevent common errors](/typescript/tsconfig-flags-to-prevent-common-errors.md) 184 | - [Type concrete subclasses of an abstract class](/typescript/type-concrete-subclasses-of-an-abstract-class.md) 185 | - [Types and variables can share names](/typescript/types-and-variables-can-share-names.md) 186 | 187 | ## valibot 188 | 189 | - [Parse a `Date` or `string` into a `Date`](/valibot/parse-a-date-or-string-into-a-date.md) 190 | 191 | ## volta 192 | 193 | - [Install Volta with Homebrew](/volta/install-volta-with-homebrew.md) 194 | 195 | ## windows 196 | 197 | - [Create a file without an extension](/windows/create-a-file-without-an-extension.md) 198 | -------------------------------------------------------------------------------- /astro/generate-a-static-svg-sprite-sheet.md: -------------------------------------------------------------------------------- 1 | # Generate a static SVG sprite sheet 2 | 3 | Since learning about SVG sprite sheets, I've used them rather than SVGs inlined in HTML. 4 | 5 | Generally, my process looks something like this: 6 | 7 | 1. Put all my SVGs in a folder. 8 | 2. Run [`svg-sprite`](https://www.npmjs.com/package/svg-sprite) on the command line to combine them into a single SVG sprite. 9 | 3. Include them in HTML with the `` tag. 10 | 11 | I usually run it out of band, immediately before actually building the website. My build command ends up looking something like `pnpm svg && pnpm build` (where those are both `package.json` scripts that build the sprite sheet and the rest of the site). 12 | 13 | No longer! At least for Astro sites. Arne Bahlo has a great tutorial on [statically generating open graph images using Astro API routes](https://arne.me/articles/static-og-images-in-astro), and I realized that the same technique will work to generate SVG sprite sheets. 14 | 15 | The key insight is that in static mode, [API routes](https://docs.astro.build/en/core-concepts/endpoints/#server-endpoints-api-routes) get rendered to static files. So this roughly the same workflow, except `svg-sprite` gets called programmatically instead of on the command line. 16 | 17 | Enough introduction! Let's get to it. 18 | 19 | First, install `svg-sprite` (and `@types/svg-sprite` if you're using TypeScript). 20 | 21 | Then, add this `icons.svg.js` file to your `pages` directory: 22 | 23 | ```js 24 | import { readdir, readFile } from "node:fs/promises"; 25 | import { resolve } from "node:path"; 26 | import SVGSpriter from "svg-sprite"; 27 | 28 | const ICON_DIR = "./assets/icons"; 29 | 30 | export const GET = async function GET() { 31 | // create an `svg-sprite` instance 32 | const spriter = new SVGSpriter({ mode: { symbol: true } }); 33 | 34 | // add all the svgs 35 | for (const svg of await readdir(ICON_DIR)) { 36 | const path = resolve(ICON_DIR, svg); 37 | spriter.add(path, svg, await readFile(path).then(file => file.toString())); 38 | } 39 | 40 | // compile the svgs into a sprite sheet 41 | const { result } = await spriter.compileAsync(); 42 | 43 | // respond with the compiled svg 44 | const svg = result.symbol.sprite.contents; 45 | return new Response(svg, { headers: { "content-type": "image/svg+xml" } }); 46 | }; 47 | ``` 48 | 49 | Every time there's a request to `/icons.svg`, that will read all the SVGs in `/assets/icons`, compile them to a sprite sheet and respond with it. That sounds inefficient, but remember that "every time there's a request" only happens in development; for production, the handler runs once at build time and the generated SVG file is written to disk. 50 | 51 | To actually use the resulting sprite, I have this Astro component: 52 | 53 | ```astro 54 | --- 55 | interface Props { 56 | icon: string; 57 | size?: number; 58 | } 59 | 60 | const { icon, size = 16 } = Astro.props; 61 | --- 62 | 63 | 64 | 65 | 66 | ``` 67 | 68 | The `icon` prop is equal to whatever the file name of the original SVG was, minus the extension. So you'd use it like this: 69 | 70 | ```astro 71 | 72 | ``` 73 | -------------------------------------------------------------------------------- /bash/control-terminal-appearance-with-tput.md: -------------------------------------------------------------------------------- 1 | # Control terminal appearance with `tput` 2 | 3 | To control things like text color and cursor position in a terminal emulator, you need to use [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code). These are mostly-inscrutable character sequences that can be printed to perform specific actions. 4 | 5 | For example, here's how to print some text in red followed by some text in black: 6 | 7 | ```bash 8 | echo "\033[0;31m" red text "\033[0;30m" black text 9 | ``` 10 | 11 | The strings `\033[0;31m` and `\033[0;30m` are the escape code for red and black foregrounds, respectively. 12 | 13 | One problem is that not all terminals support all codes. There's a database called [terminfo](https://en.wikipedia.org/wiki/Terminfo) that describes the capabilities of each terminal, determined by an identifier (you can see which identifier your terminal is using by running `echo $TERM`). Each terminal's capabilities are identified by "capnames". 14 | 15 | Luckily, there's a command to handle all this called [`tput`](https://www.gnu.org/software/termutils/manual/termutils-2.0/html_chapter/tput_1.html). Here's how to use it to print the same text as the previous example: 16 | 17 | ```bash 18 | echo "$(tput setaf 1)" red text "$(tput setaf 0)" black text 19 | ``` 20 | 21 | `setaf` is the capname for changing the foreground color, and 1 and 0 are the standard color codes for red and black. I'm not sure whether there's a list mapping human-readable color names to numbers, but you can easily fix that with environment variables: 22 | 23 | ```bash 24 | RED=1 25 | BLACK=0 26 | echo "$(tput setaf $RED)" red text "$(tput setaf $BLACK)" black text 27 | ``` 28 | 29 | `tput` can do more than set colors. For example, you might run `tput civis` and `tput cnorm` to hide the cursor during a progress animation, then show it again after: 30 | 31 | ```bash 32 | # hide the cursor 33 | tput civis 34 | 35 | # fancy progress animation code goes here 36 | 37 | # show the cursor 38 | tput cnorm 39 | ``` -------------------------------------------------------------------------------- /bash/run-commands-when-a-shell-script-exits.md: -------------------------------------------------------------------------------- 1 | # Run commands when a shell script exits 2 | 3 | Sometimes it's useful to run commands when a shell script exits. For example, we might want to hide the cursor while playing a progress animation, and show it again when finished. But what if the user hits `ctrl-c` to exit the application? If we don't explicitly re-enable the cursor, it'll remain hidden even after the script has exited. 4 | 5 | A bit of searching turned up the [`trap`](https://man7.org/linux/man-pages/man1/trap.1p.html) command, which runs another command when the program receives a signal from the operating system. 6 | 7 | The syntax is `trap [action condition...]`. Here's an example of running some cleanup code when the program receives `SIGINT` (which is triggered by `ctrl-c`) or `SIGTERM`: 8 | 9 | ```bash 10 | cleanup () { 11 | # commands go here 12 | exit 13 | } 14 | 15 | trap cleanup INT TERM 16 | ``` -------------------------------------------------------------------------------- /blender/export-a-blender-file-to-glb-from-the-command-line.md: -------------------------------------------------------------------------------- 1 | # Export a Blender file to GLB from the command line 2 | 3 | Blender includes a [command line interface](https://docs.blender.org/manual/en/latest/advanced/command_line/index.html). The included arguments mostly have to do with rendering, but it also provides a flag to execute a Python expression, which makes [its entire Python API](https://docs.blender.org/api/current/index.html) available. 4 | 5 | Here's a one-liner that takes a `.blend` file and exports it to a binary glTF (`.glb`) file: 6 | 7 | ```sh 8 | /Applications/Blender.app/Contents/MacOS/Blender -b input.blend \ 9 | --python-expr "import bpy; bpy.ops.export_scene.gltf(filepath='output.glb')" 10 | ``` 11 | 12 | - `/Applications/Blender.app/Contents/MacOS/Blender` is the path to the Blender executable. 13 | - `-b` tells Blender to run headlessly. 14 | - `input.blend` is the input file. 15 | - `--python-expr` is the command line flag that has Blender run the expression as Python. 16 | - `import bpy` imports the Blender library into the Python script. 17 | - [`bpy.ops.export_scene.gltf`](https://docs.blender.org/api/current/bpy.ops.export_scene.html#bpy.ops.export_scene.gltf) is the Blender API function that exports the scene as a binary glTF. 18 | - `filepath='output.glb'` tells Blender to save the exported file as `output.glb`. 19 | -------------------------------------------------------------------------------- /c/prevent-clangformat-from-breaking-before-curly-braces.md: -------------------------------------------------------------------------------- 1 | # Prevent ClangFormat from breaking before curly braces 2 | 3 | For some unfathomable reason, C and C++ developers sometimes put line breaks before curly braces: 4 | 5 | ```c 6 | int main() 7 | { 8 | for (int i = 0; i < 10; i++) 9 | { 10 | printf("%d\n", i); 11 | } 12 | } 13 | ``` 14 | 15 | It's also the default formatting style in the code formatter [ClangFormat](https://clang.llvm.org/docs/ClangFormat.html). 16 | 17 | Fortunately, it's configurable by setting the [`BreakBeforeBraces` option](https://clang.llvm.org/docs/ClangFormatStyleOptions.html#breakbeforebraces) to `Attach` in your `.clang-format` file: 18 | 19 | ``` 20 | BreakBeforeBraces: Attach 21 | ``` 22 | 23 | Much more readable: 24 | 25 | ```c 26 | int main() { 27 | for (int i = 0; i < 10; i++) { 28 | printf("%d\n", i); 29 | } 30 | } 31 | ``` 32 | -------------------------------------------------------------------------------- /caddy/run-a-local-static-file-server-over-https.md: -------------------------------------------------------------------------------- 1 | # Run a local static file server over HTTPS 2 | 3 | It feels like every server-side runtime has a one-shot command to run a local web server. Node has `npx http-server`, Python has `python -m http.server`, etc. That works fine if you can serve your files over HTTP, but what if you need to use HTTPS? 4 | 5 | [Caddy](https://caddyserver.com) is a web server written in Go. It made a name for itself by being the first (I think?) to automatically handle SSL certificates via Let's Encrypt. On macOS, you can install it via [Homebrew](https://formulae.brew.sh/formula/caddy), which adds a command you can run from anywhere. 6 | 7 | Here's how to use Caddy to serve static files in your current directory over HTTPS: 8 | 9 | ```bash 10 | caddy file-server --domain localhost 11 | ``` 12 | 13 | Yup, [that's it](https://caddyserver.com/docs/quick-starts/https#the-file-server-command). There are a few more options for the [`file-server`](https://caddyserver.com/docs/quick-starts/static-files) subcommand if you need them, but an HTTPS static file server is a single short command with no config. How cool is that? -------------------------------------------------------------------------------- /caddy/run-an-https-reverse-proxy-for-local-development.md: -------------------------------------------------------------------------------- 1 | # Run an HTTPS reverse proxy for local development 2 | 3 | Really the title should be something like "run an SSL-terminating reverse proxy", but most of the resources on the subject talk about using "HTTPS for local development", so here we are. 4 | 5 | The standard advice is to use a tool called [mkcert](https://github.com/FiloSottile/mkcert), which installs a root CA on your computer and issues self-signed certificates for local domains. That works, but the drawback is that your app now needs to deal with SSL termination. It's not the end of the world, but in production SSL termination probably happens at the network layer before your app receives any requests. 6 | 7 | That's where web-server-Swiss-army-knife Caddy comes in! Rather than using HTTPS for your local development server, have Caddy handle the SSL stuff and proxy the request back to your local development, which (none the wiser) serves plain old unencrypted HTTP. 8 | 9 | Unsurprisingly, [Caddy's documentation](https://caddyserver.com/docs/quick-starts/reverse-proxy) reveals that there's one-liner to do this: 10 | 11 | ```sh 12 | caddy reverse-proxy --to http://localhost:8080 13 | ``` 14 | 15 | Replace `8080` with whatever your development server's local port is. Boom: an SSL-terminating reverse proxy (or, if you prefer, "HTTPS") in local development, with no changes to your code! 16 | -------------------------------------------------------------------------------- /caddy/serve-markdown-files-as-html.md: -------------------------------------------------------------------------------- 1 | # Serve Markdown files as HTML 2 | 3 | [Caddy](https://caddyserver.com) is a simple web server that nonetheless has a bunch of cool features. One example is pre-processing responses using Go's [`text/template` package](https://pkg.go.dev/text/template). 4 | 5 | Here's a Caddyfile that uses a template to render `.md` files as HTML: 6 | 7 | ```caddy 8 | localhost:8080 9 | 10 | file_server /* 11 | templates 12 | 13 | @md { 14 | file 15 | path *.md 16 | } 17 | 18 | rewrite @md /markdown.html 19 | ``` 20 | 21 | The first few lines set up the server: 22 | 23 | - `localhost:8080` tells Caddy to serve at `localhost` on port 8080 24 | - `file_server /*` tells it to run a file server for all paths below the current directory 25 | - `templates` enables the [template middleware](https://caddyserver.com/docs/modules/http.handlers.templates), which treats the responses as Go templates 26 | 27 | The `@md` block is a [named matcher](https://caddyserver.com/docs/caddyfile/matchers#named-matchers). It captures a set of constraints that can be reused elsewhere. In this case, it matches all paths that end in `.md`. 28 | 29 | The final line `rewrite @md /markdown.html` serves the file `/markdown.html` whenever a request matches the `@md` block — i.e. whenever a Markdown file is requested. 30 | 31 | Here's a simple example of what `markdown.html` could look like: 32 | 33 | ```html 34 | {{$md := (include .OriginalReq.URL.Path | splitFrontMatter)}} 35 | 36 | 37 | 38 | {{$md.Meta.title}} 39 | 40 | 41 |

{{$md.Meta.title}}

42 | {{markdown $md.Body}} 43 | 44 | 45 | ``` 46 | 47 | At the top, we get the original request path, include the Markdown file, split the front matter and store it in the variable `$md`. 48 | 49 | The page title is set to the `title` property in the front matter. 50 | 51 | The body renders the `title` from the front matter in an `

` tag, then uses the [`markdown`](https://caddyserver.com/docs/modules/http.handlers.templates#markdown) template function to render the Markdown as HTML. 52 | -------------------------------------------------------------------------------- /css/anchor-scroll-position-to-the-bottom.md: -------------------------------------------------------------------------------- 1 | # Anchor scroll position to the bottom 2 | 3 | Recently, I was building a vertical "timeline" for an explorable explanation, in which new elements would be added below any existing ones. As elements were added, I wanted the scroll position anchored to the bottom of the timeline, as it would be in a chat app. 4 | 5 | Specifically: 6 | 7 | - The first element should start at the top. 8 | - New elements should enter the timeline below any existing elements. 9 | - If the container is scrolled to the bottom, it should remain scrolled to the bottom to reveal new elements as they're added. 10 | - But if the container is scrolled _up_, the scroll position should remain the same such that new elements are added below the fold. 11 | 12 | Initially, I thought I might have to mess around with [`MutationObserver`](https://developer.mozilla.org/en-US/docs/Web/API/MutationObserver), but I found [this StackOverflow answer](https://stackoverflow.com/a/44051405) with a neat CSS-only solution. 13 | 14 | The answer used `flex-direction: column-reverse`. That mostly works, with two caveats: 15 | 16 | 1. Elements start at the bottom, rather than the top. 17 | 2. More seriously, the markup needs to be in reverse order. 18 | 19 | Luckily, a commenter pointed out a trick to fix both of those: wrap the list in another element, and apply the flexbox properties to the _wrapper_. Then, to force the list to the top of the wrapper, add a shim before it in the markup (which places it below it in the wrapper's `column-reverse` flexbox flow). 20 | 21 | All in all, the markup would look something like this: 22 | 23 | ```html 24 |
25 |
26 |
    27 | 28 |
29 |
30 | ``` 31 | 32 | And the CSS would look like this: 33 | 34 | ```css 35 | .wrapper { 36 | display: flex; 37 | flex-direction: column-reverse; 38 | overflow: auto; 39 | } 40 | 41 | .shim { 42 | flex: 1; 43 | } 44 | ``` 45 | -------------------------------------------------------------------------------- /css/dynamically-change-styles-from-html-or-jsx.md: -------------------------------------------------------------------------------- 1 | # Dynamically change styles from HTML or JSX 2 | 3 | (No, it's not Tailwind.) 4 | 5 | I wanted a button on my website to be "RSS orange" even though that color wasn't anywhere else in my color scheme. Hardcoding that within the button component itself felt overly specific; the button should mostly be agnostic to what's inside of it. But the button component encapsulated its own markup and styles, and I didn't want to break that encapsulation by reaching in from outside to override things. 6 | 7 | The solution, surprisingly, was to use inline styles! Not by setting any properties, but by setting a [custom property](https://developer.mozilla.org/en-US/docs/Web/CSS/Using_CSS_custom_properties) — often known as a CSS variable. 8 | 9 | ```html 10 | 11 | ``` 12 | 13 | That sets the custom property `--accent` to `#ff6600` within **that specific button element**. The corresponding CSS might look something like this: 14 | 15 | ```css 16 | .button { 17 | background-color: var(--accent, var(--color-primary)); 18 | } 19 | ``` 20 | 21 | The key is to specify a [fallback value](https://developer.mozilla.org/en-US/docs/Web/CSS/Using_CSS_custom_properties#custom_property_fallback_values) when accessing `--accent`. So by default, everything with the class `button` would have a background color of `--color-primary` — **except** when `--accent` is defined, in which they'll use `--accent`. 22 | 23 | This pattern pairs particularly well with component-based frameworks like React, where it's common to change behavior by exposing props. A `Button` component using this pattern might look like this: 24 | 25 | ```jsx 26 | function Button({ accent, children }) { 27 | // only set `--accent` if it's not an empty string 28 | // otherwise it'll be set to the literal string "undefined" and everything will break 29 | const style = accent ? { "--accent": accent } : {}; 30 | 31 | return ( 32 | 35 | ); 36 | } 37 | ``` 38 | 39 | To use it, just pass something to the `accent` prop: 40 | 41 | ```jsx 42 | 43 | ``` 44 | -------------------------------------------------------------------------------- /css/make-a-css-variable-color-translucent.md: -------------------------------------------------------------------------------- 1 | # Make a CSS variable color translucent 2 | 3 | Given a color stored in a CSS variable, how do you change the opacity? This [Stack Overflow answer](https://stackoverflow.com/a/71098929) has a couple good suggestions. 4 | 5 | First is `color-mix`, which is [widely supported in browsers today](https://caniuse.com/?search=color-mix). It lets you mix two colors in a given color space. The trick is to mix some percentage of the color variable with the color `transparent`: 6 | 7 | ```css 8 | .classname { 9 | color: color-mix(in srgb, var(--some-color) 50%, transparent); 10 | } 11 | ``` 12 | 13 | Next is relative color syntax, which [as of August 2023 is only supported in Safari](https://caniuse.com/css-relative-colors). The keyword `from` lets you convert colors between formats; you can then fill in the missing alpha value when passing it to [a color function](https://developer.mozilla.org/en-US/docs/Web/CSS/color_value): 14 | 15 | ```css 16 | .classname { 17 | color: rgb(from var(--some-color) / 50%); 18 | } 19 | ``` 20 | -------------------------------------------------------------------------------- /css/set-default-styles-for-tags.md: -------------------------------------------------------------------------------- 1 | # Set default styles for tags 2 | 3 | Yes, you could just use tag selectors, but here's [a nifty trick from Stephanie Eckles](https://moderncss.dev/modern-css-for-dynamic-component-based-architecture/) so that you don't need to unset the default styles for when you're customizing the same tags: 4 | 5 | ```css 6 | a:not([class]) { 7 | /* styles go here */ 8 | } 9 | ``` 10 | 11 | The `not:([class])` attribute restricts this selector to only matching elements with no class name — meaning any time an `` tag is customized with a class, it won't have the default styles. -------------------------------------------------------------------------------- /css/swap-between-black-and-white-text-based-on-background-color.md: -------------------------------------------------------------------------------- 1 | # Swap between black and white text based on background color 2 | 3 | The title is overselling it a little, but not by much. Devon Govett posted [this clever trick](https://bsky.app/profile/devongovett.bsky.social/post/3lcedcdj4qk2y) on Bluesky using [CSS relative colors](https://developer.chrome.com/blog/css-relative-color-syntax) and [LCH](https://developer.mozilla.org/en-US/docs/Web/CSS/color_value/lch): 4 | 5 | ```css 6 | .magic { 7 | --bg: red; 8 | background: var(--bg); 9 | color: lch(from var(--bg) calc((49.44 - l) * infinity) 0 0); 10 | } 11 | ``` 12 | 13 | So it won't automagically work with any color behind the element; you need to have it stored as a CSS variable. 14 | 15 | LCH has three properties: lightness, chroma and hue. 16 | 17 | - Lightness is a number between `0` and `100` representing how bright a color is. `0` corresponds to black, while `100` corresponds to white. 18 | - Chroma is a technically unbounded number that represents "how much" color is present. 19 | - Hue is an angle that represents the hue angle, as in HSL or HSV. 20 | 21 | You supply them to the `lch` function like this: 22 | 23 | ```css 24 | .magic { 25 | --lightness: 50; 26 | --chroma: 72.2; 27 | --hue: 56.2; 28 | color: lch(var(--lightness) var(--chroma) var(--hue)); 29 | } 30 | ``` 31 | 32 | When used with relative color syntax, the color gets "broken up" into its constituent parts which are referred to by `l`, `c` and `h`. So, for example, this would set the background color to the same as the text color; `l`, `c` and `h` take their values from the `from` color and are placed in the appropriate slots unmodified. 33 | 34 | ```css 35 | .magic { 36 | --bg: red; 37 | background: var(--bg); 38 | color: lch(from var(--bg) l c h); 39 | } 40 | ``` 41 | 42 | Devon's example makes two big changes: 43 | 44 | 1. It discards the chroma and hue values, replacing them with `0`. 45 | 2. It inverts the color's lightness and multiplies it by `infinity` to obtain white or black. 46 | 47 | #2 might be confusing, so let's dig into some examples. The basic idea is that if a color's lightness is _above_ some threshold value, we want the text to be black; if it's _below_ that value, we want the text to be white. 48 | 49 | Remember, the calculation is `(49.44 - l) * infinity`, clamped within the range `[0, 100]`: 50 | 51 | - CSS `red` has an LCH lightness of `54.29`. 52 | 1. `49.44` - `54.29` = `-4.85` 53 | 2. `-4.85` \* `infinity` = `-infinity` 54 | 3. `-infinity` gets clamped to `0` (black) 55 | - CSS `blue` has an LCH lightness of `29.57`. 56 | 1. `49.44` - `29.57` = `19.87` 57 | 2. `19.87` \* `infinity` = `-infinity` 58 | 3. `infinity` gets clamped to `100` (white) 59 | - CSS `white` has an LCH lightness of `100`. 60 | 1. `49.44` - `100` = `-50.56` 61 | 2. `-4.85` \* `infinity` = `-infinity` 62 | 3. `-infinity` gets clamped to `0` (black) 63 | 64 | Why `49.44`? Devon tested it with all RGB colors and found it had the least number of WCAG 4.5:1 contrast failures. 65 | 66 | Addendum: After publishing, Noah Liebman [pointed out to me](https://mastodon.social/@noleli/113586705788122139) that Lea Verou had [independently come up with the same technique](https://lea.verou.me/blog/2024/contrast-color/) earlier this year! 67 | -------------------------------------------------------------------------------- /css/use-css-variables-in-a-dialog-backdrop.md: -------------------------------------------------------------------------------- 1 | # Use CSS variables in a `` backdrop 2 | 3 | I was building a native modal using the new(ish) [`` element](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/dialog) when I realized that I couldn't use CSS variables (officially called [custom properties](https://developer.mozilla.org/en-US/docs/Web/CSS/Using_CSS_custom_properties)) in the `::backdrop` pseudo-element. 4 | 5 | I found [this Stack Overflow post](https://stackoverflow.com/questions/58818299/css-variables-not-working-in-dialogbackdrop) that links to a (now-removed) paragraph in the spec for the `::backdrop`: 6 | 7 | > It does not inherit from any element and is not inherited from. No restrictions are made on what properties apply to this pseudo-element either. 8 | 9 | Since variables propagate via inheritance, that means that the `::backdrop` doesn't have access to variables defined on the `:root` selector. 10 | 11 | Fortunately, the fix is pretty simple: just define any variables that should be globally available on both the `:root` and `::backdrop` selectors. 12 | 13 | ```css 14 | :root, 15 | ::backdrop { 16 | --some-variable-name: 10px; 17 | --some-other-variable-name: #f00; 18 | /* etc etc */ 19 | } 20 | ``` 21 | -------------------------------------------------------------------------------- /css/use-grid-template-to-set-grid-columns-rows-and-areas.md: -------------------------------------------------------------------------------- 1 | # Use `grid-template` to set grid columns, rows and areas 2 | 3 | I'm always forgetting how the [grid-template](https://developer.mozilla.org/en-US/docs/Web/CSS/grid-template) shorthand works, so here's a quick reference. 4 | 5 | The simplest usage is just as a shorthand for `grid-rows` and `grid-columns`: 6 | 7 | ```css 8 | grid-template: 2rem 1fr / 20rem 1fr; 9 | ``` 10 | 11 | Before the slash is rows; after the slash is columns. It's equivalent to this longhand: 12 | 13 | ```css 14 | grid-template-rows: 2rem 1fr; 15 | grid-template-columns 20rem 1fr; 16 | ``` 17 | 18 | Things get a little more interesting when you also use it as a shorthand for `grid-template-areas`: 19 | 20 | ```css 21 | grid-template: 22 | "toolbar toolbar" 2rem 23 | "sidebar content" 1fr 24 | / 20rem 1fr; 25 | ``` 26 | 27 | Each line before the slash contains the grid area names and height of a row; after the slash contains the column sizes. It's equivalent to this longhand: 28 | 29 | ```css 30 | grid-template-rows: 2rem 1fr; 31 | grid-template-columns: 20rem 1fr; 32 | grid-template-areas: 33 | "toolbar toolbar" 34 | "sidebar content"; 35 | ``` 36 | -------------------------------------------------------------------------------- /django/conditionally-extend-a-template.md: -------------------------------------------------------------------------------- 1 | # Conditionally extend a template 2 | 3 | Django supports [template inheritance](https://docs.djangoproject.com/en/5.0/ref/templates/language/#id1) via the `extends` tag. Often, a template inheritance chain follows all the way up to a "base" template that contains the full markup for a page, from the doctype to the closing HTML tag. 4 | 5 | A common pattern when using libraries like htmx is returning HTML fragments. In these cases, it can make sense to conditionally extend a template: return the full page when visited directly but [return a fragment when requested via Ajax](https://htmx.org/docs/#progressive_enhancement). 6 | 7 | Let's say we have this page layout, `layout.html`: 8 | 9 | ```html 10 | 11 | 12 | 13 | 14 | 15 | 16 | {% block content %}{% endblock %} 17 | 18 | 19 | ``` 20 | 21 | A template rendered within this layout would look something like this: 22 | 23 | 24 | ```html 25 | {% extends "layout.html"} 26 | 27 | {% block content %}

Hello, world!

{% endblock %} 28 | ``` 29 | 30 | To render the template standalone, we can create a "stub" layout (say, `ajax.html`) that declares the block and nothing else: 31 | 32 | ```html 33 | {% block content %}{% endblock %} 34 | ``` 35 | 36 | Then, instead of using a string literal for the name of the template to extend, we use a variable. We can default to `layout.html` if it's omitted: 37 | 38 | 39 | ```html 40 | {% extends layout|default:"layout.html"} 41 | 42 | {% block content %}

Hello, world!

{% endblock %} 43 | ``` 44 | 45 | The view that renders the template might look something like this: 46 | 47 | ```python 48 | def view(request: HttpRequest): 49 | return render( 50 | request, 51 | template_name="some_template.html", 52 | context={ "layout": "ajax.html" if request.headers.get("hx-request") else None }, 53 | ) 54 | ``` 55 | -------------------------------------------------------------------------------- /docker/add-root-certificates-to-a-debian-container.md: -------------------------------------------------------------------------------- 1 | # Add root certificates to a Debian container 2 | 3 | I supposed this is potentially a problem with many images, but it definitely is with Debian (at least, as of December 2022): [Debian Docker images don't install the package `ca-certificates` by default](https://github.com/debuerreotype/docker-debian-artifacts/issues/15), which means any network requests that use TLS will fail with an error that looks something like this: 4 | 5 | ``` 6 | x509: certificate signed by unknown authority 7 | ``` 8 | 9 | The fix is pretty simple: just install the `ca-certificates` package, which adds root certificates so the OS can verify signatures. 10 | 11 | ```sh 12 | apt install -y ca-certificates 13 | ``` 14 | 15 | That's the shell command to install the package. In a Dockerfile, it should probably look something like this: 16 | 17 | ```sh 18 | RUN apt-get update 19 | RUN apt install -y ca-certificates 20 | ``` 21 | -------------------------------------------------------------------------------- /docker/fix-at-least-one-invalid-signature-was-encountered.md: -------------------------------------------------------------------------------- 1 | # Fix "At least one invalid signature was encountered" 2 | 3 | When trying to build a Debian container, I encountered a bunch of errors like this upon running `apt update`: 4 | 5 | ``` 6 | W: GPG error: http://deb.debian.org/debian buster InRelease: At least one invalid signature was encountered. 7 | E: The repository 'http://deb.debian.org/debian buster InRelease' is not signed. 8 | ``` 9 | 10 | The issue seems to be a lack of disk space. [This StackOverflow answer](https://stackoverflow.com/a/62510927) was indeed correct: running `docker image prune -f` and `docker container prune -f` fixed the problem. 11 | -------------------------------------------------------------------------------- /entr/reload-a-webpage-when-a-file-changes.md: -------------------------------------------------------------------------------- 1 | # Reload a webpage when a file changes 2 | 3 | I store my blog in a git repo of plain Markdown files. It's a pretty spartan setup, which I like — it forces me to keep all my writing extremely portable between website generators. 4 | 5 | To preview the files in my browser, I [render the Markdown with Caddy](/caddy/serve-markdown-files-as-html/). I wanted to add live reloading when I made a change, which was tricky because Caddy is just acting as a plain file server here. With the help of ChatGPT, I was able to come up with a solution involving a bunch of shell scripts. 6 | 7 | The general workflow: 8 | 9 | 1. Create a temporary file using [`mktemp`](https://www.gnu.org/software/autogen/mktemp.html). 10 | 2. Use the network utility [`socat`](http://www.dest-unreach.org/socat/) to listen to network requests, setting up a server-sent events connection and sending any lines appended to the file. 11 | 3. Use [`entr`](http://eradman.com/entrproject/) to append events to the temporary file whenever a watched file changes. 12 | 4. On the webpage, listen for server-sent events and reload the page when one comes through. 13 | 5. Use [`trap`](https://tldp.org/LDP/Bash-Beginners-Guide/html/sect_12_02.html) to clean up the temporary file when the process exits. 14 | 15 | Let's go through the steps one by one. 16 | 17 | First, we run `mktemp` to make a temporary file, saving the path in a variable called `$TMPFILE`: 18 | 19 | ```sh 20 | TMPFILE=$(mktemp) 21 | ``` 22 | 23 | Next, we set up step 5 using `trap` to remove `$TMPFILE` when the process exits: 24 | 25 | ```sh 26 | trap 'rm -f $TMPFILE; exit' INT TERM EXIT 27 | ``` 28 | 29 | Now we're getting to the meat of things. We use `socat` to listen for requests on port 8081. When a client connects, we initially responds with `HTTP/1.1 200 OK` and a bunch of headers, then keep the connection open and tail any lines appended to `$TMPFILE`. 30 | 31 | ```sh 32 | socat TCP-LISTEN:8081,fork,reuseaddr SYSTEM:' \ 33 | echo HTTP/1.1 200 OK; \ 34 | echo Content-Type\: text/event-stream; \ 35 | echo Cache-Control\: no-cache; \ 36 | echo Connection\: keep-alive; \ 37 | echo; \ 38 | tail -n0 -F $TMPFILE' 39 | ``` 40 | 41 | The `-n0` flag to `tail` makes it start at the _end_ of `$TMPFILE`, so the client only receives events sent after opening the connection. 42 | 43 | While that's running, we run `entr`, appending `reload` events to `$TMPFILE` whenever a watched file changes. In this case, it's watching files that are tracked by git. 44 | 45 | ```sh 46 | git ls-files | entr -s 'echo "event: reload\ndata:\n" >> $TMPFILE' 47 | ``` 48 | 49 | One last important thing: these all need to run concurrently. You can do that by chaining them together using `&`. 50 | 51 | ChatGPT tells me that I need to run [`wait`](https://linux.die.net/man/2/wait) at the end of the `&` chain to ensure that `socat` and `entr` don't quit prematurely, but in my testing that doesn't seem to matter. 52 | 53 | Here's the full shell script: 54 | 55 | ```sh 56 | # Create a temporary file 57 | TMPFILE=$(mktemp) 58 | 59 | # Set the trap to remove the temporary file on INT, TERM and EXIT 60 | trap 'rm -f $TMPFILE; exit' INT TERM EXIT 61 | 62 | # Start the server-side events server and tail lines from the temporary file 63 | socat -v -v TCP-LISTEN:8081,fork,reuseaddr SYSTEM:'echo HTTP/1.1 200 OK; echo Content-Type\: text/event-stream; echo Cache-Control\: no-cache; echo Connection\: keep-alive; echo; tail -n0 -F '"$TMPFILE"'' & 64 | 65 | # Append lines to the temporary file whenever a file tracked by git changes 66 | git ls-files | entr -s 'echo "event: reload\ndata:\n" >> '"$TMPFILE"'' & 67 | 68 | # Wait for background processes to finish 69 | wait 70 | ``` 71 | 72 | On the webpage side, we can connect to the server, listen for `reload` events and reload the page when we receive one: 73 | 74 | ```js 75 | new EventSource("http://localhost:8081").addEventListener("reload", () => location.reload()); 76 | ``` 77 | -------------------------------------------------------------------------------- /entr/rerun-a-command-when-files-change.md: -------------------------------------------------------------------------------- 1 | # Rerun a command when files change 2 | 3 | A pretty common programming workflow: make a change, save a file, rebuild the app. When I'm writing JavaScript, if whatever tool I'm working with doesn't support this on its own, I usually reach for [nodemon](https://nodemon.io/). When I'm writing something else, like Go, I usually… uh, just restart manually every time I make a change. 4 | 5 | Well, not anymore: there's an awesome little command line tool called [`entr`](http://eradman.com/entrproject/) that I learned about from [Julia Evans' blog](https://jvns.ca/blog/2020/06/28/entr/). It's not installed by default on macOS, but it's available on [Homebrew](https://formulae.brew.sh/formula/entr). 6 | 7 | The TL;DR is that you take a list of files to watch and pipe it to `entr` plus some command, and `entr` will rerun that command whenever one of those files changes. Like if you want to rerun tests when your files change: 8 | 9 | ```sh 10 | find . -name "*.ts"" | entr npm test 11 | ``` 12 | 13 | A useful command to use in combination is `git ls-files `, which lists all the files that git is tracking: 14 | 15 | ```sh 16 | git ls-files | entr npm test 17 | ``` 18 | 19 | The `-r` option will make it restart long-running tasks, which is perfect for webservers: 20 | 21 | ```sh 22 | git ls-files | entr -r node server.js 23 | ``` 24 | 25 | There's also the `-c` option, which will clear the screen every time it reruns: 26 | 27 | ```sh 28 | git ls-files | entr -c tsc --noEmit 29 | ``` 30 | -------------------------------------------------------------------------------- /esbuild/run-a-development-server-with-live-reload.md: -------------------------------------------------------------------------------- 1 | # Run a development server with live reload 2 | 3 | [esbuild](https://github.com/evanw/esbuild) is a great batteries-included bundler for frontend web projects. 4 | 5 | This is a one line shell command that will run a development web server and reload the page every time a file changes (broken up into multiple lines here for readability). I tend to put it in the [`scripts` property](https://docs.npmjs.com/cli/v9/configuring-npm/package-json?v=true#scripts) of project's `package.json` file. 6 | 7 | ```sh 8 | esbuild src/index.html src/index.ts src/style.css \ 9 | --bundle --watch \ 10 | --outdir=build --servedir=build \ 11 | --loader:.html=copy \ 12 | --inject:src/livereload.js 13 | ``` 14 | 15 | It assumes the project structure looks like this: 16 | 17 | ``` 18 | src/ 19 | ├── index.html 20 | ├── index.ts 21 | ├── style.css 22 | └── livereload.js 23 | ``` 24 | 25 | Line by line, here's what the command does: 26 | 27 | - `esbuild src/index.html src/index.ts src/style.css` runs esbuild using the three files listed as entrypoints. 28 | - `--bundle` combines all the imports 29 | from the TypeScript entrypoint into a single `.js` file. 30 | - `--watch` runs another build whenever an entrypoint or any of its dependencies change. 31 | - `--outdir=build` places bundled and copied files into the `build` directory. 32 | - `--servedir=build` serves static content from the `build` directory if a request path doesn't match any generated files. This is how `index.html` gets served. 33 | - `--loader:.html=copy` tells esbuild to use the [copy loader](https://esbuild.github.io/content-types/#copy) for any files ending `.html`, copying them unchanged to `outdir`. This also gets triggered when the HTML file changes. 34 | - `--inject:src/livereload.js` adds the comments of `src/livereload.js` to the generated output. The reason to enable it this way is that the build command can simply omit this flag to not include the live reload code in the production bundle. 35 | 36 | `src/liveroad.js` is pulled straight from the [esbuild documentation on live reloading](https://esbuild.github.io/api/#live-reload): 37 | 38 | ```js 39 | new EventSource("/esbuild").addEventListener("change", () => location.reload()); 40 | ``` 41 | -------------------------------------------------------------------------------- /fzf/make-a-tui-for-switching-and-deleting-git-branches.md: -------------------------------------------------------------------------------- 1 | # Make a TUI for switching and deleting git branches 2 | 3 | Is dealing with git branches annoying to anyone else? There's a lot of typing long branch names, especially if you're cleaning up a bunch of old branches at oncex. 4 | 5 | Enter [fzf](https://github.com/junegunn/fzf), which provides a nice terminal UI for selecting things. You pipe in some output and it shows each line in a list that you can navigate by typing to search or using the arrow keys. 6 | 7 | For example, here's a simple one liner for making a selectable list of your git branches: 8 | 9 | ```sh 10 | git branch | fzf | cut -c 3- 11 | ``` 12 | 13 | The item you select is the return value — in this case, the name of the branch. 14 | (The `cut -c 3-` removes the leading whitespace and `*` from the branches.) 15 | 16 | We can get a bit fancier, though. If you pass a `--preview` flag followed by a command with a placeholder, fzf will also display the output of a that second command in a "preview" panel on the right. The placeholder is replaced by the currently highlighted option, so the preview changes as you navigate. 17 | 18 | So to build a terminal UI for switching branches, you might use this: 19 | 20 | ```sh 21 | git branch | fzf --preview 'git log -p main..{-1} --color=always {-1}' | cut -c 3- | xargs git switch 22 | ``` 23 | 24 | Deleting branches is pretty similar. `fzf` has a flag `-m` that lets you select multiple options. Pipe that to `xargs git branch -d` and you're good: 25 | 26 | ```sh 27 | git branch | fzf --preview 'git log -p main..{-1} --color=always {-1}' | cut -c 3- | xargs git branch -d 28 | ``` 29 | 30 | I have aliases for both of those commands in my fish config: 31 | 32 | ```fish 33 | alias gs="git branch | fzf --preview 'git log -p main..{-1} --color=always {-1}' | cut -c 3- | xargs git switch" 34 | alias gbd="git branch | fzf --preview 'git log -p main..{-1} --color=always {-1}' | cut -c 3- | xargs git branch -d" 35 | ``` 36 | -------------------------------------------------------------------------------- /git/add-a-git-hook-on-windows.md: -------------------------------------------------------------------------------- 1 | # Add a Git hook on Windows 2 | 3 | [Git Hooks](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks) are scripts that Git runs for you when certain actions occur. 4 | 5 | At root of every repo, there's a `.git` directory. Inside, there's a folder called `hooks` that contains a bunch of files named things like `pre-commit.sample`. Each of those files is an example of a Git Hook. To create one for _real_, create a file of the same name without the extension, and Git will automatically run the script for you at the appropriate time. 6 | 7 | At least, that's what I thought would happen when I tried adding this pre-commit hook on Windows: 8 | 9 | ```console 10 | npm run typecheck 11 | ``` 12 | 13 | When I tried to commit it, though, I kept getting this error: 14 | 15 | ``` 16 | 'pre-commit' is not recognized as an internal or external command, 17 | operable program or batch file. 18 | ``` 19 | 20 | I tried running `attrib +x .git\hooks\pre-commit`, changing the extension to `.bat` and a bunch of other things that didn't work before stumbling upon this StackOverflow answer that held the key: [Git for Windows has an embedded bash shell](https://stackoverflow.com/a/51005120)! 21 | 22 | The fix turned out to be just adding a hashbang to the top of the script and write it as though it were a Unix shell script: 23 | 24 | ```sh 25 | #!/bin/sh 26 | npm run typecheck 27 | ``` 28 | -------------------------------------------------------------------------------- /git/add-a-global-gitignore.md: -------------------------------------------------------------------------------- 1 | # Add a global `.gitignore` 2 | 3 | If you use git on a Mac, chances are you've accidentally committed a `.DS_Store` to a repo. I used to reflexively add `.DS_Store` to all my `.gitignore` files to avoid ever repeating that mistake. 4 | 5 | But there's a better way! You can add a global `.gitignore` file to your global config with this command: 6 | 7 | ```bash 8 | git config --global core.excludesFile '~/.gitignore' 9 | ``` 10 | 11 | The single quotes around `~/.gitignore` aren't strictly necessary; if you don't use them, the `~` will just get expanded into the absolute path to your home directory. 12 | 13 | Under the hood, that command just adds this to your `~/.gitconfig`: 14 | 15 | ``` 16 | [core] 17 | excludesFile = ~/.gitignore 18 | ``` 19 | 20 | **Update:** You can now do this without modifying your `~/.gitconfig!`! Not sure when this was added, but the [git documentation](https://git-scm.com/docs/gitignore) now has this bullet point: 21 | 22 | > Patterns which a user wants Git to ignore in all situations (e.g., backup or temporary files generated by the user’s editor of choice) generally go into a file specified by `core.excludesFile` in the user's `~/.gitconfig`. Its default value is $XDG_CONFIG_HOME/git/ignore. If $XDG_CONFIG_HOME is either not set or empty, $HOME/.config/git/ignore is used instead. 23 | 24 | Not sure what `$XDG_CONFIG_HOME` is, but it hasn't been set on any systems computers I've used. On macOS and other Unix-like systems, the default is `$HOME/.config/git/ignore`, aka `~/.config/git/ignore`. On Windows, that maps to `%USERPROFILE%\config\git\ignore`. 25 | -------------------------------------------------------------------------------- /git/list-all-files-tracked-by-git.md: -------------------------------------------------------------------------------- 1 | # List all files tracked by git 2 | 3 | Sometimes it's convenient to list all the files tracked in a git repo (for example, when using [`entr`](http://eradman.com/entrproject/) to watch files for changes). Rather than trying to remember which arguments to pipe to `find` to get the right list of files, [git provides a nice concise command](https://git-scm.com/docs/git-ls-files): 4 | 5 | ```sh 6 | git ls-files 7 | ``` 8 | -------------------------------------------------------------------------------- /git/run-a-command-if-there-are-unstaged-changes.md: -------------------------------------------------------------------------------- 1 | # Run a command if there are unstaged changes 2 | 3 | A quick one-liner to run a command only if there are unstaged changes: the `--quiet` flag of [`git diff`](https://git-scm.com/docs/git-diff). The flag does two things: 4 | 5 | 1. disables all output of the command, and 6 | 2. exits with 1 if there are differences, and 0 if there are no differences 7 | 8 | That means you can combine it with boolean operators to only run another command if files have (or have not) changed: 9 | 10 | ```sh 11 | # run `somecommand` if there are unstaged changes 12 | git diff --quiet || somecommand 13 | 14 | # run `somecommand` if there are unstaged changes 15 | git diff --quiet && somecommand 16 | ``` 17 | 18 | Note that this only works for unstaged changes in files that git already tracks; new files and changes staged with `git add` are not counted. 19 | 20 | Note too that because an exit code of 0 indicates success and anything else indicates failure, the boolean operators' usual behavior is flipped: `&&` only evaluates the right side if the left is `0`, and `||` only evaluates the right side if the left is not `0`. 21 | -------------------------------------------------------------------------------- /git/update-all-submodules-to-latest-commit-on-origin.md: -------------------------------------------------------------------------------- 1 | # Update all submodules to latest commit on origin 2 | 3 | I use [Git submodules](https://git-scm.com/book/en/v2/Git-Tools-Submodules) in a bunch of projects, and one common task is checking out the latest commit on the origin. 4 | 5 | Since Git 1.8.2 this is a one-liner, but I'm always forgetting it, so here it is: 6 | 7 | ```sh 8 | git submodule update --remote --rebase # or `--merge` if you swing that way 9 | ``` 10 | 11 | The answers to [this StackOverflow question](https://stackoverflow.com/q/5828324) have some more detail (including a command that works on earlier versions of Git). 12 | -------------------------------------------------------------------------------- /github/run-github-actions-locally.md: -------------------------------------------------------------------------------- 1 | # Run GitHub Actions locally 2 | 3 | A common pain point with GitHub Actions is that the feedback loop is so long: make a change, push, wait for it to run, find an error, try to debug, repeat. Which is why I was so happy to discover [`act`](https://github.com/nektos/act), a tool for running GitHub Actions locally! The only prerequisite is Docker, which `act` uses to pull the appropriate images to run your actions. 4 | 5 | By default, `act` will run the action for the `push` event, although you can configure it to run specific events or jobs: 6 | 7 | ```bash 8 | # run the `push` event: 9 | act 10 | 11 | # run a specific event: 12 | act pull_request 13 | 14 | # run a specific job: 15 | act -j test 16 | ``` 17 | 18 | If your action needs a GitHub token (for example, if you're checking out your code with [`actions/checkout`](https://github.com/actions/checkout)) you can supply it with the `-s` flag (for "secrets") and the `GITHUB_TOKEN` environment variable. This is easiest if you have the [GitHub CLI](https://cli.github.com/) installed: 19 | 20 | ```bash 21 | act -s GITHUB_TOKEN="$(gh auth token)" 22 | ``` 23 | 24 | Note that the [official docs](https://github.com/nektos/act#github_token) note that supplying your token in this way can leak it to the shell history. 25 | -------------------------------------------------------------------------------- /github/trigger-a-workflow-run-in-another-repo.md: -------------------------------------------------------------------------------- 1 | # Trigger a workflow run in another repo 2 | 3 | This is kind of meta: I made a separate repo [`todayilearned`](https://github.com/jakelazaroff/todayilearned) with the website for the content in this repo, and I wanted to trigger a new deploy whenever I pushed a new TIL. 4 | 5 | The TL;DR is that there needs to be a GitHub Actions workflow in each repo: one in the "source" to send a [`repository_dispatch` event](https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#repository_dispatch) to the "destination", and one in the destination to do the actual building/deploying/what have you. 6 | 7 | I'm sure this has been written in a zillion different places, but it took me a bit of searching to figure out so here goes: 8 | 9 | 1. Modify the destination repo's workflow YAML file to run the workflow on `repository_dispatch` events: 10 | 11 | ```yaml 12 | on: 13 | push: 14 | branches: [main] 15 | repository_dispatch: # this line tells the workflow to run on `repository_dispatch` events 16 | types: [deploy] # this tells what event types should trigger a workflow run; if it's omitted, it'll just run on every event 17 | ``` 18 | 19 | 2. Create a [Personal Access Token](https://github.com/settings/tokens?type=beta). 20 | 21 | - The new fine-grained tokens require an expiration date with a max of two years, at which point I'll definitely have forgotten about all this. But luckily I'll be able to refer back to this TIL to figure it out again! 22 | - The only repository access needed is the destination (i.e. the one receiving the event). 23 | - The only required permissions are read and write access for `content` (which covers the dispatch event) and `metadata` (which I think is mandatory for all personal access tokens). 24 | 25 | 3. At this point, you can test things out manually with a cURL (replace `PERSONAL_ACCESS_TOKEN`, `USER` and `REPO` with the appropriate values): 26 | 27 | ```bash 28 | curl --request POST \ 29 | -H "Accept: application/vnd.github.everest-preview+json" \ 30 | -H "Authorization: token PERSONAL_ACCESS_TOKEN" \ 31 | --data '{ "event_type": "deploy" }' https://api.github.com/repos/USER/REPO/dispatches 32 | ``` 33 | 34 | If things are set up correctly, this should trigger a workflow run. 35 | 36 | 4. Add the personal access token in the Actions secrets settings of the source repo. 37 | 5. Add a workflow YAML file to the source repo that runs the cURL command above: 38 | 39 | ```yaml 40 | name: Deploy 41 | on: 42 | push: 43 | branches: 44 | - main 45 | 46 | jobs: 47 | deploy: 48 | runs-on: ubuntu-latest 49 | steps: 50 | - name: Dispatch to workflows 51 | run: | 52 | curl -H "Accept: application/vnd.github.everest-preview+json" \ 53 | -H "Authorization: token ${{ secrets.PERSONAL_ACCESS_TOKEN }}" \ 54 | --request POST \ 55 | --data '{ "event_type": "deploy" }' https://api.github.com/repos/USER/REPO/dispatches 56 | ``` 57 | -------------------------------------------------------------------------------- /github/write-an-inline-script-in-a-github-actions-workflow.md: -------------------------------------------------------------------------------- 1 | # Write an inline script in a GitHub Actions workflow 2 | 3 | This is kind of meta because I'm mostly writing this TIL to test the workflow script of this repo, but anyway: the [`github-script`](https://github.com/actions/github-script) action allows you to write inline JavaScript within a GitHub Actions workflow. The string provided to the `script` property will be used as the body of an asynchronous function call. 4 | 5 | ```yaml 6 | - uses: actions/github-script@v6 7 | id: script 8 | with: 9 | script: return "Hello!" 10 | result-encoding: string 11 | - name: Get result 12 | run: echo "${{steps.script.outputs.result}}" 13 | ``` 14 | 15 | That example is from the official documentation; you can return an optionally JSON-encoded string and it'll be available under the `result` key of the step's outputs. 16 | 17 | Some other things you can do: 18 | 19 | - `require` NPM packages 20 | - Find files with glob patterns via [`@actions/glob`](https://github.com/actions/toolkit/tree/main/packages/glob) (passed to your script as `glob`) 21 | - Make GitHub API calls via a pre-authenticated [`octokit/rest.js`](https://octokit.github.io/rest.js) (passed to your script as `github`) 22 | -------------------------------------------------------------------------------- /gltf/extract-data-from-a-mesh-primitive.md: -------------------------------------------------------------------------------- 1 | # Extract data from a mesh primitive 2 | 3 | The first step of displaying a 3D model is interpreting the geometry. The [glTF format](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html) stores that something called a [mesh primitive](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html#reference-mesh-primitive): a nested object inside of each [mesh](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html#reference-mesh) that contains information about how to extract geometry from the binary data. 4 | 5 | A mesh primitive looks something like this: 6 | 7 | ```json 8 | { 9 | "attributes": { 10 | "POSITION": 0, 11 | "NORMAL": 1, 12 | "TEXCOORD_0": 2 13 | }, 14 | "indices": 3 15 | } 16 | ``` 17 | 18 | Each of those properties indicates where to find a specific type of data about the geometry. In this case, `attributes.POSITION` points to the vertex positions, `attributes.NORMAL` points to the vertex normals and `attributes.TEXCOORD_0` points to [ST texture coordinates](https://computergraphics.stackexchange.com/a/4539). If `índices` is defined, it points to an [index array](https://webglfundamentals.org/webgl/lessons/webgl-indexed-vertices.html) — a single vertex position may be shared by multiple vertices, and each element of the index array indicates the position of a vertex. 19 | 20 | The actual data is stored in binary format. There are three levels of indirection between the mesh primitive and the data itself: 21 | 22 | - [Buffers](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html#reference-buffer) point to the binary data itself. In a GLB file, there’s usually a single buffer that refers to the chunk of data after the JSON ends. 23 | - [Buffer views](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html#reference-bufferview) point to a subset of data within a specific buffer, marked by the number of bytes in the buffer before the buffer view starts and the length in bytes of the buffer view. 24 | - [Accessors](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html#reference-accessor) point to a specific buffer view and indicate how to interpret its data. 25 | 26 | The values of the keys in the mesh primitive all point to accessors — the number corresponds to an index in the `accessors` array. By following the path from the mesh primitive all the way to the buffer, we can figure out where the actual data is and how to upload it to the GPU. 27 | 28 | Here’s an example accessor for vertex position data: 29 | 30 | ```json 31 | { 32 | "bufferView": 0, 33 | "componentType": 5126, 34 | "type": "VEC3" 35 | } 36 | ``` 37 | 38 | - `bufferView` corresponds to the index of the buffer view pointing to the vertex data in the `bufferViews` array. 39 | - `componentType` indicates the data type of the binary data. 5126 means the data is a series of 32-bit floating point numbers; [other possible `componentType` values are outlined in the spec](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html#_accessor_componenttype). 40 | - `type` indicates how many numbers in the data make up a single “item”. `VEC3` means that every three numbers in the data should be treated as one; [other possible `type` values are outlined in the spec](https://registry.khronos.org/glTF/specs/2.0/glTF-2.0.html#_accessor_type). In this case, `VEC3` makes sense because points in 3D space are sets of values for each of the X, Y and Z axes. 41 | -------------------------------------------------------------------------------- /html/define-a-custom-element.md: -------------------------------------------------------------------------------- 1 | # Define a custom element 2 | 3 | I've been using web components for a bit, and I've accrued a bit of code I generally use as a base for my custom elements. "Boilerplate" has a negative connotation, so let's just call it a "web component starter kit". 4 | 5 | Let's start with the finished product (as of today, at least) and then we'll talk about how I got there: 6 | 7 | ```js 8 | class MyElement extends HTMLElement { 9 | static tag = "my-element"; 10 | 11 | static define(tag = this.tag) { 12 | this.tag = tag; 13 | 14 | const name = customElements.getName(this); 15 | if (name) return console.warn(`${this.name} already defined as <${name}>!`); 16 | 17 | const ce = customElements.get(tag); 18 | if (Boolean(ce) && ce !== this) return console.warn(`<${tag}> already defined as ${ce.name}!`); 19 | 20 | customElements.define(tag, this); 21 | } 22 | 23 | static { 24 | const tag = new URL(import.meta.url).searchParams.get("define") || this.tag; 25 | if (tag !== "false") this.define(tag); 26 | } 27 | } 28 | ``` 29 | 30 | The first parts are heavily inspired by Mayank's post on [defining custom elements](https://mayank.co/blog/defining-custom-elements/). 31 | For a long time after reading that, my boilerplate looked like this: 32 | 33 | ```js 34 | class MyElement extends HTMLElement { 35 | static tag = "my-element"; 36 | 37 | static define(tag = this.tag) { 38 | this.tag = tag; 39 | 40 | const name = customElements.getName(this); 41 | if (name) return console.warn(`${this.name} already defined as <${name}>!`); 42 | 43 | const ce = customElements.get(tag); 44 | if (ce && ce !== this) return console.warn(`<${tag}> already defined as ${ce.name}!`); 45 | 46 | customElements.define(tag, this) 47 | } 48 | } 49 | ``` 50 | 51 | It's the same as what Mayank ended up with, plus a couple of checks to guard against accidentally defining a custom element class or tag name twice. 52 | 53 | This gives a lot of flexibilty to the person using the component: 54 | 55 | - They can define them using the tag name I've defined by just calling `MyElement.define()`. 56 | - They can override the tag name by passing a different tag name, as in `MyElement.define("other-element")`. 57 | - They can also override the tag name by changing the static `tag` property. This is particularly useful when using it as a base class: the subclasses can change the tag name and the registration code will stay the same. 58 | ```js 59 | class MyOtherElement extends MyElement { 60 | static tag = "other-element"; 61 | } 62 | 63 | MyOtherElement.define(); 64 | ``` 65 | 66 | However, there's one nagging drawback: the person using the component _needs_ to define it somehow. 67 | They can't just include a script tag like this: 68 | 69 | ```js 70 | 71 | ``` 72 | 73 | Rather, they need to call that `define` method. That means actually using the component will look something like this: 74 | 75 | ```js 76 | 80 | ``` 81 | 82 | If they're using a bundler, there won't be a ` 98 | ``` 99 | 100 | To _not_ define it, just omit the `?define` query string. 101 | 102 | A few people had good ideas in a follow-up discussion on Mastodon. 103 | Cesare Pautasso suggested [using the query string value to customize the tag name](https://scholar.social/@pautasso/113271442461273534), while Remy Sharp pointed out that [defining it should be the default behavior](https://front-end.social/@rem/113271106167064142). 104 | 105 | Here's what I came up with: 106 | 107 | ```js 108 | const tag = new URL(import.meta.url).searchParams.get('define') || undefined; 109 | if (tag !== 'false') MyElement.define(tag); 110 | ``` 111 | 112 | There's a lot going on in a few lines, so let's go through it: 113 | 114 | 1. First, we set `tag` to the _value_ of the query string parameter `define`, falling back to `undefined` instead of an empty string. 115 | 2. If `tag` is not equal to `"false"` (meaning the query string parameter was either set explicitly or omitted entirely) we call `MyElement.define(tag)`. That will either use the given tag name or fall back to the default argument, which is the value of `Myelement.tag`. 116 | 3. If `tag` is set to `"false"`, we don't do anything. We know this won't conflict with any custom element tag name the user might want to set, because custom element tag names must include a hyphen. 117 | 118 | So to import the component and define it with the default tag name, you'd import `my-element.js` and it should Just Work. To import it with a custom tag name, you'd import `my-element.js?define=other-tag-name`. And to import it without defining it at all, you'd import `my-element.js?define=false`. 119 | 120 | We can make the code feel a bit cleaner by putting it in a static initialization block: 121 | 122 | ```js 123 | class MyElement extends HTMLElement { 124 | // ... 125 | 126 | static { 127 | const tag = new URL(import.meta.url).searchParams.get("define") || this.tag; 128 | if (tag !== "false") this.define(tag); 129 | } 130 | } 131 | ``` 132 | 133 | A couple caveats here: 134 | 135 | - Westbrook Johnson pointed out that [importing a file twice with different query strings might create different modules in the module graph](https://mastodon.social/@westbrook/113271725600420967). This is probably an edge case (since, unlike with framework components, you should only ever need to import web components once) but it's worth keeping in mind. 136 | - Óscar Otero suggested that [it won't work if you use a bundler](https://mastodon.gal/@misteroom/113271013172508655). I've used it successfully in a React component in an Astro app (meaning with Vite) but I haven't tested it outside of that. 137 | - Static initialization blocks don't get inherited like normal static variables. That means if you write something like this, it won't automatically define itself: 138 | ```js 139 | class MyOtherElement extends MyElement { 140 | static tag = "other-element"; 141 | } 142 | ``` 143 | 144 | Regardless, this has been working pretty well for me! 145 | -------------------------------------------------------------------------------- /htmx/attach-attributes-to-dynamically-added-elements.md: -------------------------------------------------------------------------------- 1 | # Attach attributes to dynamically added elements 2 | 3 | It's barely mentioned within the htmx documentation, but by default htmx attributes only work on elements that were in the DOM when htmx was loaded, or that htmx itself added to the DOM. This means that if you add an element by some other means — say, AlpineJS — htmx won't know about any attributes on it or its descendants. 4 | 5 | In the below example with AlpineJS and htmx, the toggle button can be clicked to show and hide the form — but the [`hx-boost` attribute](https://htmx.org/attributes/hx-boost/) won't be detected, which means that it will be a normal form submission: 6 | 7 | ```html 8 |
9 | 16 | 17 |
18 | ``` 19 | 20 | htmx provides a function [`htmx.process`](https://htmx.org/api/#process) that checks a given element for htmx attributes. Call it like so: 21 | 22 | ```js 23 | htmx.process(document.body); 24 | ``` 25 | 26 | The above code snippet can be fixed with the [AlpineJS `x-effect` directive](https://htmx.org/attributes/hx-boost/), which runs an expression when the tag is added to the DOM: 27 | 28 | ```html 29 |
30 | 37 | 38 |
39 | ``` 40 | -------------------------------------------------------------------------------- /htmx/load-modal-content-when-shoelace-dialog-opens.md: -------------------------------------------------------------------------------- 1 | # Load modal content when a Shoelace dialog opens 2 | 3 | This is pretty idiosyncratic to [htmx](https://htmx.org) and [Shoelace](https://shoelace.style), but it's a neat pattern so I'm documenting it here. 4 | 5 | htmx lets you make an HTTP request in response to an event and insert it elsewhere into the DOM. Shoelace's [Dialog](https://shoelace.style/components/dialog) component fires an `sl-show` event when the dialog opens. These can be combined to automatically load modal content when the modal opens: 6 | 7 | ```html 8 | 9 | ``` 10 | 11 | If parts of the modal don't need to be loaded via HTTP — for example, the title — `hx-target` can be used to replace only the modal content: 12 | 13 | ```html 14 | 15 | My Modal 16 |
17 |
18 | ``` 19 | -------------------------------------------------------------------------------- /javascript/access-css-variables-from-javascript.md: -------------------------------------------------------------------------------- 1 | # Access CSS variables from JavaScript 2 | 3 | In building [fxplayground](https://fxplayground.pages.dev/), I wanted to use colors from the site's theme in a canvas visualization. The theme colors were all stored in CSS custom properties (variables), but the visualization drawing code was all JavaScript. I needed a way to read the CSS color variables from within JavaScript code. 4 | 5 | Luckily, there's a function called [`getComputedStyle`](https://developer.mozilla.org/en-US/docs/Web/API/Window/getComputedStyle) that can help with that: 6 | 7 | ```js 8 | const color = getComputedStyle(document.documentElement).getPropertyValue("--color-primary"); 9 | ``` 10 | 11 | `getComputedStyle` takes a DOM node and returns a live `CSSStyleDeclaration`, which contains all the styles applied to that element. Calling `getPropertyValue` returns the value for a given property, which includes CSS variable declarations. So if there's a variable defined on the `:root` selector, you can get the value by calling `getPropertyValue("--variable-name")`! 12 | -------------------------------------------------------------------------------- /javascript/load-a-user-created-javascript-file-in-the-browser.md: -------------------------------------------------------------------------------- 1 | # Load a user-created JavaScript file in the browser 2 | 3 | I ran into this when building a [live coding audio playground](https://jakelazaroff.com/words/building-a-live-coding-audio-playground/), and presumably it's useful for other similar apps. The issue is that APIs like [`AudioWorklet`](https://developer.mozilla.org/en-US/docs/Web/API/AudioWorklet) expect to be given a separate JavaScript file to run as a worker or worklet, but for apps in which the user writes code themselves there's no easy way to serve that file (without running a webserver). 4 | 5 | The trick is to use a combination of a [`File`](https://developer.mozilla.org/en-US/docs/Web/API/File) (which represents raw data, plus some file-specific things like a name) and [`URL.createObjectURL`](https://developer.mozilla.org/en-US/docs/Web/API/URL/createObjectURL_static), which lets you create "stub" URLs for `File` objects. 6 | 7 | Here's how to use the trick create a Web Worker: 8 | 9 | ```js 10 | // source code goes here 11 | const src = ""; 12 | 13 | // create a fake JS file from the source code 14 | const file = new File([src], "file.js"); 15 | 16 | // create a URL for the fake JS file 17 | const url = URL.createObjectURL(file.slice(0, file.size, "application/javascript")); 18 | 19 | // add the fake JS file as a Web Worker 20 | const worker = new Worker(url); 21 | 22 | // revoke the URL so as to not leak memory 23 | URL.revokeObjectURL(url); 24 | ``` 25 | 26 | The call to `file.slice` is there to fix a Safari bug where it can't infer the MIME type. 27 | -------------------------------------------------------------------------------- /javascript/programmatically-create-svg-elements.md: -------------------------------------------------------------------------------- 1 | # Programmatically create SVG elements 2 | 3 | This is simple, but it tripped me up for a bit. TL;DR you can't do this: 4 | 5 | ```js 6 | const svg = document.createElement("svg"); 7 | const line = document.createElement("line"); 8 | svg.append(line); 9 | ``` 10 | 11 | I mean, you can, but it won't work. These will both throw exceptions: 12 | 13 | ```js 14 | console.assert(svg instanceof SVGSVGElement, "%o is an SVG element", svg); 15 | console.assert(line instanceof SVGLineElement, "%o is an SVG element", line); 16 | ``` 17 | 18 | What you need instead is [`document.createElementNS`](https://developer.mozilla.org/en-US/docs/Web/API/Document/createElementNS): 19 | 20 | ```js 21 | const svg = document.createElementNS("http://www.w3.org/2000/svg", "svg"); 22 | const line = document.createElementNS("http://www.w3.org/2000/svg", "line"); 23 | svg.append(line); 24 | ``` 25 | -------------------------------------------------------------------------------- /logic/send-stereo-output-through-specific-output-channels.md: -------------------------------------------------------------------------------- 1 | # Send stereo output through specific output channels 2 | 3 | This probably works for many audio interfaces, though in this case I'm using a [Quad Cortex](https://neuraldsp.com/quad-cortex). When I have my interface hooked up to Logic Pro, I want to route stereo output to the specific output channels that are connected to my speakers. 4 | 5 | This works as of Logic Pro 10.8.1. 6 | 7 | First, ensure the interface is selected as the output device: 8 | 9 | 1. Open Settings, either by going through the Logic Pro menu or hitting `⌘,`. 10 | 2. Switch to the Audio tab. 11 | 3. Select the interface in the Output Device dropdown. 12 | 13 | Then, change the output channels: 14 | 15 | 1. In the Audio tab, select the nested I/O Assignments tab. 16 | 2. Within the third level (!) of nested subtabs, select Output. 17 | 3. Set the Output dropdown in the Stereo section to the desired outputs. 18 | -------------------------------------------------------------------------------- /make/build-all-files-with-a-given-extension.md: -------------------------------------------------------------------------------- 1 | # Build all files with a given extension 2 | 3 | The most basic use of Make (other a simple task runner) is reading some files, compiling them somehow and outputting other files. 4 | 5 | Here's a minimal example that takes an existing file with extension `.in` (`some/dir/anything.in`) and writes "Created from filename.in" to a sibling file with the extension `.out`: 6 | 7 | ```makefile 8 | %.out: %.in 9 | echo "Created from $<" > $@ 10 | ``` 11 | 12 | `$<` and `$@` are two special sigils that are replaced with the actual filenames of the target and the first dependency, respectively. So if make were invoked with `make some/dir/anything.out`, the expanded command would read `echo "Created from /some/dir/anything.in" > /some/dir/anything.out`. 13 | 14 | Note that the target needs to correspond to a specific input file that exists on the filesystem; if it doesn't, make will say something like `` make: *** No rule to make target `nonexistentfile.out'. Stop.`` In order to build **all** files with a given extension, we need to use wildcards. [Makefile tutorial](https://makefiletutorial.com/#-wildcard-1) has a short summary of the different types, and it's not entirely clear to me what the difference is, but here's my best understanding of how to use them: 15 | 16 | ```make 17 | INPUTS := $(wildcard *.in) 18 | OUTPUTS := $(patsubst %.in,%.out,$(INPUTS)) 19 | 20 | %.out: %.in 21 | echo "Created from $<" > $@ 22 | 23 | build: $(OUTPUTS) 24 | echo "Built!" 25 | ``` 26 | 27 | This creates two variables, `INPUTS` and `OUTPUTS`. For `INPUTS`, `$(wildcard *.in)` will be expanded to all files with the extension `.in`. The list can be filtered further: `$(wildcard src/**/*.in)` will only show files in descendent directories of `src`. For `OUTPUTS`, `$(patsubst %.in,%.out,$(INPUTS))` changes the extension of every `.in` file name in `INPUTS` to `.out`. 28 | 29 | The task `build: $(OUTPUTS)` will build all the input files. Let's say there are two files adjacent to the makefile, `one.in` and `two.in`. It's easiest to think of this in a few steps: 30 | 31 | 1. Run `make build` from the command line. 32 | 2. Make sees that `build` has a dependency on `$(OUTPUTS)`. 33 | 3. `OUTPUTS` expands to `one.out two.out` (since it's all the files in `INPUTS` with their extension replaced). 34 | 4. Make runs the dependent tasks `make one.out` and `make two.out`, which both correspond to the `%.out: %.in` task. You can think of this as running that task multiple times with different arguments. 35 | 5. The `%.out: %.in` task builds `one.out` and `two.out`. 36 | 6. Now that the dependencies have been satisfied, make runs the `build` task. 37 | 38 | The benefit to using make is that if we run through the steps again, it will only re-run the `%.out: %.in` task for `.out` files that are older than the corresponding `.in` files. 39 | -------------------------------------------------------------------------------- /make/list-all-commands-in-a-makefile.md: -------------------------------------------------------------------------------- 1 | # List all commands in a Makefile 2 | 3 | It's a pretty inscrutable, but Suvash Thapaliya has [a snippet of code that will add a `help` command to any Makefile](https://www.thapaliya.com/en/writings/well-documented-makefiles/). 4 | 5 | Just add a `help` target with this long `awk` string: 6 | 7 | ```make 8 | help: ## Display this help 9 | @awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n make \033[36m\033[0m\n"} /^[.a-zA-Z_-]+:.*?##/ { printf " \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST) 10 | ``` 11 | 12 | An example Makefile for a TypeScript project that looks something like this (long `help` target omitted): 13 | 14 | ```make 15 | ##@ Development 16 | .PHONY: install typecheck 17 | 18 | install: ## Install dependencies 19 | @pnpm install 20 | 21 | typecheck: ## Check static types 22 | @pnpm tsc --noEmit 23 | 24 | ##@ Deployment 25 | .PHONY: build 26 | 27 | build: ## Build for production 28 | @rm -rf build 29 | @pnpm tsc 30 | ``` 31 | 32 | Running `make help` will show this: 33 | 34 | ``` 35 | $ make help 36 | 37 | Usage: 38 | make 39 | 40 | Development 41 | install Install dependencies 42 | typecheck Check static types 43 | 44 | Deployment 45 | build Build for production 46 | ``` 47 | 48 | The snippet makes it so that `##` after a target becomes the help text for that target, and `##@` before a bunch of targets turns that into a group. (In this example, `##@ Helpers` would have gone before `help`.) 49 | 50 | To display the help text when just typing `make` alone, add `.DEFAULT_GOAL = help` at the top of the Makefile. 51 | -------------------------------------------------------------------------------- /math/3d-coordinate-systems.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/jakelazaroff/til/b4eb3794d318685b90898ea848ddddc6e5296be9/math/3d-coordinate-systems.png -------------------------------------------------------------------------------- /math/check-whether-an-angle-is-between-two-other-angles.md: -------------------------------------------------------------------------------- 1 | # Check whether an angle is between two other angles 2 | 3 | Weirdly, I've had to do this a lot recently: given an angle, determine whether it's bounded by two other angles. Technically, this is _always_ true — just go the other direction around the circle — but usually I care about going in a _particular_ direction. 4 | 5 | Step one is to normalize all the angles so that they fit into the range `[0, 2π]` (or `[0, 360]` if you prefer degrees). We'll use TypeScript to demonstrate: 6 | 7 | ```ts 8 | const TWO_PI = Math.PI * 2; 9 | export function normalize(angle: number) { 10 | const clamped = angle % TWO_PI, // normalize into range [-2π, 2π] 11 | positive = clamped + TWO_PI, // translate into range [0, 4π] 12 | result = positive % TWO_PI; // normalize into range [0, 2π] 13 | 14 | return result; 15 | } 16 | ``` 17 | 18 | Three steps here: 19 | 20 | 1. Modulo the angle by `2π`. If it's positive, this compresses it into the range `[0, 2π]`; if it's negative, this compresses it into the range `[-2π, 0]`. 21 | 2. Add `2π`. This increases the range to `[0, 4π]`, ensuring it's positive. 22 | 3. Modulo the angle by `2π` again, compressing it into the range `[0, 2π]`. 23 | 24 | Once all the angles are within the range `[0, 2π]`, they can be checked: 25 | 26 | ```ts 27 | export function between(min: number, theta: number, max: number) { 28 | if (min < max) return min <= theta && theta < max; 29 | else return min <= theta || theta < max; 30 | } 31 | ``` 32 | 33 | 1. If `min` is less than `max`, we can just ensure that `theta` is between them. 34 | 2. Otherwise, it means the angles wrap around `0` (for example, `min` might be `7π / 4` and `max` might be `π / 4`). In that case, we need to check whether `theta` is **greater** than `min` (and implicitly less than `2π`, which is the top of the range) _or_ **less** than `max` (and implicitly greater than `0`, which is the bottom of the range). 35 | 36 | This function assumes that `min` is inclusive and `max` is exclusive. 37 | -------------------------------------------------------------------------------- /math/convert-between-3d-coordinate-systems.md: -------------------------------------------------------------------------------- 1 | # Convert between 3D coordinate systems 2 | 3 | 3D coordinates are represented with X, Y and Z axes. Technically, there are 48 different ways to orient the axes with respect to each other, but [most 3D software has converged on four](https://twitter.com/FreyaHolmer/status/1325556229410861056): Y-up vs. Z-up, and right handed vs. left handed (pick one option from each). 4 | 5 | Here's roughly how it works: 6 | 7 | - In all systems, the X axis is horizontal and points to the right, meaning X coordinates get larger the further right they are. 8 | - "Y-up" and "Z up" refers to vertical axis, which points up. If you're standing at 0 on the vertical axis, coordinates above you are negative and coordinates below you are positive; those coordinates will be labeled Y in a Y-up system and Z in a Z-up system. 9 | - "Right handed" and "left handed" determine which direction the Y and Z axes point with respect to each other. This is easiest to visualize by orienting yourself such that the X axis points right and the vertical axis points up. If the vertical axis is Y (Y-up), a right handed system means the Z axis is pointing toward you and a left handed system means the Z axis is pointing away from you. If the vertical axis is Z (Z-up), this is flipped: a right handed system means the Y axis is pointing away from you, and a left handed system means the Y axis is pointing toward you. 10 | 11 | Wikipedia has [an article on the right-hand rule](https://en.wikipedia.org/wiki/Right-hand_rule), but I find it confusing. It's easiest for me to visualize by assigning an axis to each of my first three fingers: thumb is X, index finger is Y and middle finger is Z. I make an L with my index finger and thumb, and extend my middle finger perpendicular to my palm. Visualized this way, here are the four coordinate systems (remember, the X axis is always pointing right and the vertical axis is always pointing up): 12 | 13 | ![A diagram of Y-up, Z-up, left handed and right handed coordinate systems](./3d-coordinate-systems.png) 14 | 15 | - Y-up, right handed: raise your right hand in front of your face, palm to you. Your thumb (X axis) should be extending right, your index finger (Y axis) should be pointing up and your middle finger (Z axis) should be pointing toward you. 16 | - Y-up, left handed: raise your left hand in front of your face, palm away from you. Your thumb (X axis) should still be extending right and your index finger (Y axis) should still be pointing up, but now your middle finger (Z axis) should be pointing away from you. 17 | - Z-up, right handed: extend your right arm in front of you such that your palm is facing upward. Your thumb (X axis) should be extending right, your middle finger (Z axis) should be pointing up and your index finger (Y axis) should be pointing away from you. 18 | - Z-up, left handed: a little tricky to do, but contort your left arm such that your wrist is away from your body and your palm is facing upward. Your thumb (X axis) should still be extending right and your middle finger (Z axis) should still be pointing up, but now your index finger (Y axis) should be pointing toward you. 19 | 20 | Converting between these systems involves some combination of swapping the Y and Z axes and/or inverting the Z axis: 21 | 22 | - To switch from right handed to left handed (and vice versa), invert the forward axis. `(0, 0, 1)` in Y-up right handed is equal to `(0, 0, -1)` in Y-up left handed; `(0, 1, 0)` in Z-up right handed is equal to `(0, -1, 0)` in Z-up left handed. 23 | 24 | **Y-up:** 25 | 26 | ``` 27 | 'x = x 28 | 'y = y 29 | 'z = -z 30 | ``` 31 | 32 | **Z-up:** 33 | 34 | ``` 35 | 'x = x 36 | 'y = -y 37 | 'z = z 38 | ``` 39 | 40 | - To switch from Y-up to Z-up (and vice versa) with the same handedness, swap the Y and Z axes (this is the same as rotating a quarter turn about the X axis). Note that this flips the forward axis — if you were standing at `(0, 0, 0)` looking at the initial point `(0, 1, 2)` in Y-up coordinates, the point `(0, 2, 1)` in Z-up coordinates without the would be behind you. 41 | 42 | ``` 43 | 'x = x 44 | 'y = z 45 | 'z = y 46 | ``` 47 | 48 | - Therefore, to switch from Y-up to Z-up (and vice versa) without also flipping the forward axis, you need to also flip the handedness — swap the Y and Z axes and then invert the new forward axis. `(0, 0, 1)` in Y-up right handed is equal to `(0, 1, 0)` in Z-up left handed; `(0, 1, 0)` in Z-up right handed is equal to `(0, 0, 1)` in Y-up left handed. 49 | 50 | **Y-up to Z-up:** 51 | 52 | ``` 53 | 'x = x 54 | 'y = z 55 | 'z = -y 56 | ``` 57 | 58 | **Z-up to Y-up:** 59 | 60 | ``` 61 | 'x = x 62 | 'y = -z 63 | 'z = y 64 | ``` 65 | -------------------------------------------------------------------------------- /math/find-a-point-on-a-sphere.md: -------------------------------------------------------------------------------- 1 | # Find a point on a sphere 2 | 3 | Given the radius (`r`) of a sphere, an azimuth angle (angle around the vertical axis; `s`) and a polar angle (angle from horizontal; `t`) the equation to find the coordinates of the corresponding point is as follows: 4 | 5 | ``` 6 | x = r * cos(s) * sin(t) 7 | y = r * cos(t) 8 | z = r * sin(s) * sin(t) 9 | ``` 10 | 11 | This assumes the axes are in a Y-up right handed orientation. 12 | -------------------------------------------------------------------------------- /math/rotate-a-point-around-a-circle.md: -------------------------------------------------------------------------------- 1 | # Rotate a point around a circle 2 | 3 | Given a point `(x, y)` and an angle `θ` (theta) measured in radians, the coordinates of the point `(x', y')` found by rotating the original point counterclockwise around a circle centered on `(0, 0)` are: 4 | 5 | ``` 6 | x' = x * cos(θ) - y * sin(θ) 7 | y' = x * sin(θ) + y * cos(θ) 8 | ``` 9 | 10 | To rotate clockwise, flip the sign of `θ` — if it's positive then make it negative, and vice versa. 11 | 12 | If the circle isn't centered on `(0, 0)`, find the delta between the center of the circle and the origin, subtract the delta from the point before rotating, then add that delta back to the new point. 13 | 14 | Keep in mind that the initial coordinates must be used for both calculations. In particular, avoid this issue where one of the coordinates is modified, and then that modified coordinate is used to calculate the other coordinate: 15 | 16 | ```js 17 | const point = { x: 10, y: 10 }; 18 | const theta = Math.PI / 8; 19 | 20 | // ❌ incorrect 21 | point.x = point.x * Math.cos(theta) - point.y * Math.sin(theta); 22 | point.y = point.x * Math.sin(theta) + point.y * Math.cos(theta); // point.x is different now! 23 | ``` 24 | 25 | Instead, either assign the result to a new variable, or — if the original object must be mutated — store the previous `x` and `y` and use them when calculating the result. 26 | 27 | ```js 28 | const point = { x: 10, y: 10 }; 29 | const theta = Math.PI / 8; 30 | 31 | // ✅ correct if not mutating original object 32 | const result = { 33 | x: point.x * Math.cos(theta) - point.y * Math.sin(theta), 34 | y: point.x * Math.sin(theta) + point.y * Math.cos(theta), 35 | }; 36 | 37 | // ✅ correct if mutating original object 38 | const { x, y } = point; 39 | point.x = x * Math.cos(theta) - y * Math.sin(theta); 40 | point.y = x * Math.sin(theta) + y * Math.cos(theta); 41 | ``` 42 | -------------------------------------------------------------------------------- /nextjs/dont-server-side-render-a-client-component.md: -------------------------------------------------------------------------------- 1 | # Don't server-side render a client component 2 | 3 | React Server Components splits React components into two kinds: server components and client components. 4 | 5 | Server components run only on the server. Client components, by contrast, can run on both the client _and_ the server (although most hooks like `useEffect` won't run on the server). That way, there isn't just a big hole in the HTML before the JavaScript bundle loads. 6 | 7 | Sometimes, though, you _don't want_ a component to render on the server. Say it gets its content from `localStorage`, which only exists in the browser. If the component falls back to an empty list, React will throw an error when it tries to hydrate the server-side rendered HTML and it doesn't match what the client renders. 8 | 9 | Marking the component with `"use client"` is insufficient: that just tells React that it _can_ run on the client, not that it can _only_ run on the client. 10 | 11 | One way to solve this is to (ab)use the `dynamic` function, which is used for [lazy loading components](https://nextjs.org/docs/pages/building-your-application/optimizing/lazy-loading). 12 | 13 | Normally, you'd pass a function that imports the other component asynchronously: 14 | 15 | ```js 16 | import dynamic from "next/dynamic"; 17 | 18 | const SomeDynamicComponent = dynamic(() => import("./SomeComponent")); 19 | ``` 20 | 21 | That's not a requirement, though; all you need to pass `dynamic` is a function which returns a promise that resolves with a React component. Additionally, it accepts an options parameter that lets you control whether it should be server-side rendered. 22 | 23 | The full fix looks something like this: 24 | 25 | ```js 26 | import dynamic from "next/dynamic"; 27 | import SomeComponent from "./SomeComponent"; 28 | 29 | const SomeClientOnlyComponent = dynamic(() => Promise.resolve(SomeComponent), { 30 | ssr: false, 31 | }); 32 | ``` 33 | 34 | Boom: a client component that's included in your bundle but not rendered on the server. 35 | -------------------------------------------------------------------------------- /nodejs/execute-typescript-files-in-nodejs.md: -------------------------------------------------------------------------------- 1 | # Execute TypeScript files in Node.js 2 | 3 | One annoying thing about using TypeScript with Node.js is that you mostly can't — you either need to transpile all your files before running `node`, or use a "wrapper" interpreter like [ts-node](https://www.npmjs.com/package/ts-node) or [tsx](https://www.npmjs.com/package/tsx). It looks like that's changing with [an experimental feature called loaders that lets you hook into the module loading process](https://nodejs.org/api/esm.html#loaders). Node.js still expects that you give it JavaScript, but the loaders let you convert files into JavaScript as they're imported, before Node.js gets its hands on them. 4 | 5 | (Note that as of February 2023, the Node.js Loaders API is marked as unstable, meaning that this code may stop working if it changes). 6 | 7 | Raphael Medaer has [a minimal example with a CSS loader](https://raphael.medaer.me/2022/01/28/fun-with-node-experimental-modules-and-loaders.html). Loading TypeScript is a bit more involved (mostly because [TypeScript module resolution is complicated](https://www.typescriptlang.org/docs/handbook/module-resolution.html#how-typescript-resolves-modules), since you can do things like omit the extension) but here's my best attempt at it. 8 | 9 | The code will all go into a file called `loader.mjs`, which gets used like this: 10 | 11 | ```sh 12 | node --experimental-loader ./loader.mjs filetorun.ts 13 | ``` 14 | 15 | There are two functions here, hooking into two steps: 16 | 17 | - [resolve](https://nodejs.org/api/esm.html#resolvespecifier-context-nextresolve) receives a module specifier (the `./foo` of `import "./foo"`) and parent URL (the full path to the importing file, such as `file:///Users/jake/filetorun.ts`). The job of this function is to return the URL of the imported file (in this example, `file:///Users/jake/foo.ts`). 18 | - [load](https://nodejs.org/api/esm.html#loadurl-context-nextload) receives the URL returned from the `resolve` function; its job is to return the file contents as JavaScript. 19 | 20 | ```js 21 | import * as fs from "node:fs/promises"; 22 | import * as path from "node:path"; 23 | import { pathToFileURL, URL } from "node:url"; 24 | 25 | import ts from "typescript"; 26 | 27 | export async function resolve(specifier, context, next) { 28 | // if there's no parent, use node's normal module resolution 29 | if (!context.parentURL) return next(specifier, context, next); 30 | 31 | // if the specifier isn't a file path, use node's normal module resolution 32 | if (!/^(file:\/{3}|\.{0,2}\/{1,2})/.test(specifier)) return next(specifier, context, next); 33 | 34 | // handle an edge case in which the `URL` constructor interprets a leading `//` as an HTTP protocol 35 | const fixed = specifier.replace(/^\/{2}/, "/"); 36 | 37 | // get the full import specifier, including the parent URL 38 | const importpath = new URL(fixed, context.parentURL).pathname; 39 | 40 | // remove the extension from the import path 41 | const base = importpath.replace(/\.[^/.]+$/, ""); 42 | 43 | // create a list of possible paths 44 | const paths = [ 45 | base + ".ts", 46 | base + ".tsx", 47 | path.join(base, "index.ts"), 48 | path.join(base, "index.tsx") 49 | ]; 50 | 51 | // iterate through each path 52 | for (const file of paths) { 53 | // check whether the file exists 54 | const exists = await fs 55 | .stat(file) 56 | .then(() => true) 57 | .catch(() => false); 58 | 59 | // if a typescript file exists, continue module resolution with that file's url 60 | if (exists) return next(pathToFileURL(file).href, context, next); 61 | } 62 | 63 | // if no typescript file is found, just continue with node's normal module resolution 64 | return next(specifier, context, next); 65 | } 66 | 67 | export async function load(url, context, next) { 68 | // if the file extension isn't .ts or .tsx, use node's normal module loading 69 | if (!/\.tsx?$/.test(url)) return next(url, context, next); 70 | 71 | // read the file from disk 72 | const source = await fs.readFile(new URL(url)); 73 | 74 | // transpile the file to javascript 75 | const { outputText: result } = ts.transpileModule(source.toString(), { 76 | compilerOptions: { target: "esnext", module: "esnext" } 77 | }); 78 | 79 | // return the transpiled source code 80 | return { shortCircuit: true, format: "module", source: result }; 81 | } 82 | ``` 83 | 84 | Some notes on the module resolution: 85 | 86 | - If `context.parentURL` is undefined in `resolve`, it means the file was passed directly to `node` as a command line argument. If that's the case, assume it's a real file and skip any further custom resolution. 87 | - `resolve` only attempts to find files whose specifiers begin with `/`, `//`, `./`, `../` or `file:///`. More information about module resolution can be found in the [Node.js documentation on ECMAScript modules](https://nodejs.org/api/esm.html#terminology). 88 | - Technically, ECMAScript modules require the import specifier to include an extension (`import "./foo.js"` is valid; `import "./foo"` is not). In practice, most bundlers allow you to omit the extension. This `resolve` function should work whether or not the extension is present. 89 | 90 | If a faster transpiler such as [esbuild](https://esbuild.github.io) is already one of the project dependencies, it probably makes sense to use that instead of the default TypeScript one: 91 | 92 | ```js 93 | // transpile the file to javascript 94 | const { code: result } = await esbuild.transform(source, { loader: "ts" }); 95 | ``` 96 | -------------------------------------------------------------------------------- /nodejs/get-dirname-in-esm.md: -------------------------------------------------------------------------------- 1 | # Get `__dirname` in ESM 2 | 3 | When using ES modules with Node.js, `__dirname` (the path to the file's directory) isn't available. Here's a [short and sweet replacement](https://stackoverflow.com/a/50052194): 4 | 5 | ```js 6 | import { dirname } from "node:path"; 7 | import { fileURLToPath } from "node:url"; 8 | 9 | const __dirname = dirname(fileURLToPath(import.meta.url)); 10 | ``` 11 | 12 | **Update:** As of [Node 21.2.0](https://nodejs.org/en/blog/release/v21.2.0), `__filename` and `__dirname` are natively available as `import.meta.filename` and `import.meta.dirname`, respectively. 13 | -------------------------------------------------------------------------------- /pnpm/patch-a-node_modules-dependency.md: -------------------------------------------------------------------------------- 1 | # Patch a `node_modules` dependency 2 | 3 | If a library has a bug, there's an easier solution than forking the repo: patch the dependency in your `node_modules` folder! If you're using npm, you can [install a package to do it](https://www.npmjs.com/package/patch-package), but [pnpm has it built in](https://pnpm.io/cli/patch): 4 | 5 | ```sh 6 | pnpm patch 7 | ``` 8 | 9 | When you run this, pnpm will generate a temporary folder (on macOS, it looks something like `/private/var/folders/mg/longstringofchars/T/otherlongstringofchars`) which contains the exact contents of the package's `node_modules`. 10 | 11 | You can change anything you want. I added `"sideEffects": false` to the `package.json` of [lil-gui](https://github.com/georgealways/lil-gui) so esbuild could tree shake it out of production bundles, but changing code or type definitions will work just as well. 12 | 13 | Once you've changed the file (or files!), run `pnpm patch-commit ` and pnpm will generate a patch with your changes at `patches/package@version.patch` and modify your `package.json` and `pnpm-lock.yaml`. After those files are committed, pnpm will apply your changes every time you install your project's dependencies. 14 | -------------------------------------------------------------------------------- /poetry/create-venv-folders-within-projects.md: -------------------------------------------------------------------------------- 1 | # Create `.venv` folders within projects 2 | 3 | By default, [Poetry](https://python-poetry.org/) creates virtual environment directories in a global location (`/Users/username/Library/Caches/pypoetry/virtualenvs` on macOS). Since tools like VS Code need to be able to find a virtual environment to be able to do things like autocompletion of dependencies, it's convenient to colocate that directory with a project's source code. 4 | 5 | The command to do that is short and sweet: 6 | 7 | ```bash 8 | poetry config virtualenvs.in-project true 9 | ``` 10 | -------------------------------------------------------------------------------- /prosemirror/prevent-extra-whitespace-in-nodeviews.md: -------------------------------------------------------------------------------- 1 | # Prevent extra whitespace in NodeViews 2 | 3 | While trying to build a [ProseMirror `NodeView`](https://prosemirror.net/docs/ref/#view.NodeView) using a [Shoelace dropdown](https://shoelace.style/components/dropdown), I kept getting a ton of extra vertical space around the dropdown trigger. 4 | 5 | The fix turned out to be really simple. ProseMirror sets some styles on its root element. The culprit turned out to be [`white-space: break-spaces`](https://developer.mozilla.org/en-US/docs/Web/CSS/white-space#break-spaces) — an inherited style which preserves newline characters. 6 | 7 | I didn't want to override that style for the whole ProseMirror element, but setting `white-space: normal` on the `NodeView` element fixed it easily. 8 | -------------------------------------------------------------------------------- /prosemirror/use-a-svelte-component-as-a-nodeview.md: -------------------------------------------------------------------------------- 1 | # Use a Svelte component as a NodeView 2 | 3 | The ProseMirror rich text editor library has a concept called `NodeView` for [rendering custom widgets with a document](https://prosemirror.net/docs/guide/#view.node_views). 4 | 5 | To use a `NodeView`, you write a "constructor" function that takes in the ProseMirror `Node`, `EditorView` and a function to get the node's position. That function is expected to set up the DOM for the custom widget and return an object implementing the [`NodeView` interface](https://prosemirror.net/docs/ref/#view.NodeView). 6 | 7 | The examples set up DOM using vanilla JS, but it's pretty simple to instead create a DOM element and render a framework component inside of it. This general approach should work for any framework — Svelte, React, Solid, etc — as well as with web components. 8 | 9 | TL;DR, this `SvelteNodeView` class creates an element with the given tag (by default `
`) inside of which it mounts a Svelte component. It spreads the node's attributes into the component's props, adding an additional `onchange` prop that can be called from within the component to set that attribute. 10 | 11 | Right now, it doesn't update the component if the node's attributes change outside. 12 | 13 | The static `create` method creates a constructor function given a Svelte component class and an optional tag name. 14 | 15 | ```ts 16 | import { type Node } from "prosemirror-model"; 17 | import { type EditorView, type NodeViewConstructor } from "prosemirror-view"; 18 | import { type SvelteComponent, type ComponentType, mount } from "svelte"; 19 | 20 | type Component = ComponentType< 21 | SvelteComponent< 22 | Record & { 23 | onchange(name: string, value: any): void; 24 | } 25 | > 26 | >; 27 | 28 | class SvelteNodeView { 29 | node: Node; 30 | view: EditorView; 31 | getPos: () => number | undefined; 32 | dom: Element; 33 | 34 | constructor( 35 | comp: Component, 36 | tag: string, 37 | node: Node, 38 | view: EditorView, 39 | getPos: () => number | undefined 40 | ) { 41 | this.node = node; 42 | this.view = view; 43 | this.getPos = getPos; 44 | 45 | const target = (this.dom = window.document.createElement(tag)); 46 | target.style.display = "contents"; 47 | 48 | mount(comp, { 49 | target, 50 | props: { 51 | ...node.attrs, 52 | onchange(name, value) { 53 | const pos = getPos(); 54 | if (pos === undefined) return; 55 | 56 | const tr = view.state.tr.setNodeAttribute(pos, name, value); 57 | view.dispatch(tr); 58 | } 59 | } 60 | }); 61 | } 62 | 63 | static create(comp: Component, tag: string = "div"): NodeViewConstructor { 64 | return (node, view, dom) => new SvelteNodeView(comp, tag, node, view, dom); 65 | } 66 | } 67 | ``` 68 | -------------------------------------------------------------------------------- /redbean/bundle-files-into-a-redbean-zip-archive.md: -------------------------------------------------------------------------------- 1 | # Bundle files into a redbean zip archive 2 | 3 | [redbean](https://redbean.dev/) is a zip file that contains a webserver that runs on macOS, Windows, Linux and a few other operating systems. You download an executable that's also a zip archive, and it'll serve any files you add to the archive. 4 | 5 | There's extensive documentation on the redbean API, but less on how to actually do things with a zip archive. Here's a brief guide to getting up and running on macOS: 6 | 7 | First, download the latest redbean release and make it executable: 8 | 9 | ```sh 10 | curl https://redbean.dev/redbean-latest.com >redbean.com 11 | chmod +x redbean.com 12 | ``` 13 | 14 | Running the binary with `./redbean.com` starts a web server on port 8080. If you visit http://localhost:8080 in your browser, you'll see a directory listing of all the files in the zip archive. 15 | 16 | From the command line, you can see that same list of files with the `unzip` command. The `Z` flag tells it to print info instead of extracting the archive, and the `1` flag tells it to only print filenames. 17 | 18 | ```sh 19 | unzip -Z1 redbean.com 20 | ``` 21 | 22 | There are a bunch of files in `/usr/share` related to time zones and SSL. If you want to ignore these, you can add `-x "usr/*"` to your command: 23 | 24 | ```sh 25 | unzip -Z1 redbean.com -x "usr/*" 26 | ``` 27 | 28 | redbean comes with a bunch of standard files (such as `help.txt`, which contains the documentation) that you might not want in your actual server. You can remove files with `zip -d`: 29 | 30 | ```sh 31 | zip -d redbean.com help.txt 32 | ``` 33 | 34 | If you want to make things a little easier, you can use [`fzf`](https://github.com/junegunn/fzf) to make a nice interactive terminal app. Here's a command that will list all the files in the redbean archive, let you select one or more with your arrow keys and then delete them: 35 | 36 | ```sh 37 | unzip -Z1 redbean.com -x "usr/*" | fzf -m | xargs -I{} zip -d redbean.com {} 38 | ``` 39 | 40 | There are a bunch of examples in the redbean documentation about adding files to the archive. The most useful is probably adding the contents of an entire directory: 41 | 42 | ```sh 43 | zip -r -j redbean.com src/. 44 | ``` 45 | 46 | Context switching to add files and restart redbean after every change is annoying. The [`entr`](https://eradman.com/entrproject/) utility can automatically re-run a command every time it detects files change. This loop will zip a directory into redbean, run the server and re-zip and restart whenever you add or change a file: 47 | 48 | ```sh 49 | while true; do 50 | find src -print | entr -ddr sh -c "zip -r -j redbean.com src/. && ./redbean.com" 51 | [[ $? -eq 0 ]] && break 52 | done 53 | ``` 54 | -------------------------------------------------------------------------------- /rsbuild/migrate-from-create-react-app.md: -------------------------------------------------------------------------------- 1 | # Migrate from Create React App 2 | 3 | I just tried to get [SongRender](https://songrender.com/) running again. I started the React codebase in 2018 with [Create React App](https://create-react-app.dev/) — a choice that, six years later, has aged poorly. 4 | 5 | I spent a couple hours trying to get it up and running again, broke a production deploy, kept trying for a bit and then decided to follow [this recommendation from Augustinus Nathaniel](https://twitter.com/sozonome/status/1744738854677131461) to migrate to Rsbuild. 6 | 7 | What's Rsbuild? An ["Rspack-based build tool"](https://www.rsbuild.dev). What's Rspack? A ["fast Rust-based web bundler"](https://www.rspack.dev), one of the many native bundlers that have been popping up to speed up the JavaScript ecosystem. I'm not sure why there are two layers here, but I'm sure they have their reasons. 8 | 9 | Anyway. Particularly notable about Rspack is their stated goal of [maximizing webpack compatibility](https://www.rspack.dev/guide/migrate-from-webpack.html). While I don't necessarily think that would the best priority for a greenfield tool, it does make sense for migrating from Create React App, which uses webpack under the hood. 10 | 11 | Rsbuild has [a guide for migrating from Create React App](https://rsbuild.dev/guide/migration/cra). Here are the parts I needed to do: 12 | 13 | 1. Replace Create React App with Rsbuild. SongRender uses Yarn (another poor decision, but a story for another day) which is why I'm using it here: 14 | 15 | ```bash 16 | yarn remove react-scripts 17 | yarn add @rsbuild/core @rsbuild/plugin-react -D 18 | ``` 19 | 20 | 2. Update the `start` and `build` scripts in `package.json`: 21 | 22 | ```json 23 | { 24 | "scripts": { 25 | "start": "rsbuild dev", 26 | "build": "rsbuild build" 27 | } 28 | } 29 | ``` 30 | 31 | 3. Create an `rsbuild.config.ts` configuration file. Here's what I ended up with: 32 | 33 | ```ts 34 | import { defineConfig } from "@rsbuild/core"; 35 | import { pluginReact } from "@rsbuild/plugin-react"; 36 | import { pluginSvgr } from "@rsbuild/plugin-svgr"; 37 | 38 | export default defineConfig({ 39 | plugins: [pluginReact(), pluginSvgr()], 40 | html: { template: "./public/index.html" }, 41 | output: { distPath: { root: "build" } }, 42 | server: { 43 | proxy: { 44 | "/api": { target: "http://localhost:3001", pathRewrite: { "^/api": "" } }, 45 | "/api": { 46 | target: "https://songrender-local.nyc3.digitaloceanspaces.com", 47 | pathRewrite: { "^/downloads": "" } 48 | } 49 | } 50 | } 51 | }); 52 | ``` 53 | 54 | - The [React plugin](https://rsbuild.dev/plugins/list/plugin-react), um, provides support for React. 55 | - The [SVGR plugin](https://rsbuild.dev/plugins/list/plugin-svgr) supports Create React App's feature that lets you [import SVGs as React components](https://create-react-app.dev/docs/adding-images-fonts-and-files/#adding-svgs). That's a bad idea — SVGs should be in a separate sprite sheet! — but it was the decision I made in 2018 and now was not the time to change it. 56 | - The `html` object defines the path to the [HTML index file](https://rsbuild.dev/guide/basic/html-template) that bootstraps the single page app. 57 | - The `output` object changes the build directory (`build` in Create React App, `dist` in Rsbuild). 58 | - The [proxy server](https://rsbuild.dev/config/server/proxy) proxies API requests in development, replicating [a similar Create React App feature](https://create-react-app.dev/docs/proxying-api-requests-in-development/). 59 | 60 | 4. Rsbuild injects environment variables starting with `PUBLIC_` rather than `REACT_APP_`. It has [an option to replicate Create React App's behavior](https://rsbuild.dev/guide/migration/cra#environment-variables), but I just went through and changed the names. 61 | 62 | 5. I had used the [`workerize-loader`](https://github.com/developit/workerize-loader) library to circumvent Create React App's lack of support for Web Workers. That broke — but luckily, [Rspack supports web workers out of the box](https://rspack.org/guide/web-workers), using syntax much closer to the actual Web Workers syntax. Switching over was fairly seamless. 63 | 64 | Finished! Not sure that Rsbuild will be my first choice bundler going forward, but it did save me from Create React App and unmaintained dependency hell. 65 | -------------------------------------------------------------------------------- /rust/link-against-a-cpp-file.md: -------------------------------------------------------------------------------- 1 | # Link against a C++ file 2 | 3 | While Rust natively supports linking against C, it needs an extra binding layer in order to link against C++. Although a library called [bindgen](https://rust-lang.github.io/rust-bindgen/introduction.html) can generate those bindings automatically, I wanted to see how to do it myself. 4 | 5 | Rust has extensive documentation on writing a [foreign function interace](https://doc.rust-lang.org/nomicon/ffi.html); here's a minimal example of how to do it. 6 | 7 | Let's say we want to call this function in `lib/inc.cpp` from Rust: 8 | 9 | ```cpp 10 | #include "inc.h" 11 | 12 | int twice(int x) { return x * 2; } 13 | ``` 14 | 15 | In the header file `lib/inc.h`, `extern "C"` [makes the function name linkable from C code](https://stackoverflow.com/a/1041880): 16 | 17 | ```cpp 18 | extern "C" { 19 | int twice(int x); 20 | } 21 | ``` 22 | 23 | We can compile that function into a shared library `lib/libinc.so`: 24 | 25 | ```bash 26 | g++ -shared lib/inc.cpp -o lib/libinc.so 27 | ``` 28 | 29 | (Note that in this case, we're defining a C-linkable interface in the C++ file itself rather than adding an extra layer between C++ and Rust. If we couldn't change the C++ source code, we could add another file in C or C++ that calls the C++ library, and then expose the C-linkable interface from _that_ file.) 30 | 31 | In Rust, we can define a list of foreign functions within an `extern` block, and then call into them from Rust code. Rust treats all foreign functions as unsafe, so we need to call it from within an `unsafe` block: 32 | 33 | ```rs 34 | extern "C" { 35 | fn twice(x: i32) -> i32; 36 | } 37 | 38 | fn main() { 39 | unsafe { 40 | println!("{}", twice(2)); 41 | } 42 | } 43 | ``` 44 | 45 | Trying to build or run this will result in a linker error about undefined symbols. We need to tell Rust how to find the library, which we can do by placing a [`build.rs` file](https://doc.rust-lang.org/cargo/reference/build-scripts.html#rustc-link-lib) in the package root. TL;DR from the docs: 46 | 47 | > Build scripts communicate with Cargo by printing to stdout. Cargo will interpret each line that starts with cargo: as an instruction that will influence compilation of the package. All other lines are ignored. 48 | 49 | Here's our `build.rs`: 50 | 51 | ```rs 52 | fn main() { 53 | println!("cargo:rustc-link-search=lib"); 54 | println!("cargo:rustc-link-lib=inc"); 55 | } 56 | ``` 57 | 58 | - `cargo:rustc-link-search=lib` tells Cargo to look in the directory `lib` for C libraries 59 | - `cargo:rustc-link-lib=inc` tells Cargo to link against `libinc` (the `lib` prefix is implied) 60 | 61 | Boom! Running `cargo run` should now correctly print `4` to the console. 62 | -------------------------------------------------------------------------------- /svelte/bail-out-of-a-reactive-block.md: -------------------------------------------------------------------------------- 1 | # Bail out of a reactive block 2 | 3 | In Svelte, you can [prefix a block with `$:` to mark it as reactive](https://svelte.dev/docs/svelte-components#script-3-$-marks-a-statement-as-reactive), which means it will re-run whenever any outside variables referenced within the block change. It performs the same function as [`useEffect` in React](https://react.dev/reference/react/useEffect) or [`createEffect` in Solid](https://www.solidjs.com/tutorial/introduction_effects). 4 | 5 | The difference is that with React and Solid, the "effect" is performed by a function, which means it's possible to "bail out" by returning early. Here's a contrived React example: 6 | 7 | ```jsx 8 | function ExampleComponent({ shouldRun }) { 9 | useEffect(() => { 10 | if (!shouldRun) return; 11 | 12 | // effect code goes here 13 | }, [shouldRun]); 14 | 15 | // ... 16 | } 17 | ``` 18 | 19 | With Svelte, however, reactive blocks execute at the top level of the module, [where `return` statements aren't allowed](http://es5.github.io/#x12.9): 20 | 21 | > An ECMAScript program is considered syntactically incorrect if it contains a `return` statement that is not within a `FunctionBody`. 22 | 23 | It doesn't seem to be documented anywhere, but Svelte lets you add an `if` statement before the reactive block in order to execute it conditionally. Variables referenced within the condition will also be tracked reactively, even if they're not referenced within the block itself. 24 | 25 | ```html 26 | 33 | ``` 34 | 35 | If you have multiple conditions, or need to run some logic within the reactive block before bailing out, you can take advantage of the fact that `$:` uses [JS label syntax](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/label), and use the `break` statement to bail out of the block: 36 | 37 | ```html 38 | 51 | ``` 52 | -------------------------------------------------------------------------------- /svelte/force-reactive-state-to-reevaluate.md: -------------------------------------------------------------------------------- 1 | # Force reactive state to reevaluate 2 | 3 | In Svelte 5, you declare a reactive variable using the [`$state` rune](https://svelte.dev/blog/runes). 4 | 5 | ```js 6 | let something = $state(1); 7 | ``` 8 | 9 | Svelte will track anything that depends on that variable — such as a derived variable, or an effect — and update it whenever the value changes: 10 | 11 | ```js 12 | let something = $state(1); 13 | let doubled = $derived(something * 2); 14 | 15 | $effect(() => console.log(something + doubled)); 16 | 17 | something = 2; 18 | ``` 19 | 20 | In this case, setting `something` to 2 will update `doubled` to 4 and run the effect, logging 6. 21 | 22 | Previous verisons of Svelte would use self-assignments (e.g. `something = something`) to update dependents. In Svelte 5, that no longer works — which presents a problem if you're trying to use a class that mutates itself, such as a [Yjs](https://yjs.dev) document: 23 | 24 | ```js 25 | import * as Y from "yjs"; 26 | 27 | let doc = $state(new Y.Doc()); 28 | doc.on("change", () => { 29 | doc = doc; 30 | }); 31 | 32 | let array = $derived(doc.getArray("my-array")); 33 | $effect(() => console.log(array)); // will never run 34 | 35 | array.insert(0, ["val"]); 36 | ``` 37 | 38 | However, it **does** work if you first assign `undefined` to the state: 39 | 40 | ```js 41 | import * as Y from "yjs"; 42 | 43 | let doc = $state(new Y.Doc()); 44 | doc.on("change", () => { 45 | const tmp = doc; 46 | doc = undefined; 47 | doc = tmp; 48 | }); 49 | 50 | let array = $derived(doc.getArray("my-array")); 51 | $effect(() => console.log(array)); // run whenever the array is changed! 52 | 53 | array.insert(0, ["val"]); 54 | ``` 55 | -------------------------------------------------------------------------------- /svg/create-an-svg-sprite-sheet.md: -------------------------------------------------------------------------------- 1 | # Create an SVG sprite sheet 2 | 3 | It's [bad practice to embed SVGs within your JavaScript bundle](https://web.archive.org/web/20220615153016/https://twitter.com/_developit/status/1382838799420514317), since it increases the bundle size and therefore download and parsing time. 4 | 5 | Jacob Groß has [a good post on all the alternatives and their tradeoffs](https://kurtextrem.de/posts/svg-in-js), but the TL;DR is that the best way in most cases is to create SVG sprite sheets and include them in the HTML. 6 | 7 | SVG sprite sheets look something like the following: 8 | 9 | ```html 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | ``` 21 | 22 | The contents of each individual SVG goes inside a `` tag identified by a unique ID. That `` tag is also where attributes on the individual SVG's `` tag (like `viewBox`, etc) should go. Those all go within a `` tag. 23 | 24 | To use a particular symbol from the sprite sheet, there's the `` tag. It's used like this: 25 | 26 | ```html 27 | 28 | 29 | 30 | ``` 31 | 32 | To actually maintain the sprite sheet, there's a tool called (appropriately enough) [`svg-sprite`](https://www.npmjs.com/package/svg-sprite). A command to take a directory of SVGs and combine them into a single sprite sheet might look like this: 33 | 34 | ```bash 35 | svg-sprite --symbol \ 36 | --symbol-dest src \ 37 | --symbol-sprite icons.svg \ 38 | icons/*.svg 39 | ``` 40 | 41 | Here's what it does: 42 | 43 | - `--symbol` enables [symbol mode](https://github.com/svg-sprite/svg-sprite/blob/HEAD/docs/configuration.md#defs--symbol-mode) 44 | - `--symbol-dest src` sets the output directory to `src` 45 | - `--symbol-sprite icons.svg` sets the output filename to `icons.svg` 46 | - `icons/*.svg` uses all files within the directory `icons` ending in `.svg` as input 47 | 48 | The `id` on each `` tag will be the name of the input file (minus the extension). 49 | -------------------------------------------------------------------------------- /systemd/set-up-a-basic-service.md: -------------------------------------------------------------------------------- 1 | # Set up a basic service 2 | 3 | I use [systemd](https://systemd.io) pretty infrequently, so whenever I need to stand up some persistent process on a server I spend a bunch of time looking things up again. Hence: this TIL! 4 | 5 | systemd does a lot of things, but the core thing we care about here is keeping some persistent process up and running. 6 | 7 | We tell systemd how to do that using a unit file, which is an [INI file](https://en.wikipedia.org/wiki/INI_file) with the extension `.service`. There are [a bunch of places in which systemd will look for unit files](https://unix.stackexchange.com/a/367237); on my Ubuntu VPS I chose `/lib/systemd/system`. 8 | 9 | Unit files are sets of key/value pairs broken up into sections. Most of important stuff is in the `Service` section: 10 | 11 | - `EnvironmentFile` tells systemd to load the key/value pairs in the corresponding file as environment variables before starting the process. 12 | - `WorkingDirectory` tells systemd the directory from which the process will run. 13 | - `ExecStart` is the command that systemd will use to start the process. 14 | - `Restart` tells systemd when it should restart the process. Setting it to "always" will have it restart no matter what the exit code is. 15 | - `StandardOutput` and `StandardError` tell systemd where to send stdout and stderr. I used to set this to `syslog` but it recently started yelling at me that I should use `journal`, so I switched. 16 | 17 | Let's pretend we're making a service called `api`. We'd store the unit file at `/lib/systemd/system/api.service`, with corresponding file and directory names as necessary: 18 | 19 | ```ini 20 | [Unit] 21 | Description=api 22 | After=network.target 23 | 24 | [Service] 25 | EnvironmentFile=/etc/sysconfig/api 26 | WorkingDirectory=/var/www/api 27 | ExecStart=/usr/bin/node src/index.mjs 28 | Restart=always 29 | StandardOutput=journal 30 | StandardError=journal 31 | SyslogIdentifier=api 32 | 33 | [Install] 34 | WantedBy=multi-user.target 35 | ``` 36 | 37 | There are also a couple commands to remember: 38 | 39 | - `systemctl start api` starts the service if it's stopped. 40 | - `systemctl restart api` restarts the service. 41 | - `systemctl status api` checks the service's status. 42 | - `journalctl -u api` returns all the service's logs. A `-f` flag on the end will turn it into a live tail. 43 | -------------------------------------------------------------------------------- /tailwind/style-conditionally-based-on-data-and-aria-attributes.md: -------------------------------------------------------------------------------- 1 | # Style conditionally based on data and ARIA attributes 2 | 3 | The unstyled UI library [Radix](https://www.radix-ui.com) exposes its inner state using [`data-*` attributes](https://developer.mozilla.org/en-US/docs/Web/HTML/Global_attributes/data-*). When it comes to styling, it's better to make use of these attributes rather than rely on pseudo-classes like `:hover` (which isn't accessible by keyboard users) or conditionally applying classes based on props or state (which is error-prone). 4 | 5 | [Tailwind supports styling `data-*` attributes](https://tailwindcss.com/docs/hover-focus-and-other-states#data-attributes): prefix the class name with `data-[attribute]` (for styling based on the presence or absence of an attribute) or `data-[attribute=value]` (for styling based on a specific value). 6 | 7 | For example, here's how a [Radix context menu item](https://www.radix-ui.com/docs/primitives/components/context-menu#item) might be styled: 8 | 9 | ```jsx 10 | function ContextMenuItem({ text, disabled, onSelect }) { 11 | return ( 12 | 17 | {text} 18 | 19 | ); 20 | } 21 | ``` 22 | 23 | Radix conforms to accessibility standards using [ARIA attributes](https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA). The ARIA attributes themselves aren't in the documentation, so it's probably not as safe to use them for styling, but for what it's worth [Tailwind also supports styling based on ARIA attributes](https://tailwindcss.com/docs/hover-focus-and-other-states#aria-states), providing semantic sugar for common ARIA attributes like `aria-expanded` and `aria-disabled` and supporting less common attributes via the "arbitrary variants" feature. 24 | 25 | For example, styling a [Radix dropdown menu trigger](https://www.radix-ui.com/docs/primitives/components/dropdown-menu#trigger) based on its `aria-expanded` attribute might look like this: 26 | 27 | ```jsx 28 | function DropdownMenuTrigger({ text }) { 29 | return ( 30 | 31 | {text} 32 | 33 | ); 34 | } 35 | ``` 36 | -------------------------------------------------------------------------------- /tailwind/style-shadow-trees-from-the-light-dom.md: -------------------------------------------------------------------------------- 1 | # Style shadow trees from the light DOM 2 | 3 | I've been really enjoying the web component library [Shoelace](https://shoelace.style) as a replacement for framework-specific UI libraries like [Radix](https://www.radix-ui.com). 4 | 5 | Shoelace uses the shadow DOM to encapsulate its markup from the light DOM (aka the rest of the page), which means that class selectors can't reach it. This presents a problem for CSS utility frameworks like Tailwind, which use classes for everything. 6 | 7 | Web components can use the [`part` attribute](https://developer.mozilla.org/en-US/docs/Web/HTML/Global_attributes/part) to let outside stylesheets select elements in shadow DOM. For example, here's the shadow tree of [Shoelace's Switch component](https://shoelace.style/components/switch): 8 | 9 | ```html 10 | 21 | ``` 22 | 23 | CSS in the light DOM can't select these elements using classes like `switch__control` — it needs to select these elements using the [`::part()` pseudo-element](https://developer.mozilla.org/en-US/docs/Web/CSS/::part): 24 | 25 | ```css 26 | sl-switch::part(control) { 27 | background-color: red; 28 | } 29 | 30 | sl-switch::part(thumb) { 31 | background-color: white; 32 | } 33 | ``` 34 | 35 | Although Tailwind can't target these parts out of the box, you can write a plugin to do it using [dynamic variants](https://tailwindcss.com/docs/plugins#dynamic-variants) (Tailwind classes that accept arbitrary values): 36 | 37 | ```js 38 | const plugin = require("tailwindcss/plugin"); 39 | 40 | module.exports = { 41 | plugins: [ 42 | plugin(function ({ matchVariant }) { 43 | matchVariant("part", value => `&::part(${value})`); 44 | }) 45 | ] 46 | }; 47 | ``` 48 | 49 | This adds a `part` modifier that you can use like this: 50 | 51 | ```html 52 | 53 | label 54 | 55 | ``` 56 | -------------------------------------------------------------------------------- /threejs/display-an-objects-bounding-box.md: -------------------------------------------------------------------------------- 1 | # Display an object's bounding box 2 | 3 | Three.js includes a handy [`BoxHelper`](https://threejs.org/docs/#api/en/helpers/BoxHelper) to display an object's bounding box. The constructor takes an object and a color. Here's the code example from the Three.js docs: 4 | 5 | ```js 6 | const box = new THREE.BoxHelper(object, 0xffff00); 7 | scene.add(box); 8 | ``` 9 | 10 | To use with [React Three Fiber](https://docs.pmnd.rs/react-three-fiber/), the [drei package](https://drei.pmnd.rs) has a [`useHelper` hook](https://github.com/pmndrs/drei#usehelper) to make it easy to use helpers like the `BoxHelper`. It automatically sets up the helper when the component mounts, and disposes of it when the component unmounts. Here's the code sample from the drei docs: 11 | 12 | ```jsx 13 | const mesh = useRef() 14 | useHelper(mesh, BoxHelper, 'cyan') 15 | useHelper(condition && mesh, BoxHelper, 'red') // you can passe false instead of the object ref to hide the helper 16 | 17 | 18 | ``` 19 | -------------------------------------------------------------------------------- /threejs/pick-objects-with-the-mouse-cursor.md: -------------------------------------------------------------------------------- 1 | # Pick objects with the mouse cursor 2 | 3 | To make a 3D scene interactive, it's useful to know which objects the mouse cursor is over. 4 | 5 | This code snippet is mostly copied directly from the [three.js documentation for the Raycaster class](https://threejs.org/docs/index.html#api/en/core/Raycaster). It works with both [perspective cameras](https://threejs.org/docs/#api/en/cameras/PerspectiveCamera) and [orthographic cameras](https://threejs.org/docs/#api/en/cameras/OrthographicCamera). 6 | 7 | ```ts 8 | const raycaster = new THREE.Raycaster(); 9 | 10 | renderer.domElement.addEventListener("pointerdown", (e) => { 11 | const pointer = new THREE.Vector2(); 12 | 13 | // convert the mouse coordinates into clip space — x axis from -1 (left) to 1 (right) and the y axis from 1 (top) to -1 (bottom) 14 | pointer.x = (e.clientX / renderer.domElement.clientWidth) * 2 - 1; 15 | pointer.y = -(e.clientY / renderer.domElement.clientHeight) * 2 + 1; 16 | 17 | // project the clip space coordinates from the camera 18 | raycaster.setFromCamera(pointer, camera); 19 | const intersects = raycaster.intersectObjects(scene.children); 20 | for (const child of intersects) { 21 | // do something with the intersected child 22 | } 23 | }); 24 | ``` 25 | -------------------------------------------------------------------------------- /threejs/set-default-camera-in-react-three-fiber.md: -------------------------------------------------------------------------------- 1 | # Set default camera in React Three Fiber 2 | 3 | [React Three Fiber](https://docs.pmnd.rs/react-three-fiber) supports all Three.js camera types, but it does some hidden setup such that simply instantiating a camera isn't enough to render using it. 4 | 5 | Unfortunately, there doesn't seem to be a way to do this declaratively. React Three Fiber provides a hook [`useThree`](https://docs.pmnd.rs/react-three-fiber/api/hooks#usethree) for querying and manipulating the internal state, such as the current camera. 6 | 7 | The solution is to use React's [`useLayoutEffect`](https://reactjs.org/docs/hooks-reference.html#uselayouteffect) hook to set the active React Three Fiber camera when the camera component mounts, keeping a reference to the previous camera in the closure so it can be restored when it unmounts. 8 | 9 | ```tsx 10 | import * as THREE from "three"; 11 | import { useLayoutEffect } from "react"; 12 | 13 | function OrthographicCamera() { 14 | const camera = useRef(null); 15 | 16 | const set = useThree((three) => three.set); 17 | const prevCamera = useThree((three) => three.camera); 18 | 19 | useLayoutEffect(() => { 20 | // if there's no current camera ref, exit early 21 | const current = camera.current; 22 | if (!current) return; 23 | 24 | // store the previous camera to restore it when the effect cleans up 25 | const prev = prevCamera; 26 | 27 | // set the react three fiber camera to the current camera ref 28 | set(() => ({ camera: current })); 29 | 30 | // restore the previous camera when the effect cleans up 31 | return () => set(() => ({ camera: prev })); 32 | 33 | // don't include `prevCamera` in the dependency array so the effect keeps a reference to the default 34 | }, [camera, set]); 35 | 36 | return ; 37 | } 38 | ``` 39 | 40 | The [drei](https://github.com/pmndrs/drei) library (a bunch of utilies for React Three Fiber) includes something like this in the [source code for their `OrthographicCamera` component](https://github.com/pmndrs/drei/blob/master/src/core/OrthographicCamera.tsx). 41 | -------------------------------------------------------------------------------- /typescript/add-custom-element-to-jsx-intrinsic-elements.md: -------------------------------------------------------------------------------- 1 | # Add custom element to `JSX.IntrinsicElements` 2 | 3 | As of version 19, [React supports custom elements](https://custom-elements-everywhere.com/libraries/react/results/results.html)! 4 | 5 | That's great, but I still ran into this error trying to use a custom element in a React component: 6 | 7 | ``` 8 | ts: Property 'my-element' does not exist on type JSX.IntrinsicElements'. 9 | ``` 10 | 11 | Okay, so that's a TypeScript issue. But the `JSX.IntrinsicElements` type definition is inside the `@types/react` module, so who's the real culprit? 12 | 13 | Anyway, I found [this blog post](https://medium.com/@joelmalone/get-jsx-to-recognise-your-custom-element-in-react-or-preact-bf08d7522208) that got me most of the way there: 14 | 15 | ```ts 16 | import type { HTMLAttributes } from "react"; 17 | 18 | declare module "react/jsx-runtime" { 19 | namespace JSX { 20 | interface IntrinsicElements { 21 | "my-element": HTMLAttributes; 22 | } 23 | } 24 | } 25 | ``` 26 | 27 | That makes TypeScript recognize the `my-element` tag name. But it's not a complete solution: if you try to pass a `ref` prop, TypeScript will complain that "property `ref` does not exist`. 28 | 29 | The solution is to use React's `DetailedHTMLProps` type: 30 | 31 | ```ts 32 | import type { DetailedHTMLProps, HTMLAttributes } from "react"; 33 | 34 | declare module "react/jsx-runtime" { 35 | namespace JSX { 36 | interface IntrinsicElements { 37 | "my-element": DetailedHTMLProps, MyElement>; 38 | } 39 | } 40 | } 41 | ``` 42 | 43 | That adds support for React's props like `key` and `ref`. There's still a missing piece, though: if your custom element supports any custom attributes, TypeScript will tell you _those_ attributes don't exist. 44 | 45 | The full solution: make an interface with your component's attributes, and intersect it with your custom element class type before passing it to `HTMLAttributes`: 46 | 47 | ```ts 48 | import type { DetailedHTMLProps, HTMLAttributes } from "react"; 49 | 50 | interface MyElementAttributes { 51 | attr: string; 52 | } 53 | 54 | declare module "react/jsx-runtime" { 55 | namespace JSX { 56 | interface IntrinsicElements { 57 | "my-element": DetailedHTMLProps, MyElement>; 58 | } 59 | } 60 | } 61 | ``` 62 | 63 | I haven't tested this with Preact or Solid — much less other frameworks that use JSX, like Hono. But in theory it should work if you change `react/jsx-runtime` to `preact/jsx-runtime` (or whatever the corresponding module name is) and swap out React's types for the other framework's. 64 | 65 | Addendum: after publishing, Hawk Ticehurst [pointed out to me](https://bsky.app/profile/hawkticehurst.com/post/3lfeish53dc25) that Justin Fagnani [figured out how to get good typings for the custom elements without resorting to the React types](https://github.com/shoelace-style/shoelace/discussions/770#discussioncomment-2852125). 66 | -------------------------------------------------------------------------------- /typescript/assert-that-a-variable-is-not-null-or-undefined.md: -------------------------------------------------------------------------------- 1 | # Assert that a variable is not `null` or `undefined` 2 | 3 | TypeScript uses `!` as a [non-null assertion operator](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-0.html#non-null-assertion-operator): 4 | 5 | ```ts 6 | const foo = document.querySelector("#foo"); // Element | null; 7 | const bar = document.querySelector("#bar")!; // Element; 8 | ``` 9 | 10 | I avoid it because it's the same as JavaScript's unary not operator, which makes it difficult to grep. Instead, I generally prefer to assert types with `as TypeName`. 11 | 12 | ```ts 13 | const foo = document.querySelector("#foo") as Element; 14 | ``` 15 | 16 | That has a couple drawbacks. Other than being more verbose, it's possible to cast the type to something incorrect: 17 | 18 | ```ts 19 | const foo = document.querySelector("svg") as HTMLElement; 20 | ``` 21 | 22 | But! You can chain the `!` non-null assertion operator as many times as you want: 23 | 24 | ```ts 25 | const foo = document.querySelector("#foo")!!!; // Element 26 | ``` 27 | 28 | I prefer that to `as` assertions. It's greppable (since using three consecutive unary nots is unlikely), it's less verbose, there's no chance of accidentally changing the type and a bunch of repeated exclamation marks definitely calls out that something unusual is happening with that code. 29 | -------------------------------------------------------------------------------- /typescript/tsconfig-flags-to-prevent-common-errors.md: -------------------------------------------------------------------------------- 1 | # tsconfig flags to prevent common errors 2 | 3 | Even thought it’s ostensibly just a compiler, TypeScript also includes a bunch of flags that helps prevent common code errors. I end up reusing these in most TypeScript projects I work on, so I figured I’d just document them here for future reference. 4 | 5 | ```json 6 | { 7 | "strict": true, 8 | "noFallthroughCasesInSwitch": true, 9 | "noImplicitOverride": true, 10 | "noImplicitReturns": true, 11 | "noUncheckedIndexedAccess": true, 12 | "noUnusedLocals": true, 13 | "noUnusedParameters": true, 14 | "forceConsistentCasingInFileNames": true, 15 | "verbatimModuleSyntax": true 16 | } 17 | ``` 18 | 19 | Here’s a short explanation of what each flag does 20 | 21 | - [`strict`](https://www.typescriptlang.org/tsconfig#strict) enables a bunch of other flags; enabling this is really the bare minimum. 22 | - `alwaysStrict` uses [strict mode](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Strict_mode) emits `"use strict"` in each file. 23 | - `noImplicitAny` raises an error if lack of type annotations would lead it to infer `any`. 24 | - `noImplitictThis` raises an error if a reference to `this` has an inferred `any` type. 25 | - `strictBindCallApply` correctly types the `bind`, `call` and `apply` methods on functions; otherwise, they accept and return `any`. 26 | - `strictFunctionTypes` treats function arguments as [contravariant rather than bivariant](https://web.archive.org/web/20220823104433/https://www.stephanboyer.com/post/132/what-are-covariance-and-contravariance). 27 | - `strictNullChecks` treats `undefined` and `null` as their own distinct types, forbidding e.g. accessing a property of a type that might be undefined. 28 | - `strictPropertyInitialization` raises an error when a class property that isn’t optional is not set by default or in the constructor. 29 | - `useUnknownInCatchVariables` uses `unknown` rather than `any` as the type of the exception in a catch clause. 30 | - [`noFallthroughCasesInSwitch`](https://www.typescriptlang.org/tsconfig#noFallthroughCasesInSwitch) forces non-empty cases in switch statements to end with a `break` or `return`. 31 | - [`noImplicitOverride`](https://www.typescriptlang.org/tsconfig#noImplicitOverride) raises an error if a subclass overrides a method in a superclass without being annotated with `override`. 32 | - [`noImplicitReturns`](https://www.typescriptlang.org/tsconfig#noImplicitReturns) raises an error if only some code paths in a function return a value. 33 | - [`noUncheckedIndexAccess`](https://www.typescriptlang.org/tsconfig#noUncheckedIndexedAccess) uses a union with `undefined` when accessing values of objects with unknown keys, such as objects that act as indexes or arrays of unknown length. This can be somewhat annoying, since you now need to check for `undefined` every time, but it’s safer. 34 | - [`noUnusedLocals`](https://www.typescriptlang.org/tsconfig#noUnusedLocals) raises an error when a local variable is declared but never used; the error can be suppressed by prefixing the variable name with an underscore. 35 | - [`noUnusedParameters`](https://www.typescriptlang.org/tsconfig#noUnusedParameters) raises an error when a function parameter is declared but never used; the error can be suppressed by prefixing the variable name with an underscore. 36 | - [`forceConsistentCasingInFileNames`](https://www.typescriptlang.org/tsconfig#forceConsistentCasingInFileNames) raises an error if the casing of an import specifier (`”foo.js”`) differs from the file on disk (`Foo.js`). 37 | - [`verbatimModuleSyntax`](https://www.typescriptlang.org/tsconfig#verbatimModuleSyntax) ensures that type-only imports are explicitly annotated as such (since they’re erased at runtime, they won’t behave the same as value imports if the module causes side effects). Replaces [`importsNotUsedAsValues`](https://www.typescriptlang.org/tsconfig#importsNotUsedAsValues). 38 | -------------------------------------------------------------------------------- /typescript/type-concrete-subclasses-of-an-abstract-class.md: -------------------------------------------------------------------------------- 1 | # Type concrete subclasses of an abstract class 2 | 3 | Let's say there's an an inheritance hierarchy consisting of an abstract base class and two concrete subclasses: 4 | 5 | ```ts 6 | abstract class Base { 7 | constructor(exampleParam: number) {} 8 | abstract foo(): void; 9 | } 10 | 11 | class A extends Base { 12 | foo() { 13 | console.log("A"); 14 | } 15 | } 16 | 17 | class B extends Base { 18 | foo() { 19 | console.log("B"); 20 | } 21 | } 22 | ``` 23 | 24 | In an inheritance hierarchy, it's common to reference a group of subclasses by their base class (the [Liskov substitution principle](https://en.wikipedia.org/wiki/Liskov_substitution_principle)): 25 | 26 | ```ts 27 | function doSomething(obj: Base) { 28 | base.foo(); 29 | } 30 | ``` 31 | 32 | One situation in which this gets thorny for abstract classes is where a subclass that won't be known until runtime must be instantiated. In this case, referring to the base class directly will result in errors about how TypeScript cannot create an instance of an abstract class: 33 | 34 | ```ts 35 | function instantiate(Class: typeof Base) { 36 | return new Class(10); // Cannot create an instance of an abstract class. 37 | } 38 | ``` 39 | 40 | Instead of using the abstract base class's type directly, instead create an object type in which a constructor (a `new` method) returns an instance of the abstract class. If the constructor takes any parameters, they can be extracted using the [`ConstructorParameters`](https://www.typescriptlang.org/docs/handbook/utility-types.html#constructorparameterstype) utility type: 41 | 42 | ```ts 43 | function instantiate(Class: { new (...params: ConstructorParameters): Base }) { 44 | return new Class(10); 45 | } 46 | ``` 47 | -------------------------------------------------------------------------------- /typescript/types-and-variables-can-share-names.md: -------------------------------------------------------------------------------- 1 | # Types and variables can share names 2 | 3 | The [Superstruct documentation on type inference](https://docs.superstructjs.org/guides/06-using-typescript#inferring-types) includes this neat trick: 4 | 5 | ```ts 6 | import { Infer } from "superstruct"; 7 | 8 | const User = object({ 9 | id: number(), 10 | email: email(), 11 | name: string(), 12 | }); 13 | 14 | type User = Infer; 15 | ``` 16 | 17 | In the context of Superstruct, this creates both a variable `User` that checks whether an object has a particular shape and a type `User` of that shape. 18 | 19 | The cool thing here is that the type and the variable both have the same name. So you can import them both at the same time: 20 | 21 | ```ts 22 | import { assert } from "superstruct"; 23 | 24 | // import both the variable and the type 25 | import { User } from "./user"; 26 | 27 | // this line uses the type 28 | async function getUser(): Promise { 29 | const res = await fetch("https://example.com/api/fake-user-endpoint"); 30 | const json = await res.json(); 31 | 32 | // this line uses the variable 33 | assert(json, User); 34 | 35 | return user; 36 | } 37 | ``` 38 | 39 | Note that this doesn’t work for all variables. Classes, for example, are already available as types, so if you write something like this TypeScript will tell you that `Test` is a duplicate identifier: 40 | 41 | ```ts 42 | class Test {} 43 | 44 | type Test = string; 45 | ``` 46 | -------------------------------------------------------------------------------- /valibot/parse-a-date-or-string-into-a-date.md: -------------------------------------------------------------------------------- 1 | # Parse a `Date` or `string` into a `Date` 2 | 3 | One common pattern with TypeScript projects is using a validation library like [Valibot](https://valibot.dev/) to validate incoming data at points of ingress to your application. (Intentionally or unintentionally, this is the "parse, don't validate" pattern, coined by Alexis King in her [excellent blog post of the same name](https://lexi-lambda.github.io/about.html).) 4 | 5 | The pattern is a little trickier when writing code that spans network boundaries. 6 | 7 | For example, in SvelteKit, it's common to load entities from a database (on a server) and then load those same entities from API endpoints (on a client). The [node-postgres](https://node-postgres.com/) database driver will turn timestamp columns into JavaScript `Date` objects, but those objects will get converted to JSON before being sent over the network. 8 | 9 | So the issue is that there are two separate forms that the same entity can take. One way to solve this is by using a union validator: 10 | 11 | ```ts 12 | import { date as vdate, string, union } from "valibot"; 13 | 14 | const date = () => union([string(), vdate()]); 15 | ``` 16 | 17 | That's inconvenient, though, because then the resulting type is `string | Date`. And it's not just a type system thing, either: it will _actually_ be a `string` or `Date`, and doing anything with it will require either type assertions or branching. 18 | 19 | ```ts 20 | import { date as vdate, pipe, string, transform, union } from "valibot"; 21 | 22 | const date = () => 23 | pipe( 24 | union([string(), vdate()]), 25 | transform(input => typeof input === "string" ? new Date(input) : input) 26 | ); 27 | ``` 28 | 29 | Basically: 30 | 31 | 1. Ensure the input type is either a `string` or a `Date`. 32 | 2. If a `string`, convert it to a `Date`. 33 | 34 | Boom: two different "input" representations, parsed into one consistent "output" type. -------------------------------------------------------------------------------- /volta/install-volta-with-homebrew.md: -------------------------------------------------------------------------------- 1 | # Install Volta with Homebrew 2 | 3 | It seems so easy: just `brew install volta` and you have a modern Node version manager ready to go! 4 | 5 | Unfortunately, when I tried to install Node, I ran into this error: 6 | 7 | ``` 8 | note: cannot find command node. Please ensure that /Users/jake/.volta/bin is available on your PATH. 9 | ``` 10 | 11 | I suppose I could have just added that directory to my `PATH` and been done with it, but it seemed odd that the installation wouldn't have automatically set that up? 12 | I googled around and found [this GitHub issue](https://github.com/volta-cli/volta/issues/927) about Homebrew requiring you to run [`volta setup`](https://docs.volta.sh/reference/setup) manually. 13 | 14 | Turns out `volta setup` is a command that automatically modifies your shell config file to set the appropriate environment variables! 15 | 16 | So, installing Volta with Homebrew actually requires two commands: 17 | 18 | ```sh 19 | brew install volta 20 | volta setup 21 | ``` 22 | -------------------------------------------------------------------------------- /windows/create-a-file-without-an-extension.md: -------------------------------------------------------------------------------- 1 | # Create a file without an extension 2 | 3 | I was trying to save a file without an extension from Notepad. For some reason, even if I removed the extension from the filename and selected "All files", it still saved as a `.txt`. 4 | 5 | The trick turned out to be putting the file name in quotes. So for example, rather than saving the file as `filename` (which results in a file named `filename.txt`) I needed to save it as `"filename"` (which results in `filename` — removing the quotes but not adding the extension). 6 | --------------------------------------------------------------------------------