├── LICENSE ├── README.md ├── bulk-git-ops.sh ├── clj-projects.sh ├── log-analysis.sh ├── logging.sh ├── machine-setup.sh ├── patterns-heredocs-strings.sh ├── scripting-utils.sh └── usage-trap.sh /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 Aditya Athalye 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [Purpose](#purpose) 2 | [Caveat emptor](#caveat-emptor) 3 | [Contributing](#contributing) 4 | [Credits](#credits) 5 | [HOPETODOs](#hopetodos) 6 | 7 | # Purpose 8 | 9 | Maybe this will become my ever-growing, ever-improving, Swiss Army 10 | Toolkit of functions-as-cmd-line-tools and useful-to-me patterns. 11 | 12 | It is unlikely to be useful as-is to you. However, you might find 13 | some of the ideas or techniques interesting (almost none of them 14 | are original). 15 | 16 | In any case, this stuff has emerged from my personal motivations to: 17 | 18 | - reinvent the wheel for fun and profit 19 | - avoid rewriting all of these all over the place, as I end up 20 | doing ever so often 21 | - get better at text processing. That stuff is everywhere! 22 | - benefit from stable, fast tools, with well-known warts, and 23 | Because The Masters Already Solved It Better Decades Ago 24 | - improve personal/team productivity 25 | - preserve tool-building autonomy 26 | - learn about good and successful design 27 | - learn about bad _and_ successful design 28 | - go down the winding rabbit pipes of computer history. 29 | 30 | See also: 31 | - [shite](https://github.com/adityaathalye/shite), my little static site generator that makes [my website](https://evalapply.org) (and it also hot-builds and hot-refreshes). 32 | - [oxo](https://github.com/adityaathalye/oxo), a game of Noughts and Crosses, written in Bash (and it also speaks in computer voice). 33 | 34 | # Usage 35 | 36 | I happen to use these tools on Ubuntu 18+ LTS, with Bash 4.4+. Maybe 37 | they will also work just fine on other Debian-like distros, with Bash 4+. 38 | 39 | Each "utility" file would have its own usage guide. 40 | 41 | # Caveat emptor 42 | 43 | These utilities are NOT designed to be portable across shells or older 44 | Bash versions or operating systems. Nor should they be presumed secure. 45 | If I haven't tried, they certainly aren't. And even if I did, they very 46 | well may not be. 47 | 48 | I'm still learning from my mistakes, after all. Ergo, by perusing these, 49 | you accept all the horrible ramifications of your life choice. 50 | 51 | # Contributing 52 | 53 | I'll be happy to discuss bugs, design ideas etc. via email or github's 54 | issues feature. Suppose a discussion leads to a thing I'd like to add 55 | to this repo, I'll work with you to bring it in with due credit. 56 | 57 | However, I do not intend this for use by others, and will not heed 58 | unsolicited pull requests. 59 | 60 | # Credits 61 | 62 | Too many to name, but some important ones are: 63 | 64 | - Several friends and colleagues (especially those who failed to 65 | caution me about the perils of Shell scripting) 66 | - Classic Shell Scripting (Robbins and Beebee) 67 | - bash manpage authors, tool manpage authors, especially the ones 68 | who document EXAMPLES 69 | - mtxia.com, jpnc.info, catb.org, wiki.bash-hackers.org, tldp.org 70 | - Various StackOverflows, githubs, subreddits, HackerNewses, 71 | ${RANDOM_BLOGGERS}... erm... "The Internets"? 72 | 73 | # HOPETODOs 74 | 75 | - Read "The Unix Programming Environment". 76 | - Peruse [google's shell style guide](https://google.github.io/styleguide/shell.xml). They seem to know a bit or two. 77 | - Understand plan9. Migrate to plan9. Upgrade to plan10. 78 | Wait, no that's not macOS. 79 | - Many, many other things. 80 | 81 | # Copyright and License 82 | 83 | Copyright © 2018-2019 [Aditya Athalye](https://adityaathalye.com). 84 | 85 | Distributed under the [MIT license](https://github.com/inclojure-org/clojure-by-example/blob/master/LICENSE). 86 | -------------------------------------------------------------------------------- /bulk-git-ops.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | source ./logging.sh 4 | 5 | logdir="$HOME/.bulk_git_ops" 6 | 7 | # SUMMARY 8 | # 9 | # If remote has new changes, git fetch origin would download objects 10 | # and should cause mtime of .git to change. We can use this fact 11 | # to filter active repos for update. 12 | # 13 | # Examples assume that repos you contribute to are spread across 14 | # remote hosts, and (hopefully) namespaced sanely. I organise my 15 | # sources as follows. 16 | # 17 | # ~/src/{github,gitlab,bitbucket}/{usernames..}/{reponames..} 18 | 19 | # 20 | # USAGE 21 | # 22 | # Source the file, and use the functions as command line utilities as follows. 23 | # 24 | # source ~/src/path/to/this/script/bulk-git-ops.sh 25 | # 26 | # QUERY: Count repos that are stale: 27 | # 28 | # ls_git_projects ~/src/ | take_stale | count_repos_by_remote 29 | # 30 | # QUERY: Count repos that are active (within 12 hours by default): 31 | # 32 | # ls_git_projects ~/src/ | take_active | count_repos_by_remote 33 | # 34 | # 35 | # EXECUTE! Use 'xgit' to apply simple git commands to the given repos, with logs to STDERR/stty 36 | # 37 | # ls_git_projects ~/src/bitbucket | xgit fetch # bitbucket-hosted repos 38 | # 39 | # EXECUTE! Use 'proc_repos' to apply custom functions to, with logs to STDERR/stty 40 | # 41 | # ls_git_projects ~/src/bitbucket | proc_repos git_fetch # all repos 42 | # ls_git_projects ~/src/bitbucket | take_stale | proc_repos git_fetch # only stale repos 43 | # ls_git_projects ~/src/bitbucket | take_active | proc_repos git_fetch # only active repos 44 | # 45 | # EXECUTE! What's the current branch? Logs to STDERR/stty 46 | # 47 | # ls_git_projects ~/src/bitbucket | proc_repos git_branch_current # all repos 48 | # 49 | # EXECUTE! With logs redirected to hidden dir for logging (you must create it by hand first) 50 | # 51 | # mkdir -p "${logdir}" 52 | # ls_git_projects ~/src/bitbucket | proc_repos git_branch_current 2>> "${logdir}/bulkops.log" 53 | # tail "${logdir}/bulkops.log" 54 | # 55 | 56 | ls_git_projects() { 57 | # Spit out list of root dirs of .git-controlled projects 58 | local basedir="${1:-$PWD}" 59 | 60 | if [[ -d ${basedir} ]] 61 | then find ${basedir} -type d -name '.git' | xargs dirname | xargs realpath 62 | else >&2 printf "%s\n" "ERROR ${basedir} is not a valid base directory." 63 | return 1 64 | fi 65 | } 66 | 67 | identity() { 68 | printf "%s\n" "$@" 69 | } 70 | 71 | proc_repos() { 72 | # Apply any given operation on the given repos. Use in a pipeline. 73 | local func=${1:-identity} 74 | local repo_dir 75 | while read repo_dir 76 | do $func "${repo_dir}" 77 | log_info "Applied ${func} to ${repo_dir}" 78 | done 79 | } 80 | 81 | xgit() { 82 | # Apply a git operation on the given repos. Use in a pipeline. 83 | local git_cmd="${@}" 84 | local repo_dir 85 | while read repo_dir 86 | do git --git-dir="${repo_dir}/.git" $git_cmd 87 | log_info "Applied git ${git_cmd} to ${repo_dir}." 88 | done 89 | } 90 | 91 | # git_fetch_statuses() { 92 | # local basedir=${1:-"$PWD"} 93 | # find ${basedir} \ 94 | # -type d -name '.git' \ 95 | # -prune -print \ 96 | # -execdir git fetch -q \; 97 | # } 98 | 99 | __with_git_dir() { 100 | local repo_dir="${1}" 101 | shift 102 | git --git-dir="${repo_dir}/.git" "$@" 103 | } 104 | 105 | git_fetch() { 106 | local repo_dir="${1}" 107 | __with_git_dir "${repo_dir}" fetch -q 108 | } 109 | 110 | git_status() { 111 | __with_git_dir "${1}" status 112 | } 113 | 114 | git_branch_current() { 115 | local repo_dir="${1}" 116 | __with_git_dir "${repo_dir}" rev-parse --abbrev-ref=strict HEAD 117 | } 118 | 119 | __is_repo_active() { 120 | # A directory's mtime increments only if files/sub-dirs are 121 | # created and/or deleted. 122 | # 123 | # We can use this plus the fact that the .git directory's 124 | # contents would change IFF new objects are created in it, 125 | # or old ones garbage collected. 126 | # 127 | # Fetch operations are idempotent, and no-ops if local has not 128 | # diverged from remote. 129 | # 130 | # Events we care about, to classify as "active": 131 | # - Fetch from remote updates .git 132 | # - Active development updates .git (stage/commit) 133 | # 134 | # Events that don't matter usually (infrequent), but are worth 135 | # interpreting as "active": 136 | # 137 | # - Objects collected / compacted automatically by git locally 138 | # 139 | local repo_dir="${1}" 140 | local stale_hrs="${2:-12}" 141 | local hrs_ago=$(( ($(date +%s) 142 | - $(stat -c %Y "${repo_dir}/.git")) 143 | / 3600 )) 144 | [[ $hrs_ago -le $stale_hrs ]] 145 | } 146 | 147 | take_active() { 148 | local repo_dir 149 | while read repo_dir 150 | do __is_repo_active "${repo_dir}" && printf "%s\n" "${repo_dir}" 151 | done 152 | } 153 | 154 | take_stale() { 155 | local repo_dir 156 | while read repo_dir 157 | do __is_repo_active "${repo_dir}" || printf "%s\n" "${repo_dir}" 158 | done 159 | } 160 | 161 | count_repos_by_remote() { 162 | grep -E -o "github|gitlab|bitbucket" | sort | uniq -c 163 | } 164 | -------------------------------------------------------------------------------- /clj-projects.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # clj-kondo setup 4 | 5 | clj_kondo_install() { 6 | # Adapted from: 7 | # https://raw.githubusercontent.com/borkdude/clj-kondo/master/script/install-clj-kondo 8 | case "$(uname -s)" in 9 | Linux*) local platform="linux";; 10 | Darwin*) local platform="macos";; 11 | esac 12 | local latest_release="$(curl -s https://raw.githubusercontent.com/borkdude/clj-kondo/master/resources/CLJ_KONDO_RELEASED_VERSION)" 13 | local dl_archive_name="clj-kondo-${latest_release}-${platform}-amd64" 14 | local dl_archive_url="https://github.com/borkdude/clj-kondo/releases/download/v${latest_release}/${dl_archive_name}.zip" 15 | local dl_to_path="${HOME}/Downloads/${dl_archive_name}.zip" 16 | 17 | if which clj-kondo > /dev/null 18 | then local executable_bin_path="$(which clj-kondo)" 19 | local executable_version="$(clj-kondo --version | sed -e 's/clj-kondo v//g')" 20 | else local executable_bin_path="${HOME}/bin/clj-kondo" 21 | fi 22 | 23 | # Install IFF local version is stale. We rely on this check working: 24 | # [[ "2020.05.04" > "2020.05.03" ]] && echo higher 25 | # [[ "2020.05.04" > "" ]] && echo higher 26 | if [[ ${latest_release} > ${executable_version} ]]; then 27 | if ! [[ -f "${dl_to_path}" ]] 28 | then printf "%s\n" "Downloading ${dl_archive_url} to ${dl_to_path}" 29 | curl -o "${dl_to_path}" -sL "${dl_archive_url}" 30 | fi 31 | 32 | mkdir -p "/tmp/${dl_archive_name}" 33 | unzip -o "${dl_to_path}" -d "/tmp/${dl_archive_name}" 34 | 35 | if [[ -f ${executable_bin_path} ]] 36 | then printf "%s\n" "Backing up ${executable_bin_path}" 37 | cp "${executable_bin_path}" "${executable_bin_path}.old" 38 | fi 39 | 40 | printf "%s\n" "Replacing ${executable_bin_path}" 41 | mv -f "/tmp/${dl_archive_name}/clj-kondo" "${executable_bin_path}" 42 | chmod 0700 "${executable_bin_path}" 43 | 44 | printf "%s\n" "Cleanup tmp and old files" 45 | rm -rv "/tmp/${dl_archive_name}" 46 | 47 | if [[ -f "${executable_bin_path}.old" ]] 48 | then printf "%s\n" "Old clj-kondo version was $(${executable_bin_path}.old --version)" 49 | rm -v "${executable_bin_path}.old" 50 | fi 51 | printf "%s\n" "New clj-kondo version is $(clj-kondo --version)" 52 | else 53 | printf "%s\n" "clj-kondo is already at the latest version: ${executable_version}" 54 | fi 55 | } 56 | 57 | clj_project_qmark() { 58 | # USAGE: 59 | # 60 | # ls_git_projects /path/to/repos/ | clj_project_qmark 61 | # 62 | 63 | while read _wd 64 | do 65 | if [[ -f "${_wd}/project.clj" || -f "${_wd}/deps.edn" || -f "${_wd}/build.boot" ]] 66 | then printf "%s\n" "${_wd}" 67 | fi 68 | done 69 | } 70 | 71 | clj_project_classpath() { 72 | local _wd="${1:-$(pwd)}" 73 | 74 | __err() { 75 | (1>&2 printf "%s\n" "Could not get classpath with ${1}") 76 | return 1 77 | } 78 | 79 | ( 80 | cd ${_wd} 81 | if which lein > /dev/null 82 | then lein classpath 2> /dev/null || __err "lein" 83 | elif which boot > /dev/null 84 | then boot with-cp -w -f - 2> /dev/null || __err "boot" 85 | elif which clojure > /dev/null 86 | then clojure -Spath 2> /dev/null || __err "clojure cli" 87 | else __err "any standard tool (lein, boot, clojure cli)." 88 | fi 89 | ) 90 | } 91 | 92 | clj_kondo_init() { 93 | # USAGE: 94 | # 95 | # ls_git_projects /path/to/repos/ | clj_project_qmark | clj_kondo_init 96 | # 97 | # Init the given dirs, assuming they are clj / cljs projects. 98 | # Expects stream of fully qualified paths as input. 99 | # ref: https://github.com/borkdude/clj-kondo/blob/master/README.md#project-setup 100 | set -o pipefail 101 | 102 | while read _wd 103 | do local _classpath="$(clj_project_classpath ${_wd})" 104 | if [[ -n "${_classpath}" ]] 105 | then (1>&2 printf "%s\n" "About to init clj-kondo for ${_wd}") 106 | mkdir -p "${_wd}/.clj-kondo" 107 | (cd ${_wd}; clj-kondo --lint "${_classpath}") | grep "linting took" 108 | fi 109 | done 110 | } 111 | -------------------------------------------------------------------------------- /log-analysis.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | source "${HOME}/bin/scripting-utils.sh" 4 | source "${HOME}/bin/logging.sh" 5 | 6 | # #################### 7 | # DEPENDENCIES: 8 | # - Warn about missing dependencies, at the time of sourcing the file. 9 | # #################### 10 | 11 | ensure_min_bash_version "4.4" 12 | ensure_deps "gawk" "jq" 13 | 14 | 15 | # #################### 16 | # PURE FUNCTIONS: STRUCTURE-AGNOSTIC 17 | # - Assume a record per line regardless of whether CSV, JSON, or plain old text 18 | # - these must consume stdin and write to stdout/err, and/or 19 | # - never cause side-effects (explicit file I/O etc.) 20 | # #################### 21 | 22 | deduplicate() { 23 | # Ref. technique as seen here: https://stackoverflow.com/a/20639730 24 | # Or use awk '!x[$0]++', as seen here https://stackoverflow.com/a/11532197 25 | 26 | # print stdin stream with line-number prefixed 27 | cat -n | 28 | # sort uniquely by second key (the incoming lines) 29 | sort -uk2 | 30 | # sort numeric by first key (line number) 31 | sort -nk1 | 32 | # select second field to the end 33 | cut -f2- 34 | } 35 | 36 | frequencies() { 37 | # Given appropriately-sorted finite set of records via STDIN, 38 | # produce a frequency distribution of the records. 39 | # (Ref. Classic Shell Scripting) 40 | 41 | sort | uniq -c | sort -bnr 42 | } 43 | 44 | drop_first_n() { 45 | local lines="${1:-0}" 46 | local offset="$(( lines + 1 ))" 47 | 48 | tail -n +"${offset}" 49 | } 50 | 51 | drop_last_n() { 52 | local lines="${1:-0}" 53 | head -n -"${lines}" 54 | } 55 | 56 | drop_header_footer() { 57 | local header_lines="${1:?$(log_error "Header line count required, 1-indexed.")}" 58 | local footer_lines="${2:?$(log_error "Footer line count required, 1-indexed.")}" 59 | 60 | drop_first_n "${header_lines}" | 61 | drop_last_n "${footer_lines}" 62 | } 63 | 64 | window_from_to_lines() { 65 | local from_line="${1:?$(log_error "FROM line number required, 1-indexed.")}" 66 | local to_line="${2:?$(log_error "TO line number required, 1-indexed.")}" 67 | 68 | drop_first_n "${from_line}" | 69 | head -n "${to_line}" 70 | } 71 | 72 | 73 | # #################### 74 | # PURE FUNCTIONS: SINGLE and MULTI-LINE STRUCTURED RECORDS 75 | # - filter, format, transform single or multi-line log streams 76 | # - these must consume stdin and write to stdout/err, and/or 77 | # - never cause side-effects (explicit file I/O etc.) 78 | # #################### 79 | 80 | logs_extract_records() { 81 | # Usage: cat ./logfile.log | extract_log_records | wc -l 82 | # or replace by awk if the pipeline gets too deep 83 | grep -E -v "REJECT_LINES_PATTERN" \ 84 | | grep -E "CHOOSE_LINES_PATTERN" 85 | } 86 | 87 | logs_multiline_as_paras() { 88 | # Applications often print logs in multiple lines, and we face 89 | # walls of text. 90 | # 91 | # Given a wall of text of structured records, break the wall into 92 | # newline-separated paragraphs, for visual and structural separation. 93 | # Identify the beginning line, print a newline above it. Print 94 | # subsequent lines as-is, till the next paragraph begins. 95 | 96 | sed -n -E \ 97 | -e 's;^(FIXME_PARA_START_PATTERN).*;\n\0;p' \ 98 | -e 's;^([[:alnum:]]+.*);\0;p' 99 | } 100 | 101 | logs_paras_to_oneline() { 102 | # Given any paragraph-style multi-line record set, transform each 103 | # paragraph into single-line records. 104 | # 105 | # Ensure round trip from collapse -> expand -> collapse by using a 106 | # unique marker (like "^Z") to annotate the _beginning_ of each 107 | # line of a paragraph. (Ref. Classic Shell Scripting). 108 | 109 | awk 'BEGIN { RS = "" } { gsub("\n","^Z"); print; }' 110 | } 111 | 112 | logs_oneline_to_paras() { 113 | # Given a collapsed one-line record, expand it back to multi-line form. 114 | # BUT preserve the "paragraph separation", to help re-processing. 115 | 116 | awk 'BEGIN { ORS="\n\n"; } { gsub("\\^Z", "\n"); print; }' 117 | } 118 | 119 | logs_group_by_YYYY() { 120 | # Given a list of records of this structure: 121 | # INFO YYYYMMDDHHSS "Foobar important information" 122 | 123 | sort -b -k2.1,2.4 # Ignore leading blanks, for correct character indexing 124 | } 125 | 126 | logs_group_by_MM() { 127 | # Given a list of records of this structure: 128 | # INFO YYYYMMDDHHSS "Foobar important information" 129 | 130 | sort -b -k2.5,2.6 131 | } 132 | 133 | logs_group_by_MM_then_YYYY() { 134 | # Given a list of records of this structure: 135 | # INFO YYYYMMDDHHSS "Foobar important information" 136 | 137 | sort -b -k2.5,2.6 -k2.1,2.4 138 | } 139 | 140 | # #################### 141 | # PURE FUNCTIONS: CSV RECORDS 142 | # - make and select CSV records, one-per line, which 143 | # may or may not have header and footer 144 | # - these must consume stdin and write to stdout/err, and/or 145 | # - never cause side-effects (explicit file I/O etc.) 146 | # #################### 147 | 148 | csv_from_http_error_logs() { 149 | # Example for generating CSV data stream, assuming log files are 150 | # space-separated records. 151 | # 152 | # Usage: cat ./logfile.log | http_errors_to_csv > outfile.csv 153 | # 154 | # Suppose our log has the following format: 155 | # 156 | # Field Name : Position in log line 157 | # timestamp : 1 158 | # http method : 2 159 | # http status : 3 160 | # aws trace id : 5 161 | # customer ID : 6 162 | # uri : 7 163 | # 164 | # Once generated, the CSV (outfile.csv) may be further analyzed as follows: 165 | # 166 | # - Make frequency distribution of HTTP_status, found in column 2 of outfile 167 | # 168 | # $ cat outfile.csv \ 169 | # | awk 'BEGIN { FS="," } { print $2 }' \ 170 | # | drop_csv_header \ 171 | # | deduplicate \ 172 | # | frequencies 173 | # 174 | 175 | awk 'BEGIN { FS = "[[:space:]]+"; OFS = ","; print "Timestamp,HTTP_status,HTTP_method,URI,Customer_ID,AWS_Trace_ID"} 176 | /(GET|POST|PUT)[[:space:]]+(4|5)[[:digit:]][[:digit:]][[:space:]].*/ { sub(/^(\/common\/uri\/prefix\/)/, "", $7); 177 | print $1,$3,$2,$7,$6,$5; 178 | records_matched+=1; }' 179 | } 180 | 181 | csv_get_col() { 182 | local idx="${1}" 183 | cut -d , -f ${idx} 184 | } 185 | 186 | csv_prepend_colnames() { 187 | local colnames="${@}" 188 | cat <(printf "%s\t" ${colnames} printf "\n" ) - 189 | } 190 | 191 | 192 | # #################### 193 | # PURE FUNCTIONS: JSON RECORDS 194 | # - these must consume stdin and write to stdout/err, and/or 195 | # - never cause side-effects (explicit file I/O etc.) 196 | # #################### 197 | 198 | 199 | jq_with_defaults() { 200 | # Wraps jq with defaults for the purpose of this program. 201 | # Expects to be passed a well-formed jq query as argument. 202 | # 203 | # Defaults provided: 204 | # 205 | # - Output with no colour, to avoid breaking tools that can't process colours 206 | # - Output as a single line, to preserve compatibility with unix tools 207 | jq -M -c "$@" 208 | } 209 | 210 | 211 | json_drop_uris() { 212 | # Given uri prefix path, and csv list of routes under the path, drop any 213 | # JSON log line that contains the path in its 'uri' field. Also drop any 214 | # JSON having null 'uri' value. 215 | 216 | local uri_prefix="${1:?(log_error 'Path of URI to drop required')}" 217 | local uri_routes="${2:?(log_error 'List of routes to drop (as nested under URI) required')}" 218 | local uri_routes_jq_pattern="$(printf "" "${uri_routes}" | tr ',' '\|')" 219 | 220 | log_info "Dropping irrelevant or empty uris." 221 | jq_with_defaults --arg ROUTES_REGEX "^/${uri_prefix}/${uri_routes_jq_pattern}.*" '. | 222 | select((."uri" == null) or ((."uri" | strings) | test($ROUTES_REGEX) | not))' 223 | } 224 | -------------------------------------------------------------------------------- /logging.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # USAGE 4 | # 5 | # Source to cmd-line or in your file and use the functions. 6 | # 7 | # log_info "Hello, this is a log line." 8 | # log_warn "You are in dangerous territory." 9 | # log_error "You screwed up" 10 | # log_fatal_die "It's so bad I can't continue." 11 | # 12 | # Intended for userland scripts / "application-level" logging, where 13 | # writing to system logs is overkill. Prefer the 'logger' coreutil if 14 | # your scripting demands "production-grade" logging. 15 | # 16 | 17 | __to_stderr() { 18 | # Ref: I/O Redirection: http://tldp.org/LDP/abs/html/io-redirection.html 19 | 1>&2 printf "%s\n" "$(date --iso-8601=seconds) $@" 20 | } 21 | 22 | log_info() { 23 | __to_stderr "$(echo "INFO $0 $@")" 24 | } 25 | 26 | log_warn() { 27 | __to_stderr "$(echo "WARN $0 $@")" 28 | } 29 | 30 | log_error() { 31 | __to_stderr "$(echo "ERROR $0 $@")" 32 | return 1 33 | } 34 | 35 | log_fatal_die() { 36 | echo "FATAL $0 cannot proceed. Please fix errors and retry. $@" 37 | exit 1 38 | } 39 | -------------------------------------------------------------------------------- /machine-setup.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # APT. 5 | # sudo you speak it? 6 | # 7 | 8 | apt_install_standard_packages() { 9 | declare -A installed_name_to_apt_package_array=( 10 | [chromium]="chromium-browser" 11 | [curl]="curl" 12 | [emacs26]="emacs26" 13 | [ffmpeg]="ffmpeg" 14 | [flatpak]="flatpak" 15 | [gawk]="gawk" 16 | [git]="git" 17 | [gnome-tweaks]="gnome-tweak-tool" 18 | [gparted]="gparted" 19 | [jq]="jq" 20 | [kazam]="kazam" 21 | [java]="openjdk-8-jdk openjdk-11-jdk openjdk-14-jdk" 22 | [psql]="postgresql" 23 | [python3]="python3" 24 | [pip3]="python3-pip" 25 | [rlwrap]="rlwrap" 26 | [ag]="silversearcher-ag" 27 | [tmux]="tmux" 28 | [vim]="vim" 29 | ) 30 | 31 | sudo add-apt-repository universe 32 | sudo add-apt-repository multiverse 33 | sudo add-apt-repository -y ppa:kelleyk/emacs 34 | sudo apt-get update 35 | 36 | for package in "${!installed_name_to_apt_package_array[@]}" 37 | do if which ${package} > /dev/null 38 | then printf "Skipping %s \n" ${package} 39 | else sudo apt-get install -y "${installed_name_to_apt_package_array[${package}]}" 40 | fi 41 | done 42 | } 43 | 44 | __apt_install_docker() { 45 | # Installation instructions copied from: 46 | # cf. https://docs.docker.com/engine/install/ubuntu/ 47 | if ! which docker docker-engine docker.io containerd runc > /dev/null 48 | then # Add Docker's official GPG key: 49 | sudo apt-get update 50 | sudo apt-get install ca-certificates curl 51 | sudo install -m 0755 -d /etc/apt/keyrings 52 | sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc 53 | sudo chmod a+r /etc/apt/keyrings/docker.asc 54 | 55 | # Add the repository to Apt sources: 56 | echo \ 57 | "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \ 58 | $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \ 59 | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null 60 | sudo apt-get update 61 | 62 | # Install docker 63 | sudo apt-get install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin 64 | 65 | # Check install 66 | docker run hello-world 67 | fi 68 | } 69 | 70 | __apt_install_custom_nextdns() { 71 | # https://github.com/nextdns/nextdns/wiki/Debian-Based-Distribution 72 | if ! which nextdns > /dev/null 73 | then sh -c "$(curl -sL https://nextdns.io/install)" 74 | fi 75 | } 76 | 77 | __apt_install_custom_jami_p2p_videoconf() { 78 | # https://jami.net/download-jami-linux/ 79 | sudo apt install gnupg dirmngr ca-certificates curl --no-install-recommends 80 | curl -s https://dl.jami.net/public-key.gpg | sudo tee /usr/share/keyrings/jami-archive-keyring.gpg > /dev/null 81 | sudo sh -c "echo 'deb [signed-by=/usr/share/keyrings/jami-archive-keyring.gpg] https://dl.jami.net/nightly/ubuntu_20.04/ ring main' > /etc/apt/sources.list.d/jami.list" 82 | sudo apt-get update && sudo apt-get install jami 83 | } 84 | 85 | __apt_install_custom_syncthing() { 86 | # https://apt.syncthing.net/ 87 | 88 | if ! which syncthing > /dev/null 89 | then 90 | # Add the release PGP keys: 91 | sudo curl -s -o /usr/share/keyrings/syncthing-archive-keyring.gpg https://syncthing.net/release-key.gpg 92 | 93 | # Add the "stable" channel to your APT sources: 94 | echo "deb [signed-by=/usr/share/keyrings/syncthing-archive-keyring.gpg] https://apt.syncthing.net/ syncthing stable" | 95 | sudo tee /etc/apt/sources.list.d/syncthing.list 96 | 97 | # Add the "candidate" channel to your APT sources: 98 | echo "deb [signed-by=/usr/share/keyrings/syncthing-archive-keyring.gpg] https://apt.syncthing.net/ syncthing candidate" | 99 | sudo tee /etc/apt/sources.list.d/syncthing.list 100 | 101 | # Increase preference of Syncthing's packages ("pinning") 102 | printf "Package: *\nPin: origin apt.syncthing.net\nPin-Priority: 990\n" | sudo tee /etc/apt/preferences.d/syncthing 103 | 104 | # Update and install syncthing: 105 | sudo apt-get update 106 | sudo apt-get install syncthing 107 | fi 108 | } 109 | 110 | __dpkg_install_chrome() { 111 | if ! which google-chrome > /dev/null 112 | then curl https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb \ 113 | -o "$HOME/Downloads/google-chrome-stable_current_amd64.deb" 114 | sudo dpkg -i "$HOME/Downloads/google-chrome-stable_current_amd64.deb" 115 | sudo apt-get -f install 116 | fi 117 | } 118 | 119 | __snap_install_VSCode() { 120 | sudo snap install code --classic 121 | } 122 | 123 | sudo_install_from_walled_gardens() { 124 | __apt_install_custom_nextdns 125 | __apt_install_custom_syncthing 126 | __dpkg_install_chrome 127 | __snap_install_VSCode 128 | # TODO: install docker 129 | # TODO: install nvm (node version manager) https://github.com/nvm-sh/nvm#installing-and-updating 130 | # TODO: install node with nvm https://github.com/nvm-sh/nvm#usage 131 | # TODO: install yarn (package manager for JS) _WITHOUT_ node https://classic.yarnpkg.com/en/docs/install#debian-stable 132 | # TODO: install shadow-cljs if needed https://docs.cider.mx/cider/cljs/shadow-cljs.html 133 | sudo apt autoremove 134 | sudo apt autoclean 135 | } 136 | 137 | # 138 | # FLATPACK. 139 | # OH: "flatpack is the Glorious Future" 140 | # 141 | 142 | flatpak_install_packages_repo_with_restart() { 143 | # BEFORE ANYTHING ELSE 144 | # - Ensure flatpak remote repos configured. 145 | # - Trigger restart iff first-time configuration. 146 | local remote_name="flathub" 147 | if flatpak remotes --columns=name | grep -q "${remote_name}" 148 | then printf "INFO: About to install packages from %s.\n" "${remote_name}" 149 | else printf "INFO: About to add remote with name %s.\n" "${remote_name}" 150 | flatpak remote-add --if-not-exists \ 151 | "${remote_name}" \ 152 | "https://flathub.org/repo/flathub.flatpakrepo" 153 | fi 154 | } 155 | 156 | flatpak_install_packages() { 157 | local flatpak_aliases_file="$HOME/.bash_aliases_flatpak" 158 | 159 | declare -A app_alias_to_app_ID_array=( 160 | [bitwarden]="com.bitwarden.desktop" 161 | [bookworm]="com.github.babluboy.bookworm" 162 | [dropbox]="com.dropbox.Client" 163 | [gimp]="org.gimp.GIMP" 164 | [inkscape]="org.inkscape.Inkscape" 165 | [keepassx]="org.keepassxc.KeePassXC" 166 | [postman]="com.getpostman.Postman" 167 | [scribus]="net.scribus.Scribus" 168 | [skype]="com.skype.Client" 169 | [slack]="com.slack.Slack" 170 | [vlc]="org.videolan.VLC" 171 | [xournal]="net.sourceforge.xournal" 172 | [zeal]="org.zealdocs.Zeal" 173 | [zoom]="us.zoom.Zoom" 174 | ) 175 | 176 | __ensure_distinct() { tr -s '\n' | sort | uniq ; } 177 | 178 | # set up to update bash aliases file, for flatpak apps 179 | touch -m "${flatpak_aliases_file}" 180 | cat /dev/null > "${flatpak_aliases_file}.tmp" 181 | 182 | cat "${flatpak_aliases_file}" | 183 | __ensure_distinct | 184 | tee "${flatpak_aliases_file}.tmp" > /dev/null 185 | 186 | # check app alias, and flatpak install if not already apt installed 187 | for app_alias in "${!app_alias_to_app_ID_array[@]}" 188 | do if which "${app_alias}" > /dev/null || 189 | grep -q -E "${app_alias}" "${flatpak_aliases_file}.tmp" 190 | then printf "Skipping flatpak install of ${app_alias}.\n" 191 | else printf "About to install ${app_alias_to_app_ID_array[${app_alias}]}.\n" 192 | flatpak install -y flathub "${app_alias_to_app_ID_array[${app_alias}]}" 193 | cat >> "${flatpak_aliases_file}.tmp" < /dev/null 203 | 204 | # cleanup tmp file 205 | rm "${flatpak_aliases_file}.tmp" 206 | } 207 | 208 | # 209 | # HOME bin HOME. 210 | # Keep it in the userspace. 211 | # 212 | 213 | __install_to_HOME_bin_youtube_dl() { 214 | if ! which youtube-dl > /dev/null 215 | then curl -L https://yt-dl.org/downloads/latest/youtube-dl -o "${HOME}/bin/youtube-dl" 216 | chmod u+x "${HOME}/bin/youtube-dl" 217 | fi 218 | } 219 | 220 | __install_to_HOME_bin_clojure_things() { 221 | # leiningen 222 | if ! which lein > /dev/null 223 | then curl https://raw.githubusercontent.com/technomancy/leiningen/stable/bin/lein -o "$HOME/bin/lein" 224 | chmod u+x "$HOME/bin/lein" 225 | "$HOME/bin/lein" 226 | fi 227 | # clj-kondo 228 | ( source ./clj-projects.sh 229 | clj_kondo_install ) 230 | } 231 | 232 | __install_to_HOME_bin_and_local_python_things() { 233 | # get pipenv, virtualenv, and virtualenvwrapper 234 | # https://docs.python-guide.org/dev/virtualenvs/# 235 | which pipenv || pip3 install --user pipenv 236 | which virtualenv || pip3 install virtualenv 237 | which virtualenvwrapper.sh || pip3 install virtualenvwrapper 238 | } 239 | 240 | __install_to_HOME_bin_ngrok() { 241 | if ! which ngrok > /dev/null 242 | then cd "$HOME/Downloads" 243 | curl https://bin.equinox.io/c/4VmDzA7iaHb/ngrok-stable-linux-amd64.zip -o "./ngrok-stable-linux-amd64.zip" 244 | unzip "./ngrok-stable-linux-amd64.zip" 245 | mv "./ngrok" "$HOME/bin/ngrok" 246 | fi 247 | } 248 | 249 | __install_to_HOME_bin_babashka() { 250 | # ref: https://raw.githubusercontent.com/babashka/babashka/master/install 251 | if ! which bb > /dev/null 252 | then 253 | local version="$(curl -sL https://raw.githubusercontent.com/babashka/babashka/master/resources/BABASHKA_RELEASED_VERSION)" 254 | local tar_file="babashka-${version}-linux-amd64-static.tar.gz" 255 | local sha_file="${tar_file}.sha256" 256 | local uri_path="https://github.com/babashka/babashka/releases/download/v${version}/" 257 | cd "$HOME/Downloads" 258 | curl -o "./${tar_file}" -sL "${uri_path}/${tar_file}" 259 | curl -o "./${sha_file}" -sL "${uri_path}/${sha_file}" 260 | 261 | printf "%s %s" "$(cat ${sha_file})" ${tar_file} | 262 | sha256sum --check --status && 263 | tar -zxf "./${tar_file}" && 264 | mv -f "./bb" "$HOME/bin/bb" 265 | fi 266 | } 267 | 268 | install_to_HOME_bin() { 269 | mkdir -p "$HOME/bin" 270 | __install_to_HOME_bin_youtube_dl 271 | __install_to_HOME_bin_clojure_things 272 | __install_to_HOME_bin_and_local_python_things 273 | __install_to_HOME_bin_ngrok 274 | __install_to_HOME_bin_babashka 275 | } 276 | 277 | # 278 | # CONFIGURATIONS. 279 | # Are we there yet? 280 | # 281 | 282 | __configure_java_default_as_openjdk_11() { 283 | if ! java --version | grep -q "openjdk 11" 284 | then sudo update-java-alternatives \ 285 | --set "$(update-java-alternatives --list | awk '/java-1.11.*-amd64/ { print $1 }')" 286 | fi 287 | } 288 | 289 | configure_things() { 290 | __configure_java_default_as_openjdk_11 291 | } 292 | 293 | # 294 | # EXECUTE! 295 | # Theirs not to make reply, 296 | # Theirs not to reason why, 297 | # Theirs but to do and die. 298 | # Into the valley of Death 299 | # Rode the six hundred. 300 | # 301 | 302 | apt_install_standard_packages 303 | 304 | sudo_install_from_walled_gardens 305 | 306 | install_to_HOME_bin 307 | 308 | flatpak_install_packages_repo_with_restart 309 | flatpak_install_packages 310 | 311 | configure_things 312 | -------------------------------------------------------------------------------- /patterns-heredocs-strings.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | demo_nothing_interpolates() { 4 | local interpolate_me="42" 5 | 6 | cat <<- 'EOF' 7 | The hyphenated begin block `<<-` with quoted 'EOF' suppresses all interpolation. 8 | 9 | So ${interpolate_me} will print literally. 10 | 11 | EOF 12 | } 13 | 14 | demo_something_interpolates() { 15 | local interpolate_me="42" 16 | 17 | cat << EOF 18 | The \$interpolate_me variable expands to $interpolate_me 19 | EOF 20 | } 21 | -------------------------------------------------------------------------------- /scripting-utils.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # PATH MANAGEMENT 5 | # 6 | 7 | make_clean_PATH() { 8 | local my_path=${@:-"${PATH}"} 9 | 10 | printf "%s\n" ${my_path} | 11 | # un-form PATH-like strings 12 | tr ':' '\n' | 13 | # De-duplicate. # ref: https://stackoverflow.com/a/20639730 14 | # Or use awk '!x[$0]++', as explained here https://stackoverflow.com/a/11532197 15 | cat -n | sort -uk2 | sort -nk1 | cut -f2- | 16 | # re-form lines into single-colon-separated PATH-like string 17 | tr '\n' ':' | tr -s ':' | 18 | # ensure no trailing colons 19 | sed 's/:$//g' 20 | } 21 | 22 | 23 | # DEPENDENCY CHECKS 24 | # 25 | 26 | ensure_deps() { 27 | # Given a list of dependencies, emit unmet dependencies to stderr, 28 | # and return 1. Otherwise silently return 0. 29 | local required_deps="${@}" 30 | local err_code=0 31 | for prog in ${required_deps} 32 | do if ! which "$prog" > /dev/null 33 | then 2>& printf "%s\n" "$prog" 34 | err_code=1 35 | fi 36 | done 37 | return ${err_code} 38 | } 39 | 40 | ensure_min_bash_version() { 41 | # Given a 'Major.Minor.Patch' SemVer number, return 1 if the system's 42 | # bash version is older than the given version. Default to 4.0.0. 43 | # Hat tip: https://unix.stackexchange.com/a/285928 44 | local semver="${1:-4.0.0}" 45 | local bashver="${BASH_VERSINFO[0]}.${BASH_VERSINFO[1]}.${BASH_VERSINFO[2]}" 46 | 47 | [[ $(printf "%s\n" ${semver} ${bashver} | sort -V | head -1) == ${semver} ]] 48 | } 49 | 50 | # 51 | # FUNCTION QUERIES and TRANSFORMS 52 | # 53 | 54 | fn_to_sh_syntax() { 55 | # Given a zsh-style or bash-style function definition, emit 56 | # an sh-style companion. I prefer to keep the opening brace 57 | # on the same line as the fn name and the body+closing braces 58 | # on the following lines, for cleaner regex-matching of fn 59 | # definitions. e.g. 60 | # 61 | # 'function foo3_bar() {' --> 'foo3_bar() {' 62 | # 'function foo3_bar {' --> 'foo3_bar() {' 63 | # 64 | sed -E 's;(function)\s+(\w+)(\(\))?+\s+\{;\2() \{;' ; 65 | } 66 | 67 | fn_ls() { 68 | # List out names of functions found in the given files, 69 | # assuming well-formed function definitions, written as: 70 | # 71 | # 'foo3_bar() {' 72 | # 73 | grep -E -o "^(\w+)\(\)\s+\{" "$@" | tr -d '(){' 74 | } 75 | -------------------------------------------------------------------------------- /usage-trap.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | set -euo pipefail 4 | 5 | # Include all common utils first, such as logging 6 | source "./logging.sh" 7 | 8 | 9 | # #################### 10 | # GLOBAL CONSTANTS 11 | # #################### 12 | 13 | # Allowed commands must exactly match terraform commands 14 | declare -r allowed_commands=$(cat <<- 'ALLOWEDCMDS' 15 | foo 16 | bar 17 | baz 18 | quxx 19 | ALLOWEDCMDS 20 | ) 21 | 22 | 23 | # #################### 24 | # SCRIPT HELP 25 | # #################### 26 | 27 | usage() { 28 | cat <<- EOH 29 | 30 | NAME 31 | 32 | ${0} - automate commands over the given target + environment combination. 33 | 34 | SYNOPSIS 35 | 36 | \$ ${0} [command] [target] [environment] 37 | 38 | DESCRIPTION 39 | 40 | Something meaningful. State prerequisites, pre-checks. 41 | 42 | Command is one of : $(echo $allowed_commands | sed 's/\s/, /g') 43 | 44 | EXAMPLES 45 | 46 | \$ ${0} -h|--help 47 | For help 48 | 49 | \$ ${0} foo targetName sandbox 50 | Execute "foo" command for all targets in the "sandbox" environment. 51 | 52 | EOH 53 | } 54 | 55 | 56 | # #################### 57 | # SCRIPT INPUTS (MANDATORY) 58 | # #################### 59 | 60 | trap usage EXIT SIGINT SIGTERM # ensure usage auto-prints if required args are missing 61 | 62 | if [[ -z ${1:-} || ${1} == "-h" || ${1} == "--help" ]] 63 | then exit 64 | fi 65 | 66 | declare -r command="${1:?$(log_error Required argument missing: 'command'.)}" 67 | declare -r target="${2:?$(log_error Required argument missing: 'target'.)}" 68 | declare -r environment="${3:?$(log_error Required argument missing: 'environment'.)}" 69 | 70 | trap - EXIT SIGINT SIGTERM # unset trap 71 | 72 | 73 | # #################### 74 | # CONSTRUCTED GLOBAL CONSTANTS 75 | # #################### 76 | 77 | # Constructed global constants 78 | declare -r target_dir="path/to/${target}" 79 | declare -r target_file="$target_dir/example.txt" 80 | declare -r ROOT_DIR="$PWD" 81 | 82 | 83 | # #################### 84 | # PRE-RUN CHECKS and FAILSAFES 85 | # #################### 86 | 87 | trap usage EXIT # ensure usage auto-prints if we preemptively fail the script 88 | 89 | # FAIL if unmet dependencies 90 | ensure_dependencies # comes from ./bin/common_util_functions.sh 91 | 92 | # FAIL if bad/disallowed command is provided 93 | if [[ -z $(echo "$allowed_commands" | grep --regexp="^${command}\$") ]] 94 | then log_fatal_die "Unrecognised command \"$command\". See usage." 95 | fi 96 | 97 | # FAIL if conventional infra list file is absent 98 | if ! [[ -f "./$target_file" ]] 99 | then log_fatal_die "Cannot find ./${target_file} for \"$target\" target." 100 | fi 101 | 102 | # FAIL destructive command invocations, if environment is production-like 103 | if [[ ${command} == @(baz|quxx) ]] && [[ ${environment} == @(prod*|sandbox) ]] 104 | then log_fatal_die "You are attempting to run a destructive command \"${command}\" against the production environment \"${environment}\"." 105 | fi 106 | 107 | trap - EXIT # release usage-printing trap we took previously on EXIT 108 | 109 | # #################### 110 | # LOGIC 111 | # #################### 112 | 113 | func1() { 114 | # do something 115 | } 116 | 117 | func2() { 118 | # do something else 119 | } 120 | 121 | func3() { 122 | # do something even better 123 | } 124 | 125 | # #################### 126 | # EXECUTE! 127 | # #################### 128 | 129 | # Scary stateful stuff goes here 130 | --------------------------------------------------------------------------------