├── .env.dev ├── .env.prod ├── .gitignore ├── backend ├── Dockerfile ├── Dockerfile.prod ├── devsupervisord.conf ├── manage.py ├── netdis │ ├── .DS_Store │ ├── __init__.py │ ├── __pycache__ │ │ ├── __init__.cpython-312.pyc │ │ ├── settings.cpython-312.pyc │ │ ├── urls.cpython-312.pyc │ │ └── wsgi.cpython-312.pyc │ ├── asgi.py │ ├── authentication │ │ ├── __init__.py │ │ ├── admin.py │ │ ├── apps.py │ │ ├── migrations │ │ │ └── __init__.py │ │ ├── models.py │ │ ├── tests.py │ │ ├── urls.py │ │ └── views.py │ ├── celery.py │ ├── core │ │ ├── __init__.py │ │ ├── __pycache__ │ │ │ ├── __init__.cpython-312.pyc │ │ │ ├── admin.cpython-312.pyc │ │ │ ├── apps.cpython-312.pyc │ │ │ ├── models.cpython-312.pyc │ │ │ ├── urls.cpython-312.pyc │ │ │ └── views.cpython-312.pyc │ │ ├── admin.py │ │ ├── apps.py │ │ ├── ghidra │ │ │ ├── __init__.py │ │ │ └── analysis.py │ │ ├── migrations │ │ │ └── __init__.py │ │ ├── models.py │ │ ├── serializers.py │ │ ├── tasks.py │ │ ├── tests.py │ │ ├── urls.py │ │ ├── utils.py │ │ └── views.py │ ├── proctor │ │ ├── __init__.py │ │ ├── admin.py │ │ ├── apps.py │ │ ├── migrations │ │ │ └── __init__.py │ │ ├── models.py │ │ ├── tests.py │ │ └── views.py │ ├── settings.py │ ├── urls.py │ └── wsgi.py ├── prodsupervisord.conf └── requirements.txt ├── docker-compose-dev.yml ├── docker-compose-maintenance.yml ├── docker-compose-prod.yml ├── docker-compose.yml ├── frontend ├── .env.vite.dev ├── .env.vite.prod ├── .eslintrc.cjs ├── .gitignore ├── Dockerfile ├── Dockerfile.prod ├── README.md ├── entrypoint.dev.sh ├── entrypoint.prod.sh ├── index.html ├── package-lock.json ├── package.json ├── postcss.config.js ├── public │ ├── django_logo.png │ ├── favicon.ico │ ├── ghidra_logo.png │ ├── graph.png │ ├── graph_skew_dark.png │ └── vite.svg ├── src │ ├── assets │ │ └── react.svg │ ├── components │ │ ├── CodeNode.css │ │ ├── CodeNode.jsx │ │ ├── Decompilation.jsx │ │ ├── FunctionList.jsx │ │ ├── Graph.jsx │ │ ├── Listing.jsx │ │ ├── MenuBarItem.jsx │ │ ├── Menubar.jsx │ │ ├── NavBar.css │ │ ├── NavBar.jsx │ │ ├── RawHex.jsx │ │ ├── Strings.jsx │ │ └── Upload.jsx │ ├── context │ │ ├── AnalysisContext.js │ │ ├── MenuContext.jsx │ │ └── NodeSizeContext.jsx │ ├── index.css │ ├── main.jsx │ └── pages │ │ ├── AnalysisPage.css │ │ ├── AnalysisPage.jsx │ │ ├── ErrorPage.jsx │ │ ├── HomePage.jsx │ │ └── InfoPage.jsx ├── tailwind.config.js └── vite.config.js ├── maintenance ├── index.html └── lain.gif ├── netdis.png ├── nginxprod.conf ├── production.checklist └── readme.MD /.env.dev: -------------------------------------------------------------------------------- 1 | DEBUG=1 2 | SECRET_KEY=foo 3 | DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1] 4 | CORS_ALLOWED_ORIGINS=http://localhost:5173 5 | 6 | SQL_ENGINE=django.db.backends.postgresql 7 | SQL_DATABASE=netdis 8 | SQL_USER=postgres 9 | SQL_PASSWORD=postgres 10 | SQL_HOST=db 11 | SQL_PORT=5432 12 | 13 | # 2 * 1024 * 1024 * 1024 = 2gb 14 | MAX_STORAGE=2147483648 15 | # 2 * 1024 * 1024 = 2mb 16 | MAX_FILE_SIZE=2097152 -------------------------------------------------------------------------------- /.env.prod: -------------------------------------------------------------------------------- 1 | DEBUG=0 2 | SECRET_KEY=foo 3 | DJANGO_ALLOWED_HOSTS=https://www.netdis.org https://netdis.org netdis.org 4 | CORS_ALLOWED_ORIGINS=https://www.netdis.org https://netdis.org netdis.org 5 | 6 | SQL_ENGINE=django.db.backends.postgresql 7 | SQL_DATABASE=netdis 8 | SQL_USER=postgres 9 | SQL_PASSWORD=postgres 10 | SQL_HOST=db 11 | SQL_PORT=5432 12 | 13 | # 2 * 1024 * 1024 * 1024 = 2gb 14 | MAX_STORAGE=2147483648 15 | # 2 * 1024 * 1024 = 2mb 16 | MAX_FILE_SIZE=2097152 -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | backend/db.sqlite3 2 | venv/ 3 | .DS_Store 4 | backend/media 5 | db.sqlite3 6 | *.log 7 | *.pot 8 | *.pyc 9 | node_modules/ 10 | dump.rdb 11 | .env 12 | -------------------------------------------------------------------------------- /backend/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3.11 2 | 3 | WORKDIR /app 4 | 5 | COPY ./requirements.txt . 6 | RUN apt-get update && apt-get install -y gcc g++ supervisor unzip java-common openjdk-17-jdk 7 | RUN pip3 install --upgrade pip 8 | RUN pip3 install -r requirements.txt 9 | RUN wget https://github.com/NationalSecurityAgency/ghidra/releases/download/Ghidra_11.1.2_build/ghidra_11.1.2_PUBLIC_20240709.zip 10 | RUN unzip ./ghidra_11.1.2_PUBLIC_20240709.zip 11 | ENV GHIDRA_INSTALL_DIR=/app/ghidra_11.1.2_PUBLIC 12 | 13 | # 4 hours of my life spent on these two RUNs 14 | RUN curl -fsSL https://services.gradle.org/distributions/gradle-7.6-bin.zip -o gradle-7.6-bin.zip && \ 15 | unzip gradle-7.6-bin.zip -d /opt && \ 16 | ln -s /opt/gradle-7.6/bin/gradle /usr/bin/gradle && \ 17 | rm gradle-7.6-bin.zip 18 | 19 | RUN ./ghidra_11.1.2_PUBLIC/support/buildNatives 20 | 21 | EXPOSE 8000 22 | 23 | COPY devsupervisord.conf /etc/supervisor/conf.d/supervisord.conf 24 | 25 | COPY . . 26 | 27 | CMD ["/usr/bin/supervisord"] -------------------------------------------------------------------------------- /backend/Dockerfile.prod: -------------------------------------------------------------------------------- 1 | FROM python:3.11 2 | 3 | WORKDIR /app 4 | 5 | COPY ./requirements.txt . 6 | RUN apt-get update && apt-get install -y gcc g++ supervisor unzip java-common openjdk-17-jdk 7 | RUN python3 -m venv venv && . venv/bin/activate 8 | RUN pip3 install --upgrade pip 9 | RUN pip3 install -r requirements.txt 10 | RUN wget https://github.com/NationalSecurityAgency/ghidra/releases/download/Ghidra_11.1.2_build/ghidra_11.1.2_PUBLIC_20240709.zip 11 | RUN unzip ./ghidra_11.1.2_PUBLIC_20240709.zip 12 | ENV GHIDRA_INSTALL_DIR=/app/ghidra_11.1.2_PUBLIC 13 | 14 | # 4 hours of my life spent on these two RUNs 15 | RUN curl -fsSL https://services.gradle.org/distributions/gradle-7.6-bin.zip -o gradle-7.6-bin.zip && \ 16 | unzip gradle-7.6-bin.zip -d /opt && \ 17 | ln -s /opt/gradle-7.6/bin/gradle /usr/bin/gradle && \ 18 | rm gradle-7.6-bin.zip 19 | 20 | RUN ./ghidra_11.1.2_PUBLIC/support/buildNatives 21 | 22 | RUN pip install gunicorn 23 | 24 | COPY . . 25 | 26 | RUN rm -f /etc/supervisor/conf.d/supervisord.conf 27 | COPY prodsupervisord.conf /etc/supervisor/conf.d/supervisord.conf 28 | 29 | EXPOSE 8000 30 | 31 | CMD ["/usr/bin/supervisord"] -------------------------------------------------------------------------------- /backend/devsupervisord.conf: -------------------------------------------------------------------------------- 1 | [supervisord] 2 | nodaemon=true 3 | 4 | [program:django] 5 | environment = PYTHONUNBUFFERED=1 6 | command=sh -c "python manage.py flush --noinput && python manage.py makemigrations && python manage.py migrate && python manage.py runserver 0.0.0.0:8000" 7 | autorestart=true 8 | stdout_logfile=/dev/stdout 9 | stderr_logfile=/dev/stderr 10 | stdout_logfile_maxbytes = 0 11 | stderr_logfile_maxbytes = 0 12 | 13 | [program:celery] 14 | command=celery -A netdis worker -l info --concurrency=2 15 | autorestart=true 16 | stdout_logfile=/dev/stdout 17 | stderr_logfile=/dev/stderr 18 | stdout_logfile_maxbytes = 0 19 | stderr_logfile_maxbytes = 0 20 | 21 | [program:celery_beat] 22 | command=celery -A netdis beat -l INFO 23 | autorestart=true 24 | stdout_logfile=/dev/stdout 25 | stderr_logfile=/dev/stderr 26 | stdout_logfile_maxbytes = 0 27 | stderr_logfile_maxbytes = 0 -------------------------------------------------------------------------------- /backend/manage.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | """Django's command-line utility for administrative tasks.""" 3 | import os 4 | import sys 5 | 6 | 7 | def main(): 8 | """Run administrative tasks.""" 9 | os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'netdis.settings') 10 | try: 11 | from django.core.management import execute_from_command_line 12 | except ImportError as exc: 13 | raise ImportError( 14 | "Couldn't import Django. Are you sure it's installed and " 15 | "available on your PYTHONPATH environment variable? Did you " 16 | "forget to activate a virtual environment?" 17 | ) from exc 18 | execute_from_command_line(sys.argv) 19 | 20 | 21 | if __name__ == '__main__': 22 | main() 23 | -------------------------------------------------------------------------------- /backend/netdis/.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/.DS_Store -------------------------------------------------------------------------------- /backend/netdis/__init__.py: -------------------------------------------------------------------------------- 1 | from .celery import app as celery_app 2 | 3 | __all__ = ("celery_app",) -------------------------------------------------------------------------------- /backend/netdis/__pycache__/__init__.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/__pycache__/__init__.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/__pycache__/settings.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/__pycache__/settings.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/__pycache__/urls.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/__pycache__/urls.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/__pycache__/wsgi.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/__pycache__/wsgi.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/asgi.py: -------------------------------------------------------------------------------- 1 | """ 2 | ASGI config for netdis project. 3 | 4 | It exposes the ASGI callable as a module-level variable named ``application``. 5 | 6 | For more information on this file, see 7 | https://docs.djangoproject.com/en/4.2/howto/deployment/asgi/ 8 | """ 9 | 10 | import os 11 | 12 | from django.core.asgi import get_asgi_application 13 | 14 | os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'netdis.settings') 15 | 16 | application = get_asgi_application() 17 | -------------------------------------------------------------------------------- /backend/netdis/authentication/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/authentication/__init__.py -------------------------------------------------------------------------------- /backend/netdis/authentication/admin.py: -------------------------------------------------------------------------------- 1 | from django.contrib import admin 2 | 3 | # Register your models here. 4 | -------------------------------------------------------------------------------- /backend/netdis/authentication/apps.py: -------------------------------------------------------------------------------- 1 | from django.apps import AppConfig 2 | 3 | 4 | class AuthenticationConfig(AppConfig): 5 | default_auto_field = 'django.db.models.BigAutoField' 6 | name = 'netdis.authentication' 7 | -------------------------------------------------------------------------------- /backend/netdis/authentication/migrations/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/authentication/migrations/__init__.py -------------------------------------------------------------------------------- /backend/netdis/authentication/models.py: -------------------------------------------------------------------------------- 1 | from django.db import models 2 | 3 | # Create your models here. 4 | -------------------------------------------------------------------------------- /backend/netdis/authentication/tests.py: -------------------------------------------------------------------------------- 1 | from django.test import TestCase 2 | 3 | # Create your tests here. 4 | -------------------------------------------------------------------------------- /backend/netdis/authentication/urls.py: -------------------------------------------------------------------------------- 1 | from django.urls import path 2 | from .views import test_view 3 | 4 | from rest_framework_simplejwt.views import ( 5 | TokenObtainPairView, 6 | TokenRefreshView 7 | ) 8 | 9 | urlpatterns = [ 10 | path('token/', TokenObtainPairView.as_view(), name='token_obtain_pair'), 11 | path('token/refresh', TokenRefreshView.as_view(), name='token_refresh'), 12 | path('test', test_view, name='test_view') 13 | ] -------------------------------------------------------------------------------- /backend/netdis/authentication/views.py: -------------------------------------------------------------------------------- 1 | from django.shortcuts import render 2 | from rest_framework.decorators import api_view, permission_classes 3 | from rest_framework.permissions import IsAuthenticated 4 | from rest_framework.response import Response 5 | # Create your views here. 6 | 7 | @api_view(['GET']) 8 | @permission_classes([IsAuthenticated]) 9 | def test_view(request): 10 | return Response("ur logged in bro!") -------------------------------------------------------------------------------- /backend/netdis/celery.py: -------------------------------------------------------------------------------- 1 | import os 2 | from celery import Celery 3 | 4 | os.environ.setdefault("DJANGO_SETTINGS_MODULE", "netdis.settings") 5 | app = Celery("netdis") 6 | app.config_from_object("django.conf:settings", namespace="CELERY") 7 | app.autodiscover_tasks() -------------------------------------------------------------------------------- /backend/netdis/core/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/__init__.py -------------------------------------------------------------------------------- /backend/netdis/core/__pycache__/__init__.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/__pycache__/__init__.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/core/__pycache__/admin.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/__pycache__/admin.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/core/__pycache__/apps.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/__pycache__/apps.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/core/__pycache__/models.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/__pycache__/models.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/core/__pycache__/urls.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/__pycache__/urls.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/core/__pycache__/views.cpython-312.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/__pycache__/views.cpython-312.pyc -------------------------------------------------------------------------------- /backend/netdis/core/admin.py: -------------------------------------------------------------------------------- 1 | from django.contrib import admin 2 | 3 | # Register your models here. 4 | -------------------------------------------------------------------------------- /backend/netdis/core/apps.py: -------------------------------------------------------------------------------- 1 | from django.apps import AppConfig 2 | 3 | 4 | class CoreConfig(AppConfig): 5 | default_auto_field = 'django.db.models.BigAutoField' 6 | name = 'netdis.core' 7 | -------------------------------------------------------------------------------- /backend/netdis/core/ghidra/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/ghidra/__init__.py -------------------------------------------------------------------------------- /backend/netdis/core/ghidra/analysis.py: -------------------------------------------------------------------------------- 1 | import pyhidra 2 | from ..models import Task, UploadedFile, Project, Function, Block, Disasm, CFGAnalysisResult, FileUploadResult 3 | from django.contrib.contenttypes.models import ContentType 4 | from celery import shared_task 5 | import base64 6 | import jpype 7 | import logging 8 | 9 | logger = logging.getLogger(__name__) 10 | 11 | # This is for local dev 12 | # os.environ['GHIDRA_INSTALL_DIR'] = "/Users/sasha/Desktop/ghidra_10.3.2_PUBLIC/" 13 | 14 | @shared_task() 15 | def ghidra_get_loaders(program): 16 | print("THIS SHOULDNT HAVE A LOAD SPEC ISSUE...") 17 | try: 18 | with pyhidra.open_program(program, analyze=False,language="x86:LE:64:default", loader="ghidra.app.util.opinion.BinaryLoader") as flat_api: 19 | from ghidra.app.util.opinion import LoaderService 20 | from ghidra.app.util.bin import FileByteProvider 21 | from java.nio.file import AccessMode 22 | from java.io import File 23 | byte_provider = FileByteProvider(File(program), None, AccessMode.READ) 24 | load_specs = LoaderService.getAllSupportedLoadSpecs(byte_provider) 25 | loaders = {} 26 | for loader in load_specs: 27 | loaders[loader.getName()] = loader.toString().split('@')[0] 28 | logger.debug(f"LOADER CLASS: {loader.toString()} LOADER NAME: {loader.getName()}") 29 | 30 | from ghidra.program.util import DefaultLanguageService 31 | language_service = DefaultLanguageService.getLanguageService() 32 | language_descs = language_service.getLanguageDescriptions(False) 33 | langs = {} 34 | for lang in language_descs: 35 | langs[lang.getLanguageID().getIdAsString()] = lang.getDescription() 36 | return [loaders, langs] 37 | except Exception as e: 38 | return {"error": e.toString()} 39 | 40 | @shared_task() 41 | def ghidra_get_strings(program): 42 | try: 43 | with pyhidra.open_program(program) as flat_api: 44 | currentProgram = flat_api.getCurrentProgram() 45 | memory = currentProgram.getMemory() 46 | strings = flat_api.findStrings(None, 4, 1, True, True) 47 | json_strings = {} 48 | for string in strings: 49 | current_string = string.getString(memory) 50 | json_strings[string.getAddress().toString()] = repr(current_string) 51 | return json_strings 52 | except Exception as e: 53 | return {"error": e.toString()} 54 | 55 | @shared_task() 56 | def ghidra_get_rawhex(program, address, length): 57 | print(f"PROGRAM: {program} ADDRESS: {address} LENGTH: {length}") 58 | try: 59 | with pyhidra.open_program(program) as flat_api: 60 | from ghidra.program.model.address import Address 61 | currentProgram = flat_api.getCurrentProgram() 62 | 63 | address_factory = currentProgram.getAddressFactory() 64 | address = address_factory.getAddress(str(address)) 65 | memory = currentProgram.getMemory() 66 | valid_lengths = [1, 2, 4, 8, 16, 32, 64, 128, 512] 67 | if length not in valid_lengths: 68 | return {"error": "Invalid size"} 69 | if memory.contains(address): 70 | byte_array = {} 71 | for byte in range(length): 72 | logger.debug(f"At address: {str(address)}") 73 | try: 74 | byte_array[str(address)] = format(memory.getByte(address) & 0xFF, '02x') 75 | except: 76 | byte_array[str(address)] = "??" 77 | address = address.add(1) 78 | return byte_array 79 | else: 80 | return {"error": "Invalid address"} 81 | except Exception as e: 82 | return {"error": e.toString()} 83 | 84 | @shared_task() 85 | def ghidra_decompile_func(program, func_id): 86 | try: 87 | with pyhidra.open_program(program) as flat_api: 88 | from ghidra.app.decompiler import DecompInterface 89 | from ghidra.util.task import ConsoleTaskMonitor 90 | currentProgram = flat_api.getCurrentProgram() 91 | function_obj = Function.objects.get(id=func_id) 92 | function_address = currentProgram.getAddressFactory().getAddress(function_obj.addr) 93 | ghidra_function = currentProgram.getFunctionManager().getFunctionAt(function_address) 94 | decompiler = DecompInterface() 95 | decompiler.openProgram(currentProgram) 96 | task_monitor = ConsoleTaskMonitor() 97 | decomp_result = decompiler.decompileFunction(ghidra_function, 30, task_monitor) 98 | if decomp_result.decompileCompleted(): 99 | decompiled_code = decomp_result.getDecompiledFunction().getC() 100 | return decompiled_code 101 | else: 102 | error_message = decomp_result.getErrorMessage() 103 | return {"error": error_message} 104 | except Exception as e: 105 | return {"error": e.toString()} 106 | 107 | @shared_task() 108 | def ghidra_function_cfg(program, func_id): 109 | try: 110 | with pyhidra.open_program(program) as flat_api: 111 | from ghidra.util.task import TaskMonitor 112 | import ghidra.program.model.block as blockmodel 113 | monitor = TaskMonitor.DUMMY 114 | currentProgram = flat_api.getCurrentProgram() 115 | if Function.objects.get(id=func_id): 116 | function_obj = Function.objects.get(id=func_id) 117 | func_address = currentProgram.getAddressFactory().getAddress(function_obj.addr) 118 | ghidra_function = currentProgram.getFunctionManager().getFunctionAt(func_address) 119 | code_block_model = blockmodel.BasicBlockModel(currentProgram) 120 | blocks = code_block_model.getCodeBlocksContaining(ghidra_function.body, monitor) 121 | edges = [] 122 | for block in blocks: 123 | if Block.objects.filter(function=function_obj, addr=block.minAddress).exists(): 124 | block_obj = Block.objects.get(function=function_obj, addr=block.minAddress) 125 | else: 126 | block_obj = Block(function = function_obj, addr=block.minAddress) 127 | block_obj.save() 128 | srcs = code_block_model.getSources(block, monitor) 129 | dsts = code_block_model.getDestinations(block, monitor) 130 | while srcs.hasNext(): 131 | src = srcs.next().getSourceBlock() 132 | if Block.objects.filter(function=function_obj, addr=src.minAddress).exists(): 133 | src_obj = Block.objects.get(function=function_obj, addr=src.minAddress) 134 | block_obj.src.add(src_obj) 135 | else: 136 | src_obj = Block(function = function_obj, addr=src.minAddress) 137 | src_obj.save() 138 | block_obj.src.add(src_obj) 139 | while dsts.hasNext(): 140 | dst_edge = dsts.next() 141 | dst = dst_edge.getDestinationBlock() 142 | if Block.objects.filter(function=function_obj, addr=dst.minAddress).exists(): 143 | dst_obj = Block.objects.get(function=function_obj, addr=dst.minAddress) 144 | block_obj.dst.add(dst_obj) 145 | else: 146 | dst_obj = Block(function = function_obj, addr=dst.minAddress) 147 | dst_obj.save() 148 | block_obj.dst.add(dst_obj) 149 | edge_type = "" 150 | if(dst_edge.getFlowType().isConditional()): 151 | edge_type = "conditional" 152 | if(dst_edge.getFlowType().isUnConditional()): 153 | edge_type = "unconditional" 154 | edges.append({"src": block_obj.id, "dst": dst_obj.id, "type": edge_type}) 155 | 156 | # Now return the json object! 157 | blocks = Block.objects.filter(function=function_obj).all() 158 | nodes = [{"id": block.id} for block in blocks] 159 | 160 | cfg_result = {"nodes": nodes, "edges": edges} 161 | except Exception as e: 162 | return {"error": e.toString()} 163 | return cfg_result 164 | 165 | @shared_task() 166 | def ghidra_full_disassembly(task_id, program, file_id, loader, language): 167 | logger.debug(f"DOING LANGUAGE: '{language}'") 168 | task = Task.objects.get(pk=task_id) 169 | task.status = 'PROCESSING' 170 | task.save() 171 | 172 | kwargs = {} 173 | try: 174 | if language is not None and language.upper() != "NONE": 175 | kwargs['language'] = language 176 | if loader is not None and loader.upper() != "NONE": 177 | kwargs['loader'] = loader 178 | except Exception as e: 179 | logger.debug(f"Error setting up kwargs: {str(e)}") 180 | 181 | logger.debug(f"Using kwargs: {kwargs}") 182 | 183 | try: 184 | with pyhidra.open_program(program, **kwargs) as flat_api: 185 | from ghidra.util.task import TaskMonitor 186 | import ghidra.program.model.block as blockmodel 187 | monitor = TaskMonitor.DUMMY 188 | currentProgram = flat_api.getCurrentProgram() 189 | ghidra_functions = currentProgram.getFunctionManager().getFunctions(True) 190 | image_base = currentProgram.getImageBase().toString() 191 | logger.debug(f"Starting full disasm. IMAGE BASE: {image_base}") 192 | file = UploadedFile.objects.get(pk=file_id) 193 | file.image_base = image_base 194 | file.save() 195 | for f in ghidra_functions: 196 | print(f) 197 | file = UploadedFile.objects.get(pk = file_id) 198 | function_obj = Function(file=file,name=f.getName(), addr=f.getEntryPoint()) 199 | function_obj.save() 200 | code_block_model = blockmodel.BasicBlockModel(currentProgram) 201 | blocks = code_block_model.getCodeBlocksContaining(f.body, monitor) 202 | for block in blocks: 203 | block_obj = Block(function = function_obj, addr=block.minAddress) 204 | block_obj.save() 205 | instruction = currentProgram.getListing().getInstructionAt(block.minAddress) 206 | while instruction and instruction.getMinAddress() <= block.maxAddress: 207 | operands = '' 208 | for i in range(instruction.getNumOperands()): 209 | operands += instruction.getDefaultOperandRepresentation(i) 210 | operands += ', ' 211 | operands = operands[:-2] 212 | disasm_obj = Disasm(block=block_obj, op=instruction.getMnemonicString(), data=operands, addr=instruction.getMinAddress()) 213 | disasm_obj.save() 214 | instruction = instruction.getNext() 215 | except Exception as e: 216 | error_message = str(e) 217 | logger.error(f"Error in ghidra_full_disassembly: {error_message}") 218 | return {"error": error_message} -------------------------------------------------------------------------------- /backend/netdis/core/migrations/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/core/migrations/__init__.py -------------------------------------------------------------------------------- /backend/netdis/core/models.py: -------------------------------------------------------------------------------- 1 | from django.db import models 2 | from django.contrib.contenttypes.fields import GenericForeignKey 3 | from django.contrib.contenttypes.models import ContentType 4 | from netdis import settings 5 | import os 6 | 7 | class UploadedFile(models.Model): 8 | file = models.FileField(upload_to="uploads/", max_length=300) 9 | hash = models.CharField(max_length=256) 10 | uploaded_at = models.DateTimeField(auto_now_add=True) 11 | evict_at = models.DateTimeField(null=True, blank=True) 12 | file_size = models.IntegerField() 13 | image_base = models.CharField(max_length=64, null=True, blank=True) 14 | 15 | 16 | class Project(models.Model): 17 | file = models.ForeignKey(UploadedFile, on_delete=models.CASCADE) 18 | 19 | class Function(models.Model): 20 | file = models.ForeignKey(UploadedFile, on_delete=models.CASCADE) 21 | addr = models.CharField(max_length=64) 22 | name = models.CharField(max_length=256) 23 | cfg = models.BooleanField(default=False) 24 | def __str__(self): 25 | return self.name 26 | 27 | class Block(models.Model): 28 | function = models.ForeignKey(Function, on_delete=models.CASCADE) 29 | src = models.ManyToManyField('self', related_name="src_blocks", symmetrical=False, blank=True) 30 | dst = models.ManyToManyField('self', related_name="dst_blocks", symmetrical=False, blank=True) 31 | addr = models.CharField(max_length=64) 32 | 33 | class Disasm(models.Model): 34 | block = models.ForeignKey(Block, on_delete=models.CASCADE) 35 | addr = models.CharField(max_length=64) 36 | op = models.CharField(max_length=64) 37 | data = models.CharField(max_length=64) 38 | 39 | class Task(models.Model): 40 | task_type = models.CharField(max_length=64, default="default") 41 | status = models.CharField(max_length=16) 42 | 43 | content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE, null=True, blank=True) 44 | object_id = models.PositiveIntegerField(null=True, blank=True) 45 | result = GenericForeignKey('content_type', 'object_id') 46 | 47 | class FileUploadResult(models.Model): 48 | file = models.ForeignKey(UploadedFile, on_delete=models.CASCADE, null=True, blank=True) 49 | 50 | class CFGAnalysisResult(models.Model): 51 | json_result = models.JSONField() 52 | 53 | class DecompAnalysisResult(models.Model): 54 | decomp_result = models.TextField(null=True, blank=True) 55 | 56 | class ErrorResult(models.Model): 57 | error_message = models.TextField(null=True, blank=True) 58 | 59 | class RawHexResult(models.Model): 60 | raw_hex = models.TextField(null=True, blank=True) 61 | 62 | class StringsResult(models.Model): 63 | strings = models.JSONField() 64 | 65 | class LoadersResult(models.Model): 66 | loaders = models.JSONField() -------------------------------------------------------------------------------- /backend/netdis/core/serializers.py: -------------------------------------------------------------------------------- 1 | from rest_framework import serializers 2 | from .models import Function, UploadedFile, Project, Task, Block, Disasm 3 | 4 | class FunctionSerializer(serializers.ModelSerializer): 5 | class Meta: 6 | model = Function 7 | fields = '__all__' 8 | 9 | class BlockSerializer(serializers.ModelSerializer): 10 | class Meta: 11 | model = Block 12 | fields = '__all__' 13 | 14 | class DisasmSerializer(serializers.ModelSerializer): 15 | class Meta: 16 | model = Disasm 17 | fields = '__all__' 18 | 19 | class TaskSerializer(serializers.ModelSerializer): 20 | class Meta: 21 | model = Task 22 | fields = '__all__' 23 | -------------------------------------------------------------------------------- /backend/netdis/core/tasks.py: -------------------------------------------------------------------------------- 1 | # Celery tasks 2 | from .models import Task, UploadedFile, Project, Function, Block, Disasm, FileUploadResult, CFGAnalysisResult, DecompAnalysisResult, ErrorResult, RawHexResult, StringsResult, LoadersResult 3 | from celery import shared_task 4 | from .utils import timer 5 | import subprocess 6 | import os 7 | from .ghidra.analysis import ghidra_function_cfg, ghidra_full_disassembly, ghidra_decompile_func, ghidra_get_rawhex, ghidra_get_strings, ghidra_get_loaders 8 | from django.contrib.contenttypes.models import ContentType 9 | import logging 10 | 11 | logger = logging.getLogger(__name__) 12 | 13 | def get_loaders_task(file_id, task_id): 14 | file = UploadedFile.objects.get(pk=file_id) 15 | print(f"Getting task... ID {task_id}") 16 | task = Task.objects.get(pk=task_id) 17 | task.status = "ACTIVE" 18 | task.save() 19 | file_path = "./media/" + file.file.name 20 | ghidra_get_loaders.apply_async(args=(file_path,), link=get_loaders_task_callback.s(task.id, file_id)) 21 | 22 | @shared_task() 23 | def get_loaders_task_callback(loader_result, task_id, file_id): 24 | file = UploadedFile.objects.get(pk=file_id) 25 | file.delete() 26 | task = Task.objects.get(pk=task_id) 27 | if 'error' in loader_result: 28 | task.task_type = "error" 29 | result = ErrorResult.objects.create(error_message=loader_result) 30 | else: 31 | task.task_type = "loaders" 32 | result = LoadersResult.objects.create(loaders=loader_result) 33 | task.status = "DONE" 34 | task.content_type = ContentType.objects.get_for_model(result) 35 | task.object_id = result.id 36 | task.result = result 37 | task.save() 38 | 39 | def primary_analysis(file_id, task_id, loader, lang): 40 | file = UploadedFile.objects.get(pk=file_id) 41 | print(f"Getting task... ID {task_id}") 42 | task = Task.objects.get(pk=task_id) 43 | task.status = "ACTIVE" 44 | task.save() 45 | # Check if Project object already exists with this hash 46 | if(Project.objects.filter(file=file)).exists(): 47 | proj_obj = Project.objects.get(file=file) 48 | print("Project already exists. Returning project ID...") 49 | return proj_obj.id 50 | 51 | print("Project doesn't exist. Let's make one and analyze the file. Added debug below") 52 | logger.debug("Project doesn't exist. Let's make one and analyze the file.") 53 | print(file.file.name) 54 | file_path = "./media/" + file.file.name 55 | 56 | # proj_obj = Project(file = file) 57 | # proj_obj.save() 58 | # print(f"Project ID created: {proj_obj.id}") 59 | # print(f"Project obj id {proj_obj.id}") 60 | ghidra_full_disassembly.apply_async(args=(task_id, file_path, file_id, loader, lang), link=primary_analysis_callback.s(task.id, file_id)) 61 | 62 | @shared_task() 63 | def primary_analysis_callback(error, task_id, file_id): 64 | print(f"Finished task id {task_id}") 65 | # project = Project.objects.get(pk=project_id) 66 | file = UploadedFile.objects.get(pk=file_id) 67 | if error and 'error' in error: 68 | # Do some house cleaning.. 69 | result = ErrorResult.objects.create(error_message=error) 70 | # project.delete() 71 | # file = UploadedFile.objects.get(pk=file_id) 72 | print(f"error {error}") 73 | file.delete() 74 | else: 75 | result = FileUploadResult.objects.create(file=file) 76 | result.save() 77 | task = Task.objects.get(pk=task_id) 78 | if error and 'error' in error: 79 | task.task_type = "error" 80 | task.status = "DONE" 81 | task.content_type = ContentType.objects.get_for_model(result) 82 | task.object_id = result.id 83 | task.result = result 84 | task.save() 85 | 86 | 87 | def cfg_analysis(file_id, func_id, task_id): 88 | task = Task.objects.get(pk=task_id) 89 | task.status = "ACTIVE" 90 | task.save() 91 | file = UploadedFile.objects.get(pk=file_id) 92 | file_path = "./media/" + file.file.name 93 | ghidra_function_cfg.apply_async(args=(file_path, func_id), link=cfg_analysis_callback.s(task.id)) 94 | 95 | 96 | @shared_task() 97 | def cfg_analysis_callback(cfg_result, task_id): 98 | if 'error' in cfg_result: 99 | result = ErrorResult.objects.create(error_message=cfg_result) 100 | else: 101 | result = CFGAnalysisResult.objects.create(json_result=cfg_result) 102 | result.save() 103 | task = Task.objects.get(id=task_id) 104 | if 'error' in cfg_result: 105 | task.task_type = "error" 106 | task.status = "DONE" 107 | task.content_type = ContentType.objects.get_for_model(result) 108 | task.object_id = result.id 109 | task.result = result 110 | task.save() 111 | 112 | def get_rawhex(file_id, task_id, address, length): 113 | task = Task.objects.get(pk=task_id) 114 | task.status = "ACTIVE" 115 | task.save() 116 | print("Looking for file id", file_id) 117 | file = UploadedFile.objects.get(pk=file_id) 118 | file_path = "./media/" + file.file.name 119 | ghidra_get_rawhex.apply_async(args=(file_path, address, int(length)), link=get_rawhex_callback.s(task.id)) 120 | 121 | @shared_task() 122 | def get_rawhex_callback(rawhex_result, task_id): 123 | if 'error' in rawhex_result: 124 | result = ErrorResult.objects.create(error_message = rawhex_result) 125 | else: 126 | result = RawHexResult.objects.create(raw_hex = rawhex_result) 127 | result.save() 128 | task = Task.objects.get(id=task_id) 129 | if 'error' in rawhex_result: 130 | task.task_type = "error" 131 | task.status = "DONE" 132 | task.content_type = ContentType.objects.get_for_model(result) 133 | task.object_id = result.id 134 | task.result = result 135 | task.save() 136 | 137 | 138 | def decompile_function(file_id, func_id, task_id): 139 | task = Task.objects.get(pk=task_id) 140 | task.status = "ACTIVE" 141 | task.save() 142 | file = UploadedFile.objects.get(pk=file_id) 143 | file_path = "./media/" + file.file.name 144 | ghidra_decompile_func.apply_async(args=(file_path, func_id), link=decompile_function_callback.s(task.id)) 145 | 146 | @shared_task() 147 | def decompile_function_callback(cfg_result, task_id): 148 | if 'error' in cfg_result: 149 | result = ErrorResult.objects.create(error_message = cfg_result) 150 | else: 151 | result = DecompAnalysisResult.objects.create(decomp_result = cfg_result) 152 | result.save() 153 | task = Task.objects.get(id=task_id) 154 | if 'error' in cfg_result: 155 | task.task_type = "error" 156 | task.status = "DONE" 157 | task.content_type = ContentType.objects.get_for_model(result) 158 | task.object_id = result.id 159 | task.result = result 160 | task.save() 161 | 162 | def get_strings(file_id, task_id): 163 | task = Task.objects.get(pk=task_id) 164 | task.status = "ACTIVE" 165 | task.save() 166 | file = UploadedFile.objects.get(pk=file_id) 167 | file_path = "./media/" + file.file.name 168 | ghidra_get_strings.apply_async(args=(file_path,), link=get_strings_callback.s(task.id)) 169 | 170 | @shared_task() 171 | def get_strings_callback(strings_result, task_id): 172 | if 'error' in strings_result: 173 | result = ErrorResult.objects.create(error_message = strings_result) 174 | else: 175 | result = StringsResult.objects.create(strings = strings_result) 176 | result.save() 177 | task = Task.objects.get(id=task_id) 178 | if 'error' in strings_result: 179 | task.task_type = "error" 180 | task.status = "DONE" 181 | task.content_type = ContentType.objects.get_for_model(result) 182 | task.object_id = result.id 183 | task.result = result 184 | task.save() 185 | 186 | from netdis.core.models import ErrorResult 187 | # from celery.utils.log import get_task_logger 188 | 189 | # logger = get_task_logger(__name__) 190 | 191 | @shared_task 192 | def test(): 193 | err = ErrorResult.objects.create(error_message="CRONN") 194 | err.save() 195 | logger.info("TEST CRON") 196 | 197 | -------------------------------------------------------------------------------- /backend/netdis/core/tests.py: -------------------------------------------------------------------------------- 1 | from django.test import TestCase 2 | 3 | # Create your tests here. 4 | -------------------------------------------------------------------------------- /backend/netdis/core/urls.py: -------------------------------------------------------------------------------- 1 | from django.urls import path 2 | from .views import * 3 | 4 | urlpatterns = [ 5 | path('test/', test_view, name='test'), 6 | path('binary_ingest/', binary_ingest, name='binary_ingest'), 7 | path('get_loaders/', get_loaders, name='get_loaders'), 8 | 9 | path('funcs/', funcs, name='funcs'), 10 | path('blocks/', blocks, name='funcs'), 11 | path('disasms/', disasms, name='disasms'), 12 | 13 | path('task/', task, name='task'), 14 | 15 | path('func_graph/', func_graph, name='func_graph'), 16 | path('decomp_func/', decomp_func, name='decomp_func'), 17 | path('rawhex/', rawhex, name='rawhex'), 18 | path('strings/', strings, name='strings'), 19 | 20 | path('proj/', test_view, name='proj'), 21 | path('func/', test_view, name='func'), 22 | path('block/', test_view, name='block'), 23 | path('disasm/', test_view, name='disasm'), 24 | ] 25 | -------------------------------------------------------------------------------- /backend/netdis/core/utils.py: -------------------------------------------------------------------------------- 1 | from .models import UploadedFile, Project, Function, Block, Disasm 2 | from functools import wraps 3 | from .serializers import FunctionSerializer, BlockSerializer, DisasmSerializer 4 | import time 5 | import os 6 | import shutil 7 | 8 | def query_storage(): 9 | """ 10 | Check if total storage has been exceeded. If so, delete files until we are below the limit. 11 | """ 12 | max_size = int(os.environ["MAX_STORAGE"]) 13 | files = UploadedFile.objects.all().order_by('uploaded_at') 14 | total_size = sum(file.file.size for file in files) 15 | while total_size > max_size: 16 | oldest = files.first() 17 | if os.path.exists(oldest.file.path): 18 | os.remove(oldest.file.path) 19 | shutil.rmtree(oldest.file.path + "_ghidra") 20 | total_size -= oldest.size 21 | oldest.delete() 22 | 23 | def get_project_from_hash(hash): 24 | if UploadedFile.objects.filter(hash = hash).exists(): 25 | uploadedfile = UploadedFile.objects.get(hash = hash) 26 | try: 27 | project_id = Project.objects.get(file=uploadedfile) 28 | return project_id 29 | except: 30 | return None 31 | return None 32 | 33 | def get_functions_from_file(file_id): 34 | function_list = Function.objects.filter(file_id=int(file_id)) 35 | serializer = FunctionSerializer(function_list, many=True) 36 | return serializer.data 37 | 38 | 39 | def get_blocks_from_function(function_id): 40 | block_list = Block.objects.filter(function_id=int(function_id)) 41 | serializer = BlockSerializer(block_list, many=True) 42 | return serializer.data 43 | 44 | def get_disasm_from_block(block_id): 45 | disasm_list = Disasm.objects.filter(block_id=int(block_id)) 46 | serializer = DisasmSerializer(disasm_list, many=True) 47 | return serializer.data 48 | 49 | def timer(func): 50 | """helper function to estimate view execution time""" 51 | 52 | @wraps(func) # used for copying func metadata 53 | def wrapper(*args, **kwargs): 54 | # record start time 55 | start = time.time() 56 | 57 | # func execution 58 | result = func(*args, **kwargs) 59 | 60 | duration = (time.time() - start) * 1000 61 | # output execution time to console 62 | print('view {} takes {:.2f} ms'.format( 63 | func.__name__, 64 | duration 65 | )) 66 | return result 67 | return wrapper -------------------------------------------------------------------------------- /backend/netdis/core/views.py: -------------------------------------------------------------------------------- 1 | from rest_framework.decorators import api_view, parser_classes, permission_classes 2 | from rest_framework.response import Response 3 | from rest_framework.parsers import MultiPartParser, FormParser 4 | from rest_framework import status 5 | from django.http import Http404, HttpResponseBadRequest 6 | from .models import Task, UploadedFile, Project, Function, Block, Disasm, FileUploadResult, CFGAnalysisResult, DecompAnalysisResult, ErrorResult, RawHexResult, StringsResult, LoadersResult 7 | import hashlib 8 | import json 9 | from .utils import get_functions_from_file, get_blocks_from_function, get_disasm_from_block, query_storage 10 | from .utils import timer 11 | from .tasks import primary_analysis, cfg_analysis, decompile_function, get_rawhex, get_strings, get_loaders_task 12 | from .serializers import TaskSerializer 13 | import datetime 14 | import os 15 | import logging 16 | 17 | logger = logging.getLogger(__name__) 18 | 19 | @api_view(['GET']) 20 | def test_view(request): 21 | data = {"result": "test"} 22 | return Response(data) 23 | 24 | @api_view(['POST']) 25 | @parser_classes([MultiPartParser]) 26 | def get_loaders(request): 27 | if(request.method == 'POST' and request.FILES.get('file')): 28 | file_obj = request.FILES['file'] 29 | file_size = file_obj.size 30 | contents = file_obj.read() 31 | hash = hashlib.sha256(contents).hexdigest() 32 | file_obj.name = hash 33 | max_file_size = int(os.environ.get("MAX_FILE_SIZE")) 34 | if(file_size > max_file_size): 35 | # Reject if file is over 5mb 36 | return Response({"error": "File too large", "error_info": file_size}, status=status.HTTP_400_BAD_REQUEST) 37 | query_storage() 38 | uploaded_file = UploadedFile(file=file_obj, hash=hash, file_size=file_size) 39 | uploaded_file.save() 40 | uploaded_file.evict_at = uploaded_file.uploaded_at + datetime.timedelta(days=2) 41 | uploaded_file.save() 42 | 43 | logger.debug("Queueing worker...") 44 | task = Task(status = "QUEUED", task_type='get_loaders') 45 | task.save() 46 | logger.debug(f"Task id {task.id}") 47 | get_loaders_task(uploaded_file.id, task.id) 48 | serializer = TaskSerializer(task) 49 | return Response(serializer.data) 50 | 51 | return Response("Bad request!", status=status.HTTP_400_BAD_REQUEST) 52 | 53 | @timer 54 | @api_view(['POST']) 55 | @parser_classes([MultiPartParser]) 56 | def binary_ingest(request): 57 | if(request.method == 'POST' and request.FILES.get('file')): 58 | if request.data.get('loader'): 59 | loader = request.data.get('loader') 60 | else: 61 | loader = None 62 | if request.data.get('lang'): 63 | lang = request.data.get('lang') 64 | else: 65 | lang = None 66 | file_obj = request.FILES['file'] 67 | file_size = file_obj.size 68 | contents = file_obj.read() 69 | hash = hashlib.sha256(contents).hexdigest() 70 | file_obj.name = hash 71 | max_file_size = int(os.environ.get("MAX_FILE_SIZE")) 72 | if(file_size > max_file_size): 73 | # Reject if file is over 5mb 74 | return Response({"error": "File too large", "error_info": file_size}, status=status.HTTP_400_BAD_REQUEST) 75 | 76 | if UploadedFile.objects.filter(hash = hash).exists(): 77 | # Uploaded file, and analysis already exists 78 | uploaded_file = UploadedFile.objects.get(hash = hash) 79 | # project = Project.objects.get(file = uploaded_file) 80 | print("Loaded file") 81 | print(uploaded_file) 82 | return Response({ "file_id": uploaded_file.id, "image_base": uploaded_file.image_base }) 83 | else: 84 | # Uploaded file does not exist. Upload, analyze, and delete it. 85 | query_storage() 86 | uploaded_file = UploadedFile(file=file_obj, hash=hash, file_size=file_size) 87 | uploaded_file.save() 88 | uploaded_file.evict_at = uploaded_file.uploaded_at + datetime.timedelta(days=2) 89 | uploaded_file.save() 90 | 91 | print("Queueing worker...") 92 | task = Task(status = "QUEUED", task_type='file_upload') 93 | task.save() 94 | print(f"File id {uploaded_file.id}") 95 | print(f"Task id {task.id}") 96 | print(f"USING LOADER: {loader}") 97 | print(f"USING LANG: {lang}") 98 | primary_analysis(uploaded_file.id, task.id, loader, lang) 99 | serializer = TaskSerializer(task) 100 | return Response(serializer.data) 101 | 102 | return Response("Bad request!", status=status.HTTP_400_BAD_REQUEST) 103 | 104 | @api_view(['POST']) 105 | def funcs(request): 106 | if(request.body): 107 | data_dict = json.loads(request.body.decode("utf-8")) 108 | try: 109 | file_id = data_dict['file_id'] 110 | except Exception as error: 111 | return Response(f"ERROR: {error.__str__()}") 112 | print("Loading funcs for file_id", file_id) 113 | print(get_functions_from_file(file_id)) 114 | return Response(get_functions_from_file(file_id)) 115 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) 116 | 117 | @api_view(['POST']) 118 | def blocks(request): 119 | if(request.body): 120 | try: 121 | data_dict = json.loads(request.body.decode("utf-8")) 122 | except Exception as error: 123 | return Response(error) 124 | function_id = data_dict['function_id'] 125 | return Response(get_blocks_from_function(function_id)) 126 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) 127 | 128 | @api_view(['POST']) 129 | def disasms(request): 130 | if(request.body): 131 | try: 132 | data_dict = json.loads(request.body.decode("utf-8")) 133 | except Exception as error: 134 | return Response(error) 135 | block_id = data_dict['block_id'] 136 | return Response(get_disasm_from_block(block_id)) 137 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) 138 | 139 | @api_view(['POST']) 140 | def raw(request): 141 | if(request.body): 142 | try: 143 | data_dict = json.loads(request.body.decode("utf-8")) 144 | except Exception as error: 145 | return Response(error) 146 | address = data_dict['address'] 147 | print(f"Looking for address {address}") 148 | return(status.HTTP_200_OK) 149 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) 150 | 151 | @api_view(['GET']) 152 | def task(request, id): 153 | try: 154 | task = Task.objects.get(pk=id) 155 | serializer = TaskSerializer(task) 156 | response = serializer.data 157 | if task.status == "DONE": 158 | logger.debug(f"task type: {task.task_type}") 159 | match task.task_type: 160 | case 'file_upload': 161 | result = FileUploadResult.objects.get(id=task.object_id) 162 | logger.debug(f"Task is asking for file id {result.file_id}") 163 | image_base = UploadedFile.objects.get(pk=result.file_id).image_base 164 | response["result"] = {"file_id": result.file_id, "image_base": image_base} 165 | case 'cfg_analysis': 166 | result = CFGAnalysisResult.objects.get(id=task.object_id) 167 | response["result"] = {"json_result": result.json_result} 168 | case 'decomp_func': 169 | result = DecompAnalysisResult.objects.get(id=task.object_id) 170 | response['result'] = {"decomp_result": result.decomp_result} 171 | case 'raw_request': 172 | result = RawHexResult.objects.get(id=task.object_id) 173 | response['result'] = {"rawhex": result.raw_hex} 174 | case 'strings': 175 | result = StringsResult.objects.get(id=task.object_id) 176 | response['result'] = {"strings": result.strings} 177 | case 'error': 178 | result = ErrorResult.objects.get(id=task.object_id) 179 | response['result'] = {"error": result.error_message} 180 | case 'loaders': 181 | result = LoadersResult.objects.get(id=task.object_id) 182 | response['result'] = {"loaders": result.loaders} 183 | task.delete() 184 | return Response(response) 185 | except Task.DoesNotExist as e: 186 | return Response('Task does not exist', status=status.HTTP_404_NOT_FOUND) 187 | except Exception as e: 188 | return Response(str(e), status=status.HTTP_400_BAD_REQUEST) 189 | 190 | @api_view(['POST']) 191 | def func_graph(request): 192 | if(request.body): 193 | data_dict = json.loads(request.body.decode("utf-8")) 194 | func_id = data_dict['function_id'] 195 | file_id = data_dict['file_id'] 196 | task = Task.objects.create(task_type='cfg_analysis') 197 | task.save() 198 | cfg_analysis(file_id, func_id, task.id) 199 | return Response({"task_id": task.id, "status": task.status}) 200 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) 201 | 202 | @api_view(['POST']) 203 | def decomp_func(request): 204 | if(request.body): 205 | data_dict = json.loads(request.body.decode("utf-8")) 206 | func_id = data_dict['function_id'] 207 | file_id = data_dict['file_id'] 208 | task = Task.objects.create(task_type='decomp_func') 209 | task.save() 210 | decompile_function(file_id, func_id, task.id) 211 | return Response({"task_id": task.id, "status": task.status}) 212 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) 213 | 214 | @api_view(['POST']) 215 | def rawhex(request): 216 | if(request.body): 217 | data_dict = json.loads(request.body.decode("utf-8")) 218 | file_id = data_dict['file_id'] 219 | address = data_dict['address'] 220 | length = data_dict['length'] 221 | task = Task.objects.create(task_type='raw_request') 222 | task.save() 223 | logger.debug(f"Address: {address}, length: {length}, file id {file_id}") 224 | get_rawhex(file_id, task.id, address, length) 225 | return Response({"task_id": task.id, "status": task.status}) 226 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) 227 | 228 | @api_view(['POST']) 229 | def strings(request): 230 | if(request.body): 231 | data_dict = json.loads(request.body.decode("utf-8")) 232 | file_id = data_dict['file_id'] 233 | task = Task.objects.create(task_type='strings') 234 | task.save() 235 | get_strings(file_id, task.id) 236 | return Response({"task_id": task.id, "status": task.status}) 237 | return Response('Bad request!', status=status.HTTP_400_BAD_REQUEST) -------------------------------------------------------------------------------- /backend/netdis/proctor/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/proctor/__init__.py -------------------------------------------------------------------------------- /backend/netdis/proctor/admin.py: -------------------------------------------------------------------------------- 1 | from django.contrib import admin 2 | 3 | # Register your models here. 4 | -------------------------------------------------------------------------------- /backend/netdis/proctor/apps.py: -------------------------------------------------------------------------------- 1 | from django.apps import AppConfig 2 | 3 | 4 | class ProctorConfig(AppConfig): 5 | default_auto_field = 'django.db.models.BigAutoField' 6 | name = 'netdis.proctor' 7 | -------------------------------------------------------------------------------- /backend/netdis/proctor/migrations/__init__.py: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/backend/netdis/proctor/migrations/__init__.py -------------------------------------------------------------------------------- /backend/netdis/proctor/models.py: -------------------------------------------------------------------------------- 1 | from django.db import models 2 | 3 | # Create your models here. 4 | -------------------------------------------------------------------------------- /backend/netdis/proctor/tests.py: -------------------------------------------------------------------------------- 1 | from django.test import TestCase 2 | 3 | # Create your tests here. 4 | -------------------------------------------------------------------------------- /backend/netdis/proctor/views.py: -------------------------------------------------------------------------------- 1 | from django.shortcuts import render 2 | 3 | # Create your views here. 4 | -------------------------------------------------------------------------------- /backend/netdis/settings.py: -------------------------------------------------------------------------------- 1 | """ 2 | Django settings for netdis project. 3 | 4 | Generated by 'django-admin startproject' using Django 4.2.7. 5 | 6 | For more information on this file, see 7 | https://docs.djangoproject.com/en/4.2/topics/settings/ 8 | 9 | For the full list of settings and their values, see 10 | https://docs.djangoproject.com/en/4.2/ref/settings/ 11 | """ 12 | 13 | import os 14 | from pathlib import Path 15 | from celery.schedules import crontab 16 | # import netdis.core.cron 17 | # import netdis.cron 18 | # import netdis.core.tasks 19 | 20 | # Build paths inside the project like this: BASE_DIR / 'subdir'. 21 | BASE_DIR = Path(__file__).resolve().parent.parent 22 | 23 | MEDIA_URL = '/media/' 24 | MEDIA_ROOT = os.path.join(BASE_DIR, 'media') 25 | 26 | 27 | # Quick-start development settings - unsuitable for production 28 | # See https://docs.djangoproject.com/en/4.2/howto/deployment/checklist/ 29 | 30 | # SECURITY WARNING: keep the secret key used in production secret! 31 | # SECRET_KEY = 'django-insecure-)-e458(5nsay^+rlz@c6v8*7zimmq2bl%l&&)n7lbn8_(p3^m$' 32 | SECRET_KEY = os.environ.get("SECRET_KEY") 33 | 34 | 35 | # SECURITY WARNING: don't run with debug turned on in production! 36 | DEBUG = int(os.environ.get('DEBUG', 0)) 37 | 38 | ALLOWED_HOSTS = os.environ.get("DJANGO_ALLOWED_HOSTS").split(" ") 39 | 40 | # Application definition 41 | 42 | INSTALLED_APPS = [ 43 | 'django.contrib.admin', 44 | 'django.contrib.auth', 45 | 'django.contrib.contenttypes', 46 | 'django.contrib.sessions', 47 | 'django.contrib.messages', 48 | 'django.contrib.staticfiles', 49 | 'rest_framework', 50 | 'netdis.core', 51 | 'netdis.proctor', 52 | 'netdis.authentication', 53 | 'rest_framework_simplejwt.token_blacklist', 54 | ] 55 | 56 | MIDDLEWARE = [ 57 | 'django.middleware.security.SecurityMiddleware', 58 | 'django.contrib.sessions.middleware.SessionMiddleware', 59 | 'django.middleware.common.CommonMiddleware', 60 | 'django.middleware.csrf.CsrfViewMiddleware', 61 | 'django.contrib.auth.middleware.AuthenticationMiddleware', 62 | 'django.contrib.messages.middleware.MessageMiddleware', 63 | 'django.middleware.clickjacking.XFrameOptionsMiddleware', 64 | 'corsheaders.middleware.CorsMiddleware' 65 | ] 66 | 67 | ROOT_URLCONF = 'netdis.urls' 68 | 69 | # CORS_ALLOWED_ORIGINS = [ 70 | # 'http://localhost' 71 | # ] 72 | CORS_ALLOWED_ORIGINS = os.environ.get("CORS_ALLOWED_ORIGINS").split(" ") 73 | 74 | CELERY_BROKER_URL = "redis://redis:6379" 75 | CELERY_RESULT_BACKEND = "redis://redis:6379" 76 | 77 | CELERY_BEAT_SCHEDULE = { 78 | 'test': { 79 | 'task': 'netdis.core.tasks.test', 80 | 'schedule': 3600, 81 | } 82 | } 83 | 84 | TEMPLATES = [ 85 | { 86 | 'BACKEND': 'django.template.backends.django.DjangoTemplates', 87 | 'DIRS': [], 88 | 'APP_DIRS': True, 89 | 'OPTIONS': { 90 | 'context_processors': [ 91 | 'django.template.context_processors.debug', 92 | 'django.template.context_processors.request', 93 | 'django.contrib.auth.context_processors.auth', 94 | 'django.contrib.messages.context_processors.messages', 95 | ], 96 | }, 97 | }, 98 | ] 99 | 100 | WSGI_APPLICATION = 'netdis.wsgi.application' 101 | 102 | 103 | # Database 104 | # https://docs.djangoproject.com/en/4.2/ref/settings/#databases 105 | 106 | DATABASES = { 107 | "default": { 108 | "ENGINE": os.environ.get("SQL_ENGINE", "django.db.backends.sqlite3"), 109 | "NAME": os.environ.get("SQL_DATABASE", BASE_DIR / "db.sqlite3"), 110 | "USER": os.environ.get("SQL_USER", "user"), 111 | "PASSWORD": os.environ.get("SQL_PASSWORD", "password"), 112 | "HOST": os.environ.get("SQL_HOST", "localhost"), 113 | "PORT": os.environ.get("SQL_PORT", "5432"), 114 | } 115 | } 116 | 117 | REST_FRAMEWORK = { 118 | 'DEFAULT_AUTHENTICATION_CLASSES': ( 119 | 'rest_framework_simplejwt.authentication.JWTAuthentication', 120 | ), 121 | 'DEFAULT_RENDERER_CLASSES': ( 122 | 'rest_framework.renderers.JSONRenderer', 123 | ) 124 | } 125 | 126 | from datetime import timedelta 127 | 128 | SIMPLE_JWT = { 129 | "ACCESS_TOKEN_LIFETIME": timedelta(minutes=15), 130 | "REFRESH_TOKEN_LIFETIME": timedelta(days=90), 131 | "ROTATE_REFRESH_TOKENS": True, 132 | "BLACKLIST_AFTER_ROTATION": True, 133 | "UPDATE_LAST_LOGIN": False, 134 | 135 | "ALGORITHM": "HS256", 136 | "SIGNING_KEY": "CHANGEMEBRO", 137 | "VERIFYING_KEY": "", 138 | "AUDIENCE": None, 139 | "ISSUER": None, 140 | "JSON_ENCODER": None, 141 | "JWK_URL": None, 142 | "LEEWAY": 0, 143 | 144 | "AUTH_HEADER_TYPES": ("Bearer",), 145 | "AUTH_HEADER_NAME": "HTTP_AUTHORIZATION", 146 | "USER_ID_FIELD": "id", 147 | "USER_ID_CLAIM": "user_id", 148 | "USER_AUTHENTICATION_RULE": "rest_framework_simplejwt.authentication.default_user_authentication_rule", 149 | 150 | "AUTH_TOKEN_CLASSES": ("rest_framework_simplejwt.tokens.AccessToken",), 151 | "TOKEN_TYPE_CLAIM": "token_type", 152 | "TOKEN_USER_CLASS": "rest_framework_simplejwt.models.TokenUser", 153 | 154 | "JTI_CLAIM": "jti", 155 | 156 | "SLIDING_TOKEN_REFRESH_EXP_CLAIM": "refresh_exp", 157 | "SLIDING_TOKEN_LIFETIME": timedelta(minutes=5), 158 | "SLIDING_TOKEN_REFRESH_LIFETIME": timedelta(days=1), 159 | 160 | "TOKEN_OBTAIN_SERIALIZER": "rest_framework_simplejwt.serializers.TokenObtainPairSerializer", 161 | "TOKEN_REFRESH_SERIALIZER": "rest_framework_simplejwt.serializers.TokenRefreshSerializer", 162 | "TOKEN_VERIFY_SERIALIZER": "rest_framework_simplejwt.serializers.TokenVerifySerializer", 163 | "TOKEN_BLACKLIST_SERIALIZER": "rest_framework_simplejwt.serializers.TokenBlacklistSerializer", 164 | "SLIDING_TOKEN_OBTAIN_SERIALIZER": "rest_framework_simplejwt.serializers.TokenObtainSlidingSerializer", 165 | "SLIDING_TOKEN_REFRESH_SERIALIZER": "rest_framework_simplejwt.serializers.TokenRefreshSlidingSerializer", 166 | } 167 | 168 | 169 | # Password validation 170 | # https://docs.djangoproject.com/en/4.2/ref/settings/#auth-password-validators 171 | 172 | AUTH_PASSWORD_VALIDATORS = [ 173 | { 174 | 'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator', 175 | }, 176 | { 177 | 'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator', 178 | }, 179 | { 180 | 'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator', 181 | }, 182 | { 183 | 'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator', 184 | }, 185 | ] 186 | 187 | 188 | # Internationalization 189 | # https://docs.djangoproject.com/en/4.2/topics/i18n/ 190 | 191 | LANGUAGE_CODE = 'en-us' 192 | 193 | TIME_ZONE = 'UTC' 194 | 195 | USE_I18N = True 196 | 197 | USE_TZ = True 198 | 199 | 200 | # Static files (CSS, JavaScript, Images) 201 | # https://docs.djangoproject.com/en/4.2/howto/static-files/ 202 | 203 | STATIC_URL = 'static/' 204 | 205 | # Default primary key field type 206 | # https://docs.djangoproject.com/en/4.2/ref/settings/#default-auto-field 207 | 208 | DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField' 209 | 210 | LOGGING = { 211 | 'version': 1, 212 | 'disable_existing_loggers': False, 213 | 'handlers': { 214 | 'console': { 215 | 'level': 'DEBUG', 216 | 'class': 'logging.StreamHandler', 217 | }, 218 | }, 219 | 'loggers': { 220 | # 'django': { 221 | # 'handlers': ['console'], 222 | # 'level': 'WARNING', 223 | # 'propagate': True, 224 | # }, 225 | 'netdis.core': { 226 | 'handlers': ['console'], 227 | 'level': 'DEBUG', 228 | 'propagate': True, 229 | }, 230 | }, 231 | } -------------------------------------------------------------------------------- /backend/netdis/urls.py: -------------------------------------------------------------------------------- 1 | """ 2 | URL configuration for netdis project. 3 | 4 | The `urlpatterns` list routes URLs to views. For more information please see: 5 | https://docs.djangoproject.com/en/4.2/topics/http/urls/ 6 | Examples: 7 | Function views 8 | 1. Add an import: from my_app import views 9 | 2. Add a URL to urlpatterns: path('', views.home, name='home') 10 | Class-based views 11 | 1. Add an import: from other_app.views import Home 12 | 2. Add a URL to urlpatterns: path('', Home.as_view(), name='home') 13 | Including another URLconf 14 | 1. Import the include() function: from django.urls import include, path 15 | 2. Add a URL to urlpatterns: path('blog/', include('blog.urls')) 16 | """ 17 | from django.contrib import admin 18 | from django.urls import path, include 19 | 20 | urlpatterns = [ 21 | path('admin/', admin.site.urls), 22 | path('api/', include('netdis.core.urls')), 23 | path('auth/', include('netdis.authentication.urls')) 24 | ] 25 | -------------------------------------------------------------------------------- /backend/netdis/wsgi.py: -------------------------------------------------------------------------------- 1 | """ 2 | WSGI config for netdis project. 3 | 4 | It exposes the WSGI callable as a module-level variable named ``application``. 5 | 6 | For more information on this file, see 7 | https://docs.djangoproject.com/en/4.2/howto/deployment/wsgi/ 8 | """ 9 | 10 | import os 11 | 12 | from django.core.wsgi import get_wsgi_application 13 | 14 | os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'netdis.settings') 15 | 16 | application = get_wsgi_application() 17 | -------------------------------------------------------------------------------- /backend/prodsupervisord.conf: -------------------------------------------------------------------------------- 1 | [supervisord] 2 | nodaemon=true 3 | 4 | [program:django] 5 | environment = PYTHONUNBUFFERED=1 6 | command=sh -c "python manage.py flush --noinput && python manage.py makemigrations && python manage.py migrate && gunicorn -b 0.0.0.0:8000 netdis.wsgi" 7 | autorestart=true 8 | stdout_logfile=/dev/stdout 9 | stderr_logfile=/dev/stderr 10 | stdout_logfile_maxbytes = 0 11 | stderr_logfile_maxbytes = 0 12 | 13 | [program:celery] 14 | command=celery -A netdis worker -l info --concurrency=3 15 | autorestart=true 16 | stdout_logfile=/dev/stdout 17 | stderr_logfile=/dev/stderr 18 | stdout_logfile_maxbytes = 0 19 | stderr_logfile_maxbytes = 0 20 | 21 | [program:celery_beat] 22 | command=celery -A netdis beat -l INFO 23 | autorestart=true 24 | stdout_logfile=/dev/stdout 25 | stderr_logfile=/dev/stderr 26 | stdout_logfile_maxbytes = 0 27 | stderr_logfile_maxbytes = 0 -------------------------------------------------------------------------------- /backend/requirements.txt: -------------------------------------------------------------------------------- 1 | amqp==5.2.0 2 | asgiref==3.8.1 3 | billiard==4.2.0 4 | celery==5.4.0 5 | click==8.1.7 6 | click-didyoumean==0.3.1 7 | click-plugins==1.1.1 8 | click-repl==0.3.0 9 | Django==5.0.7 10 | django-cors-headers==4.4.0 11 | django-crontab==0.7.1 12 | djangorestframework==3.15.2 13 | djangorestframework-simplejwt==5.3.1 14 | gunicorn==23.0.0 15 | JPype1==1.5.0 16 | kombu==5.3.7 17 | packaging==24.1 18 | prompt_toolkit==3.0.47 19 | psycopg2-binary==2.9.9 20 | pyhidra==1.2.0 21 | PyJWT==2.8.0 22 | python-dateutil==2.9.0.post0 23 | redis==5.0.7 24 | six==1.16.0 25 | sqlparse==0.5.1 26 | tzdata==2024.1 27 | vine==5.1.0 28 | wcwidth==0.2.13 29 | -------------------------------------------------------------------------------- /docker-compose-dev.yml: -------------------------------------------------------------------------------- 1 | services: 2 | frontend: 3 | volumes: 4 | - ./frontend:/app 5 | - /app/node_modules 6 | command: sh -c "npm install && npm run dev -- --host" 7 | backend: 8 | env_file: 9 | - ./.env.dev 10 | ports: 11 | - "8000:8000" -------------------------------------------------------------------------------- /docker-compose-maintenance.yml: -------------------------------------------------------------------------------- 1 | services: 2 | nginx: 3 | image: nginx:latest 4 | ports: 5 | - "80:80" 6 | volumes: 7 | - ./maintenance/:/usr/share/nginx/html:ro -------------------------------------------------------------------------------- /docker-compose-prod.yml: -------------------------------------------------------------------------------- 1 | services: 2 | frontend: 3 | build: 4 | context: ./frontend 5 | dockerfile: Dockerfile.prod 6 | command: sh -c "npm install && npm run build" 7 | volumes: 8 | - ./frontend:/app 9 | networks: 10 | - ng 11 | backend: 12 | build: 13 | context: ./backend 14 | dockerfile: Dockerfile.prod 15 | env_file: 16 | - ./.env.prod 17 | expose: 18 | - "8000" 19 | networks: 20 | - ng 21 | 22 | redis: 23 | volumes: 24 | - redis-data:/data 25 | networks: 26 | - ng 27 | 28 | nginx: 29 | image: nginx:latest 30 | ports: 31 | - "80:80" 32 | volumes: 33 | - ./nginxprod.conf:/etc/nginx/conf.d/default.conf 34 | - ./frontend/dist:/usr/share/nginx/html:ro 35 | networks: 36 | - ng 37 | depends_on: 38 | - frontend 39 | 40 | db: 41 | volumes: 42 | - postgres_data:/var/lib/postgresql/data/ 43 | networks: 44 | - ng 45 | 46 | volumes: 47 | redis-data: 48 | postgres_data: 49 | 50 | networks: 51 | ng: 52 | driver: bridge 53 | -------------------------------------------------------------------------------- /docker-compose.yml: -------------------------------------------------------------------------------- 1 | services: 2 | frontend: 3 | build: 4 | context: ./frontend 5 | ports: 6 | - "5173:5173" 7 | 8 | backend: 9 | build: 10 | context: ./backend 11 | depends_on: 12 | - redis 13 | 14 | redis: 15 | image: "redis:alpine" 16 | command: sh -c "redis-server" 17 | ports: 18 | - "6379:6379" 19 | 20 | db: 21 | image: postgres:15 22 | environment: 23 | - POSTGRES_USER=postgres 24 | - POSTGRES_PASSWORD=postgres 25 | - POSTGRES_DB=netdis -------------------------------------------------------------------------------- /frontend/.env.vite.dev: -------------------------------------------------------------------------------- 1 | VITE_BACKEND=http://localhost:8000/ -------------------------------------------------------------------------------- /frontend/.env.vite.prod: -------------------------------------------------------------------------------- 1 | VITE_BACKEND=https://netdis.org/ -------------------------------------------------------------------------------- /frontend/.eslintrc.cjs: -------------------------------------------------------------------------------- 1 | module.exports = { 2 | root: true, 3 | env: { browser: true, es2020: true }, 4 | extends: [ 5 | 'eslint:recommended', 6 | 'plugin:react/recommended', 7 | 'plugin:react/jsx-runtime', 8 | 'plugin:react-hooks/recommended', 9 | ], 10 | ignorePatterns: ['dist', '.eslintrc.cjs'], 11 | parserOptions: { ecmaVersion: 'latest', sourceType: 'module' }, 12 | settings: { react: { version: '18.2' } }, 13 | plugins: ['react-refresh'], 14 | rules: { 15 | 'react/jsx-no-target-blank': 'off', 16 | 'react-refresh/only-export-components': [ 17 | 'warn', 18 | { allowConstantExport: true }, 19 | ], 20 | }, 21 | } 22 | -------------------------------------------------------------------------------- /frontend/.gitignore: -------------------------------------------------------------------------------- 1 | # Logs 2 | logs 3 | *.log 4 | npm-debug.log* 5 | yarn-debug.log* 6 | yarn-error.log* 7 | pnpm-debug.log* 8 | lerna-debug.log* 9 | 10 | node_modules 11 | dist 12 | dist-ssr 13 | *.local 14 | 15 | # Editor directories and files 16 | .vscode/* 17 | !.vscode/extensions.json 18 | .idea 19 | .DS_Store 20 | *.suo 21 | *.ntvs* 22 | *.njsproj 23 | *.sln 24 | *.sw? 25 | -------------------------------------------------------------------------------- /frontend/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM node:20-alpine 2 | 3 | WORKDIR /app 4 | # ENV VITE_BACKEND=http://localhost:8000/ 5 | COPY . . 6 | COPY entrypoint.dev.sh /entrypoint.sh 7 | RUN chmod +x /entrypoint.sh 8 | 9 | ENTRYPOINT ["/entrypoint.sh"] -------------------------------------------------------------------------------- /frontend/Dockerfile.prod: -------------------------------------------------------------------------------- 1 | FROM node:20-alpine 2 | 3 | WORKDIR /app 4 | # ENV VITE_BACKEND=http://localhost:8000/ 5 | COPY . . 6 | COPY entrypoint.prod.sh /entrypoint.sh 7 | RUN chmod +x /entrypoint.sh 8 | 9 | ENTRYPOINT ["/entrypoint.sh"] -------------------------------------------------------------------------------- /frontend/README.md: -------------------------------------------------------------------------------- 1 | # React + Vite 2 | 3 | This template provides a minimal setup to get React working in Vite with HMR and some ESLint rules. 4 | 5 | Currently, two official plugins are available: 6 | 7 | - [@vitejs/plugin-react](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react/README.md) uses [Babel](https://babeljs.io/) for Fast Refresh 8 | - [@vitejs/plugin-react-swc](https://github.com/vitejs/vite-plugin-react-swc) uses [SWC](https://swc.rs/) for Fast Refresh 9 | -------------------------------------------------------------------------------- /frontend/entrypoint.dev.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | cp /app/.env.vite.dev /app/.env 4 | exec "$@" -------------------------------------------------------------------------------- /frontend/entrypoint.prod.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | cp /app/.env.vite.prod /app/.env 4 | exec "$@" -------------------------------------------------------------------------------- /frontend/index.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | netdis 8 | 9 | 10 |
11 | 12 | 13 | 14 | -------------------------------------------------------------------------------- /frontend/package.json: -------------------------------------------------------------------------------- 1 | { 2 | "name": "frontend", 3 | "private": true, 4 | "version": "0.0.0", 5 | "type": "module", 6 | "scripts": { 7 | "dev": "vite", 8 | "build": "vite build", 9 | "lint": "eslint . --ext js,jsx --report-unused-disable-directives --max-warnings 0", 10 | "preview": "vite preview" 11 | }, 12 | "dependencies": { 13 | "@dagrejs/dagre": "^1.1.3", 14 | "@xyflow/react": "^12.0.3", 15 | "axios": "^1.7.2", 16 | "elkjs": "^0.9.3", 17 | "react": "^18.3.1", 18 | "react-dom": "^18.3.1", 19 | "react-grid-layout": "^1.4.4", 20 | "react-resizable": "^3.0.5", 21 | "react-router-dom": "^6.25.1", 22 | "react-syntax-highlighter": "^15.5.0" 23 | }, 24 | "devDependencies": { 25 | "@types/react": "^18.3.3", 26 | "@types/react-dom": "^18.3.0", 27 | "@vitejs/plugin-react": "^4.3.1", 28 | "autoprefixer": "^10.4.19", 29 | "eslint": "^8.57.0", 30 | "eslint-plugin-react": "^7.34.3", 31 | "eslint-plugin-react-hooks": "^4.6.2", 32 | "eslint-plugin-react-refresh": "^0.4.7", 33 | "postcss": "^8.4.40", 34 | "tailwindcss": "^3.4.7", 35 | "vite": "^5.4.0" 36 | } 37 | } 38 | -------------------------------------------------------------------------------- /frontend/postcss.config.js: -------------------------------------------------------------------------------- 1 | export default { 2 | plugins: { 3 | tailwindcss: {}, 4 | autoprefixer: {}, 5 | }, 6 | } 7 | -------------------------------------------------------------------------------- /frontend/public/django_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/frontend/public/django_logo.png -------------------------------------------------------------------------------- /frontend/public/favicon.ico: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/frontend/public/favicon.ico -------------------------------------------------------------------------------- /frontend/public/ghidra_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/frontend/public/ghidra_logo.png -------------------------------------------------------------------------------- /frontend/public/graph.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/frontend/public/graph.png -------------------------------------------------------------------------------- /frontend/public/graph_skew_dark.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/frontend/public/graph_skew_dark.png -------------------------------------------------------------------------------- /frontend/public/vite.svg: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /frontend/src/assets/react.svg: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /frontend/src/components/CodeNode.css: -------------------------------------------------------------------------------- 1 | .code-node { 2 | padding: 0.5rem; 3 | font-family: monospace; 4 | background: white; 5 | border: 2px solid black; 6 | } 7 | 8 | .codenode-line { 9 | display: flex; 10 | white-space: nowrap; 11 | } 12 | 13 | .codenode-addr { 14 | color: darkblue; 15 | width: 100px; /* Adjust width as needed */ 16 | } 17 | 18 | .codenode-op { 19 | color: blue; 20 | width: 50px; /* Adjust width as needed */ 21 | text-align: left; 22 | } 23 | 24 | .codenode-data { 25 | color: darkred; 26 | cursor: pointer; 27 | flex-grow: 1; 28 | } 29 | -------------------------------------------------------------------------------- /frontend/src/components/CodeNode.jsx: -------------------------------------------------------------------------------- 1 | import React, { useRef, useEffect, useState, useContext } from 'react'; 2 | import { Handle, Position } from '@xyflow/react'; 3 | import './CodeNode.css'; 4 | import { NodeSizeContext } from '../context/NodeSizeContext.jsx'; 5 | 6 | const CodeNode = ({ data }) => { 7 | const nodeRef = useRef(null); 8 | const { updateNodeSize } = useContext(NodeSizeContext); 9 | 10 | useEffect(() => { 11 | if(nodeRef.current){ 12 | const { offsetWidth, offsetHeight } = nodeRef.current; 13 | updateNodeSize(data.label, { width: offsetWidth, height: offsetHeight }) 14 | } 15 | }, [data]) 16 | 17 | return ( 18 |
19 |
{data.label}
20 |
{data.text.map((line, index) => { 21 | return ( 22 |
23 | {line.addr}: 24 | {line.op} 25 |
{line.data}
26 |
27 | )})} 28 |
29 | 30 | 31 |
32 | ); 33 | }; 34 | 35 | export default CodeNode; -------------------------------------------------------------------------------- /frontend/src/components/Decompilation.jsx: -------------------------------------------------------------------------------- 1 | import { useState, useContext, useEffect, useRef } from "react"; 2 | import axios from "axios"; 3 | import { AnalysisContext } from "../context/AnalysisContext"; 4 | import SyntaxHighlighter from 'react-syntax-highlighter'; 5 | import { a11yLight } from 'react-syntax-highlighter/dist/esm/styles/hljs'; 6 | 7 | export default function Decompilation() { 8 | const [analysisContext, setAnalysisContext] = useContext(AnalysisContext); 9 | 10 | if(analysisContext.decomp == ''){ 11 | return ( 12 |
No function selected
13 | ) 14 | } 15 | 16 | // useEffect(() => { 17 | // console.log(analysisContext.decomp) 18 | // }, [analysisContext.decomp]) 19 | 20 | return ( 21 |
22 |
23 | 24 | {analysisContext.decomp} 25 | 26 |
27 |
28 | 29 | ) 30 | } -------------------------------------------------------------------------------- /frontend/src/components/FunctionList.jsx: -------------------------------------------------------------------------------- 1 | import { useState, useContext, useEffect, useRef } from "react"; 2 | import axios from "axios"; 3 | import { AnalysisContext } from "../context/AnalysisContext"; 4 | 5 | 6 | export default function FunctionList(props) { 7 | const funcs = props.functionListProps.funcs; 8 | const file_id = props.functionListProps.file_id; 9 | const [dis, setDis] = useState([]); 10 | const [analysisContext, setAnalysisContext] = useContext(AnalysisContext); 11 | 12 | useEffect(() => { 13 | setAnalysisContext({...analysisContext, allFunctions: props.funcs, funcHistory: []}) 14 | }, []); 15 | 16 | useEffect(() => { 17 | console.log(analysisContext.funcHistory) 18 | }, [analysisContext.funcHistory]); 19 | 20 | useEffect(() => { 21 | if (analysisContext.selectedFunction !== null) { 22 | 23 | const index = funcs.findIndex(f => f.id === analysisContext.selectedFunction); 24 | if (index !== -1 && scrollRefs.current[index]) { 25 | scrollRefs.current[index].scrollIntoView({ behavior: 'smooth'}); 26 | } 27 | //setAnalysisContext({...analysisContext, funcBanner: `${funcs[index].name}`}) 28 | } 29 | }, [analysisContext.selectedFunction]); 30 | 31 | function onBackClick(){ 32 | if(analysisContext.funcHistory.length > 1){ 33 | const funcHistory = analysisContext.funcHistory.slice(0, -1); 34 | setAnalysisContext({...analysisContext, selectedFunction: funcHistory[funcHistory.length - 1], funcHistory: funcHistory}) 35 | } 36 | } 37 | 38 | const scrollRefs = useRef([]); 39 | 40 | async function onFunctionClick(id){ 41 | if(analysisContext.funcHistory){ 42 | const newFuncHistory = [...analysisContext.funcHistory, id] 43 | setAnalysisContext({...analysisContext, funcHistory: newFuncHistory, selectedFunction: id}) 44 | } else { 45 | setAnalysisContext({...analysisContext, funcHistory: [id], selectedFunction: id}) 46 | } 47 | } 48 | 49 | function cfg_req(func_id){ 50 | const url = import.meta.env.VITE_BACKEND + 'api/func_graph/'; 51 | axios.post(url, { "file_id": file_id, "function_id": func_id }) 52 | .then(response => { 53 | const task_id = response.data.task_id; 54 | const timer = setInterval(() => { 55 | const url = import.meta.env.VITE_BACKEND + "api/task/" + task_id; 56 | const resp = axios.get(url).then((response => { 57 | console.log(response); 58 | if(response.data.status == "DONE" && response.data.task_type == "cfg_analysis"){ 59 | clearInterval(timer); 60 | const graph = response.data.result.json_result; 61 | setAnalysisContext({...analysisContext, graph: graph, graphSet: true}) 62 | } 63 | if(response.data.status == "DONE" && response.data.task_type == "error"){ 64 | clearInterval(timer); 65 | let result = response.data.result.error; 66 | result = result.replace(/'/g, '"'); 67 | result = JSON.parse(result) 68 | console.log(`ERROR: ${result.error}`) 69 | } 70 | })) 71 | }, 1000); 72 | }) 73 | } 74 | 75 | function decomp_req(func_id){ 76 | const url = import.meta.env.VITE_BACKEND + 'api/decomp_func/'; 77 | axios.post(url, { "file_id": file_id, "function_id": func_id }) 78 | .then(response => { 79 | const task_id = response.data.task_id; 80 | const timer = setInterval(() => { 81 | const url = import.meta.env.VITE_BACKEND + "api/task/" + task_id; 82 | const resp = axios.get(url).then((response => { 83 | console.log(response); 84 | if(response.data.status == "DONE" && response.data.task_type == "decomp_func"){ 85 | clearInterval(timer); 86 | const decomp_result = response.data.result.decomp_result; 87 | setAnalysisContext({...analysisContext, decomp: decomp_result}) 88 | } 89 | })) 90 | }, 1000); 91 | }) 92 | } 93 | 94 | return ( 95 |
96 |
97 |
98 | {funcs.map((f,index) => { 99 | return ( 100 |
scrollRefs.current[index] = el} className={"function-item " + (analysisContext.selectedFunction == f.id ? 'function-highlight' : '')} onClick={() => onFunctionClick(f.id)}> 101 | {f.addr}:{f.name} 102 |   103 |   104 |
105 | )})} 106 |
107 |
108 | ) 109 | } -------------------------------------------------------------------------------- /frontend/src/components/Graph.jsx: -------------------------------------------------------------------------------- 1 | import { useState, useContext, useEffect } from "react"; 2 | import { AnalysisContext } from "../context/AnalysisContext"; 3 | import { 4 | ReactFlow, 5 | ReactFlowProvider, 6 | Controls, 7 | Background, 8 | useNodesState, 9 | useEdgesState, 10 | MarkerType 11 | } from '@xyflow/react'; 12 | import '@xyflow/react/dist/style.css'; 13 | import CodeNode from "./CodeNode"; 14 | import axios from "axios"; 15 | import ELK from 'elkjs/lib/elk.bundled.js'; 16 | import { NodeSizeContext, NodeSizeProvider } from "../context/NodeSizeContext.jsx"; 17 | 18 | const elk = new ELK(); 19 | 20 | const nodeWidth = 400; 21 | const nodeHeight = 600; 22 | 23 | const nodeTypes = { 24 | codeNode: CodeNode 25 | } 26 | 27 | const layoutGraph = async (nodes, edges, nodeSizes) => { 28 | console.log(" THIS SHOULD HAPPEN LAST! layoutGraph node sizes") 29 | console.log("HERES NODES") 30 | console.log(nodes) 31 | console.log("HERES EDGES") 32 | console.log(edges) 33 | 34 | const nodeIds = new Set(nodes.map(node => node.id)); 35 | 36 | // Remove edges where the destination is a blank node 37 | const validEdges = edges.filter(edge => nodeIds.has(edge.source) && nodeIds.has(edge.target)); 38 | 39 | const graph = { 40 | id: 'root', 41 | layoutOptions: { 42 | 'elk.algorithm': 'layered', 43 | 'elk.direction': 'DOWN', 44 | 'elk.topdownLayout': true, 45 | 'elk.topdown.nodeType': 'ROOT_NODE', 46 | }, 47 | children: nodes.map(node => ({ 48 | id: node.id, 49 | width: nodeSizes[node.id]?.width + 30 || nodeWidth, 50 | height: nodeSizes[node.id]?.height + 30 || nodeHeight 51 | })), 52 | edges: validEdges.map(edge => ({ 53 | id: edge.id, 54 | sources: [edge.source], 55 | targets: [edge.target] 56 | })) 57 | }; 58 | const layout = await elk.layout(graph); 59 | const positionedNodes = layout.children.map(node => ({ 60 | ...nodes.find(n => n.id === node.id), 61 | position: { 62 | x: node.x, 63 | y: node.y, 64 | } 65 | })); 66 | 67 | return { nodes: positionedNodes, edges }; 68 | }; 69 | 70 | const convertToReactFlowFormat = async (graph, nodeSizes) => { 71 | const nodes = graph.nodes.map(node => ({ 72 | id: node.id.toString(), 73 | position: {x: 0, y: 0}, 74 | data: { label: `Node ${node.id}`, text: '' }, 75 | type: 'codeNode', 76 | draggable: true, 77 | nodesConnectable: false, 78 | })); 79 | 80 | const edges = graph.edges.map((edge, index) => { 81 | //Find edge where source is the same, but target is different 82 | let color = 'black'; 83 | if(edge.type == "conditional"){ 84 | color = 'green'; 85 | for(let i = 0; i < graph.edges.length; i++){ 86 | if(graph.edges[i].src == edge.src && graph.edges[i].dst != edge.dst){ 87 | graph.edges[i].type = "fallthrough" 88 | } 89 | } 90 | } 91 | if(edge.type == "fallthrough"){ 92 | color = 'red'; 93 | } 94 | return { 95 | id: `edge-${index}`, 96 | source: edge.src.toString(), 97 | target: edge.dst.toString(), 98 | type: 'smoothstep', 99 | markerEnd: { 100 | type: MarkerType.ArrowClosed, 101 | color: color, 102 | }, 103 | style: { stroke: color, strokeWidth: 2 } 104 | } 105 | }); 106 | 107 | //return await layoutGraph(nodes, edges, nodeSizes); 108 | return {nodes, edges}; 109 | }; 110 | 111 | 112 | export default function Graph() { 113 | const [analysisContext] = useContext(AnalysisContext); 114 | const {nodeSizes} = useContext(NodeSizeContext); 115 | const [graph, setGraph] = useState(null); 116 | const [nodes, setNodes, onNodesChange] = useNodesState([]); 117 | const [edges, setEdges, onEdgesChange] = useEdgesState([]); 118 | const [error, setError] = useState(); 119 | 120 | useEffect(() => { 121 | if (analysisContext.graphSet) { 122 | convertToReactFlowFormat(analysisContext.graph, nodeSizes).then(convertedGraph => { 123 | setGraph(convertedGraph); 124 | setEdges(convertedGraph.edges); 125 | 126 | const disasmRequests = convertedGraph.nodes.map(node => 127 | axios.post(`${import.meta.env.VITE_BACKEND}api/disasms/`, { "block_id": node.id }) 128 | ); 129 | 130 | Promise.all(disasmRequests) 131 | .then(responses => { 132 | let updatedNodes = convertedGraph.nodes.map((node, index) => { 133 | const disasmData = responses[index].data; 134 | let text = []; 135 | for (let j = 0; j < disasmData.length; j++) { 136 | text.push({ "addr": disasmData[j].addr, "op": disasmData[j].op, "data": disasmData[j].data }); 137 | } 138 | return { 139 | ...node, 140 | data: { label: node.id, text } 141 | }; 142 | }); 143 | updatedNodes = updatedNodes.filter(node => node.data.text.length > 0); 144 | setNodes(updatedNodes); 145 | setError(); 146 | }).then(() => { 147 | // console.log("AFTER FIRST UPDATE") 148 | }) 149 | .catch(error => { 150 | console.error("Error fetching disassemblies:", error); 151 | setError(`Error fetching disassemblies: ${error}`) 152 | }); 153 | }); 154 | } 155 | }, [analysisContext.graph]); 156 | 157 | useEffect(() => { 158 | async function run() { 159 | if(Object.keys(nodeSizes).length != 0){ 160 | const result = await layoutGraph(nodes, edges, nodeSizes); 161 | setNodes(result.nodes); 162 | setEdges(result.edges); 163 | } 164 | } 165 | run() 166 | }, [nodeSizes]) 167 | 168 | if (!graph) { 169 | return ( 170 |
171 |
172 |
No function selected
173 |
174 |
175 | ); 176 | } 177 | 178 | return ( 179 |
180 | {error && ({error})} 181 | 182 | 192 | 193 | 194 | 195 | 196 |
197 | ); 198 | } 199 | -------------------------------------------------------------------------------- /frontend/src/components/Listing.jsx: -------------------------------------------------------------------------------- 1 | import { useContext, useState, useEffect } from "react"; 2 | import { AnalysisContext } from "../context/AnalysisContext"; 3 | import axios from "axios"; 4 | // import './Listing.css' 5 | 6 | export default function Listing() { 7 | const [analysisContext, setAnalysisContext] = useContext(AnalysisContext); 8 | const [blocks, setBlocks] = useState([]); 9 | const [error, setError] = useState([]); 10 | 11 | function addressClick(address){ 12 | const i = internalFuncRef(address).index; 13 | console.log 14 | if(i != false){ 15 | if(analysisContext.funcHistory){ 16 | const newFuncHistory = [...analysisContext.funcHistory, analysisContext.allFunctions[i].id] 17 | setAnalysisContext({...analysisContext, funcHistory: newFuncHistory, selectedFunction: analysisContext.allFunctions[i].id}) 18 | } else { 19 | setAnalysisContext({...analysisContext, funcHistory: [id], selectedFunction: analysisContext.allFunctions[i].id}) 20 | } 21 | } 22 | } 23 | 24 | function internalFuncRef(address){ 25 | const regex = /^[0-9A-Fa-f]+h$/; 26 | if(address.match(regex)){ 27 | const numFunctions = analysisContext.allFunctions.length; 28 | const trimmedAddress = address.replace(/h$/, ''); 29 | for(let i = 0; i < numFunctions; i++ ){ 30 | let currentFuncAddr = analysisContext.allFunctions[i].addr; 31 | currentFuncAddr = currentFuncAddr.replace(/^0x/, ''); 32 | if(currentFuncAddr == trimmedAddress){ 33 | return {index: i, name: analysisContext.allFunctions[i].name}; 34 | } 35 | } 36 | } 37 | return false; 38 | } 39 | 40 | useEffect(() => { 41 | if (analysisContext.selectedFunction != null) { 42 | const url = import.meta.env.VITE_BACKEND + 'api/blocks/'; 43 | axios.post(url, { "function_id": analysisContext.selectedFunction }).then(response => { 44 | const fetchedBlocks = response.data; 45 | 46 | const disasmPromises = fetchedBlocks.map(block => { 47 | const disasmUrl = import.meta.env.VITE_BACKEND + 'api/disasms/'; 48 | return axios.post(disasmUrl, { "block_id": block.id }).then(disasmResponse => { 49 | return { 50 | blockId: block.id, 51 | disassembly: disasmResponse.data 52 | }; 53 | }).catch(error => { 54 | console.log("Error fetching disasms:", error); 55 | return { 56 | blockId: block.id, 57 | disassembly: [] 58 | }; 59 | }); 60 | }); 61 | 62 | Promise.all(disasmPromises).then(disasmResults => { 63 | const blocksWithDisasm = fetchedBlocks.map(block => { 64 | const disasmForBlock = disasmResults.find(disasm => disasm.blockId === block.id); 65 | return { 66 | ...block, 67 | disassembly: disasmForBlock ? disasmForBlock.disassembly : [] 68 | }; 69 | }); 70 | setBlocks(blocksWithDisasm); 71 | setError(); 72 | }); 73 | }).catch(error => { 74 | setError(`Error fetching blocks: ${error}`); 75 | }); 76 | } 77 | }, [analysisContext.selectedFunction]); 78 | 79 | return ( 80 |
81 |
82 | {error && (error)} 83 | {blocks.map((block, key) => ( 84 |
85 |
LABEL {block.addr}
86 | {block.disassembly.map((d, dkey) => ( 87 |
88 | 89 | {d.addr}:  90 | 91 | 92 | {d.op}  93 | 94 | addressClick(d.data)} className="text-red-800"> 95 | {d.data} 96 | 97 | {internalFuncRef(d.data) ? `[XREF=>${internalFuncRef(d.data).name}]` : ''} 98 |
99 | ))} 100 |
101 | ))} 102 |
103 |
104 | ); 105 | } -------------------------------------------------------------------------------- /frontend/src/components/MenuBarItem.jsx: -------------------------------------------------------------------------------- 1 | import { useEffect, useRef, useState } from "react"; 2 | 3 | export default function MenuBarItem(props) { 4 | const name = props.name; 5 | const subMenu = props.subMenu; 6 | const menuRef = useRef(); 7 | const [isOpen, setIsOpen] = useState(false); 8 | 9 | function clickHandler(){ 10 | setIsOpen(true); 11 | } 12 | 13 | useEffect(() => { 14 | const handleClickOutside = (event) => { 15 | if (menuRef.current && !menuRef.current.contains(event.target)) { 16 | setIsOpen(false); 17 | } 18 | }; 19 | 20 | document.addEventListener('mousedown', handleClickOutside); 21 | 22 | return () => { 23 | document.removeEventListener('mousedown', handleClickOutside); 24 | }; 25 | }, []) 26 | 27 | return ( 28 |
29 |
{name}
30 | {isOpen && ( 31 |
32 | {Object.entries(subMenu).map(([name, func]) => { 33 | return ( 34 |
func()} key={name}>{name}
35 | ) 36 | })} 37 |
38 | )} 39 |
40 | ) 41 | } -------------------------------------------------------------------------------- /frontend/src/components/Menubar.jsx: -------------------------------------------------------------------------------- 1 | import { useContext, useState } from "react"; 2 | import MenuBarItem from "./MenuBarItem"; 3 | import { info } from "autoprefixer"; 4 | import { MenuContext } from "../context/MenuContext"; 5 | import Upload from "./Upload" 6 | import { useNavigate } from "react-router-dom"; 7 | 8 | export default function MenuBar() { 9 | const navigate = useNavigate(); 10 | const [showModal, setShowModal] = useState(false); 11 | const [menuContext, setMenuContext] = useContext(MenuContext); 12 | 13 | function toggleModal(){ 14 | const currentValue = showModal; 15 | setShowModal(!currentValue); 16 | } 17 | 18 | const fileSubMenu = { 19 | "Open": function(){ 20 | toggleModal(); 21 | }, 22 | "Quit": function(){ 23 | navigate('/'); 24 | }, 25 | } 26 | 27 | // const optionsSubMenu = { 28 | // "Auto-analyze": "a", 29 | // "Highlight": "b", 30 | // } 31 | 32 | const windowsSubmenu = { 33 | "Disassembly": function(){ 34 | if(menuContext.disasmView){ 35 | setMenuContext({...menuContext, disasmView: false}) 36 | } else { 37 | setMenuContext({...menuContext, disasmView: true}) 38 | } 39 | }, 40 | "Decompilation": function(){ 41 | if(menuContext.decompView){ 42 | setMenuContext({...menuContext, decompView: false}) 43 | } else { 44 | setMenuContext({...menuContext, decompView: true}) 45 | } 46 | }, 47 | "Control Flow Graph": function(){ 48 | if(menuContext.cfgView){ 49 | setMenuContext({...menuContext, cfgView: false}) 50 | } else { 51 | setMenuContext({...menuContext, cfgView: true}) 52 | } 53 | }, 54 | "Function List": function(){ 55 | if(menuContext.functionView){ 56 | setMenuContext({...menuContext, functionView: false}) 57 | } else { 58 | setMenuContext({...menuContext, functionView: true}) 59 | } 60 | }, 61 | "Raw Hex Viewer": function(){ 62 | if(menuContext.rawView){ 63 | setMenuContext({...menuContext, rawView: false}) 64 | } else { 65 | setMenuContext({...menuContext, rawView: true}) 66 | } 67 | }, 68 | "Strings": function(){ 69 | if(menuContext.stringsView){ 70 | setMenuContext({...menuContext, stringsView: false}) 71 | } else { 72 | setMenuContext({...menuContext, stringsView: true}) 73 | } 74 | } 75 | } 76 | 77 | const helpSubmenu = { 78 | "About": function(){ 79 | const win = window.open('/info', '_blank'); 80 | win.focus(); 81 | }, 82 | "Github Repo": function(){ 83 | const win = window.open('https://github.com/anthonyshibitov/netdis', '_blank'); 84 | win.focus(); 85 | }, 86 | } 87 | 88 | return ( 89 |
90 |
netdis
91 | 94 | {/* */} 97 | 100 | 103 | {showModal && ( 104 |
105 |
106 | 107 | 113 |
114 |
115 | )} 116 |
117 | ) 118 | } -------------------------------------------------------------------------------- /frontend/src/components/NavBar.css: -------------------------------------------------------------------------------- 1 | .navbar-container { 2 | display: flex; 3 | flex-direction: row; 4 | padding: 2rem; 5 | justify-content: space-between; 6 | } -------------------------------------------------------------------------------- /frontend/src/components/NavBar.jsx: -------------------------------------------------------------------------------- 1 | import './NavBar.css'; 2 | import { Link } from "react-router-dom"; 3 | 4 | export default function NavBar({ onUploadClick }) { 5 | return ( 6 |
7 | 8 | netdis 9 | 10 | 18 |
19 | ) 20 | } -------------------------------------------------------------------------------- /frontend/src/components/RawHex.jsx: -------------------------------------------------------------------------------- 1 | import { useState, useEffect, useRef, useCallback } from "react"; 2 | import axios from "axios"; 3 | 4 | export default function RawHex(props) { 5 | const file_id = props.rawhexProps.file_id; 6 | const image_base = props.rawhexProps.image_base; 7 | const [address, setAddress] = useState(image_base); 8 | const [allData, setAllData] = useState({}); 9 | const [error, setError] = useState(); 10 | const [isLoading, setIsLoading] = useState(false); 11 | const [hasMore, setHasMore] = useState(true); 12 | const containerRef = useRef(null); 13 | const observerRef = useRef(null); 14 | 15 | const FETCH_SIZE = 512; // Set fetch size to 512 bytes 16 | 17 | const lastElementRef = useCallback(node => { 18 | if (isLoading) return; 19 | if (observerRef.current) observerRef.current.disconnect(); 20 | observerRef.current = new IntersectionObserver(entries => { 21 | if (entries[0].isIntersecting && hasMore) { 22 | const lastAddress = Object.keys(allData).pop(); 23 | if (lastAddress) { 24 | fetchData(lastAddress); 25 | } 26 | } 27 | }); 28 | if (node) observerRef.current.observe(node); 29 | }, [isLoading, hasMore, allData]); 30 | 31 | useEffect(() => { 32 | fetchData(image_base); 33 | }, []); 34 | 35 | const sendAddressRequest = (e) => { 36 | e.preventDefault(); 37 | const newAddress = parseInt(address, 16); 38 | if (isNaN(newAddress) || newAddress < image_base) { 39 | setError("Invalid address. Please enter a valid hexadecimal address at or above the image base."); 40 | console.log("fucked") 41 | return; 42 | } 43 | console.log("not fucked") 44 | setHasMore(true); 45 | setError(); 46 | setAllData({}); // Clear existing data 47 | fetchData(newAddress); 48 | }; 49 | 50 | const fetchData = (startAddress) => { 51 | if (isLoading || !hasMore) { 52 | console.log("fukcky return") 53 | return; 54 | } 55 | setIsLoading(true); 56 | const url = import.meta.env.VITE_BACKEND + 'api/rawhex/'; 57 | axios.post(url, { 58 | "file_id": file_id, 59 | "address": startAddress.toString(16), 60 | "length": FETCH_SIZE.toString() 61 | }) 62 | .then(response => { 63 | const task_id = response.data.task_id; 64 | const timer = setInterval(() => { 65 | const url = import.meta.env.VITE_BACKEND + "api/task/" + task_id; 66 | axios.get(url).then((response) => { 67 | if(response.data.status == "DONE" && response.data.task_type == "raw_request"){ 68 | let result = JSON.parse(response.data.result.rawhex.replace(/'/g, '"')); 69 | setAllData(prevData => ({...prevData, ...result})); 70 | setError(null); 71 | setIsLoading(false); 72 | setHasMore(Object.keys(result).length === FETCH_SIZE); 73 | clearInterval(timer); 74 | } 75 | if(response.data.status == "DONE" && response.data.task_type == "error"){ 76 | let result = JSON.parse(response.data.result.error.replace(/'/g, '"')); 77 | console.log(result.error) 78 | setError(result.error); 79 | setIsLoading(false); 80 | setHasMore(false); 81 | clearInterval(timer); 82 | } 83 | }); 84 | }, 1000); 85 | }) 86 | .catch(error => { 87 | console.error("Error fetching data:", error); 88 | setError("Failed to fetch data. Please try again."); 89 | setIsLoading(false); 90 | }); 91 | }; 92 | 93 | function handleChange(event){ 94 | const newAddress = event.target.value; 95 | setAddress(newAddress); 96 | } 97 | 98 | return ( 99 |
100 |
101 | Addr 102 | setAddress(e.target.value)} 107 | onFocus={e => e.target.select()} 108 | /> 109 | 115 |
116 |
117 | {Object.entries(allData).map(([address, value], i, arr) => ( 118 | 119 | {i % 8 === 0 && {address}: } 120 | {value}  121 | 122 | ))} 123 | {isLoading &&
Loading...
} 124 | {error &&
{error}
} 125 |
126 |
127 | ) 128 | } -------------------------------------------------------------------------------- /frontend/src/components/Strings.jsx: -------------------------------------------------------------------------------- 1 | import { useEffect, useState } from "react" 2 | import axios from "axios"; 3 | import { jsx } from "react/jsx-runtime"; 4 | 5 | export default function Strings(props) { 6 | const file_id = props.stringsProps.file_id; 7 | const [error, setError] = useState(); 8 | const [data, setData] = useState(); 9 | 10 | useEffect(() => { 11 | const url = import.meta.env.VITE_BACKEND + 'api/strings/'; 12 | axios.post(url, { "file_id": file_id }) 13 | .then(response => { 14 | const task_id = response.data.task_id; 15 | const timer = setInterval(() => { 16 | const url = import.meta.env.VITE_BACKEND + "api/task/" + task_id; 17 | const resp = axios.get(url).then((response => { 18 | console.log(response.data) 19 | if(response.data.status == "DONE" && response.data.task_type == "strings"){ 20 | const result = response.data.result.strings 21 | const jsxResult = Object.entries(result).map(([key, index]) => { 22 | return ( 23 |
{key}: {result[key]}
24 | ) 25 | }) 26 | setData(jsxResult); 27 | setError(); 28 | clearInterval(timer); 29 | } 30 | if(response.data.status == "DONE" && response.data.task_type == "error"){ 31 | const result = response.data.result.error 32 | setError(result.error); 33 | setData(); 34 | clearInterval(timer); 35 | } 36 | })) 37 | }, 1000); 38 | }) 39 | }, []) 40 | 41 | return ( 42 |
43 |
{data}
44 |
{error}
45 |
46 | ) 47 | } -------------------------------------------------------------------------------- /frontend/src/components/Upload.jsx: -------------------------------------------------------------------------------- 1 | import React, { useState } from "react"; 2 | import { useNavigate } from "react-router-dom"; 3 | import axios from "axios"; 4 | import { useEffect } from "react"; 5 | 6 | export default function UploadPage(props) { 7 | const [status, setStatus] = useState(""); 8 | const navigate = useNavigate(); 9 | const callbackFunction = props.callback; 10 | const [loaders, setLoaders] = useState(); 11 | const [useLoaders, setUseLoaders] = useState(false); 12 | const [showLoaderCheckbox, setShowLoaderCheckbox] = useState(true); 13 | const [usingLoader, setUsingLoader] = useState(); 14 | const [selectedLoader, setSelectedLoader] = useState("NONE"); 15 | const [selectedLang, setSelectedLang] = useState("NONE"); 16 | const [selectedFile, setSelectedFile] = useState(); 17 | const [uploadText, setUploadText] = useState("Analyze") 18 | 19 | const handleFileChange = (e) => { 20 | setSelectedFile(e.target.files[0]) 21 | } 22 | 23 | const handleLoaderCheckBox = () => { 24 | setUseLoaders(!useLoaders); 25 | if(!useLoaders){ 26 | setUploadText("Detect available specs"); 27 | } 28 | if(useLoaders){ 29 | setUploadText("Analyze"); 30 | } 31 | } 32 | 33 | useEffect(() => { 34 | if (loaders) { 35 | const firstLoader = Object.entries(loaders[0])[0][1]; // Get the first loader value 36 | const firstLang = Object.entries(loaders[1])[0][0]; // Get the first language name 37 | setSelectedLoader(firstLoader); 38 | setSelectedLang(firstLang); 39 | } 40 | }, [loaders]); 41 | 42 | const handleSelectedLoader = (e) => { 43 | setSelectedLoader(e); 44 | console.log(e); 45 | console.log("selected loader", selectedLoader); 46 | } 47 | 48 | const handleSelectedLang = (e) => { 49 | setSelectedLang(e); 50 | console.log(e); 51 | console.log("selected lang", selectedLang); 52 | } 53 | 54 | function uploadFile(event){ 55 | // const file = event.target.files[0]; 56 | // event.preventDefault(); 57 | const formData = new FormData(); 58 | formData.append('file', selectedFile); 59 | formData.append('fileName', selectedFile.name); 60 | formData.append('loader', selectedLoader); 61 | formData.append('lang', selectedLang) 62 | let timeProcessing = 0; 63 | const config = { 64 | headers: { 65 | 'content-type': 'multipart/form-data' 66 | } 67 | } 68 | if(!useLoaders){ 69 | const url = import.meta.env.VITE_BACKEND + "api/binary_ingest/"; 70 | let state; 71 | setStatus("Uploading..."); 72 | axios.post(url, formData, config).then((response => { 73 | /* If project_id is null, it is still processing/queued! */ 74 | if(response.data.file_id != null){ 75 | console.log("RECEIVED FROM UPLOAD PROCESSING...") 76 | console.log(response.data) 77 | const file_id = response.data.file_id; 78 | const url = `${import.meta.env.VITE_BACKEND}api/funcs/`; 79 | const image_base = response.data.image_base; 80 | axios.post(url, {"file_id": response.data.file_id}).then(response => { 81 | callbackFunction ? callbackFunction(): ''; 82 | navigate("/analysis", {state: {funcs: response.data, file_id: file_id, image_base: image_base}, replace: true}); 83 | }) 84 | } else { 85 | setStatus("File queued..."); 86 | const task_id = response.data.id; 87 | const timer = setInterval(() => { 88 | const url = import.meta.env.VITE_BACKEND + "api/task/" + task_id; 89 | const resp = axios.get(url).then((response => { 90 | console.log(response); 91 | if(response.data.status == "DONE" && response.data.task_type == "file_upload"){ 92 | clearInterval(timer); 93 | console.log("RECEIVED FROM UPLOAD PROCESSING...") 94 | console.log(response.data) 95 | const file_id = response.data.result.file_id; 96 | const image_base = response.data.result.image_base; 97 | const url = `${import.meta.env.VITE_BACKEND}api/funcs/`; 98 | axios.post(url, {"file_id": response.data.result.file_id}).then(response => { 99 | console.log("funcs response") 100 | console.log(response.data) 101 | callbackFunction ? callbackFunction() : ''; 102 | navigate("/analysis", {state: {funcs: response.data, file_id: file_id, image_base: image_base}, replace: true}); 103 | }) 104 | } 105 | if(response.data.status == "PROCESSING"){ 106 | setStatus(`Processing. Time elapsed: ${timeProcessing}s`); 107 | timeProcessing += 1; 108 | } 109 | if(response.data.status == "DONE" && response.data.task_type == "error"){ 110 | let result = response.data.result.error; 111 | // Pesky single quotes! 112 | result = result.replace(/'/g, '"'); 113 | result = JSON.parse(result) 114 | clearInterval(timer) 115 | setStatus(`ERROR: ${result.error}`) 116 | } 117 | })).catch(error => { 118 | console.log(`new error ${error}`); 119 | }) 120 | }, 1000); 121 | } 122 | })).catch(error => { 123 | console.log(error); 124 | // if(error.response.data.error){ 125 | // setStatus(`ERROR: ${error.response.data.error} - ${error.response.data.error_info}`) 126 | // return; 127 | // } 128 | setStatus(`ERROR: ${error}`); 129 | }) 130 | } else { 131 | const url = import.meta.env.VITE_BACKEND + "api/get_loaders/"; 132 | let state; 133 | setStatus("Uploading..."); 134 | axios.post(url, formData, config).then((response => { 135 | setStatus("Loading analyzers..."); 136 | const task_id = response.data.id; 137 | const timer = setInterval(() => { 138 | const url = import.meta.env.VITE_BACKEND + "api/task/" + task_id; 139 | const resp = axios.get(url).then((response => { 140 | console.log(response); 141 | if(response.data.status == "DONE" && response.data.task_type == "loaders"){ 142 | clearInterval(timer); 143 | console.log("GOT LOADERS") 144 | console.log(response); 145 | setStatus("Loaded analyzers.") 146 | let loaders = response.data.result.loaders[1]; 147 | 148 | // Convert the object into an array of [key, value] pairs 149 | let sortedLoadersArray = Object.entries(loaders).sort(([keyA], [keyB]) => keyA.localeCompare(keyB)); 150 | 151 | // Convert the sorted array back into an object 152 | response.data.result.loaders[1] = Object.fromEntries(sortedLoadersArray); 153 | setLoaders(response.data.result.loaders); 154 | // To upload file normally.. 155 | // setUseLoaders(false); 156 | // Hacky... 157 | handleLoaderCheckBox(); 158 | setShowLoaderCheckbox(false); 159 | } 160 | if(response.data.status == "PROCESSING"){ 161 | setStatus(`Processing. Time elapsed: ${timeProcessing}s`); 162 | timeProcessing += 1; 163 | } 164 | if(response.data.status == "DONE" && response.data.task_type == "error"){ 165 | let result = response.data.result.error; 166 | // Pesky single quotes! 167 | result = result.replace(/'/g, '"'); 168 | result = JSON.parse(result) 169 | clearInterval(timer) 170 | setStatus(`ERROR: ${result.error}`) 171 | } 172 | })) 173 | }, 1000); 174 | })).catch(error => { 175 | if(error.response.data.error){ 176 | setStatus(`ERROR: ${error.response.data.error} - ${error.response.data.error_info}`) 177 | return; 178 | } 179 | }) 180 | } 181 | } 182 | 183 | return ( 184 |
185 | 188 | 189 |
190 | {status} 191 |
192 | 193 |
(2mb file upload limit)
194 |
195 | {showLoaderCheckbox && 196 |
197 |
Advanced options
198 |
199 | 200 | 201 |
202 |
203 | } 204 | {loaders && 205 | <> 206 |
Select an appropriate loader and language
207 | 214 | 221 | 222 | } 223 |
224 |
225 | ) 226 | } -------------------------------------------------------------------------------- /frontend/src/context/AnalysisContext.js: -------------------------------------------------------------------------------- 1 | import { createContext } from "react"; 2 | 3 | export const AnalysisContext = createContext({ 4 | selectedFunction: null, 5 | allFunctions: [], 6 | funcHistory: [], 7 | funcBanner: '', 8 | graph: {}, 9 | graphSet: false, 10 | decomp: '', 11 | }); -------------------------------------------------------------------------------- /frontend/src/context/MenuContext.jsx: -------------------------------------------------------------------------------- 1 | import { createContext } from "react"; 2 | 3 | export const MenuContext = createContext({ 4 | disasmView: true, 5 | decompView: true, 6 | functionView: true, 7 | cfgView: true, 8 | uploadModal: false, 9 | }); -------------------------------------------------------------------------------- /frontend/src/context/NodeSizeContext.jsx: -------------------------------------------------------------------------------- 1 | import React, { createContext, useState } from 'react'; 2 | 3 | export const NodeSizeContext = createContext(); 4 | 5 | export const NodeSizeProvider = ({ children }) => { 6 | const [nodeSizes, setNodeSizes] = useState({}); 7 | 8 | const updateNodeSize = (nodeId, size) => { 9 | setNodeSizes(prevSizes => ({ ...prevSizes, [nodeId]: size })); 10 | }; 11 | 12 | return ( 13 | 14 | {children} 15 | 16 | ); 17 | }; 18 | -------------------------------------------------------------------------------- /frontend/src/index.css: -------------------------------------------------------------------------------- 1 | @tailwind base; 2 | @tailwind components; 3 | @tailwind utilities; 4 | 5 | /* Remove circular handle elements */ 6 | .react-flow__handle { 7 | opacity: 0; 8 | } 9 | 10 | ::-webkit-scrollbar { 11 | -webkit-appearance: none; 12 | width: 7px; 13 | height: 7px; 14 | } 15 | ::-webkit-scrollbar-thumb { 16 | border-radius: 4px; 17 | background-color: rgba(0, 0, 0, .5); 18 | -webkit-box-shadow: 0 0 1px rgba(255, 255, 255, .5); 19 | } 20 | 21 | pre { 22 | padding: 0px; 23 | } -------------------------------------------------------------------------------- /frontend/src/main.jsx: -------------------------------------------------------------------------------- 1 | import React from 'react' 2 | import ReactDOM from 'react-dom/client' 3 | import HomePage from './pages/HomePage.jsx' 4 | import ErrorPage from './pages/ErrorPage.jsx' 5 | import AnalysisPage from './pages/AnalysisPage.jsx' 6 | import InfoPage from './pages/InfoPage.jsx' 7 | import './index.css' 8 | 9 | import { 10 | RouterProvider, 11 | createBrowserRouter, 12 | } from 'react-router-dom'; 13 | 14 | const router = createBrowserRouter([ 15 | { 16 | path: '/', 17 | element: , 18 | errorElement: , 19 | }, 20 | { 21 | path: '/info', 22 | element: 23 | }, 24 | { 25 | path: '/analysis', 26 | element: 27 | } 28 | ]) 29 | 30 | ReactDOM.createRoot(document.getElementById('root')).render( 31 | 32 | ) 33 | -------------------------------------------------------------------------------- /frontend/src/pages/AnalysisPage.css: -------------------------------------------------------------------------------- 1 | .component-wrapper { 2 | border: 1px solid #5b92e5; 3 | } 4 | 5 | .component-title { 6 | font-weight: bold; 7 | margin: 0.3rem; 8 | border-bottom: 1px solid #5b92e5; 9 | display: flex; 10 | /* gap: 1rem; */ 11 | /* position: fixed; */ 12 | top: 0; 13 | /* position: sticky; */ 14 | background-color: white; 15 | } 16 | 17 | .component-body { 18 | margin: 0.5rem; 19 | /* overflow-y: scroll; */ 20 | /* height: calc(100% - 3.2rem); */ 21 | overflow-x: auto; 22 | } 23 | 24 | .react-grid-item { 25 | overflow: hidden; 26 | display: flex; 27 | flex-direction: column; 28 | justify-content: stretch; 29 | } 30 | 31 | .react-grid-item > * { 32 | flex-grow: 1; 33 | flex-shrink: 1; 34 | overflow: auto; 35 | /* padding: 10px; */ 36 | } 37 | 38 | .draggable-handle { 39 | cursor: move; 40 | background-color: #ccc; 41 | /* text-align: right; */ 42 | display: flex; 43 | flex-direction: row; 44 | border-bottom: 1px solid #aaa; 45 | padding: 2px; 46 | padding-left: 5px; 47 | height: 20px; 48 | width: 100%; 49 | flex-grow: 0; 50 | flex-shrink: 0; 51 | overflow-y: hidden; 52 | user-select: none; 53 | font-family: monospace; 54 | font-size: 0.75rem; 55 | font-weight: bolder; 56 | } -------------------------------------------------------------------------------- /frontend/src/pages/AnalysisPage.jsx: -------------------------------------------------------------------------------- 1 | import React, { useState, useEffect } from "react"; 2 | import { useLocation } from "react-router-dom"; 3 | import FunctionList from "../components/FunctionList.jsx"; 4 | import Listing from "../components/Listing.jsx"; 5 | import { AnalysisContext } from "../context/AnalysisContext.js"; 6 | import { MenuContext } from "../context/MenuContext.jsx"; 7 | import './AnalysisPage.css'; 8 | import Graph from "../components/Graph.jsx"; 9 | import GridLayout from "react-grid-layout"; 10 | import "react-grid-layout/css/styles.css"; 11 | import "react-resizable/css/styles.css"; 12 | import { NodeSizeProvider } from "../context/NodeSizeContext.jsx"; 13 | import Decompilation from "../components/Decompilation.jsx"; 14 | import MenuBar from "../components/Menubar.jsx"; 15 | import RawHex from "../components/RawHex.jsx"; 16 | import Strings from "../components/Strings.jsx"; 17 | 18 | const AnalysisPage = () => { 19 | const { state } = useLocation(); 20 | console.log(state); 21 | const [analysisContext, setAnalysisContext] = useState({ "selectedFunction": null }); 22 | const [menuContext, setMenuContext] = useState({ 23 | disasmView: true, 24 | decompView: true, 25 | cfgView: true, 26 | functionView: true, 27 | rawView: true, 28 | stringsView: false, 29 | uploadModal: false, 30 | }); 31 | 32 | const [dimensions, setDimensions] = useState({ 33 | width: window.innerWidth, 34 | height: window.innerHeight 35 | }); 36 | 37 | useEffect(() => { 38 | const handleResize = () => { 39 | setDimensions({ width: window.innerWidth, height: window.innerHeight }); 40 | }; 41 | 42 | window.addEventListener('resize', handleResize); 43 | // window.location.reload(); 44 | return () => window.removeEventListener('resize', handleResize); 45 | }, []); 46 | 47 | const layout = [ 48 | { i: "funcs", x: 10, y: 0, w: 6, h: 8, minW: 3, minH: 3 }, 49 | { i: "disasm", x: 10, y: 9, w: 10, h: 8, minW: 3, minH: 3 }, 50 | { i: "cfg", x: 0, y: 0, w: 10, h: 8, minW: 3, minH: 3 }, 51 | { i: "decomp", x: 0, y: 9, w: 10, h: 8, minW: 3, minH: 3 }, 52 | { i: "hex", x: 16, y: 0, w: 4, h: 8, minW: 4, minH: 3 } 53 | ]; 54 | 55 | return ( 56 | 57 | 58 | 59 | 69 | {menuContext.functionView && 70 |
71 |
Functions
72 | 73 |
74 | } 75 | {menuContext.disasmView && 76 |
77 |
Disassembly
78 | 79 |
80 | } 81 | {menuContext.cfgView && 82 |
83 |
Control Flow Graph
84 | 85 | 86 | 87 |
88 | } 89 | {menuContext.decompView && 90 |
91 |
Decompilation
92 | 93 |
94 | } 95 | {menuContext.rawView && 96 |
97 |
Raw Hex
98 | 99 |
100 | } 101 | {menuContext.stringsView && 102 |
103 |
Strings
104 | 105 |
106 | } 107 |
108 |
109 |
110 | ); 111 | }; 112 | 113 | export default AnalysisPage; 114 | -------------------------------------------------------------------------------- /frontend/src/pages/ErrorPage.jsx: -------------------------------------------------------------------------------- 1 | import { useRouteError} from 'react-router-dom'; 2 | 3 | export default function ErrorPage() { 4 | const error = useRouteError(); 5 | console.error(error) 6 | return ( 7 | <> 8 |

Whoops

9 |

Error: {error.statusText || error.message}

10 | 11 | ) 12 | } -------------------------------------------------------------------------------- /frontend/src/pages/HomePage.jsx: -------------------------------------------------------------------------------- 1 | import React, { useState } from 'react'; 2 | import NavBar from '../components/NavBar'; 3 | import Upload from '../components/Upload'; 4 | 5 | export default function HomePage() { 6 | console.log(import.meta.env.VITE_BACKEND); 7 | const [showModal, setShowModal] = useState(false); 8 | 9 | function toggleModal(){ 10 | const current = showModal; 11 | setShowModal(!current); 12 | } 13 | 14 | return ( 15 |
16 | 17 |
18 |
19 |
20 |

netdis web disassembler

21 |
22 | {/*

Powered by:

*/} 23 | 24 | 25 |
26 |
27 |
28 |
29 | netdis is an online and open-source binary analysis platform based on the Ghidra suite of tools. simply upload a file and analyze it - all within your browser. 30 |
31 |
32 |

netdis supports full binary disassembly, decompilation, and function control flow graph recovery. many features upcoming, stay tuned 🕵️‍♂️

33 |
34 |
35 |
36 | 37 |
38 |
39 |

Like what you see? Have suggestions? Is my code unbearably bad? Fork this repo!

40 | 41 |
42 | Buy Me a Coffee at ko-fi.com 43 |
44 |
45 | 46 | {showModal && ( 47 |
48 |
49 | 50 | 56 |
57 |
58 | )} 59 |
60 | ) 61 | } -------------------------------------------------------------------------------- /frontend/src/pages/InfoPage.jsx: -------------------------------------------------------------------------------- 1 | import React, { useState } from 'react'; 2 | import NavBar from '../components/NavBar'; 3 | import Upload from '../components/Upload'; 4 | 5 | export default function InfoPage() { 6 | console.log(import.meta.env.VITE_BACKEND); 7 | const [showModal, setShowModal] = useState(false); 8 | 9 | function toggleModal(){ 10 | const current = showModal; 11 | setShowModal(!current); 12 | } 13 | 14 | return ( 15 |
16 | 17 |
18 |
19 |
20 |

netdis web disassembler

21 |
22 |
23 | About 24 |
25 |
26 |

27 | Netdis is an online disassembler which uses Ghidra. It currently offers disassembly, decompilation, and function control flow graphing, with more features planned. If you'd like to contribute to the project, please visit the github repo link! Any and all contributions are welcome :) 28 |

29 |
30 |
31 | Donations 32 |
33 |
34 | I'm currently running this on a server out of pocket. There are no paid features, advertising, or sponsorships supporting this project. If you'd like to toss me a couple bucks, considering donating via my Ko-fi. Your support will be greatly appreciated, and will go directly to paying server costs/possible upgrading to increase analysis speeds! 35 |
36 |
37 | Buy Me a Coffee at ko-fi.com 38 |
39 |
40 |
41 |
42 | 43 | {showModal && ( 44 |
45 |
46 | 47 | 53 |
54 |
55 | )} 56 |
57 | ) 58 | } -------------------------------------------------------------------------------- /frontend/tailwind.config.js: -------------------------------------------------------------------------------- 1 | /** @type {import('tailwindcss').Config} */ 2 | export default { 3 | content: [ 4 | "./index.html", 5 | "./src/**/*.{js,ts,jsx,tsx}", 6 | ], 7 | theme: { 8 | extend: { 9 | colors : { 10 | 'ndblue': '#5b92e5', 11 | 'ndgrey': '#2a2b2a', 12 | 'ccc': '#ccc', 13 | } 14 | } 15 | }, 16 | plugins: [], 17 | } 18 | -------------------------------------------------------------------------------- /frontend/vite.config.js: -------------------------------------------------------------------------------- 1 | import { defineConfig, loadEnv } from 'vite' 2 | import react from '@vitejs/plugin-react' 3 | 4 | // https://vitejs.dev/config/ 5 | export default defineConfig({ 6 | plugins: [react()], 7 | }) 8 | -------------------------------------------------------------------------------- /maintenance/index.html: -------------------------------------------------------------------------------- 1 |
2 |

Netdis.org is currently under maintenance. Please check back later.

3 | lain trying to fix a webserver 4 |
-------------------------------------------------------------------------------- /maintenance/lain.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/maintenance/lain.gif -------------------------------------------------------------------------------- /netdis.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/anthonyshibitov/netdis/345ea660e0504e4cae44877f10549aabd4eec1dc/netdis.png -------------------------------------------------------------------------------- /nginxprod.conf: -------------------------------------------------------------------------------- 1 | limit_req_zone $binary_remote_addr zone=mylimit:10m rate=20r/s; 2 | 3 | server { 4 | listen 80; 5 | server_name www.netdis.org; 6 | return 301 http://netdis.org$request_uri; 7 | } 8 | 9 | server { 10 | listen 80; 11 | server_name netdis.org; 12 | client_max_body_size 5M; 13 | 14 | location / { 15 | root /usr/share/nginx/html; 16 | try_files $uri /index.html; 17 | } 18 | 19 | location /api/ { 20 | limit_req zone=mylimit burst=300; 21 | 22 | proxy_pass http://backend:8000; 23 | proxy_set_header Host $host; 24 | proxy_set_header X-Real-IP $remote_addr; 25 | proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; 26 | proxy_set_header X-Forwarded-Proto $scheme; 27 | 28 | # add_header Access-Control-Allow-Origin *; 29 | # add_header Access-Control-Allow-Methods 'GET, POST, OPTIONS'; 30 | # add_header Access-Control-Allow-Headers 'Origin, Content-Type, X-Auth-Token, Authorization'; 31 | } 32 | } -------------------------------------------------------------------------------- /production.checklist: -------------------------------------------------------------------------------- 1 | FRONT END 2 | 3 | 1. /frontend/.env 4 | 1. Contains the VITE_BACKEND env, which points to the backend server 5 | 2. Build step for /dist/ files 6 | 7 | BACK END 8 | 9 | 1. /backend/settings.py 10 | 1. ALLOWED_HOSTS must allow the frontend 11 | 2. CORS_ALLOWED_ORIGINS must allow the frontend 12 | 3. Configure PostgreSQL database 13 | 2. Configure gunicorn to serve Django app -------------------------------------------------------------------------------- /readme.MD: -------------------------------------------------------------------------------- 1 | # Netdis 2 | ## Web based binary analysis 3 | 4 | ![Netdis user interface](netdis.png) 5 | 6 | Netdis is an open-source binary analysis tool powered by Ghidra. Upload files for disassembly, decompilation, control flow graphs and more, all in your browser. 7 | 8 | ## [https://netdis.org/](https://netdis.org/) 9 | 10 | ## Deployment configuration 11 | 12 | The address of the backend needs to be specified in the respective `frontend/.env.vite.dev` and `frontend/.env.vite.prod` files, depending on the type of deployment. For development, this can typically be left at `http://localhost:8000/`. When deploying to production, this must point to the address of the backend. Additionally, the Vite files are built using this environment variable, so it may not reflect until the build has completed and the built files are available in the `frontend/dist/` directory. 13 | 14 | The remaining environment variables are defined in the `.env.dev` and `.env.prod` files. When deploying to production, the `DJANGO_ALLOWED_HOSTS` and `CORS_ALLOWED_ORIGINS` must contain the address of the frontend, which is typically just the host, as it is being served via nginx on port 80. 15 | 16 | ### Deployment commands 17 | 18 | Deploy for development: 19 | 20 | `docker-compose -f docker-compose.yml -f docker-compose-dev.yml up --build` 21 | 22 | Deploy for production: 23 | 24 | `docker-compose -f docker-compose.yml -f docker-compose-prod.yml up --build` 25 | 26 | ### Database Migrations 27 | 28 | Between changes to any Django models, old PostgreSQL volumes will need to be deleted. Otherwise, there will be errors to non-existant fields. 29 | 30 | `docker volume rm netdis_postgres_data` --------------------------------------------------------------------------------