├── README.md └── lostfuzzer.sh /README.md: -------------------------------------------------------------------------------- 1 | # Automated URL Recon & DAST Scanning 2 | 3 | ## Overview 4 | This script automates the process of extracting, filtering, and testing passive URLs by using **gau** tool. It checks for live URLs and performs **DAST (Dynamic Application Security Testing)** using **nuclei**. 5 | 6 | ## 🚀 Why This Tool? 7 | 8 | ParamSpider can create **imbalanced URLs** like: 9 | ``` 10 | http://testphp.vulnweb.com/listproducts.php?artist=FUZZ&cat=FUZZ 11 | ``` 12 | This breaks **Nuclei DAST** scans because every query needs a valid parameter. The URL has too many FUZZ placeholders. This makes it harder for Nuclei to properly process and test each parameter because valid query structures are needed for effective scanning.also i did'nt used any active crawler tool bcz thats takes lots of time to get live urls from targets. 13 | 14 | That’s why I built this custom tool to extract only valid URLs with full query parameters, ensuring they are correctly formatted for security testing. 15 | 16 | ### 🛠️ What This Tool Does: 17 | ✅ **Extracts valid URLs** with real query parameters 18 | ✅ **Removes imbalanced/fuzzed queries** 19 | ✅ **Checks live URLs** before scanning 20 | ✅ **Runs Nuclei DAST properly** for accurate results 21 | 22 | This makes **bug hunting faster, cleaner, and more effective!** 🚀 23 | 24 | ## Prerequisites 25 | Ensure the following tools are installed before running the script: 26 | 27 | - [`gau`](https://github.com/lc/gau) 28 | - [`uro`](https://github.com/s0md3v/uro) 29 | - [`nuclei`](https://github.com/projectdiscovery/nuclei) 30 | - [`httpx-toolkit`](https://github.com/projectdiscovery/httpx) 31 | 32 | ## Installation 33 | Clone the repository and navigate into it: 34 | ```bash 35 | git clone https://github.com/coffinxp/lostfuzzer.git 36 | cd lostfuzzer 37 | ``` 38 | Make the script executable: 39 | ```bash 40 | chmod +x lostfuzzer.sh 41 | ``` 42 | 43 | ## Usage 44 | Run the script and follow the prompts: 45 | ```bash 46 | ./lostfuzzer.sh 47 | ``` 48 | You'll be asked to provide: 49 | - A **target domain** or a **file** containing a list of subdomains 50 | 51 | The script will: 52 | 1. Fetch passive URLs by **gau** tool in parallel if there are multiple subdomains 53 | 2. Filter URLs containing query parameters 54 | 3. Check which URLs are live using **httpx-toolkit** 55 | 4. Run **nuclei** for **DAST scanning** 56 | 5. Save results for manual testing 57 | 58 | ## Output Files 59 | - `filtered_urls.txt`: Filtered URLs with query parameters for further manual testing 60 | - `nuclei_results.txt`: Results of the DAST scan 61 | 62 | ## Example Output 63 | ![Screenshot (1207)](https://github.com/user-attachments/assets/d663b424-2a89-4439-b54e-ba54e7397e21) 64 | 65 | ## Disclaimer 66 | This tool is intended for **educational and legal security testing purposes only**. The author is not responsible for any misuse of this script. 67 | -------------------------------------------------------------------------------- /lostfuzzer.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | # ANSI color codes 4 | RED='\033[91m' 5 | GREEN='\033[92m' 6 | RESET='\033[0m' 7 | 8 | # ASCII art banner 9 | echo -e "${RED}" 10 | cat << "EOF" 11 | ______ _____________ 12 | ___ /______________ /___ __/___ _________________________ 13 | __ /_ __ \_ ___/ __/_ /_ _ / / /__ /__ /_ _ \_ ___/ 14 | _ / / /_/ /(__ )/ /_ _ __/ / /_/ /__ /__ /_/ __/ / 15 | _/ \____//____/ \__/ /_/ \__,_/ _____/____/\___//_/ 16 | 17 | by ~/.coffinxp@lostsec 18 | EOF 19 | echo -e "${RESET}" 20 | 21 | # Ensure required tools are installed 22 | REQUIRED_TOOLS=("gau" "uro" "httpx-toolkit" "nuclei") 23 | for tool in "${REQUIRED_TOOLS[@]}"; do 24 | if ! command -v "$tool" &>/dev/null; then 25 | echo -e "${RED}[ERROR] $tool is not installed. Please install it and try again.${RESET}" 26 | exit 1 27 | fi 28 | done 29 | 30 | # Ask the user for the domain or subdomains list file 31 | read -p "Enter the target domain or subdomains list file: " INPUT 32 | if [ -z "$INPUT" ]; then 33 | echo -e "${RED}[ERROR] Input cannot be empty.${RESET}" 34 | exit 1 35 | fi 36 | 37 | # Determine if input is a file or single domain 38 | if [ -f "$INPUT" ]; then 39 | TARGETS=$(cat "$INPUT") 40 | else 41 | TARGETS="$INPUT" 42 | fi 43 | 44 | # Remove protocols (http/https) if present 45 | TARGETS=$(echo "$TARGETS" | sed 's|https\?://||g') 46 | 47 | # Create temporary files 48 | GAU_FILE=$(mktemp) 49 | FILTERED_URLS_FILE="filtered_urls.txt" 50 | NUCLEI_RESULTS="nuclei_results.txt" 51 | 52 | # Step 1: Fetch URLs in Parallel using xargs 53 | echo -e "${GREEN}[INFO] Fetching URLs using gau in parallel...${RESET}" 54 | echo "$TARGETS" | xargs -P10 -I{} sh -c 'gau "{}" >> "$1"' _ "$GAU_FILE" 55 | 56 | # Step 2: Filter URLs with query parameters 57 | echo -e "${GREEN}[INFO] Filtering URLs with query parameters...${RESET}" 58 | grep -E '\?[^=]+=.+$' "$GAU_FILE" | uro | sort -u > "$FILTERED_URLS_FILE" 59 | 60 | # Step 3: Check live URLs using httpx 61 | echo -e "${GREEN}[INFO] Checking for live URLs using httpx-toolkit...${RESET}" 62 | httpx-toolkit -silent -t 300 -rl 200 < "$FILTERED_URLS_FILE" > "$FILTERED_URLS_FILE.tmp" 63 | mv "$FILTERED_URLS_FILE.tmp" "$FILTERED_URLS_FILE" 64 | 65 | # Step 4: Run nuclei for DAST scanning 66 | echo -e "${GREEN}[INFO] Running nuclei for DAST scanning...${RESET}" 67 | nuclei -dast -retries 2 -silent -o "$NUCLEI_RESULTS" < "$FILTERED_URLS_FILE" 68 | 69 | # Step 5: Show saved results 70 | echo -e "${GREEN}[INFO] Nuclei results saved to $NUCLEI_RESULTS${RESET}" 71 | echo -e "${GREEN}[INFO] Filtered URLs saved to $FILTERED_URLS_FILE for manual testing.${RESET}" 72 | echo -e "${GREEN}[INFO] Automation completed successfully!${RESET}" 73 | 74 | # Check if Nuclei found any vulnerabilities 75 | if [ ! -s "$NUCLEI_RESULTS" ]; then 76 | echo -e "${GREEN}[INFO] No vulnerable URLs found.${RESET}" 77 | else 78 | echo -e "${GREEN}[INFO] Vulnerabilities were detected. Check $NUCLEI_RESULTS for details.${RESET}" 79 | fi 80 | --------------------------------------------------------------------------------