├── .config └── .aws ├── .gitbook └── assets │ ├── ChatGPT acquistion.png │ ├── DNS bruteforcing.png │ ├── Excel sheet screenshot.png │ ├── Github favicon.png │ ├── Gotator.png │ ├── Recursive Enumeration.png │ ├── Subfinder With.png │ ├── TLS.png │ ├── TLscert_yahoo.png │ ├── Vhost bruteforcing.png │ ├── asnip.png │ ├── axiomxreconftw.png │ ├── carbon-2-.png │ ├── carbon-6-.png │ ├── cero.png │ ├── ceroo(1).png │ ├── circle-cropped.png │ ├── copy-of-copy-of-copy-of-webscraping_meme.png │ ├── crt.png │ ├── crunchbase (1).png │ ├── crunchbase.png │ ├── csp.png │ ├── ctfr.png │ ├── dnscewl.png │ ├── dnsvalidator1.png │ ├── dnvalidator.png │ ├── download.jpg │ ├── enumeration-2-.png │ ├── excelsheet.png │ ├── favUp(1).png │ ├── favicon.png │ ├── favicontool.png │ ├── googlenalytics.png │ ├── gospider.png │ ├── hosthunter.png │ ├── httpx.png │ ├── hurricane.png │ ├── index.png │ ├── jhaddixtweet.png │ ├── memesss.png │ ├── permutations.png │ ├── ptr.png │ ├── purednsb.png │ ├── reconftw_logo.png │ ├── resolve.png │ ├── shodanfavicon.png │ ├── subdomains.png │ ├── subfinder without(1).png │ ├── subfinder without.png │ ├── subfinderconfig.png │ ├── subfinderwithout.png │ ├── tls probing.png │ ├── twitter-logo.png │ ├── untitled-design-1-.png │ ├── whoiss.png │ ├── whoisxml.png │ └── whoxyrm.png ├── CONTRIBUTING.md ├── README.md ├── SUMMARY.md ├── active-enumeration ├── dns-bruteforcing.md ├── google-analytics.md ├── js-file-scraping.md ├── other-methods.md ├── permutation-alterations.md ├── tls-csp-cname.md └── vhost-probing.md ├── automation.md ├── introduction ├── prequisites.md └── whats-the-need.md ├── passive-enumeration ├── certificate-logs.md ├── passive-sources.md └── recursive-enumeration.md ├── types ├── horizontal-enumeration.md └── vertical-enumeration.md └── web-probing.md /.config/.aws: -------------------------------------------------------------------------------- 1 | Config { 2 | AWSAccessKeyId=AKIA4VFHDSG67Y5QTMQS 3 | AWSSecretKey=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 4 | } 5 | -------------------------------------------------------------------------------- /.gitbook/assets/ChatGPT acquistion.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ChatGPT acquistion.png -------------------------------------------------------------------------------- /.gitbook/assets/DNS bruteforcing.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/DNS bruteforcing.png -------------------------------------------------------------------------------- /.gitbook/assets/Excel sheet screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Excel sheet screenshot.png -------------------------------------------------------------------------------- /.gitbook/assets/Github favicon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Github favicon.png -------------------------------------------------------------------------------- /.gitbook/assets/Gotator.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Gotator.png -------------------------------------------------------------------------------- /.gitbook/assets/Recursive Enumeration.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Recursive Enumeration.png -------------------------------------------------------------------------------- /.gitbook/assets/Subfinder With.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Subfinder With.png -------------------------------------------------------------------------------- /.gitbook/assets/TLS.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/TLS.png -------------------------------------------------------------------------------- /.gitbook/assets/TLscert_yahoo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/TLscert_yahoo.png -------------------------------------------------------------------------------- /.gitbook/assets/Vhost bruteforcing.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Vhost bruteforcing.png -------------------------------------------------------------------------------- /.gitbook/assets/asnip.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/asnip.png -------------------------------------------------------------------------------- /.gitbook/assets/axiomxreconftw.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/axiomxreconftw.png -------------------------------------------------------------------------------- /.gitbook/assets/carbon-2-.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/carbon-2-.png -------------------------------------------------------------------------------- /.gitbook/assets/carbon-6-.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/carbon-6-.png -------------------------------------------------------------------------------- /.gitbook/assets/cero.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/cero.png -------------------------------------------------------------------------------- /.gitbook/assets/ceroo(1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ceroo(1).png -------------------------------------------------------------------------------- /.gitbook/assets/circle-cropped.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/circle-cropped.png -------------------------------------------------------------------------------- /.gitbook/assets/copy-of-copy-of-copy-of-webscraping_meme.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/copy-of-copy-of-copy-of-webscraping_meme.png -------------------------------------------------------------------------------- /.gitbook/assets/crt.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/crt.png -------------------------------------------------------------------------------- /.gitbook/assets/crunchbase (1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/crunchbase (1).png -------------------------------------------------------------------------------- /.gitbook/assets/crunchbase.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/crunchbase.png -------------------------------------------------------------------------------- /.gitbook/assets/csp.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/csp.png -------------------------------------------------------------------------------- /.gitbook/assets/ctfr.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ctfr.png -------------------------------------------------------------------------------- /.gitbook/assets/dnscewl.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/dnscewl.png -------------------------------------------------------------------------------- /.gitbook/assets/dnsvalidator1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/dnsvalidator1.png -------------------------------------------------------------------------------- /.gitbook/assets/dnvalidator.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/dnvalidator.png -------------------------------------------------------------------------------- /.gitbook/assets/download.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/download.jpg -------------------------------------------------------------------------------- /.gitbook/assets/enumeration-2-.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/enumeration-2-.png -------------------------------------------------------------------------------- /.gitbook/assets/excelsheet.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/excelsheet.png -------------------------------------------------------------------------------- /.gitbook/assets/favUp(1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/favUp(1).png -------------------------------------------------------------------------------- /.gitbook/assets/favicon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/favicon.png -------------------------------------------------------------------------------- /.gitbook/assets/favicontool.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/favicontool.png -------------------------------------------------------------------------------- /.gitbook/assets/googlenalytics.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/googlenalytics.png -------------------------------------------------------------------------------- /.gitbook/assets/gospider.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/gospider.png -------------------------------------------------------------------------------- /.gitbook/assets/hosthunter.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/hosthunter.png -------------------------------------------------------------------------------- /.gitbook/assets/httpx.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/httpx.png -------------------------------------------------------------------------------- /.gitbook/assets/hurricane.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/hurricane.png -------------------------------------------------------------------------------- /.gitbook/assets/index.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/index.png -------------------------------------------------------------------------------- /.gitbook/assets/jhaddixtweet.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/jhaddixtweet.png -------------------------------------------------------------------------------- /.gitbook/assets/memesss.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/memesss.png -------------------------------------------------------------------------------- /.gitbook/assets/permutations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/permutations.png -------------------------------------------------------------------------------- /.gitbook/assets/ptr.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ptr.png -------------------------------------------------------------------------------- /.gitbook/assets/purednsb.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/purednsb.png -------------------------------------------------------------------------------- /.gitbook/assets/reconftw_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/reconftw_logo.png -------------------------------------------------------------------------------- /.gitbook/assets/resolve.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/resolve.png -------------------------------------------------------------------------------- /.gitbook/assets/shodanfavicon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/shodanfavicon.png -------------------------------------------------------------------------------- /.gitbook/assets/subdomains.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subdomains.png -------------------------------------------------------------------------------- /.gitbook/assets/subfinder without(1).png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinder without(1).png -------------------------------------------------------------------------------- /.gitbook/assets/subfinder without.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinder without.png -------------------------------------------------------------------------------- /.gitbook/assets/subfinderconfig.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinderconfig.png -------------------------------------------------------------------------------- /.gitbook/assets/subfinderwithout.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinderwithout.png -------------------------------------------------------------------------------- /.gitbook/assets/tls probing.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/tls probing.png -------------------------------------------------------------------------------- /.gitbook/assets/twitter-logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/twitter-logo.png -------------------------------------------------------------------------------- /.gitbook/assets/untitled-design-1-.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/untitled-design-1-.png -------------------------------------------------------------------------------- /.gitbook/assets/whoiss.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/whoiss.png -------------------------------------------------------------------------------- /.gitbook/assets/whoisxml.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/whoisxml.png -------------------------------------------------------------------------------- /.gitbook/assets/whoxyrm.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/whoxyrm.png -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | Feel free make Pull-Requests to this repo 2 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | --- 2 | description: Comprehensive Subdomain Enumeration Guide 3 | --- 4 | 5 | # Home 🏠 6 | 7 | ## Subdomain Enumeration Guide 2023 📖 8 | 9 | This guide contains all the needed knowledge for performing a good subdomain enumeration. I have tried to cover each technique and explained it from a beginner's perspective. Each of the techniques used has a detailed explanation about why this technique was used and how to perform them. I have tried to link various gists, charts, statistics for a better understanding of the concept. 10 | 11 | There exists various tools over the internet which perform the same tasks. So, I have tried to include only those tools that yield the best results compared to other tools from the same category. 12 | 13 | I encourage y'all to go through this guide and try to build your **own reconnaissance methodology**. I believe each one should have their own methodology and keep trying out new things and find out which fits best for them. 14 | 15 | I'm too a beginner in this field and have tried my best to explain the right concepts. If you think any of the content is wrongly explained, I am always open to listening to you. 16 | 17 | Last but not the least, I would like to thank [**six2dez**](https://twitter.com/Six2dez1) for supporting and helping me during my learning phase, who's outcome you can see in this guide. 18 | 19 | 20 | 21 | **Twitter**: [@sidxparab](https://twitter.com/sidxparab) 22 | 23 | **LinkedIn**: [@sidxparab](https://www.linkedin.com/in/sidxparab/) 24 | 25 | **Medium**: [@sidxparab](https://medium.com/@sidxparab) 26 | 27 | ## Support 🙏 :heart: 28 | 29 | ### **You can support my work by buying me a coffee** ☕ 30 | 31 | {% embed url="https://buymeacoffee.com/siddheshparab" %} 32 | 33 | **Doitdoitdoitdoitdoitdoit....** 😉 34 | 35 | 36 | 37 | -------------------------------------------------------------------------------- /SUMMARY.md: -------------------------------------------------------------------------------- 1 | # Table of contents 2 | 3 | * [Home 🏠](README.md) 4 | 5 | ## Introduction 6 | 7 | * [What's the need ?🤔](introduction/whats-the-need.md) 8 | * [Prerequisites](introduction/prequisites.md) 9 | 10 | ## Types 11 | 12 | * [Horizontal Enumeration](types/horizontal-enumeration.md) 13 | * [Vertical Enumeration](types/vertical-enumeration.md) 14 | 15 | ## Passive Techniques 16 | 17 | * [Passive Sources](passive-enumeration/passive-sources.md) 18 | * [Certificate Logs](passive-enumeration/certificate-logs.md) 19 | * [Recursive Enumeration](passive-enumeration/recursive-enumeration.md) 20 | 21 | ## Active Techniques 22 | 23 | * [DNS Bruteforcing](active-enumeration/dns-bruteforcing.md) 24 | * [Permutation/Alterations](active-enumeration/permutation-alterations.md) 25 | * [Scraping(JS/Source code)](active-enumeration/js-file-scraping.md) 26 | * [Google analytics](active-enumeration/google-analytics.md) 27 | * [TLS, CSP, CNAME Probing](active-enumeration/tls-csp-cname.md) 28 | * [VHOST probing](active-enumeration/vhost-probing.md) 29 | 30 | *** 31 | 32 | * [Web probing](web-probing.md) 33 | * [Automation 🤖](automation.md) 34 | -------------------------------------------------------------------------------- /active-enumeration/dns-bruteforcing.md: -------------------------------------------------------------------------------- 1 | # DNS Bruteforcing 2 | 3 | ## What is DNS bruteforcing? 4 | 5 | In simple terms DNS bruteforcing is a technique where, we prepend a long list of common subdomains names to our target domain and try to DNS resolve this new list in hope to find valid subdomains of our target domain. 6 | 7 | This is what happens during DNS bruteforcing: 8 | 9 | * **admin** ----> **admin**.example.com 10 | * **internal.dev** ----> **internal.dev**.example.com 11 | * **secret** ----> **secret**.example.com 12 | * **backup01** ----> **backup01**.example.com 13 | 14 | Now that we have a list of probable domain names that could exists, we can perform DNS resolution on this domain list. This would yield us live subdomains. After this process, if any of these subdomains is found valid, it's a win-win situation for us. 15 | 16 |
17 | 18 | ### Why do we perform subdomain bruteforcing? 19 | 20 | At times passive DNS data doesn't give all the hosts/subdomains associated with our target. Also, there would some newer subdomains that still wouldn't have been crawled by the internet crawlers. In such a case subdomain bruteforcing proves beneficial. 21 | 22 | Earlier DNS zone transfer vulnerabilities were the key to get the whole DNS zone data of a particular organization. But lately, the DNS servers have been secured and zone transfers are found very rarely. 23 | 24 | 25 | 26 | 27 | 28 | ## 🔧Tool: 29 | 30 | ### Puredns 31 | 32 | * **Author:** [d3mondev](https://github.com/d3mondev) 33 | * **Language**: Go 34 | * **Features**: DNS Bruteforcing & Resolution 35 | 36 | [**Puredns** ](https://github.com/d3mondev/puredns)outperforms the work of DNS bruteforcing & resolving millions of domains at once. There exists various open-source tools, but puredns is the best in terms of speed & accuracy of the results produced. 37 | 38 | ### ⚙️How Puredns works: 39 | 40 | **1) Sanitize the input wordlist** 41 | 42 | The input wordlist is first sanitized to include only valid characters(`[a-z0-9.-]`) and sets each individual line to lowercase. 43 | 44 | **2) Mass resolve using the public resolvers** 45 | 46 | To perform mass DNS resolution of millions of domains at a high speed puredns uses [**Massdns**](https://github.com/blechschmidt/massdns) as its base tool. Massdns is responsible for validating all the domains in the list against the set of DNS resolvers provided and return only the alive subdomains. This is generally performed at an unlimited rate and generates a huge amount of traffic. 47 | 48 | **3) Wildcard detection** 49 | 50 | Effective detection of wildcards are key to get accurate results. If wildcards arent detected than the tool outputs a lot of false-positives. But puredns holds good capability over wildcard detection. Hence, it rarely outputs false positives. 51 | 52 | **4) Validating results with trusted resolvers** 53 | 54 | Once, all the alive subdomains are found, puredns again runs the DNS resolution process over the obtained lists again in order to filter out false-positives. But the catch here is that, this DNS resolution process uses "[**Trusted DNS resolvers**](https://raw.githubusercontent.com/six2dez/resolvers\_reconftw/main/resolvers\_trusted.txt)" inorder to verify the results for the final time. This double resolution process helps in discarding those false-positive results. The main advantage of using Trusted DNS resolvers like Google DNS (`8.8.8.8` , `8.8.4.4`), Cloudflare(`1.1.1.1`) is to avoid DNS poisoned responses or other discrepancies that normal resolvers cause. 55 | 56 | ### Installing Puredns: 57 | 58 | Since this tool is written in Go, your Go environment should be configured properly. 59 | 60 |
#Prerequisites
 61 | git clone https://github.com/blechschmidt/massdns.git
 62 | cd massdns
 63 | make
 64 | sudo make install
 65 | 
 66 | #Installing the tool
 67 | go install github.com/d3mondev/puredns/v2@latest
 68 | 
69 | 70 | ### Running Puredns: 71 | 72 | Before we start using puredns for bruteforcing we need to generate our public DNS resolvers list. For this, we will use a tool called [dnsvalidator](https://github.com/vortexau/dnsvalidator). Check [my previous page](https://app.gitbook.com/@sidxparab/s/subdomain-enumeration-guide/introduction/prequisites#2-100-accurate-public-dns-resolvers) to know more about public DNS resolvers and why they are important. 73 | 74 | ```bash 75 | git clone https://github.com/vortexau/dnsvalidator.git 76 | cd dnsvalidator/ 77 | pip3 install -r requirements.txt 78 | pip3 install setuptools==58.2.0 79 | python3 setup.py install 80 | ``` 81 | 82 | **Generating list of open public DNS resolvers** 83 | 84 | It's very important to note that even if one of your public resolver is failing/not working you have a greater chance of missing an important subdomain. Hence, it's always advised that you generate a fresh public DNS resolvers list before execution. 85 | 86 | ```bash 87 | dnsvalidator -tL https://public-dns.info/nameservers.txt -threads 100 -o resolvers.txt 88 | ``` 89 | 90 | ![](../.gitbook/assets/dnsvalidator1.png) 91 | 92 | #### Downloading a pre-populated list of valid DNS resolvers: 93 | 94 | Various open source contributors like [proabiral](https://github.com/proabiral/Fresh-Resolvers), [cxosmo](https://github.com/cxosmo/dns-resolvers), [janmasarik ](https://github.com/janmasarik/resolvers)have set up their GitHub-Actions or VPS in order to generate valid public DNS resolvers periodically(every 24hrs). We can make use of these DNS resolvers rather than generating our own resolvers using dnsvalidator which consumes alot of time. To aggregate all of these efforts [Trickest](https://github.com/trickest) have come up with their own repository called [**resolvers**](https://github.com/trickest/resolvers). It is a merged list of all the DNS resolvers, which they validate every 24 hours. 95 | 96 | ```bash 97 | wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers.txt 98 | ``` 99 | 100 | 101 | 102 | Now that we have generated our public DNS resolver we are good to move ahead and perform subdomain bruteforcing using puredns. 103 | 104 | ```bash 105 | puredns bruteforce wordlist.txt example.com -r resolvers.txt -w output.txt 106 | ``` 107 | 108 | **Flags:** 109 | 110 | * **bruteforce** - use the bruteforcing mode 111 | * **r** - Specify your public resolvers 112 | * **w** - Output filename 113 | 114 | ![](../.gitbook/assets/purednsb.png) 115 | 116 | {% hint style="success" %} 117 | While performing DNS queries sometimes we receive **SERVFAIL** error. Puredns by default retries on SERVFAIL while most tools don't. 118 | {% endhint %} 119 | 120 | ### Which wordlist :page\_facing\_up: to use? 121 | 122 | The whole effort of DNS bruteforcing is a waste if you don't use a good subdomain bruteforcing wordlist. Selection of the wordlist is the most important aspect of bruteforcing. Let's have a look at some great wordlists:-\ 123 | \ 124 | **1) Assetnote** [**best-dns-wordlist.txt**](https://wordlists-cdn.assetnote.io/data/manual/best-dns-wordlist.txt) (**9 Million**) ⭐\ 125 | [Assetnote](https://wordlists.assetnote.io/) wordlists are the best. No doubt this is the best subdomain bruteforcing wordlist. But highly recommended that you run this in your VPS. Running on a home system will take hours also the results wouldn't be accurate. This wordlist will definitely give you those hidden subdomains. 126 | 127 | **2) n0kovo** [**n0kovo\_subdomains\_huge.txt**](https://github.com/n0kovo/n0kovo\_subdomains/blob/main/n0kovo\_subdomains\_huge.txt) (**3 Million**)\ 128 | [N0kovo ](https://github.com/n0kovo)created this wordlist by scanning the whole IPv4 and collecting all the subdomain names from the TLS certificates. You can check out [this blog](https://n0kovo.github.io/posts/subdomain-enumeration-creating-a-highly-efficient-wordlist-by-scanning-the-entire-internet/#benchmarking-) to see how good this bruteforcing wordlist performs as compared other big wordlists. So, if you are target contains a lot of wildcards this would be best wordlist for bruteforcing_(considering the computation bottleneck for wildcard filtering)._ 129 | 130 | **3) Smaller** [**wordlist**](https://gist.github.com/six2dez/a307a04a222fab5a57466c51e1569acf/raw) (**102k** )\ 131 | Created by [six2dez](https://github.com/six2dez) is suitable to be run if you are using your personal computer which is consuming your home wifi router internet.\ 132 | 133 | 134 | 135 | 136 | ### 🙁Problems faced during subdomain bruteforcing 137 | 138 | #### 1) Wildcard filtering 139 | 140 | A wildcard DNS record is a record that matches requests for non-existent domain names. Wildcards are denoted by specifying a **`*`** of the left part of a domain name such as **\*.example.com.** That means even if a subdomain doesn't vlid it will return a valid response. See the example below:- 141 | 142 | **doesntexists.example.com** ----> **valid** 143 | 144 | **Strange right?** So in short, if a domain is a wildcard domain we will get all valid responses(false positives) while bruteforcing and wouldn't be able to differentiate which are valid and which aren't. To avoid this various wildcard filtering techniques are used by subdomain bruteforcing tools. 145 | 146 | **2) Open Public resolvers** 147 | 148 | While bruteforcing we tend to use a long wordlist of common subdomain names to get those hidden domains, hence the domains to be resolved will also be large. Such large resolutions cannot be performed by your system's DNS resolver, hence we depend on freely available public resolvers. Also, using public resolvers eliminates the changes of DNS rate limits. 149 | 150 | We can get the list of open public DNS resolvers from here [https://public-dns.info/nameservers.txt](https://public-dns.info/nameservers.txt) 151 | 152 | {% hint style="info" %} 153 | :book: Read [**this** ](https://app.gitbook.com/@sidxparab/s/subdomain-enumeration-guide/introduction/prequisites#2-100-accurate-public-dns-resolvers)article on why how to create public resolvers and they are important 154 | {% endhint %} 155 | 156 | **3) Bandwidth** 157 | 158 | While performing subdomain bruteforcing [massdns](https://github.com/blechschmidt/massdns) is used as a base tool for DNS querying at very high concurrent rates. For this, the underlying system should also possess a higher bandwidth. 159 | 160 | 161 | 162 | ## :punch:Issues faced and how to overcome them: 163 | 164 | #### 1) Crashes on low specs( 1cpu/1gb vps) 165 | 166 | Usually, if you provide a very large bruteforce wordlist and your target domain contains significant wildcards then sometimes puredns crashes out due to less memory while filtering out wildcards. To overcome this issue you can use **`--wildcard-batch 1000000`** flag. By default, puredns puts all the domains in a single batch to save on the number of DNS queries and execution time. Using this flag takes in a batch of only 1 million subdomains at a time for wildcard filtering and after completion of the task takes in the next batch for wildcard filtering. 167 | 168 | **2) Puredns kills my home router** 169 | 170 | Massdns is the one to be blamed for. Massdns tries to perform DNS resolution using public resolvers at an unlimited rate. This generates very large traffic consuming up the whole bandwidth, thus making other applications laggy/unresponsive. To overcome this you can use the **`-l`** flag which rate limits the number of DNS queries to public resolvers. It's advisable that you set the value anywhere between `2000-10000` 171 | 172 | 173 | 174 | -------------------------------------------------------------------------------- /active-enumeration/google-analytics.md: -------------------------------------------------------------------------------- 1 | # Google analytics 2 | 3 | Most organizations use [Google Analytics](https://analytics.google.com/analytics/web/) to track website visitors and for more statistics. Generally, they have the same Google Analytics ID across all subdomains of a root domain. This means we can perform a reverse search and find all the subdomains having the same ID. Hence, it helps us in the enumeration process. 4 | 5 | Most people might be familiar with a browser extension called [**BuiltWidth**](https://builtwith.com/toolbar)**.** But using this extension or its website is a manual process. We need some sort of command-line utility. That's when **AnalyticsRelationships** comes to the rescue. 6 | 7 | ## Tool: 8 | 9 | ### [AnalyticsRelationships](https://github.com/Josue87/AnalyticsRelationships) 10 | 11 | * **Author**: [Josué Encinar](https://github.com/Josue87) 12 | * **Language**: Go/Python 13 | 14 | **AnalyticsRelationships** is a tool to enumerate subdomains via Google Analytics ID. It does not require any login and has the capability to bypass the [BuiltWidth ](https://builtwith.com/)& [HackerTarget ](https://hackertarget.com/)captchas. This tool is available in 2 languages Python & Go. But the Go one is faster compared to the python one. 15 | 16 | ### Installation: 17 | 18 | ```bash 19 | git clone https://github.com/Josue87/AnalyticsRelationships.git 20 | cd AnalyticsRelationships/GO 21 | go build -ldflags "-s -w" 22 | ``` 23 | 24 | ### Running: 25 | 26 | * The output may contain false positives. 27 | * Also, you need to further DNS resolve them in order to get the valid ones. 28 | 29 | ```bash 30 | ./analyticsrelationships --url https://www.bugcrowd.com 31 | ``` 32 | 33 | ![](../.gitbook/assets/googlenalytics.png) 34 | 35 | 36 | 37 | -------------------------------------------------------------------------------- /active-enumeration/js-file-scraping.md: -------------------------------------------------------------------------------- 1 | # Scraping\(JS/Source code\) 2 | 3 | ## Source Code Recon 4 | 5 | JavaScript files are used by modern web applications to provide dynamic content which contains various functions & events. Each website includes JS files and are a great resource for finding those internal subdomains used by the organization. 6 | 7 | ## Tools: 🛠 8 | 9 | ### 1\) [Gospider](https://github.com/jaeles-project/gospider) 10 | 11 | * **Author**: [Jaeles](https://github.com/jaeles-project) 12 | * **Language**: Go 13 | 14 | [**Gospider**](https://github.com/jaeles-project/gospider) is a fast web spidering tool capable of crawling the whole website within in a short amount of time. This means gospider will visit/scrap each and every URL mentioned in the JS file and source code. So, since source code & JS files make up a website they may contain links to other subdomains too. 15 | 16 | ### Installation: 17 | 18 | ```bash 19 | go get -u github.com/jaeles-project/gospider 20 | ``` 21 | 22 | **This is a long process so Brace yourself !!!** 💪 23 | 24 | ### Running: 25 | 26 | This process is divided into3⃣steps: 27 | 28 | ### 1\) Web probing subdomains 29 | 30 | * Since we are crawling a website, gospider excepts us to provide URL's, which means in the form of `http://` `https://` 31 | * So first, we need to web probe all the subdomains we have gathered till now. For this purpose, we will use [**httpx**](https://github.com/projectdiscovery/httpx) . 32 | * So, lets first web probe the subdomains: 33 | 34 | ```bash 35 | cat subdomains.txt | httpx -random-agent -retries 2 -no-color -o probed_tmp_scrap.txt 36 | ``` 37 | 38 | * Now, that we have web probed URLs, we can send them for crawling to gospider. 39 | 40 | ```bash 41 | gospider -S probed_tmp_scrap.txt --js -t 50 -d 3 --sitemap --robots -w -r > gospider.txt 42 | ``` 43 | 44 | {% hint style="danger" %} 45 | **Caution**: This generates huge traffic on your target 46 | {% endhint %} 47 | 48 | #### Flags: 49 | 50 | * **S** - Input file 51 | * **js** - Find links in JavaScript files 52 | * **t** - Number of threads \(Run sites in parallel\) \(default 1\) 53 | * **d** - depth \(3 depth means scrap links from second-level JS files\) 54 | * **sitemap** - Try to crawl sitemap.xml 55 | * **robots** - Try to crawl robots.txt 56 | 57 | ![](../.gitbook/assets/gospider.png) 58 | 59 | ### 2\) Cleaning the output 60 | 61 | > The parth portion of an URL shouldn't have more than 2048 characters. Since, we gopsider 62 | > 63 | > ```bash 64 | > sed -i '/^.\{2048\}./d' gospider.txt 65 | > ``` 66 | 67 | The Point to note here is we have got URLs from JS files & source code till now. We are only concerned with subdomains. Hence we must just extract subdomains from the Gospider output. 68 | 69 | This can be done using Tomnomnom's [**unfurl**](https://github.com/tomnomnom/unfurl) tool. It takes a list of URLs as input and extracts the subdomain/domain part from them. 70 | You can install **Unfurl** using this command `go get -u github.com/tomnomnom/unfurl` 71 | 72 | ```bash 73 | cat gospider.txt | grep -Eo 'https?://[^ ]+' | sed 's/]$//' | unfurl -u domains | grep ".example.com$" | sort -u scrap_subs.txt 74 | ``` 75 | 76 | **Break down of the command:** 77 | **a\)** grep - Extract the links that start with http/https 78 | **b\)** sed - Remove " \] " at the end of line 79 | **c\)** unfurl - Extract domain/subdomain from the urls 80 | **d\)** grep - Only select subdomains of our target 81 | **e\)** sort - Avoid duplicates 82 | 83 | ### 3\) Resolving our target subdomains 84 | 85 | * Now that we have all the subdomains of our target, it's time to DNS resolve and check for valid subdomains. 86 | 87 | \( hoping you have seen the previous techniques, and you know how to run puredns\) 88 | 89 | ```bash 90 | puredns resolve scrap_subs.txt -w scrap_subs_resolved.txt -r resolvers.txt 91 | ``` 92 | 93 | I love this technique as, it also finds hidden Amazon S3 buckets used by the organization.If such buckets are open and expose sensitive data than its a WIN WIN situation for us. 94 | Also the ouput of this can be sent to secretfinder tool, whihc can find hidden secrets,exposed api tokens etc. 95 | 96 | ![](../.gitbook/assets/copy-of-copy-of-copy-of-webscraping_meme.png) 97 | 98 | -------------------------------------------------------------------------------- /active-enumeration/other-methods.md: -------------------------------------------------------------------------------- 1 | # Other methods 2 | 3 | -------------------------------------------------------------------------------- /active-enumeration/permutation-alterations.md: -------------------------------------------------------------------------------- 1 | # Permutation/Alterations 2 | 3 | It is almost similar to the previous DNS wordlist bruteforcing but instead of simply performing a dictionary attack we generate combinations/permutations of the already known subdomains. 4 | 5 | One more thing to be noted here is, we also need a small wordlist with us in this method, which would contain common words like `mail` , `internal`, `dev`, `demo`, `accounts`, `ftp`, `admin`(similar to DNS bruteforcing but smaller) 6 | 7 | For instance, let's consider a subdomain **`dev.example.com`** . Now we will generate different variations/permutations of this domain. 8 | 9 | ![](../.gitbook/assets/permutations.png) 10 | 11 | Isn't it good that we can generate such great combinations? This is the power of permutation bruteforcing. Now that we have generated these combinations, we further need to DNS resolve them and check if we get any valid subdomains. If so it would be a WIN ! WIN ! 🏁 situation for us. 12 | 13 | ## Tools: 14 | 15 | ### [Gotator](https://github.com/Josue87/gotator) 16 | 17 | * **Author:** [Josué Encinar](https://github.com/Josue87) 18 | * **Language**: Go 19 | 20 | Gotator is DNS wordlist generator tool. It is used to generate various combinations or permutations of a root domain with the user-supplied wordlist. Capable of generating 1M combinations in almost 2 secs. 21 | 22 | ### Features: 23 | 24 | * Permute numbers up and down (**dev2** --> `dev0`, `dev1`, `dev2`, `dev3`,`dev4`) 25 | * 3 levels of depth (**`dev`.`demo`.`admin`.**example.com) 26 | * Controls generation of duplicate permutations 27 | * Option to add external permutation list 28 | * Option to permute amongst the subdomains 29 | 30 | ### Installation: 31 | 32 | ```bash 33 | go install -v github.com/Josue87/gotator@latest 34 | ``` 35 | 36 | ### Running: 37 | 38 | * First, we need to make a combined list of all the subdomains(valid/invalid) we collected from all the above steps whose permutations we will create. 39 | * To generate combinations you need to provide a small wordlist that contains common domain names like admin, demo, backup, api, ftp, email, etc. 40 | * [This](https://gist.githubusercontent.com/six2dez/ffc2b14d283e8f8eff6ac83e20a3c4b4/raw) is a good wordlist of 1K permutation words that we will need. 41 | * The below command generates a huge list of non-resolved subdomains. 42 | 43 | ``` 44 | gotator -sub subdomains.txt -perm permutations_list.txt -depth 1 -numbers 10 -mindup -adv -md > gotator1.txt 45 | ``` 46 | 47 | #### Flags: 48 | 49 | * **sub** - Specify subdomain list 50 | * **perm** - Specify permutation/append list 51 | * **depth** - Configure the depth 52 | * **numbers** - Configure the number of iterations to the numbers found in the permutations (up and down) 53 | * **mindup** - Set this flag to minimize duplicates. (For heavy workloads, it is recommended to activate this flag). 54 | * **md** - Extract 'previous' domains and subdomains from subdomains found in the list 'sub'. 55 | * **adv** - Advanced option. Generate permutations words with subdomains and words with -. And joins permutation word in the back 56 | 57 | ![](../.gitbook/assets/Gotator.png) 58 | 59 | ### Resolution: 60 | 61 | * Now that we have made a huge list of all the possible subdomains that could exist, now it's time to DNS resolve them and check for valid ones. 62 | * For this, we will again use [Puredns](https://github.com/d3mondev/puredns). 63 | * It's always better to generate fresh public DNS resolvers every time we use them. 64 | 65 | ```bash 66 | puredns resolve permutations.txt -r resolvers.txt 67 | ``` 68 | 69 | **Flags:** 70 | 71 | * **resolve** - Use resolution mode 72 | * **r** - public DNS resolvers list 73 | 74 | In such a way, we have got those strange name subdomains and increased our attack surface. 75 | 76 | ### 77 | -------------------------------------------------------------------------------- /active-enumeration/tls-csp-cname.md: -------------------------------------------------------------------------------- 1 | # TLS, CSP, CNAME Probing 2 | 3 | ## 1) TLS Probing 4 | 5 | Nowadays generally all websites communicate over HTTPS(HyperText Transfer Protocol Secure). In order to use HTTPS, the website owner needs to issue an SSL(Secure Socket Layer) certificate. 6 | 7 | This SSL/TLS(Transport Layer Security) certificate contains hostname belonging to the same organization. 8 | 9 | Clicking on the "Lock🔒" button in the address bar, you can view the TLS/SSL certificate of any website. 10 | 11 | ![Hackerone.com contain these subdomains in its TLS certificate](../.gitbook/assets/TLscert\_yahoo.png) 12 | 13 | 14 | 15 | For this purpose, we will be using a tool called [**Cero**](https://github.com/glebarez/cero) 16 | 17 | #### Installation: 18 | 19 | ```bash 20 | go install github.com/glebarez/cero@latest 21 | ``` 22 | 23 | #### Running: 24 | 25 | ```bash 26 | cero in.search.yahoo.com | sed 's/^*.//' | grep -e "\." | sort -u 27 | ``` 28 | 29 | ![](../.gitbook/assets/ceroo\(1\).png) 30 | 31 | ## 2) CSP Probing 32 | 33 | In order to defend from the XSS attacks as well as keeping in mind to allow cross-domain resource sharing in websites CSP(Content Security Policies) are used. These CSP headers sometimes contain domains/subdomains from where the content is usually imported. 34 | 35 | Hence, these subdomains can be helpful for us. In the below image we can see I extracted domains/subdomains from the CSP header of [twitter.com](https://twitter.com) 36 | 37 | ``` 38 | cat subdomains.txt | httpx -csp-probe -status-code -retries 2 -no-color | anew csp_probed.txt | cut -d ' ' -f1 | unfurl -u domains | anew -q csp_subdomains.txt 39 | ``` 40 | 41 | ![](../.gitbook/assets/csp.png) 42 | 43 | ## 3) CNAME Probing 44 | 45 | I personally came across 2-3 cases where visiting the CNAME of the website showed me the same website without a firewall. (I personally don't know why this happened) 46 | 47 | Since then I probe the CNAME's of the subdomains found. 48 | 49 | ``` 50 | dnsx -retry 3 -cname -l subdomains.txt 51 | ``` 52 | -------------------------------------------------------------------------------- /active-enumeration/vhost-probing.md: -------------------------------------------------------------------------------- 1 | # VHOST probing 2 | 3 | ## What is Virtual Host? 4 | 5 | VHOST(Virtual Host) refers to the practice of running more than one website (such as `company1.example.com` and `company2.example.com`) on a single machine. 6 | 7 | There are mainly 2 types of Virtual hosts: 8 | 9 | 1. **IP-based Virtual Host:** 10 | 11 | In IP-based Virtual Host, we have different IP addresses for every website. 12 | 2. **Name-based Virtual Host:**✔️ 13 | 14 | In named-based Virtual Host, several websites are hosted on the same IP. Mostly this type is widely and preferred in order to preserve IP space. 15 | 16 | But when talking about VHOST we are generally talking about **Named-based Virtual hosts.** 17 | 18 | ### How does this actually work? 19 | 20 | Now, you would be confused about how will the webserver differentiate to which website it has to send my requests since many websites are being hosted on the same server with the same IP. 21 | 22 | It's through the "**Host header**". The web server identifies which content to serve up once it receives the Host header from the client. 23 | 24 | 25 | 26 | ![](<../.gitbook/assets/Vhost bruteforcing.png>) 27 | 28 | ### How to identity VHOST on a single IP? 29 | 30 | For this purpose, we can use a tool called[ HostHunter](https://github.com/SpiderLabs/HostHunter). 31 | 32 | ### [HostHunter](https://github.com/SpiderLabs/HostHunter) 33 | 34 | * **Author**: [SpiderLabs](https://github.com/SpiderLabs) 35 | * **Language**: Python 36 | 37 | #### Installation: 38 | 39 | ``` 40 | git clone https://github.com/SpiderLabs/HostHunter.git 41 | pip3 install -r requirements.txt 42 | ``` 43 | 44 | #### Running: 45 | 46 | ``` 47 | python3 hosthunter.py ip_addresses.txt 48 | ``` 49 | 50 | ![A total of 336 websites are hosted on the same IP](../.gitbook/assets/hosthunter.png) 51 | 52 | 53 | 54 | ## VHOST bruteforcing 55 | 56 | > Sorry to say, I couldn't find appropriate content around the internet related to this topic. Me myself don't use this technique, but yes this is also an technique to discover VHOSTS 57 | 58 | ``` 59 | gobuster vhost -u https://example.com -t 50 -w subdomains.txt 60 | ``` 61 | -------------------------------------------------------------------------------- /automation.md: -------------------------------------------------------------------------------- 1 | # Automation 🤖 2 | 3 | It would be difficult for a person to perform all the above-mentioned techniques. Hence, we need to rely on some kind of tool to automate such intensive steps. Wouldn't it be good if we just had to give our target name **BOOM** !!💥 the tool performs subdomain enumeration via all these techniques? 4 | 5 | ## [ReconFTW](https://github.com/six2dez/reconftw) 6 | 7 | ![](.gitbook/assets/reconftw_logo.png) 8 | 9 | * Author: [six2dez](https://github.com/six2dez) 10 | * Language: Bash 11 | 12 | Yess this tool outperforms the work of subdomain enumeration via **6 unique techniques**. Currently if configured well, gives the most number of subdomains compared to any other open-source tool 🚀 . Let's take a look at the enumeration techniques it performs:- 13 | 14 | 1. **Passive Enumeration \(** [subfinder](https://github.com/projectdiscovery/subfinder), [assetfinder](https://github.com/tomnomnom/assetfinder), [amass](https://github.com/OWASP/Amass), [findomain](https://github.com/Findomain/Findomain), [crobat](https://github.com/cgboal/sonarsearch), [waybackurls](https://github.com/tomnomnom/waybackurls), [github-subdomains](https://github.com/gwen001/github-subdomains), [Anubis](https://jldc.me/), [gauplus](https://github.com/bp0lr/gauplus) and [mildew](https://github.com/daehee/mildew)\) 15 | 2. **Certificate transparency** \([ctfr](https://github.com/UnaPibaGeek/ctfr), [tls.bufferover](https://github.com/six2dez/reconftw/blob/main/tls.bufferover.run) and [dns.bufferover](https://github.com/six2dez/reconftw/blob/main/dns.bufferover.run)\) 16 | 3. **Bruteforce** \([puredns](https://github.com/d3mondev/puredns)\) 17 | 4. **Permutations** \([DNScewl](https://github.com/codingo/DNSCewl)\) 18 | 5. **JS files & Source Code Scraping** \([gospider](https://github.com/jaeles-project/gospider), [analyticsRelationship](https://github.com/Josue87/analyticsRelationship)\) 19 | 6. **DNS Records** \([dnsx](https://github.com/projectdiscovery/dnsx)\) 🤖 20 | 21 | ### Installation: 22 | 23 | * The [installer](https://github.com/six2dez/reconftw/blob/main/install.sh) script installs all the required dependencies and tools required. 24 | 25 | ```bash 26 | git clone https://github.com/six2dez/reconftw 27 | cd reconftw/ 28 | ./install.sh 29 | ``` 30 | 31 | ### Running ReconFTW: 32 | 33 | * ReconFTW has a `-s` flag that performs subdomain enumeration & web probing. 34 | * Out of all the 6 techniques if we want to skip any step we can do it through its [config](https://github.com/six2dez/reconftw/blob/main/reconftw.cfg#L49) file. Just set the value of a particular function to `false` 35 | * Also, you can provide your own wordlist for bruteforcing by specifying them in the reconftw config file. 36 | * Highly recommended that you run this tool in a VPS. 37 | 38 | ```bash 39 | ./reconftw.sh -d example.com -s 40 | ``` 41 | 42 | **Flags:** 43 | 44 | * **d** - target domain 45 | * **s** - Perform subdomain enumeration 46 | 47 | {% hint style="success" %} 48 | **Tip**: 🧙♂ Using `--deep` mode will run more time taking steps but return more subdomains 49 | {% endhint %} 50 | 51 | 52 | 53 | 54 | ## Taking Subdomain Enumeration to next level 🚀 🚀 55 | 56 | The biggest fear while performing subdomain enumeration is that the public DNS resolvers we are using should give us a ban/timeout as we are querying them at a high rate for a prolonged period of time. Since we would be querying the public resolvers using our single VPS IP address they might give us a ban. But what we perform the same task by distributing the workload amongst several VPS instances? The chances of a ban would be less right? Also, the execution time would be considerably less right? 57 | 58 | That's when **Axiom** comes to the rescue. 59 | 60 | ### [Axiom](https://github.com/pry0cc/axiom) 🤍 61 | 62 | * Author: [pyrocc](https://github.com/pry0cc) 63 | * Language: Bash 64 | * Supports: Digital Ocean, Linode, Azure, GCP, IBM 65 | 66 | Axiom is a dynamic infrastructure that helps to distribute the workload of a single task equally among several cloud instances. A perfect tool while performing mass recon. You will first need to install Axiom on your VPS/system from where you will be able to spin up/down the cloud instances. 67 | 68 | ### How does axiom work? 69 | 70 | Let's consider want to perform DNS bruteforcing. For this first, you will need to initialize a fleet of instances. This can be any number of instances you want/authorize to make. Within a matter of 4-5 minutes that many instances would be initialized and ready to accept your commands. 71 | 72 | #### Steps: 73 | 74 | 1. Divide the bruteforce wordlist into equal number\(total number of instances\) of parts 75 | 2. Transfer each part to the respective instances 76 | 3. Perform standalone execution in separate instances 77 | 4. Merge the output results from all instances 78 | 5. Create a single output 79 | 80 | ### Installation: 81 | 82 | * Axiom has an interactive installer that will first ask for your cloud provider, API key, which provision to install, which region to choose, default instance size, etc. 83 | 84 | ```bash 85 | git clone https://github.com/pry0cc/axiom ~/.axiom/ 86 | $HOME/.axiom/interact/axiom-configure 87 | ``` 88 | 89 | #### 90 | 91 | #### Some stats: 📊 92 | 93 | | **Task** | **Axiom** \(15 instances\) ✅ | **Single VPS** \(4cpu/8gb\) | 94 | | :--- | :--- | :--- | 95 | | DNS bruteforcing \(11M wordlist\) | 1m 16s | 10m 28s | 96 | | Web probing \(50k subdomains\) | 1m 40s | 21m 22s | 97 | 98 | 99 | 100 | ![](.gitbook/assets/axiomxreconftw.png) 101 | 102 | Yes, it's possible to integrate **Axiom** in **ReconFTW**. Isn't that great? Do try this out !!😍 103 | 104 | ### Usage: 105 | 106 | * It's necessary to first install ReconFTW first on your controller/main system and then install/setup axiom. 107 | * Before running ReconFTW over Axiom it's recommended that you first initialize your fleet. 108 | * The thing to note here is to run ReconFTW over Axiom you have to use another script called `reconftw_axiom.sh` 109 | 110 | ```bash 111 | axiom-fleet testy -i=30 112 | axiom-select 'testy*' 113 | ./reconftw_axiom.sh -d example.com -s 114 | ``` 115 | 116 | 117 | 118 | 119 | 120 | #### Liked my work? Don't hesitate to buy me a coffee XDD 121 | 122 | #### ❤💙💚 [https://www.buymeacoffee.com/siddheshparab](https://www.buymeacoffee.com/siddheshparab) 💚 💙 ❤ 123 | 124 | -------------------------------------------------------------------------------- /introduction/prequisites.md: -------------------------------------------------------------------------------- 1 | # Prerequisites 2 | 3 | ### What things do we need before performing a great enumeration? 4 | 5 | 1. **API keys of Passive DNS sources** 6 | 2. **100% accurate open public DNS resolvers** 7 | 3. **A VPS(Virtual Private Server)** 8 | 9 | 10 | 11 | ## 1) API keys for Passive DNS data :key: 12 | 13 | ### What is Passive DNS data? 14 | 15 | Whenever a domain is alive on the internet, to access it, a DNS query needs to be made to the DNS resolver. With special probes activated on the DNS resolver, it is possible to record these queries and store them into a database. This doesn't record which client made the request but, just the fact that at some point a domain has been associated with a specific DNS record. 16 | 17 | Hence, we can know, what subdomains of a particular root domain existed with the help of these DNS record database. These subdomains in the present time may be alive or dead. (we need further find out which are the valid ones). There exists various services/companies that are doing this work for past several years. Along with this, various companies have their internet crawlers which continuously keep on crawling the whole internet and discover new domains. 18 | 19 | There are a number of services/sources([SecurityTrails](https://securitytrails.com/), [Censys](https://censys.io/), [Shodan](https://www.shodan.io/), [VirusTotal](https://www.virustotal.com/), [WhoisXMLAPI](https://www.whoisxmlapi.com/), [DNSDB](https://www.farsightsecurity.com/tools/dnsdb-scout/)) that provide such historical DNS data & crawled data. These services provide their API keys so that we can query and retrieve subdomains of our choice. 20 | 21 | ### Configuring API keys: 22 | 23 | **There are 2 types of passive DNS services:-** 24 | 25 | **1) Allow querying their Datasets freely(partially)**\ 26 | A number of sources allow users to freely query their DNS datasets. Check out which sources allow to freely query their dataset here. (we don't need to care about these sources as our subdomain enumeration tools like [**amass**](https://github.com/OWASP/Amass), [**subfinder**](https://github.com/projectdiscovery/subfinder), [**assetfinder**](https://github.com/tomnomnom/assetfinder) will query them and do the work for us:yum: ) 27 | 28 | **2) Need to generate API keys to query Datasets**\ 29 | Also, a number of sources require you to signup on to their platform and generate a unique API key for yourself so that you are authorized to query and retrieve results from their historical DNS datasets. 30 | 31 | ### Problems with obtaining free API keys of good passive sources: 32 | 33 | * Good passive sources provide API keys for a limited period. (7 days/20 days) 34 | * They provide a limited amount of API query quota. (50 per day/1000 per month ) 35 | * Limited query results (2 pages of data) 36 | 37 | ### Is it worth making API keys? 38 | 39 | * Yes, absolutely, given below is the comparison between running [**Subfinder** ](https://github.com/projectdiscovery/subfinder)with API keys configured and without. 40 | * You can clearly see the difference that using API keys gave me a whopping **198,669 more**💥 subdomains. 41 | * Further, this passive data would be used to generate permutation/alterations which eventually would give us more subdomains. 42 | 43 | {% tabs %} 44 | {% tab title="Without API keys " %} 45 |
46 | {% endtab %} 47 | 48 | {% tab title="With API keys" %} 49 |
50 | {% endtab %} 51 | {% endtabs %} 52 | 53 | ### How much time does it takes to signup and obtain API keys? 54 | 55 | * There are in total 19 services on which you can sign up and obtain API keys. 56 | * I have created a detailed excel sheet about which sources to signup on for, what's the validity of API key, their API key quota, rate limits, etc. 57 | * Depending on your consumption of API queries and the validity of API keys, you need to make keep making new accounts at a regular interval of time in order to get the maximum results. 58 | 59 |
60 | 61 | :man\_tipping\_hand: **Check out the excel sheet** :point\_right: [**here** ](https://docs.google.com/spreadsheets/d/19lns4DUmCts1VXIhmC6x-HaWgNT7vWLH0N68srxS7bI/edit?usp=sharing):point\_left: 62 | 63 | {% hint style="success" %} 64 | **More the time** you invest in signing up with passive sources, **More the subdomains** you get ✨ 65 | {% endhint %} 66 | 67 | ## 68 | 69 | ## 2) 100% accurate open public DNS resolvers 70 | 71 | ### What is a DNS resolver? 72 | 73 | A DNS(Domain Name System) resolver is a service that manages "name to IP address" translations. The process of DNS resolution involves converting a hostname (such as www.example.com) into a computer-friendly IP address (such as 192.168.1.1). In short, if we need to know whether a domain/host is alive or not; we would need to perform a DNS query. 74 | 75 | ### Why do we need a public DNS resolvers list? 76 | 77 | During various subdomain enumeration techniques like DNS bruteforcing or DNS resolution of a large number of domains, we use a base tool called [MassDNS](https://github.com/blechschmidt/massdns). MassDNS is a simple high-performance tool that is used to check whether a given domain is valid or not. For this purpose, MassDNS needs to be provided with a list of public DNS resolvers. These public resolvers perform DNS queries on our behalf and returns the result. Hence, more the number of public resolvers we provided, the more concurrent DNS queries can be made and thus quicker the output.\ 78 | 79 | 80 | ### How can we create a list of valid public resolvers? 81 | 82 | [**Dnsvalidator**](https://github.com/vortexau/dnsvalidator) is a tool that helps us to verify/generate a valid list of open public DNS resolvers. The [https://public-dns.info](https://public-dns.info/) is a website that includes a list of around **28.7K** open public DNS resolvers. But some of these resolvers wouldn't be working for us. So, dnsvalidator helps us to verify and return only the valid open public DNS resolvers. Dnsvalidator takes this list of 28.7K public resolvers and queries for its resolution using trusted resolvers like Google DNS(8.8.8.8), Cloudflare DNS(1.1.1.1), Quad9(9.9.9.9). 83 | 84 | ```bash 85 | git clone https://github.com/vortexau/dnsvalidator.git 86 | cd dnsvalidator/ 87 | python3 setup.py install 88 | dnsvalidator -tL https://public-dns.info/nameservers.txt -threads 100 -o resolvers.txt 89 | ``` 90 | 91 |

This process takes around 2hrs and gives 13k valid resolvers.

92 | 93 | ### \[Alternative]💁‍♂️ 94 | 95 | The above method of using dnsvalidator to populate a list of valid public DNS resolver is too much time and resource consuming. Hence, we can depend upon open-source contributions by other researcher for populating the list of DNS resolvers. Various security researchers/companies have created automations that run dnsvalidator periodically(every 24hrs). One can benefit from such contributions. Below is the periodically verified list of DNS resolvers by [Trickest](https://trickest.com/). 96 | 97 | ``` 98 | wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers.txt 99 | mv resolvers.txt public_resolvers.txt 100 | ``` 101 | 102 | ## 103 | 104 | ## 3) A VPS (_Most Preferable)_ 105 | 106 | ### What is a VPS? 107 | 108 | VPS(Virtual Private Server) can be called as your own dedicated virtual machine in the cloud. 109 | 110 | ### Benefits of a VPS? 111 | 112 | VPS tends to have higher bandwidth and better DNS resolution capabilities as compared to your local home router system which uses ISP's DNS resolver(slow). 113 | 114 | A VPS helps while performing various bandwidth-intensive tasks such as DNS resolution & brute-forcing. Alongside when performing such tasks on a local system generally blows up your wifi making it unusable for other users connected on the same network. 115 | 116 | Also, with VPS you can perform tasks 24/7; 365 days, unlike your local system. 117 | 118 | ### How to get one? 119 | 120 | There are various cloud providers that provide **free 200$ credits** like [Digital Ocean](https://www.digitalocean.com/), [Linode](https://www.linode.com/), [Vultr](https://www.vultr.com/) when you signup for the first time. (**C**redit**C**ard with international transac enabled required)\ 121 | **Referral Links:** 122 | 123 | [![DigitalOcean Referral Badge](https://web-platforms.sfo2.digitaloceanspaces.com/WWW/Badge%203.svg)](https://www.digitalocean.com/?refcode=9961f826b4d3\&utm\_campaign=Referral\_Invite\&utm\_medium=Referral\_Program\&utm\_source=badge) 124 | 125 | * [**Linode**](https://www.linode.com/?r=3e95d563ede9af9901189e9476917c9338b7108f) 126 | * [**Vultr**](https://www.vultr.com/?ref=8905902) 127 | 128 | \ 129 | 130 | 131 | 132 | 133 | -------------------------------------------------------------------------------- /introduction/whats-the-need.md: -------------------------------------------------------------------------------- 1 | # What's the need ?🤔 2 | 3 | ## **What is subdomain enumeration?** 4 | 5 | It is one of the most crucial parts of the reconnaissance phase while performing a security assessment. **Subdomain Enumeration** is a process of finding sub-domains associated to the root domain. According to [RFC 1034](https://tools.ietf.org/html/rfc1034), "_a domain is a subdomain of another domain if it is contained within that domain_". 6 | 7 | ![](../.gitbook/assets/subdomains.png) 8 | 9 | ## What's the need? 10 | 11 | * Performing subdomain enumeration via various intensive techniques can help enlarge your attack surface, as you get more assets to find vulnerabilities on. 12 | * A good subdomain enumeration will help you find those hidden/untouched subdomains, resulting lesser people finding bugs on that particular domain. Hence lesser **duplicates**. 13 | * Finding applications running on hidden, forgotten(by the organization) sub-domains may lead to uncovering critical vulnerabilities. 14 | * Discovering such strangely named subdomains is a critical skill, each bug hunter should possess in today's time. 15 | * For large organizations, to find what services have they exposed to the internet while performing an internal pentest. 16 | 17 | 18 | 19 | {% hint style="success" %} 20 | **More the subdomains = More assets to look for vulnerabilities**:lady\_beetle: 21 | {% endhint %} 22 | 23 | ## :warning: Common Misconception about "subdomain" 24 | 25 | A Fully Qualified Domain Name (**FQDN**) is the complete domain name for a specific computer, or host, on the internet. 26 | 27 | An FQDN looks like this:- 28 | 29 | `myhost.example.com.` **---->** Fully Qualified Domain Name 30 | 31 | `myhost` **---->** is the host located within the domain example.com (subdomain) 32 | 33 | 34 | 35 | **Hence;**\ 36 | [**https://**example.com](https://example.com)\ 37 | [**http://**myhost.example.com](http://myhost.example.com)\ 38 | [**https://**internal.accounts.example.com ](https://internal.accounts.example.com)\ 39 | [**http://**internal.accounts.dashboard.example.com](https://internal.accounts.dashboard.example.com) 40 | 41 | The above-mentioned **cannot** be called as subdomains. They are the hyperlinks to web applications hosted the respective hosts. Most people have a misconception that these are subdomains of a particular target. 42 | 43 | Let's consider an example, **`admin.example.com`** is a subdomain on which there isn't any web service hosted. This means that, when we send web probes to `admin.example.com` using [httpx](https://github.com/projectdiscovery/httpx)/[httprobe ](https://github.com/tomnomnom/httprobe)**(**tools that check whether any web service is running on that host), it will not return any output. 44 | 45 | This doesn't mean that `admin.example.com` is not a valid subdomain of root domain `example.com`There may exists other services like SSH, SMTP, SMB, WinRM(non-web) hosted on that subdomain that cannot be accessed through your web browser. Surprisingly these services may be vulnerable and their exploits would be publicly available. \ 46 | \ 47 | So in such a case, it's always better that you **DNS resolve** the subdomains that are gathered from passive enumeration to get the valid ones. Later you can send the valid/alive subdomains for **web probing** and find out the hosted web applications on them. 48 | 49 | ### **Moral of the story:** 50 | 51 | The methodology of collecting subdomains from tools like amass, subfinder, findomain and directly sending them to httpx/httprobe is **absolutely wrong**:x:. Instead, you should first DNS resolve them using tools like [puredns ](https://github.com/d3mondev/puredns)or [shuffledns](https://github.com/projectdiscovery/shuffledns). 52 | 53 | \ 54 | 55 | 56 | \ 57 | 58 | 59 | 60 | 61 | -------------------------------------------------------------------------------- /passive-enumeration/certificate-logs.md: -------------------------------------------------------------------------------- 1 | # Certificate Logs 2 | 3 | ## What are SSL/TLS certificates? 4 | 5 | SSL/TLS certificates are obtained to help a website move from "HTTP" to "HTTPS" which is more secure. This certificate is trusted by both the domain presenting the certificates and the clients that use the certificate to encrypt their communications with the domain’s services. To obtain such a certificate we need to request it from the CA(Certificate Authority). 6 | 7 | 8 | 9 | ## 1) Certificate Transparency(CT) Logs 10 | 11 | #### What was the need to implement Certificate Transparency Logs? 12 | 13 | Before 2013 these CA(Certificate Authority) authorities faced various breaches. Due to such breaches, anyone could maliciously create a forged certificate of the domain owner and gain the trust of the end-user. Also, CA didn't perform proper verifications if the requester is an authorized person of the domain. Hence, there was a need to create a central repository to maintain transparency amongst all. 14 | 15 | #### What is Certificate Transparency Log? 16 | 17 | Google came up with a unique solution to this problem, by introducing [**Certificate Transparency logs**](https://certificate.transparency.dev/). This means that all the certificates issued by the CA would be appended to a common public list. Having a transparent logs of all issued certificates is a great solution for solving the problem of fraudulent certificate issuing, as legitimate domain owners have the ability to spot certificates issued without their consent. 18 | 19 | #### How we can abuse CT logs? 20 | 21 | Since every time an organization gets an SSL certificate it gets logged in these CT logs, they can be abused easily. As anyone can query them, thus can be utilized to enumerate the subdomains of a root domain that have an accompanying TLS certificate. 22 | 23 | We can find all SSL certificates belonging to a domain by issuing a GET request to [**https://crt.sh/?q=%25.dell.com**](https://crt.sh/?q=%25.dell.com) 24 | 25 | ![Screenshot from crt.sh](../.gitbook/assets/crt.png) 26 | 27 | As you can see we got a list of subdomains. 28 | 29 | 30 | 31 | ## 🔧Tool: 32 | 33 | ### CTFR 34 | 35 | * **Author**: [UnaPibaGeek](https://github.com/UnaPibaGeek) 36 | * **Language**: Python 37 | 38 | [**CTFR** ](https://github.com/UnaPibaGeek/ctfr) is a python based tool that helps to grabs all the subdomains for our target domains using Certificate Transparency(CT) logs. CTFR queries [**crt.sh**](https://crt.sh/) website and retrieves subdomains of our mentioned domain. 39 | 40 | **Installation:** 41 | 42 | ```bash 43 | git clone https://github.com/UnaPibaGeek/ctfr.git 44 | cd ctfr/ 45 | pip3 install -r requirements.txt 46 | ``` 47 | 48 | **Running CTFR:** 49 | 50 |
51 | 52 | ``` 53 | python3 ctfr.py -d target.com -o output.txt 54 | ``` 55 | 56 | **Flags:** 57 | 58 | * **d** - Target domain 59 | * **o** - Output file 60 | 61 | 62 | 63 | 64 | 65 | ## 2) tls.bufferover.run 66 | 67 | [**tls.bufferover.run**](https://tls.bufferover.run/) is a service that scans the whole IPv4 address space and grabs all the necessary data from the TLS certificates of those hosts. These TLS certificate include a field called as "**Subject**" that hold necessary information from our perspective. The Subject field contains a component called as "**CommonName(CN)**" which indicates the Fully Qualified Domain Name(FQDN) of that host. So, we can leverage this to look for subdomains of our target. But inorder to query this service we must 68 | 69 | 70 | 71 | #### Creating API key: 72 | 73 | * Inorder get the results we need to create an API key for this services. 74 | * Visit [https://tls.bufferover.run/](https://tls.bufferover.run/) and enter your email-id for the "Free-Tier" plan. 75 | * You will instantly get the API key in your email Inbox. 76 | 77 | #### Querying for results: 78 | 79 | * Replace the target and API key with yours. 80 | 81 | ```bash 82 | curl 'https://tls.bufferover.run/dns?q=.dell.com' -H 'x-api-key: TYvnAWCtsmJKTjcYs9bE91aNs8GZZMo5lCX3i06a'| jq -r .Results[] | cut -d ',' -f5 | grep -F ".dell.com" | sort -u output.txt 83 | ``` 84 | 85 | -------------------------------------------------------------------------------- /passive-enumeration/passive-sources.md: -------------------------------------------------------------------------------- 1 | # Passive Sources 2 | 3 | ### What is passive subdomain enumeration? 4 | 5 | Passive subdomain enumeration is a technique to query for passive DNS datasets provided by services like [SecurityTrails](https://securitytrails.com/), [Censys](https://censys.io/), [Shodan](https://www.shodan.io/), [BinaryEdge](https://www.binaryedge.io/), [VirusTotal](https://www.virustotal.com/gui/), [Whoisxmlapi](https://main.whoisxmlapi.com/), etc. to obtain the subdomains of a particular target. Here we don't send any active probes to our target, instead passively try to scrape information available from the internet. 6 | 7 | There are in total around[ **90** **passive DNS sources/services**](https://gist.github.com/sidxparab/22c54fd0b64492b6ae3224db8c706228) that provide such datasets to query them. It's difficult to manually query these third-party services thus, to ease this process various tools are developed which automate these processes. 8 | 9 | {% hint style="warning" %} 10 | It's highly recommended to read [**this**](https://app.gitbook.com/@sidxparab/s/subdomain-enumeration-guide/introduction/prequisites#what-is-passive-dns-data) section first, before proceeding further. 11 | {% endhint %} 12 | 13 | 1. **Passive DNS enumeration tools** 14 | * [Amass](https://github.com/OWASP/Amass) 15 | * [Subfinder](https://github.com/projectdiscovery/subfinder) 16 | * [Assetfinder](https://github.com/tomnomnom/assetfinder) 17 | * [Findomain](https://github.com/Findomain/Findomain) 18 | 2. **Internet Archive** 19 | * [gau](https://github.com/lc/gau) 20 | * [waybackurls](https://github.com/tomnomnom/waybackurls) 21 | 3. **Github Scraping** 22 | * [github-subdomains](https://github.com/gwen001/github-subdomains) 23 | 4. **GitLab Scraping** 24 | * [gitlab-subdomains](https://github.com/gwen001/gitlab-subdomains) 25 | 26 | 27 | 28 | 29 | 30 | ## A) Passive DNS gathering tools 31 | 32 | ### 1) Amass 33 | 34 | * **Author:** [OWASP](https://github.com/OWASP) (mainly [caffix](https://github.com/caffix)). 35 | * **Language**: Go 36 | * **Total Passive Sources**: **82** 37 | 38 | [**Amass** ](https://github.com/owasp-amass/amass)is a Swiss army knife for subdomains enumeration that outperforms passive enumeration the best. Amass queries the most number of third-party services which results in more subdomains of a particular target. [**These**](https://gist.github.com/sidxparab/e625a264322e4c9db3c3f1844b4a00b6) are passive services that amass queries. 39 | 40 | ### :gear: Configuring amass: 41 | 42 | * Since amass written in Go, you need your Go environment properly set up([Steps](https://gist.github.com/sidxparab/e3856c5e27b8a9b27b5b4911eb9e4ae6) to setup Go environment) 43 | 44 | **Installation:** 45 | 46 | ```bash 47 | go install -v github.com/owasp-amass/amass/v3/...@master 48 | ``` 49 | 50 | **Setting up Amass config file:** 51 | 52 | * [**Link**](https://gist.github.com/sidxparab/b4ffb99c98136dc4a238cbb88a77f642) to my amass config file for reference. 53 | * To make it possible for Amass to query the passive DNS datasets, it necessary for us to setup the API keys of those services in the Amass configuration file. 54 | * By default, amass config file is located at `$HOME/.config/amass/config.ini` 55 | 56 | {% hint style="info" %} 57 | To get to know to create API keys, check out [**this article**](https://dhiyaneshgeek.github.io/bug/bounty/2020/02/06/recon-with-me/)**.** 58 | {% endhint %} 59 | 60 | * Now let's set up our API keys in the `config.ini`config file. 61 | * Open the config file in a text editor and then uncomment the required lines and add your API keys. 62 | * Refer to [my config file](https://gist.github.com/sidxparab/b4ffb99c98136dc4a238cbb88a77f642)(this is exactly how your amass config file should be) 63 | 64 |
# https://otx.alienvault.com (Free)
 65 | [data_sources.AlienVault]
 66 | [data_sources.AlienVault.Credentials]
 67 | apikey = dca0d4d692a6fd757107333d43d5f284f9a38f245d267b1cd72b4c5c6d5c31
 68 | 
 69 | 
 70 | #How to Add 2 API keys for a single service
 71 | # https://app.binaryedge.com (Free)
 72 | [data_sources.BinaryEdge]
 73 | ttl = 10080
 74 | [data_sources.BinaryEdge.account1]
 75 | apikey = d749e0d3-ff9e-gcd0-a913-b5e62f6f216a
 76 | [data_sources.BinaryEdge.account2]
 77 | apikey = afdb97ff-t65e-r47f-bba7-c51dc5d83347
 78 | 
79 | 80 | ### **Running Amass:** 81 | 82 | * After setting up API keys now we are good to run amass. 83 | 84 | ```bash 85 | amass enum -passive -d example.com -config config.ini -o output.txt 86 | ``` 87 | 88 | **Flags:-** 89 | 90 | * **enum** - Perform DNS enumeration 91 | * **passive** - passively collect information through the data sources mentioned in the config file. 92 | * **config** - Specify the location of your config file (default: `$HOME/.config/amass/config.ini` ) 93 | * **o** - Output filename 94 | 95 | {% hint style="success" %} 96 | :man\_mage:**Tip**: After configuring your config file in order to verify whether the API keys have been correctly set up or not you can use this command:\ 97 | _amass enum -list -config config.ini_ 98 | {% endhint %} 99 | 100 | ### 101 | 102 | ### 2) Subfinder 103 | 104 | * **Author**: [projectdiscovery](https://github.com/projectdiscovery) 105 | * **Language**: Go 106 | * **Total Passive Sources**: **38** 107 | 108 | [**Subfinder** ](https://github.com/projectdiscovery/subfinder)is yet another great tool that one should have in their pipeline. There are some unique sources that subfinder queries for, that amass doesn't. This tool is been developed by the famous ProjectDiscovery team, who's tools are used by every other bugbounty hunter. 109 | 110 | ### :gear:Configuring Subfinder: 111 | 112 | **Installation:** 113 | 114 | ```bash 115 | go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest 116 | ``` 117 | 118 | **Setting up Subfinder configuration file:** 119 | 120 | * Subfinder's default config file location is at _`$HOME/.config/subfinder/provider-config.yaml`_ 121 | * After your first installation, if you didn't find the configuration file populated by default run the following command again `subfinder` in order to get it generated. 122 | * The subfinder config file follows YAML(YAML Ain't Markup Language) syntax. So, you need to be careful that you don't break the syntax. It's better that you use a text editor and set up syntax highlighting. 123 | 124 | **Example config file:-** 125 | 126 | * [**Link**](https://gist.github.com/sidxparab/ba50e138e5c912c7c59532ce38399d1b) to my subfinder config file for reference. 127 | * Some passive sources like `Censys` , `PassiveTotal` use 2 keys in combination in order to authenticate a user. For such services, both values need to be mentioned with a colon(:) in between them. _(Check how have I mentioned the "Censys" source values- `APP-id`:`Secret` in the below example )_ 128 | * Subfinder automatically detects its config file only if at the default position. 129 | 130 | ```yaml 131 | securitytrails: [] 132 | censys: 133 | - ac244e2f-b635-4581-878a-33f4e79a2c13:dd510d6e-1b6e-4655-83f6-f347b363def9 134 | shodan: 135 | - AAAAClP1bJJSRMEYJazgwhJKrggRwKA 136 | github: 137 | - d23a554bbc1aabb208c9acfbd2dd41ce7fc9db39 138 | - asdsd54bbc1aabb208c9acfbd2dd41ce7fc9db39 139 | passivetotal: 140 | - sample-email@user.com:password123 141 | ``` 142 | 143 | ### **Running Subfinder:** 144 | 145 | ```bash 146 | subfinder -d example.com -all -config config.yaml -o output.txt 147 | ``` 148 | 149 | **Flags:-** 150 | 151 | * **d** - Specify our target domain 152 | * **all** - Use all passive sources (slow enumeration but more results) 153 | * **config** - Config file location 154 | 155 | {% hint style="success" %} 156 | :man\_mage: **Tip:-** To view the sources that require API keys `subfinder -ls` command 157 | {% endhint %} 158 | 159 | ### 160 | 161 | ### **3) Assetfinder** 162 | 163 | * **Author**: [tomnomnom](https://github.com/tomnomnom) 164 | * **Language**: Go 165 | * **Total passive sources**: **9** 166 | 167 | Don't know why did I include this tool:joy:just because its build by the legend [Tomnomnom](https://twitter.com/TomNomNom) ? It doesn't give any unique subdomains compared to other tools but it's extremely fast. 168 | 169 | ```bash 170 | go install github.com/tomnomnom/assetfinder@latest 171 | ``` 172 | 173 | **Running:** 174 | 175 | ```bash 176 | assetfinder --subs-only example.com > output.txt 177 | ``` 178 | 179 | ### 180 | 181 | ### 4) Findomain 182 | 183 | * **Author**: [Edu4rdSHL](https://github.com/Edu4rdSHL) 184 | * **Language**: Rust 185 | * **Total Passive sources**: 21 186 | 187 | [**Findomain** ](https://github.com/Findomain/Findomain)is one of the standard subdomain finder tools in the industry. Another extremely fast enumeration tool. It also has a paid version that offers much more features like subdomain monitoring, resolution, less resource consumption. 188 | 189 | ### Configuring Findomain: :gear: 190 | 191 | **Installation:-** 192 | 193 | * Depending on your architecture download binary from [here](https://github.com/Findomain/Findomain/wiki/Installation#using-upstream-precompiled-binaries) 194 | 195 | ```bash 196 | wget -N -c https://github.com/Findomain/Findomain/releases/download/9.0.0/findomain-linux.zip 197 | unzip findomain-linux.zip 198 | mv findomain /usr/local/bin/findomain 199 | chmod 755 /usr/local/bin/findomain 200 | ``` 201 | 202 | **Configuration:-** 203 | 204 | * You need to define API keys in your `.bashrc` or `.zshrc` . 205 | * Findomain will pick up them automatically. 206 | 207 | ```bash 208 | export findomain_virustotal_token="API_KEY" 209 | export findomain_fb_token="API_KEY" 210 | ``` 211 | 212 | ### **Running Findomain:** 213 | 214 | ```bash 215 | findomain -t example.com -u output.txt 216 | ``` 217 | 218 | **Flags:-** 219 | 220 | * **t** - Target domain 221 | * **u** - Output file 222 | 223 | 224 | 225 | 226 | 227 | ## B) Internet Archives 228 | 229 | Internet Archives deploy their own web crawlers and indexing systems that crawl each website on the internet. Hence, they have historical data of all the websites that once existed. hence, Internet Archives can be a useful source to grab subdomains of a particular target that once existed and later perform permutations(more on this later) on them to get more valid subdomains. 230 | 231 | Internet Archive when queried gives back URLs. Since we are only concerned with the subdomains, we need to process those URLs to grab only unique FQDN subdomains from them. 232 | 233 | For this, we use a tool called [unfurl](https://github.com/tomnomnom/unfurl). This tool helps to extract the domain name from a list of URLs. 234 | 235 | ### 5) Gau 236 | 237 | * **Author**: [lc](https://github.com/lc) 238 | * **Language**: Go 239 | * **Sources**: 240 | * [web.archive.org](http://web.archive.org/) 241 | * [index.commoncrawl.org](http://index.commoncrawl.org/) 242 | * [otx.alienvault.com](https://otx.alienvault.com/) 243 | * [urlscan.io](https://urlscan.io/) 244 | 245 | [**Gau** ](https://github.com/lc/gau)works by querying all the above 4 internet archive services and grabs all the URLs that their internet-wide crawler had once crawled. So through this process we get tons of URL's belonging to our target that once existed. After collecting the URLs we extract only the domain/subdomain part from those URLs. 246 | 247 | #### Installation: 248 | 249 | ```bash 250 | go install github.com/lc/gau/v2/cmd/gau@latest 251 | ``` 252 | 253 | #### Running gauplus: 254 | 255 | ```bash 256 | gau --threads 5 --subs example.com | unfurl -u domains | sort -u -o output_unfurl.txt 257 | ``` 258 | 259 | **Flags:** 260 | 261 | * **threads** - How many workers to spawn 262 | * **subs** - Include subdomains of the target domain 263 | 264 | 265 | 266 | ### **6) Waybackurls** 267 | 268 | * **Author**: [tomnomnom](https://github.com/tomnomnom) 269 | * **Language**: Go 270 | * **Sources**: 271 | * [web.archive.org](http://web.archive.org/) 272 | * [index.commoncrawl.org](http://index.commoncrawl.org/) 273 | * [www.virustotal.com](https://www.virustotal.com) 274 | 275 | [**Waybackurls**](https://github.com/tomnomnom/waybackurls) works similar to Gau, but I have found that it returns some unique data that Gau couldn't find. Hence, we need to include waybackurls in our arsenal. 276 | 277 | #### Installation: 278 | 279 | ```bash 280 | go install github.com/tomnomnom/waybackurls@latest 281 | ``` 282 | 283 | #### **Running Waybackurls:** 284 | 285 | ```bash 286 | waybackurls example.com | unfurl -u domains | sort -u -o output.txt 287 | ``` 288 | 289 | 290 | 291 | 292 | 293 | 294 | 295 | ## C) GitHub Scraping 296 | 297 | ### 7) Github-subdomains 298 | 299 | * **Author**: [gwen001](https://github.com/gwen001) 300 | * **Language**: Go 301 | 302 | Organizations sometimes host their source code on GitHub, also employees working at these organizations sometimes leak the source code on GitHub. Additionally, I have came around instances where security researchers host their reconnaissance data in public repositories. The tool Github-subdomains can help you extract these exposed/leaked subdomains of your target from GitHub. 303 | 304 | **Installation:** 305 | 306 | ```bash 307 | go install github.com/gwen001/github-subdomains@latest 308 | ``` 309 | 310 | :gear:**Configuring github-subdomains​​:** 311 | 312 | * For github-subdomains to scrap domains from GitHub you need to specify a list of GitHub access tokens. 313 | * [**Here**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token#creating-a-personal-access-token-classic) is an article on how you can generate your GitHub access tokens. 314 | * These access tokens are used by the tool to perform searches and find subdomains on behalf of you. 315 | * I always prefer that you make at least 10 tokens from 3 different accounts(30 in total) to avoid rate limiting. 316 | * Specify 1 token per line. 317 | 318 | **Running github-subdomains:** 319 | 320 | ```bash 321 | github-subdomains -d example.com -t tokens.txt -o output.txt 322 | ``` 323 | 324 | **Flags:** 325 | 326 | * **d -** target 327 | * **t** - file containing tokens 328 | * **o** - output file 329 | 330 | 331 | 332 | ## ~~**D)** Rapid7 Project Sonar dataset(depreciated)~~ 333 | 334 | [Project Sonar](https://opendata.rapid7.com/about/) is a security research project by Rapid7 that conducts internet-wide scans. Rapid7 has been generous and made this data freely available to the public. Project Sonar contains [8 different datasets](https://opendata.rapid7.com/) with a total size of over **66.6 TB** which are updated on a regular basis. You can read here how you can parse these datasets on your own using this [guide](https://0xpatrik.com/project-sonar-guide/). 335 | 336 | This internet-wide DNS dataset could be an excellent resource for us to grab our subdomains right? But querying such large datasets could take up significant time. That's when **Crobat** comes to the rescue. 337 | 338 | ### 8) [Crobat](https://github.com/Cgboal/SonarSearch) 339 | 340 | * **Author**: [Cgboal](https://github.com/Cgboal) 341 | * **Language**: Go 342 | 343 | [Cgboal ](https://twitter.com/CalumBoal)has done an excellent work of parsing and indexing the whole Rapid7 Sonar dataset into MongoDB and creating an API to query this database. This Crobat API is freely available at [https://sonar.omnisint.io/](https://sonar.omnisint.io/).More over he developed a command-line tool that uses this API and returns the results at a blazing fast speed. 344 | 345 | ### Installation: 346 | 347 | ```bash 348 | go get github.com/cgboal/sonarsearch/cmd/crobat 349 | ``` 350 | 351 | ### Running: 352 | 353 | ```bash 354 | crobat -s example.com > output.txt 355 | ``` 356 | 357 | **Flags:** 358 | 359 | * **s** - Target Name 360 | 361 | 362 | 363 | 364 | 365 | 366 | 367 | ## :checkered\_flag:**That's it !!! Done with passive things** :checkered\_flag: 368 | 369 | #### Liked my work? Don't hesitate to buy me a coffee XDD 370 | 371 | #### :heart::blue\_heart::green\_heart: [https://www.buymeacoffee.com/siddheshparab](https://www.buymeacoffee.com/siddheshparab) :green\_heart: :blue\_heart: :heart: 372 | 373 | -------------------------------------------------------------------------------- /passive-enumeration/recursive-enumeration.md: -------------------------------------------------------------------------------- 1 | # Recursive Enumeration 2 | 3 | Through various testing and trying out new things for subdomain enumeration, my friend [Six2dez ](https://twitter.com/Six2dez1)came across a technique where running the subdomain enumeration tools again on each of the subdomains found yields in getting more subdomains in total 4 | 5 | In easy words, we again run tools like Amass, Subfinder, Assetfinder again each of the subdomains that were found. 6 | 7 | {% hint style="danger" %} 8 | If you have set up API keys, this technique may consume your entire querying quota. 9 | {% endhint %} 10 | 11 | For a better understanding look at the image below: 12 | 13 | ![](<../.gitbook/assets/Recursive Enumeration.png>) 14 | 15 | ### Things to keep in mind: 16 | 17 | * This technique is only useful when your target has a large number of multi-level subdomains_(not effective for small & medium scope targets)._ 18 | * It is recommended to execute this technique as the final step, exclusively on a validated list of subdomains that you have collected through other Passive + Active techniques. 19 | * This techniques may consume all your passive DNS service's API keys quota_(if they are configured)._ 20 | * This techniques takes time to return the final results. 21 | 22 | ### Workflow: 23 | 24 | 1. Read the list of subdomains from the file "subdomains.txt". 25 | 2. Process the subdomains in two steps: 26 | 27 | **a)** Find the Top-10 most frequent occuring Second-Level Domain names with the help of tools like cut, sort, rev, uniq, etc.\ 28 | **b)** Find the Top-10 most frequent occuring Third-Level domains. 29 | 3. Now run passive subdomain enumeration on these 10 Second-level domain names and 10 Third-level domain names using tools like amass, subfinder, assetfinder, findomain. 30 | 4. Keep appending the results to `passive_recursive.txt` file. 31 | 5. Now after finding out the a list of domain names, run puredns to DNS resolve them and find the alive subdomains. 32 | 33 | 34 | 35 | _Replace `subdomains.txt` with the filename of your subdomains list._ 36 | 37 | ```bash 38 | #!/bin/bash 39 | 40 | go install -v github.com/tomnomnom/anew@latest 41 | subdomain_list="subdomains.txt" 42 | 43 | for sub in $( ( cat $subdomain_list | rev | cut -d '.' -f 3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 && cat subdomains.txt | rev | cut -d '.' -f 4,3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 ) | sed -e 's/^[[:space:]]*//' | cut -d ' ' -f 2);do 44 | subfinder -d $sub -silent -max-time 2 | anew -q passive_recursive.txt 45 | assetfinder --subs-only $sub | anew -q passive_recursive.txt 46 | amass enum -timeout 2 -passive -d $sub | anew -q passive_recursive.txt 47 | findomain --quiet -t $sub | anew -q passive_recursive.txt 48 | done 49 | ``` 50 | -------------------------------------------------------------------------------- /types/horizontal-enumeration.md: -------------------------------------------------------------------------------- 1 | # Horizontal Enumeration 2 | 3 | While performing a security assessment our main goal is to map out all the root domains owned by a single entity. This means, making an inventory of all the internet facing assets of a particular organization. It is a bit trickier to find related domains/acquisitions of a particular organization as this step includes some tedious methods and doesn't guarantee accurate results always. One has to solely perform manual analysis to verify the results. 4 | 5 | From the below image you can get an idea of what a **horizontal domain correlation** is: 6 | 7 | ![](../.gitbook/assets/enumeration-2-.png) 8 | 9 | \ 10 | Let's look at how to find these related horizontal domains. 11 | 12 | {% hint style="danger" %} 13 | These enumeration methods can go out of scope and backfire you. Do it with caution! 14 | {% endhint %} 15 | 16 | ## 1) Finding related domains/acquisitions 17 | 18 | #### a) **WhoisXMLAPI** 19 | 20 | [**WhoisXMLAPI** ](https://www.whoisxmlapi.com/)is an excellent source that provides a good amount of related domains & acquisitions based on the WHOIS record. Singing up on their platform will assign you **500 free credits** which renews every month.\ 21 | Visit [https://tools.whoisxmlapi.com/reverse-whois-search](https://tools.whoisxmlapi.com/reverse-whois-search) . Searching with the root domain name like **dell.com** will give you a list of all the associated domains. 22 | 23 | ![](../.gitbook/assets/whoisxml.png) 24 | 25 | {% hint style="warning" %} 26 | These are not 100% accurate results, as they contain false positives 27 | {% endhint %} 28 | 29 | #### b) **Whoxy** :moneybag: 30 | 31 | [**Whoxy**](https://www.whoxy.com/) is yet another great source to perform reverse WHOIS on parameters like Company Name, Registrant Email address, Owner Name, Domain keyword. Whoxy has an enormous database of around **455M WHOIS records**. But sadly this is a paid service :( 32 | 33 | To effectively use Whoxy API there's a command-line tool called [**whoxyrm**](https://github.com/MilindPurswani/whoxyrm)**.** 34 | 35 | ``` 36 | go get -u github.com/milindpurswani/whoxyrm 37 | export WHOXY_API_KEY="89acb0f4557df3237l1" 38 | 39 | whoxyrm -company-name "Red Bull GmBH" 40 | ``` 41 | 42 | ![](../.gitbook/assets/whoxyrm.png) 43 | 44 | #### c) Crunchbase:moneybag: 45 | 46 | [**Crunchbase**](https://www.crunchbase.com/) is another great alternative for finding acquisitions but requires a paid subscription to view all the acquisitions. The trial version allows viewing some of the acquisitions. 47 | 48 |
49 | 50 | #### d) ChatGPT 51 | 52 | You can leverage OpenAI's [**ChatGPT**](https://chat.openai.com/) for getting a list of acquisitions that are owned by a particular organization. Below is the example of getting acquisitions of Tesla. 53 | 54 |
55 | 56 | ## 57 | 58 | ## 2) Discovering the IP space 59 | 60 | **ASN**(Autonomous System Number) is a unique identifier for a set of IP-ranges an organizations owns. Very large organizations such as Apple, GitHub, Tesla have their own significant IP space. To find an ASN of a particular organization, [https://bgp.he.net](https://bgp.he.net/) is a useful website where we can query.\ 61 | Let's find ASN for **Apple Inc.** 62 | 63 | ![](../.gitbook/assets/hurricane.png) 64 | 65 | Now that we have found out the ASN number of an organization, the next step is to find out the IP ranges that reside inside that ASN. For this, we will use a tool called **whois.** 66 | 67 |
apt-get install whois
 68 | whois -h whois.radb.net  -- '-i origin AS714' | grep -Eo "([0-9.]+){4}/[0-9]+" | uniq -u
 69 | 
70 | 71 |
72 | 73 | 74 | 75 | ## 3) PTR records (Reverse DNS) 76 | 77 | Now since we have got to know the IP address ranges from ASN of an organization, we can perform reverse DNS(PTR) queries on the IP addresses and check for valid hosts.\ 78 | \ 79 | **What is reverse DNS?**\ 80 | When a user attempts to open a domain/website in their browser, a DNS lookup occurs. This maps out the IP address(192.168.0.1) of the associated domain name(example.com). A reverse DNS lookup is the opposite of this process; it is a query that starts with the IP address and looks up the associated domain name of it. 81 | 82 | This means that, since we already know the IP space of an organization we can, we can reverse query the IP addresses and find the valid domains. Sounds cool? 83 | 84 | **But how?**\ 85 | DNS PTR records (pointer record) helps us to achieve this. Using [**dnsx**](https://github.com/projectdiscovery/dnsx) tool we can query a PTR record of an IP address and find the associated hostname/domain name. 86 | 87 | **Apple Inc.** :apple: owns **ASN714** which represents IP range **17.0.0.0/8.** So now, lets perform reverse DNS queries to find out the domain names**.** 88 | 89 | ### Running: 90 | 91 | We will first need to install 2 tools: 92 | 93 | * [**Mapcidr**](https://github.com/projectdiscovery/mapcidr) : 94 | 95 | ``` 96 | go install -v github.com/projectdiscovery/mapcidr/cmd/mapcidr@latest 97 | ``` 98 | * [**Dnsx** ](https://github.com/projectdiscovery/dnsx): 99 | 100 | ``` 101 | go install -v github.com/projectdiscovery/dnsx/cmd/dnsx@latest 102 | ``` 103 | 104 | **One liner:** 105 | 106 | ```bash 107 | echo 17.0.0.0/16 | mapcidr -silent | dnsx -ptr -resp-only -o output.txt 108 | ``` 109 | 110 | #### Breakdown: 111 | 112 | * When an IP range is given to **mapcidr** through stdin(standard input), it performs expansion of the CIDR range, spitting out each IP address from the range onto a new line. 113 | * Now when **dnsx** receives each IP address from stdin, it performs reverse DNS and checks for PTR record. If, found it gives us back the hostname/domain name. 114 | 115 | ![](../.gitbook/assets/ptr.png) 116 | 117 | ## 118 | 119 | ## 4) Favicon Search 120 | 121 | #### What is a favicon? 122 | 123 | The image/icon shown on the left-hand side of a tab is called as **favicon.ico**. This icon is usually a picture that can be hosted on a different endpoint, host or CDN. We can check whether the URL for the favicon is present in the Web page source code or not. 124 | 125 |
126 | 127 | #### How to find the favicon.ico link? 128 | 129 | * Visit any website which already posses a favicon ([https://github.com/](https://github.com/)) 130 | * Now, view the source code and find the keyword "**favicon**" in the source code. 131 | * You will find the link where the favicon is hosted ([https://github.githubassets.com/favicons/favicon.png](https://github.githubassets.com/favicons/favicon.png)) 132 | 133 | #### How can we leverage this to find different root domains? 134 | 135 | * Usually the web assets owned by a particular company will have the same logo/favicon image across various domains. 136 | * Hence, we can make a Internet wide search using Shodan to get all the domains/IP addresses of such web assets that have a common favicon. 137 | 138 | ### [Fav-UP](https://github.com/pielco11/fav-up): 139 | 140 | * **Author**: [Francesco Poldi](https://github.com/pielco11) 141 | * **Language**: Python 142 | 143 | Fav-Up is a great tool that can help us in the process to automate the steps of performing a favicon hash search. The FavUp.py is python based tool that performs execution in the following steps: 144 | 145 | 1. First will visit the page source of the mentioned website and try to find the URL on which the favicon is hosted. 146 | 2. After fetching the favicon.ico, now the tool generates [**MurmurHash** ](https://en.wikipedia.org/wiki/MurmurHash)of that favicon which is unique to every favicon. 147 | 3. Now it performs a **Shodan** search to find all the IP addresses that have the same favicon in their title(`http.favicon.hash:`) 148 | 149 |
#Installation
150 | git clone https://github.com/pielco11/fav-up.git
151 | cd fav-up/
152 | pip3 install -r requirements.txt
153 | apt-get install jq
154 | 
155 | #Initializing Shodan API key
156 | shodan init A5TCTEH78E6Zhjdva6X2fls6Oob9F2hL
157 | 
158 | #Running the tool
159 | python3 favUp.py -w www.github.com -sc -o output.json
160 | 
161 | #Parsing the output
162 | cat output.json | jq -r 'try .found_ips' | sed "s/|/\n/g"
163 | 
164 | 
165 | 166 |
167 | 168 | **You know this is a powerful technique when the Recon king**:crown: **tweets about it.** 169 | 170 | ![](../.gitbook/assets/jhaddixtweet.png) 171 | 172 | \ 173 | \ 174 | 175 | 176 | ## :checkered\_flag:**That's it !!! Done with Horizontal Enumeration**:checkered\_flag: 177 | 178 | #### Liked my work? Don't hesitate to buy me a coffee XDD 179 | 180 | #### :heart::blue\_heart::green\_heart: [https://www.buymeacoffee.com/siddheshparab](https://www.buymeacoffee.com/siddheshparab) :green\_heart: :blue\_heart: :heart: 181 | 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | -------------------------------------------------------------------------------- /types/vertical-enumeration.md: -------------------------------------------------------------------------------- 1 | # Vertical Enumeration 2 | 3 | Vertical Enumeration or Vertical domain correlation is a process of finding out hosts located on the same root domain. This type of enumeration contains various techniques and can be automated too. 4 | 5 | Here, we find different levels of subdomains of a particular root/main domain. 6 | 7 | ![](../.gitbook/assets/enumeration-2-.png) 8 | 9 | Vertical Enumeration can be performed with the help of below mentioned techniques: 10 | 11 | 1. **Passive Techniques** 12 | * Passive Sources 13 | * Certificates Logs 14 | 2. **Active Techniques** 15 | * DNS bruteforcing 16 | * Permutations/Alterations 17 | * JS/Source Code Scraping 18 | * VHOST discovery 19 | * Google Analytics 20 | * Recursive Enumeration 21 | * TLS, CSP, CNAME probing 22 | * Regex Permutations 23 | 3. **Web probing** 24 | * Default Ports 25 | * Common Ports 26 | 27 | Each and every technique is explained in detail, so read out the whole guide. :blush: 28 | -------------------------------------------------------------------------------- /web-probing.md: -------------------------------------------------------------------------------- 1 | # Web probing 2 | 3 | Another important aspect of subdomain enumeration is identifying web applications hosted on those subdomains. Most people perform pentesting on web applications only hence their accurate identification/discovery is essential. 4 | 5 | Port **80 & 443** are the default ports on which web applications are hosted. But one must also check for web applications on other common web ports. Most times something hosted on other common ports is very juicy or paid less attention by organizations. 6 | 7 | ## Tools 8 | 9 | ### [HTTPX](https://github.com/projectdiscovery/httpx) 10 | 11 | * **Author**: [projectdiscovery](https://github.com/projectdiscovery) 12 | * **Language**: Go 13 | 14 | **Httpx** is a fast multi-purpose toolkit that allows running multiple HTTP probers and find for web applications on a particular port. (find hosts ?)\ 15 | Httpx is a highly configurable tool, which means it provides a ton of flags. So, users can get a highly customizable output as per their needs. 16 | 17 | ### Installation: 18 | 19 | ```bash 20 | GO111MODULE=on go get -v github.com/projectdiscovery/httpx/cmd/httpx 21 | ``` 22 | 23 | ### Running Httpx 24 | 25 | ```bash 26 | cat hosts.txt | httpx -follow-redirects -status-code -random-agent -o output.txt 27 | ``` 28 | 29 | ### Flags: 30 | 31 | * **follow-redirects -** Follows redirects (can go out-of-scope) 32 | * **follow-host-redirects -** Follows redirects if on the same host (helps to be in-scope) 33 | * **random-agent -** Uses a random user-agent for each request 34 | * **status-code -** Shows the status code 35 | * **retries** - Number of times to retry if response not received 36 | * **no-color** - Don't use colorized output (to avoid color Unicode issues in output file) 37 | * **o** - Output file 38 | 39 | ![](.gitbook/assets/httpx.png) 40 | 41 | ## Probing on default ports: 42 | 43 | By default, [**httpx** ](https://github.com/projectdiscovery/httpx)will probes on port **80**(HTTP) & **443**(HTTPS). Organizations host their web applications on these ports. After subdomain enumeration, the next first task is identifying web applications where vulnerabilities are found in abundance. 44 | 45 | ```bash 46 | cat subdomains.txt | httpx -random-agent -retries 2 -no-color -o output.txt 47 | ``` 48 | 49 | ## Probing on common ports: 50 | 51 | Most people check for web applications only on the default ports, but what they fail to check is whether an application is hosted on any other port? 52 | 53 | Generally, there are around **88 common ports** on which web applications may be hosted. So, it's our duty to check for them. 👉 [**Here**](https://gist.github.com/sidxparab/459fa5e733b5fd3dd6c3aac05008c21c)👈 is the list of those common ports. Mostly anything hosted on these ports is very juicy and tends to yield a higher vulnerability. 54 | 55 | ### Method: 56 | 57 | ### 1) Using [httpx](https://github.com/projectdiscovery/httpx) 58 | 59 | ```bash 60 | cat subdomains.txt | httpx -random-agent -retries 2 -threads 150 -no-color -ports 81,300,591,593,832,981,1010,1311,1099,2082,2095,2096,2480,3000,3128,3333,4243,4567,4711,4712,4993,5000,5104,5108,5280,5281,5601,5800,6543,7000,7001,7396,7474,8000,8001,8008,8014,8042,8060,8069,8080,8081,8083,8088,8090,8091,8095,8118,8123,8172,8181,8222,8243,8280,8281,8333,8337,8443,8500,8834,8880,8888,8983,9000,9001,9043,9060,9080,9090,9091,9200,9443,9502,9800,9981,10000,10250,11371,12443,15672,16080,17778,18091,18092,20720,32000,55440,55672 -o output.txt 61 | ``` 62 | 63 | * Using **httpx** for common ports generally takes a lot of time as it needs to probe on a relatively higher amount of ports(88 in total). Hence, this method is feasible. 64 | 65 | ### 2) Using [Unimap](https://github.com/Edu4rdSHL/unimap) 66 | 67 | **Unimap** is a port scanner that uses [**Nmap**](https://github.com/nmap/nmap) as its base. Using Unimap we quickly scan for whether any of those 88 common ports are open on the subdomain or not(this happens at a blazing fast speed). Once we know that a particular port is open on the subdomain we can later send HTTP probes using **httpx** and check whether a web application is available on that open port or not\*\*.\*\* This method is far more quicker than just using httpx. 68 | 69 | **What's so special about Unimap?** 70 | 71 | * You would be wondering why didn't we use Nmap/Naabu for port scanning purposes right? 72 | * The answer lies in the way Unimap performs open port scanning. 73 | * Sometimes many subdomains point to the same IP address. Hence, scanning the same IP again & again would lead us to an IP ban or greater execution time. 74 | * Unimap uses its own technology to initially resolve the IP addresses of all subdomains, once this process is finished, it creates a vector with the unique IP addresses and launches a parallel scan with Nmap. 75 | 76 | ### Installation: 77 | 78 | ```bash 79 | wget -N -c https://github.com/Edu4rdSHL/unimap/releases/download/0.5.1/unimap-linux 80 | sudo mv unimap-linux /usr/local/bin/unimap 81 | chmod 755 /usr/local/bin/unimap 82 | strip -s /usr/local/bin/unimap 83 | ``` 84 | 85 | ### Steps: 86 | 87 | **1)** First let's initialize all the common ports into a variable called `COMMON_PORTS_WEB` 88 | 89 | ```bash 90 | COMMON_PORTS_WEB="81,300,591,593,832,981,1010,1311,1099,2082,2095,2096,2480,3000,3128,3333,4243,4567,4711,4712,4993,5000,5104,5108,5280,5281,5601,5800,6543,7000,7001,7396,7474,8000,8001,8008,8014,8042,8060,8069,8080,8081,8083,8088,8090,8091,8095,8118,8123,8172,8181,8222,8243,8280,8281,8333,8337,8443,8500,8834,8880,8888,8983,9000,9001,9043,9060,9080,9090,9091,9200,9443,9502,9800,9981,10000,10250,11371,12443,15672,16080,17778,18091,18092,20720,32000,55440,55672" 91 | ``` 92 | 93 | **2)** Now we will run a port scan to check all the open ports 94 | 95 | ```bash 96 | sudo unimap --fast-scan -f subdomains.txt --ports $COMMON_PORTS_WEB -q -k --url-output > unimap_commonweb.txt 97 | ``` 98 | 99 | **3)** Now that we have a list of open ports, we will check for web applications running on them using **httpx**. 100 | 101 | ```bash 102 | cat unimap_commonweb.txt | httpx -random-agent -status-code -silent -retries 2 -no-color | cut -d ' ' -f1 | tee probed_common_ports.txt 103 | ``` 104 | 105 | * That's it, we have got those hidden web applications running on common ports. Go ahead! and hunt on them. 🐞 106 | 107 | 📊 **Some stats:** 🤓 108 | 109 | | **Method** | **Execution Time (150 subdomains)** | 110 | | ---------- | ----------------------------------- | 111 | | 1st Method | 42min 51secs | 112 | | 2nd Method | **55 secs** ⚡ | 113 | --------------------------------------------------------------------------------