├── .config
└── .aws
├── .gitbook
└── assets
│ ├── ChatGPT acquistion.png
│ ├── DNS bruteforcing.png
│ ├── Excel sheet screenshot.png
│ ├── Github favicon.png
│ ├── Gotator.png
│ ├── Recursive Enumeration.png
│ ├── Subfinder With.png
│ ├── TLS.png
│ ├── TLscert_yahoo.png
│ ├── Vhost bruteforcing.png
│ ├── asnip.png
│ ├── axiomxreconftw.png
│ ├── carbon-2-.png
│ ├── carbon-6-.png
│ ├── cero.png
│ ├── ceroo(1).png
│ ├── circle-cropped.png
│ ├── copy-of-copy-of-copy-of-webscraping_meme.png
│ ├── crt.png
│ ├── crunchbase (1).png
│ ├── crunchbase.png
│ ├── csp.png
│ ├── ctfr.png
│ ├── dnscewl.png
│ ├── dnsvalidator1.png
│ ├── dnvalidator.png
│ ├── download.jpg
│ ├── enumeration-2-.png
│ ├── excelsheet.png
│ ├── favUp(1).png
│ ├── favicon.png
│ ├── favicontool.png
│ ├── googlenalytics.png
│ ├── gospider.png
│ ├── hosthunter.png
│ ├── httpx.png
│ ├── hurricane.png
│ ├── index.png
│ ├── jhaddixtweet.png
│ ├── memesss.png
│ ├── permutations.png
│ ├── ptr.png
│ ├── purednsb.png
│ ├── reconftw_logo.png
│ ├── resolve.png
│ ├── shodanfavicon.png
│ ├── subdomains.png
│ ├── subfinder without(1).png
│ ├── subfinder without.png
│ ├── subfinderconfig.png
│ ├── subfinderwithout.png
│ ├── tls probing.png
│ ├── twitter-logo.png
│ ├── untitled-design-1-.png
│ ├── whoiss.png
│ ├── whoisxml.png
│ └── whoxyrm.png
├── CONTRIBUTING.md
├── README.md
├── SUMMARY.md
├── active-enumeration
├── dns-bruteforcing.md
├── google-analytics.md
├── js-file-scraping.md
├── other-methods.md
├── permutation-alterations.md
├── tls-csp-cname.md
└── vhost-probing.md
├── automation.md
├── introduction
├── prequisites.md
└── whats-the-need.md
├── passive-enumeration
├── certificate-logs.md
├── passive-sources.md
└── recursive-enumeration.md
├── types
├── horizontal-enumeration.md
└── vertical-enumeration.md
└── web-probing.md
/.config/.aws:
--------------------------------------------------------------------------------
1 | Config {
2 | AWSAccessKeyId=AKIA4VFHDSG67Y5QTMQS
3 | AWSSecretKey=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
4 | }
5 |
--------------------------------------------------------------------------------
/.gitbook/assets/ChatGPT acquistion.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ChatGPT acquistion.png
--------------------------------------------------------------------------------
/.gitbook/assets/DNS bruteforcing.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/DNS bruteforcing.png
--------------------------------------------------------------------------------
/.gitbook/assets/Excel sheet screenshot.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Excel sheet screenshot.png
--------------------------------------------------------------------------------
/.gitbook/assets/Github favicon.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Github favicon.png
--------------------------------------------------------------------------------
/.gitbook/assets/Gotator.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Gotator.png
--------------------------------------------------------------------------------
/.gitbook/assets/Recursive Enumeration.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Recursive Enumeration.png
--------------------------------------------------------------------------------
/.gitbook/assets/Subfinder With.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Subfinder With.png
--------------------------------------------------------------------------------
/.gitbook/assets/TLS.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/TLS.png
--------------------------------------------------------------------------------
/.gitbook/assets/TLscert_yahoo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/TLscert_yahoo.png
--------------------------------------------------------------------------------
/.gitbook/assets/Vhost bruteforcing.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/Vhost bruteforcing.png
--------------------------------------------------------------------------------
/.gitbook/assets/asnip.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/asnip.png
--------------------------------------------------------------------------------
/.gitbook/assets/axiomxreconftw.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/axiomxreconftw.png
--------------------------------------------------------------------------------
/.gitbook/assets/carbon-2-.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/carbon-2-.png
--------------------------------------------------------------------------------
/.gitbook/assets/carbon-6-.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/carbon-6-.png
--------------------------------------------------------------------------------
/.gitbook/assets/cero.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/cero.png
--------------------------------------------------------------------------------
/.gitbook/assets/ceroo(1).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ceroo(1).png
--------------------------------------------------------------------------------
/.gitbook/assets/circle-cropped.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/circle-cropped.png
--------------------------------------------------------------------------------
/.gitbook/assets/copy-of-copy-of-copy-of-webscraping_meme.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/copy-of-copy-of-copy-of-webscraping_meme.png
--------------------------------------------------------------------------------
/.gitbook/assets/crt.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/crt.png
--------------------------------------------------------------------------------
/.gitbook/assets/crunchbase (1).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/crunchbase (1).png
--------------------------------------------------------------------------------
/.gitbook/assets/crunchbase.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/crunchbase.png
--------------------------------------------------------------------------------
/.gitbook/assets/csp.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/csp.png
--------------------------------------------------------------------------------
/.gitbook/assets/ctfr.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ctfr.png
--------------------------------------------------------------------------------
/.gitbook/assets/dnscewl.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/dnscewl.png
--------------------------------------------------------------------------------
/.gitbook/assets/dnsvalidator1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/dnsvalidator1.png
--------------------------------------------------------------------------------
/.gitbook/assets/dnvalidator.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/dnvalidator.png
--------------------------------------------------------------------------------
/.gitbook/assets/download.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/download.jpg
--------------------------------------------------------------------------------
/.gitbook/assets/enumeration-2-.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/enumeration-2-.png
--------------------------------------------------------------------------------
/.gitbook/assets/excelsheet.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/excelsheet.png
--------------------------------------------------------------------------------
/.gitbook/assets/favUp(1).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/favUp(1).png
--------------------------------------------------------------------------------
/.gitbook/assets/favicon.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/favicon.png
--------------------------------------------------------------------------------
/.gitbook/assets/favicontool.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/favicontool.png
--------------------------------------------------------------------------------
/.gitbook/assets/googlenalytics.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/googlenalytics.png
--------------------------------------------------------------------------------
/.gitbook/assets/gospider.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/gospider.png
--------------------------------------------------------------------------------
/.gitbook/assets/hosthunter.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/hosthunter.png
--------------------------------------------------------------------------------
/.gitbook/assets/httpx.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/httpx.png
--------------------------------------------------------------------------------
/.gitbook/assets/hurricane.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/hurricane.png
--------------------------------------------------------------------------------
/.gitbook/assets/index.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/index.png
--------------------------------------------------------------------------------
/.gitbook/assets/jhaddixtweet.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/jhaddixtweet.png
--------------------------------------------------------------------------------
/.gitbook/assets/memesss.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/memesss.png
--------------------------------------------------------------------------------
/.gitbook/assets/permutations.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/permutations.png
--------------------------------------------------------------------------------
/.gitbook/assets/ptr.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/ptr.png
--------------------------------------------------------------------------------
/.gitbook/assets/purednsb.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/purednsb.png
--------------------------------------------------------------------------------
/.gitbook/assets/reconftw_logo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/reconftw_logo.png
--------------------------------------------------------------------------------
/.gitbook/assets/resolve.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/resolve.png
--------------------------------------------------------------------------------
/.gitbook/assets/shodanfavicon.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/shodanfavicon.png
--------------------------------------------------------------------------------
/.gitbook/assets/subdomains.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subdomains.png
--------------------------------------------------------------------------------
/.gitbook/assets/subfinder without(1).png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinder without(1).png
--------------------------------------------------------------------------------
/.gitbook/assets/subfinder without.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinder without.png
--------------------------------------------------------------------------------
/.gitbook/assets/subfinderconfig.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinderconfig.png
--------------------------------------------------------------------------------
/.gitbook/assets/subfinderwithout.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/subfinderwithout.png
--------------------------------------------------------------------------------
/.gitbook/assets/tls probing.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/tls probing.png
--------------------------------------------------------------------------------
/.gitbook/assets/twitter-logo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/twitter-logo.png
--------------------------------------------------------------------------------
/.gitbook/assets/untitled-design-1-.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/untitled-design-1-.png
--------------------------------------------------------------------------------
/.gitbook/assets/whoiss.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/whoiss.png
--------------------------------------------------------------------------------
/.gitbook/assets/whoisxml.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/whoisxml.png
--------------------------------------------------------------------------------
/.gitbook/assets/whoxyrm.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sidxparab/Subdomain-Enumeration-Guide/a7f485fecda85472492ad41f0415fd76d506b518/.gitbook/assets/whoxyrm.png
--------------------------------------------------------------------------------
/CONTRIBUTING.md:
--------------------------------------------------------------------------------
1 | Feel free make Pull-Requests to this repo
2 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | ---
2 | description: Comprehensive Subdomain Enumeration Guide
3 | ---
4 |
5 | # Home 🏠
6 |
7 | ## Subdomain Enumeration Guide 2023 📖
8 |
9 | This guide contains all the needed knowledge for performing a good subdomain enumeration. I have tried to cover each technique and explained it from a beginner's perspective. Each of the techniques used has a detailed explanation about why this technique was used and how to perform them. I have tried to link various gists, charts, statistics for a better understanding of the concept.
10 |
11 | There exists various tools over the internet which perform the same tasks. So, I have tried to include only those tools that yield the best results compared to other tools from the same category.
12 |
13 | I encourage y'all to go through this guide and try to build your **own reconnaissance methodology**. I believe each one should have their own methodology and keep trying out new things and find out which fits best for them.
14 |
15 | I'm too a beginner in this field and have tried my best to explain the right concepts. If you think any of the content is wrongly explained, I am always open to listening to you.
16 |
17 | Last but not the least, I would like to thank [**six2dez**](https://twitter.com/Six2dez1) for supporting and helping me during my learning phase, who's outcome you can see in this guide.
18 |
19 |
20 |
21 | **Twitter**: [@sidxparab](https://twitter.com/sidxparab)
22 |
23 | **LinkedIn**: [@sidxparab](https://www.linkedin.com/in/sidxparab/)
24 |
25 | **Medium**: [@sidxparab](https://medium.com/@sidxparab)
26 |
27 | ## Support 🙏 :heart:
28 |
29 | ### **You can support my work by buying me a coffee** ☕
30 |
31 | {% embed url="https://buymeacoffee.com/siddheshparab" %}
32 |
33 | **Doitdoitdoitdoitdoitdoit....** 😉
34 |
35 |
36 |
37 |
--------------------------------------------------------------------------------
/SUMMARY.md:
--------------------------------------------------------------------------------
1 | # Table of contents
2 |
3 | * [Home 🏠](README.md)
4 |
5 | ## Introduction
6 |
7 | * [What's the need ?🤔](introduction/whats-the-need.md)
8 | * [Prerequisites](introduction/prequisites.md)
9 |
10 | ## Types
11 |
12 | * [Horizontal Enumeration](types/horizontal-enumeration.md)
13 | * [Vertical Enumeration](types/vertical-enumeration.md)
14 |
15 | ## Passive Techniques
16 |
17 | * [Passive Sources](passive-enumeration/passive-sources.md)
18 | * [Certificate Logs](passive-enumeration/certificate-logs.md)
19 | * [Recursive Enumeration](passive-enumeration/recursive-enumeration.md)
20 |
21 | ## Active Techniques
22 |
23 | * [DNS Bruteforcing](active-enumeration/dns-bruteforcing.md)
24 | * [Permutation/Alterations](active-enumeration/permutation-alterations.md)
25 | * [Scraping(JS/Source code)](active-enumeration/js-file-scraping.md)
26 | * [Google analytics](active-enumeration/google-analytics.md)
27 | * [TLS, CSP, CNAME Probing](active-enumeration/tls-csp-cname.md)
28 | * [VHOST probing](active-enumeration/vhost-probing.md)
29 |
30 | ***
31 |
32 | * [Web probing](web-probing.md)
33 | * [Automation 🤖](automation.md)
34 |
--------------------------------------------------------------------------------
/active-enumeration/dns-bruteforcing.md:
--------------------------------------------------------------------------------
1 | # DNS Bruteforcing
2 |
3 | ## What is DNS bruteforcing?
4 |
5 | In simple terms DNS bruteforcing is a technique where, we prepend a long list of common subdomains names to our target domain and try to DNS resolve this new list in hope to find valid subdomains of our target domain.
6 |
7 | This is what happens during DNS bruteforcing:
8 |
9 | * **admin** ----> **admin**.example.com
10 | * **internal.dev** ----> **internal.dev**.example.com
11 | * **secret** ----> **secret**.example.com
12 | * **backup01** ----> **backup01**.example.com
13 |
14 | Now that we have a list of probable domain names that could exists, we can perform DNS resolution on this domain list. This would yield us live subdomains. After this process, if any of these subdomains is found valid, it's a win-win situation for us.
15 |
16 |
#Prerequisites
61 | git clone https://github.com/blechschmidt/massdns.git
62 | cd massdns
63 | make
64 | sudo make install
65 |
66 | #Installing the tool
67 | go install github.com/d3mondev/puredns/v2@latest
68 |
69 |
70 | ### Running Puredns:
71 |
72 | Before we start using puredns for bruteforcing we need to generate our public DNS resolvers list. For this, we will use a tool called [dnsvalidator](https://github.com/vortexau/dnsvalidator). Check [my previous page](https://app.gitbook.com/@sidxparab/s/subdomain-enumeration-guide/introduction/prequisites#2-100-accurate-public-dns-resolvers) to know more about public DNS resolvers and why they are important.
73 |
74 | ```bash
75 | git clone https://github.com/vortexau/dnsvalidator.git
76 | cd dnsvalidator/
77 | pip3 install -r requirements.txt
78 | pip3 install setuptools==58.2.0
79 | python3 setup.py install
80 | ```
81 |
82 | **Generating list of open public DNS resolvers**
83 |
84 | It's very important to note that even if one of your public resolver is failing/not working you have a greater chance of missing an important subdomain. Hence, it's always advised that you generate a fresh public DNS resolvers list before execution.
85 |
86 | ```bash
87 | dnsvalidator -tL https://public-dns.info/nameservers.txt -threads 100 -o resolvers.txt
88 | ```
89 |
90 | 
91 |
92 | #### Downloading a pre-populated list of valid DNS resolvers:
93 |
94 | Various open source contributors like [proabiral](https://github.com/proabiral/Fresh-Resolvers), [cxosmo](https://github.com/cxosmo/dns-resolvers), [janmasarik ](https://github.com/janmasarik/resolvers)have set up their GitHub-Actions or VPS in order to generate valid public DNS resolvers periodically(every 24hrs). We can make use of these DNS resolvers rather than generating our own resolvers using dnsvalidator which consumes alot of time. To aggregate all of these efforts [Trickest](https://github.com/trickest) have come up with their own repository called [**resolvers**](https://github.com/trickest/resolvers). It is a merged list of all the DNS resolvers, which they validate every 24 hours.
95 |
96 | ```bash
97 | wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers.txt
98 | ```
99 |
100 |
101 |
102 | Now that we have generated our public DNS resolver we are good to move ahead and perform subdomain bruteforcing using puredns.
103 |
104 | ```bash
105 | puredns bruteforce wordlist.txt example.com -r resolvers.txt -w output.txt
106 | ```
107 |
108 | **Flags:**
109 |
110 | * **bruteforce** - use the bruteforcing mode
111 | * **r** - Specify your public resolvers
112 | * **w** - Output filename
113 |
114 | 
115 |
116 | {% hint style="success" %}
117 | While performing DNS queries sometimes we receive **SERVFAIL** error. Puredns by default retries on SERVFAIL while most tools don't.
118 | {% endhint %}
119 |
120 | ### Which wordlist :page\_facing\_up: to use?
121 |
122 | The whole effort of DNS bruteforcing is a waste if you don't use a good subdomain bruteforcing wordlist. Selection of the wordlist is the most important aspect of bruteforcing. Let's have a look at some great wordlists:-\
123 | \
124 | **1) Assetnote** [**best-dns-wordlist.txt**](https://wordlists-cdn.assetnote.io/data/manual/best-dns-wordlist.txt) (**9 Million**) ⭐\
125 | [Assetnote](https://wordlists.assetnote.io/) wordlists are the best. No doubt this is the best subdomain bruteforcing wordlist. But highly recommended that you run this in your VPS. Running on a home system will take hours also the results wouldn't be accurate. This wordlist will definitely give you those hidden subdomains.
126 |
127 | **2) n0kovo** [**n0kovo\_subdomains\_huge.txt**](https://github.com/n0kovo/n0kovo\_subdomains/blob/main/n0kovo\_subdomains\_huge.txt) (**3 Million**)\
128 | [N0kovo ](https://github.com/n0kovo)created this wordlist by scanning the whole IPv4 and collecting all the subdomain names from the TLS certificates. You can check out [this blog](https://n0kovo.github.io/posts/subdomain-enumeration-creating-a-highly-efficient-wordlist-by-scanning-the-entire-internet/#benchmarking-) to see how good this bruteforcing wordlist performs as compared other big wordlists. So, if you are target contains a lot of wildcards this would be best wordlist for bruteforcing_(considering the computation bottleneck for wildcard filtering)._
129 |
130 | **3) Smaller** [**wordlist**](https://gist.github.com/six2dez/a307a04a222fab5a57466c51e1569acf/raw) (**102k** )\
131 | Created by [six2dez](https://github.com/six2dez) is suitable to be run if you are using your personal computer which is consuming your home wifi router internet.\
132 |
133 |
134 |
135 |
136 | ### 🙁Problems faced during subdomain bruteforcing
137 |
138 | #### 1) Wildcard filtering
139 |
140 | A wildcard DNS record is a record that matches requests for non-existent domain names. Wildcards are denoted by specifying a **`*`** of the left part of a domain name such as **\*.example.com.** That means even if a subdomain doesn't vlid it will return a valid response. See the example below:-
141 |
142 | **doesntexists.example.com** ----> **valid**
143 |
144 | **Strange right?** So in short, if a domain is a wildcard domain we will get all valid responses(false positives) while bruteforcing and wouldn't be able to differentiate which are valid and which aren't. To avoid this various wildcard filtering techniques are used by subdomain bruteforcing tools.
145 |
146 | **2) Open Public resolvers**
147 |
148 | While bruteforcing we tend to use a long wordlist of common subdomain names to get those hidden domains, hence the domains to be resolved will also be large. Such large resolutions cannot be performed by your system's DNS resolver, hence we depend on freely available public resolvers. Also, using public resolvers eliminates the changes of DNS rate limits.
149 |
150 | We can get the list of open public DNS resolvers from here [https://public-dns.info/nameservers.txt](https://public-dns.info/nameservers.txt)
151 |
152 | {% hint style="info" %}
153 | :book: Read [**this** ](https://app.gitbook.com/@sidxparab/s/subdomain-enumeration-guide/introduction/prequisites#2-100-accurate-public-dns-resolvers)article on why how to create public resolvers and they are important
154 | {% endhint %}
155 |
156 | **3) Bandwidth**
157 |
158 | While performing subdomain bruteforcing [massdns](https://github.com/blechschmidt/massdns) is used as a base tool for DNS querying at very high concurrent rates. For this, the underlying system should also possess a higher bandwidth.
159 |
160 |
161 |
162 | ## :punch:Issues faced and how to overcome them:
163 |
164 | #### 1) Crashes on low specs( 1cpu/1gb vps)
165 |
166 | Usually, if you provide a very large bruteforce wordlist and your target domain contains significant wildcards then sometimes puredns crashes out due to less memory while filtering out wildcards. To overcome this issue you can use **`--wildcard-batch 1000000`** flag. By default, puredns puts all the domains in a single batch to save on the number of DNS queries and execution time. Using this flag takes in a batch of only 1 million subdomains at a time for wildcard filtering and after completion of the task takes in the next batch for wildcard filtering.
167 |
168 | **2) Puredns kills my home router**
169 |
170 | Massdns is the one to be blamed for. Massdns tries to perform DNS resolution using public resolvers at an unlimited rate. This generates very large traffic consuming up the whole bandwidth, thus making other applications laggy/unresponsive. To overcome this you can use the **`-l`** flag which rate limits the number of DNS queries to public resolvers. It's advisable that you set the value anywhere between `2000-10000`
171 |
172 |
173 |
174 |
--------------------------------------------------------------------------------
/active-enumeration/google-analytics.md:
--------------------------------------------------------------------------------
1 | # Google analytics
2 |
3 | Most organizations use [Google Analytics](https://analytics.google.com/analytics/web/) to track website visitors and for more statistics. Generally, they have the same Google Analytics ID across all subdomains of a root domain. This means we can perform a reverse search and find all the subdomains having the same ID. Hence, it helps us in the enumeration process.
4 |
5 | Most people might be familiar with a browser extension called [**BuiltWidth**](https://builtwith.com/toolbar)**.** But using this extension or its website is a manual process. We need some sort of command-line utility. That's when **AnalyticsRelationships** comes to the rescue.
6 |
7 | ## Tool:
8 |
9 | ### [AnalyticsRelationships](https://github.com/Josue87/AnalyticsRelationships)
10 |
11 | * **Author**: [Josué Encinar](https://github.com/Josue87)
12 | * **Language**: Go/Python
13 |
14 | **AnalyticsRelationships** is a tool to enumerate subdomains via Google Analytics ID. It does not require any login and has the capability to bypass the [BuiltWidth ](https://builtwith.com/)& [HackerTarget ](https://hackertarget.com/)captchas. This tool is available in 2 languages Python & Go. But the Go one is faster compared to the python one.
15 |
16 | ### Installation:
17 |
18 | ```bash
19 | git clone https://github.com/Josue87/AnalyticsRelationships.git
20 | cd AnalyticsRelationships/GO
21 | go build -ldflags "-s -w"
22 | ```
23 |
24 | ### Running:
25 |
26 | * The output may contain false positives.
27 | * Also, you need to further DNS resolve them in order to get the valid ones.
28 |
29 | ```bash
30 | ./analyticsrelationships --url https://www.bugcrowd.com
31 | ```
32 |
33 | 
34 |
35 |
36 |
37 |
--------------------------------------------------------------------------------
/active-enumeration/js-file-scraping.md:
--------------------------------------------------------------------------------
1 | # Scraping\(JS/Source code\)
2 |
3 | ## Source Code Recon
4 |
5 | JavaScript files are used by modern web applications to provide dynamic content which contains various functions & events. Each website includes JS files and are a great resource for finding those internal subdomains used by the organization.
6 |
7 | ## Tools: 🛠
8 |
9 | ### 1\) [Gospider](https://github.com/jaeles-project/gospider)
10 |
11 | * **Author**: [Jaeles](https://github.com/jaeles-project)
12 | * **Language**: Go
13 |
14 | [**Gospider**](https://github.com/jaeles-project/gospider) is a fast web spidering tool capable of crawling the whole website within in a short amount of time. This means gospider will visit/scrap each and every URL mentioned in the JS file and source code. So, since source code & JS files make up a website they may contain links to other subdomains too.
15 |
16 | ### Installation:
17 |
18 | ```bash
19 | go get -u github.com/jaeles-project/gospider
20 | ```
21 |
22 | **This is a long process so Brace yourself !!!** 💪
23 |
24 | ### Running:
25 |
26 | This process is divided into3⃣steps:
27 |
28 | ### 1\) Web probing subdomains
29 |
30 | * Since we are crawling a website, gospider excepts us to provide URL's, which means in the form of `http://` `https://`
31 | * So first, we need to web probe all the subdomains we have gathered till now. For this purpose, we will use [**httpx**](https://github.com/projectdiscovery/httpx) .
32 | * So, lets first web probe the subdomains:
33 |
34 | ```bash
35 | cat subdomains.txt | httpx -random-agent -retries 2 -no-color -o probed_tmp_scrap.txt
36 | ```
37 |
38 | * Now, that we have web probed URLs, we can send them for crawling to gospider.
39 |
40 | ```bash
41 | gospider -S probed_tmp_scrap.txt --js -t 50 -d 3 --sitemap --robots -w -r > gospider.txt
42 | ```
43 |
44 | {% hint style="danger" %}
45 | **Caution**: This generates huge traffic on your target
46 | {% endhint %}
47 |
48 | #### Flags:
49 |
50 | * **S** - Input file
51 | * **js** - Find links in JavaScript files
52 | * **t** - Number of threads \(Run sites in parallel\) \(default 1\)
53 | * **d** - depth \(3 depth means scrap links from second-level JS files\)
54 | * **sitemap** - Try to crawl sitemap.xml
55 | * **robots** - Try to crawl robots.txt
56 |
57 | 
58 |
59 | ### 2\) Cleaning the output
60 |
61 | > The parth portion of an URL shouldn't have more than 2048 characters. Since, we gopsider
62 | >
63 | > ```bash
64 | > sed -i '/^.\{2048\}./d' gospider.txt
65 | > ```
66 |
67 | The Point to note here is we have got URLs from JS files & source code till now. We are only concerned with subdomains. Hence we must just extract subdomains from the Gospider output.
68 |
69 | This can be done using Tomnomnom's [**unfurl**](https://github.com/tomnomnom/unfurl) tool. It takes a list of URLs as input and extracts the subdomain/domain part from them.
70 | You can install **Unfurl** using this command `go get -u github.com/tomnomnom/unfurl`
71 |
72 | ```bash
73 | cat gospider.txt | grep -Eo 'https?://[^ ]+' | sed 's/]$//' | unfurl -u domains | grep ".example.com$" | sort -u scrap_subs.txt
74 | ```
75 |
76 | **Break down of the command:**
77 | **a\)** grep - Extract the links that start with http/https
78 | **b\)** sed - Remove " \] " at the end of line
79 | **c\)** unfurl - Extract domain/subdomain from the urls
80 | **d\)** grep - Only select subdomains of our target
81 | **e\)** sort - Avoid duplicates
82 |
83 | ### 3\) Resolving our target subdomains
84 |
85 | * Now that we have all the subdomains of our target, it's time to DNS resolve and check for valid subdomains.
86 |
87 | \( hoping you have seen the previous techniques, and you know how to run puredns\)
88 |
89 | ```bash
90 | puredns resolve scrap_subs.txt -w scrap_subs_resolved.txt -r resolvers.txt
91 | ```
92 |
93 | I love this technique as, it also finds hidden Amazon S3 buckets used by the organization.If such buckets are open and expose sensitive data than its a WIN WIN situation for us.
94 | Also the ouput of this can be sent to secretfinder tool, whihc can find hidden secrets,exposed api tokens etc.
95 |
96 | 
97 |
98 |
--------------------------------------------------------------------------------
/active-enumeration/other-methods.md:
--------------------------------------------------------------------------------
1 | # Other methods
2 |
3 |
--------------------------------------------------------------------------------
/active-enumeration/permutation-alterations.md:
--------------------------------------------------------------------------------
1 | # Permutation/Alterations
2 |
3 | It is almost similar to the previous DNS wordlist bruteforcing but instead of simply performing a dictionary attack we generate combinations/permutations of the already known subdomains.
4 |
5 | One more thing to be noted here is, we also need a small wordlist with us in this method, which would contain common words like `mail` , `internal`, `dev`, `demo`, `accounts`, `ftp`, `admin`(similar to DNS bruteforcing but smaller)
6 |
7 | For instance, let's consider a subdomain **`dev.example.com`** . Now we will generate different variations/permutations of this domain.
8 |
9 | 
10 |
11 | Isn't it good that we can generate such great combinations? This is the power of permutation bruteforcing. Now that we have generated these combinations, we further need to DNS resolve them and check if we get any valid subdomains. If so it would be a WIN ! WIN ! 🏁 situation for us.
12 |
13 | ## Tools:
14 |
15 | ### [Gotator](https://github.com/Josue87/gotator)
16 |
17 | * **Author:** [Josué Encinar](https://github.com/Josue87)
18 | * **Language**: Go
19 |
20 | Gotator is DNS wordlist generator tool. It is used to generate various combinations or permutations of a root domain with the user-supplied wordlist. Capable of generating 1M combinations in almost 2 secs.
21 |
22 | ### Features:
23 |
24 | * Permute numbers up and down (**dev2** --> `dev0`, `dev1`, `dev2`, `dev3`,`dev4`)
25 | * 3 levels of depth (**`dev`.`demo`.`admin`.**example.com)
26 | * Controls generation of duplicate permutations
27 | * Option to add external permutation list
28 | * Option to permute amongst the subdomains
29 |
30 | ### Installation:
31 |
32 | ```bash
33 | go install -v github.com/Josue87/gotator@latest
34 | ```
35 |
36 | ### Running:
37 |
38 | * First, we need to make a combined list of all the subdomains(valid/invalid) we collected from all the above steps whose permutations we will create.
39 | * To generate combinations you need to provide a small wordlist that contains common domain names like admin, demo, backup, api, ftp, email, etc.
40 | * [This](https://gist.githubusercontent.com/six2dez/ffc2b14d283e8f8eff6ac83e20a3c4b4/raw) is a good wordlist of 1K permutation words that we will need.
41 | * The below command generates a huge list of non-resolved subdomains.
42 |
43 | ```
44 | gotator -sub subdomains.txt -perm permutations_list.txt -depth 1 -numbers 10 -mindup -adv -md > gotator1.txt
45 | ```
46 |
47 | #### Flags:
48 |
49 | * **sub** - Specify subdomain list
50 | * **perm** - Specify permutation/append list
51 | * **depth** - Configure the depth
52 | * **numbers** - Configure the number of iterations to the numbers found in the permutations (up and down)
53 | * **mindup** - Set this flag to minimize duplicates. (For heavy workloads, it is recommended to activate this flag).
54 | * **md** - Extract 'previous' domains and subdomains from subdomains found in the list 'sub'.
55 | * **adv** - Advanced option. Generate permutations words with subdomains and words with -. And joins permutation word in the back
56 |
57 | 
58 |
59 | ### Resolution:
60 |
61 | * Now that we have made a huge list of all the possible subdomains that could exist, now it's time to DNS resolve them and check for valid ones.
62 | * For this, we will again use [Puredns](https://github.com/d3mondev/puredns).
63 | * It's always better to generate fresh public DNS resolvers every time we use them.
64 |
65 | ```bash
66 | puredns resolve permutations.txt -r resolvers.txt
67 | ```
68 |
69 | **Flags:**
70 |
71 | * **resolve** - Use resolution mode
72 | * **r** - public DNS resolvers list
73 |
74 | In such a way, we have got those strange name subdomains and increased our attack surface.
75 |
76 | ###
77 |
--------------------------------------------------------------------------------
/active-enumeration/tls-csp-cname.md:
--------------------------------------------------------------------------------
1 | # TLS, CSP, CNAME Probing
2 |
3 | ## 1) TLS Probing
4 |
5 | Nowadays generally all websites communicate over HTTPS(HyperText Transfer Protocol Secure). In order to use HTTPS, the website owner needs to issue an SSL(Secure Socket Layer) certificate.
6 |
7 | This SSL/TLS(Transport Layer Security) certificate contains hostname belonging to the same organization.
8 |
9 | Clicking on the "Lock🔒" button in the address bar, you can view the TLS/SSL certificate of any website.
10 |
11 | 
12 |
13 |
14 |
15 | For this purpose, we will be using a tool called [**Cero**](https://github.com/glebarez/cero)
16 |
17 | #### Installation:
18 |
19 | ```bash
20 | go install github.com/glebarez/cero@latest
21 | ```
22 |
23 | #### Running:
24 |
25 | ```bash
26 | cero in.search.yahoo.com | sed 's/^*.//' | grep -e "\." | sort -u
27 | ```
28 |
29 | .png)
30 |
31 | ## 2) CSP Probing
32 |
33 | In order to defend from the XSS attacks as well as keeping in mind to allow cross-domain resource sharing in websites CSP(Content Security Policies) are used. These CSP headers sometimes contain domains/subdomains from where the content is usually imported.
34 |
35 | Hence, these subdomains can be helpful for us. In the below image we can see I extracted domains/subdomains from the CSP header of [twitter.com](https://twitter.com)
36 |
37 | ```
38 | cat subdomains.txt | httpx -csp-probe -status-code -retries 2 -no-color | anew csp_probed.txt | cut -d ' ' -f1 | unfurl -u domains | anew -q csp_subdomains.txt
39 | ```
40 |
41 | 
42 |
43 | ## 3) CNAME Probing
44 |
45 | I personally came across 2-3 cases where visiting the CNAME of the website showed me the same website without a firewall. (I personally don't know why this happened)
46 |
47 | Since then I probe the CNAME's of the subdomains found.
48 |
49 | ```
50 | dnsx -retry 3 -cname -l subdomains.txt
51 | ```
52 |
--------------------------------------------------------------------------------
/active-enumeration/vhost-probing.md:
--------------------------------------------------------------------------------
1 | # VHOST probing
2 |
3 | ## What is Virtual Host?
4 |
5 | VHOST(Virtual Host) refers to the practice of running more than one website (such as `company1.example.com` and `company2.example.com`) on a single machine.
6 |
7 | There are mainly 2 types of Virtual hosts:
8 |
9 | 1. **IP-based Virtual Host:**
10 |
11 | In IP-based Virtual Host, we have different IP addresses for every website.
12 | 2. **Name-based Virtual Host:**✔️
13 |
14 | In named-based Virtual Host, several websites are hosted on the same IP. Mostly this type is widely and preferred in order to preserve IP space.
15 |
16 | But when talking about VHOST we are generally talking about **Named-based Virtual hosts.**
17 |
18 | ### How does this actually work?
19 |
20 | Now, you would be confused about how will the webserver differentiate to which website it has to send my requests since many websites are being hosted on the same server with the same IP.
21 |
22 | It's through the "**Host header**". The web server identifies which content to serve up once it receives the Host header from the client.
23 |
24 |
25 |
26 | 
27 |
28 | ### How to identity VHOST on a single IP?
29 |
30 | For this purpose, we can use a tool called[ HostHunter](https://github.com/SpiderLabs/HostHunter).
31 |
32 | ### [HostHunter](https://github.com/SpiderLabs/HostHunter)
33 |
34 | * **Author**: [SpiderLabs](https://github.com/SpiderLabs)
35 | * **Language**: Python
36 |
37 | #### Installation:
38 |
39 | ```
40 | git clone https://github.com/SpiderLabs/HostHunter.git
41 | pip3 install -r requirements.txt
42 | ```
43 |
44 | #### Running:
45 |
46 | ```
47 | python3 hosthunter.py ip_addresses.txt
48 | ```
49 |
50 | 
51 |
52 |
53 |
54 | ## VHOST bruteforcing
55 |
56 | > Sorry to say, I couldn't find appropriate content around the internet related to this topic. Me myself don't use this technique, but yes this is also an technique to discover VHOSTS
57 |
58 | ```
59 | gobuster vhost -u https://example.com -t 50 -w subdomains.txt
60 | ```
61 |
--------------------------------------------------------------------------------
/automation.md:
--------------------------------------------------------------------------------
1 | # Automation 🤖
2 |
3 | It would be difficult for a person to perform all the above-mentioned techniques. Hence, we need to rely on some kind of tool to automate such intensive steps. Wouldn't it be good if we just had to give our target name **BOOM** !!💥 the tool performs subdomain enumeration via all these techniques?
4 |
5 | ## [ReconFTW](https://github.com/six2dez/reconftw)
6 |
7 | 
8 |
9 | * Author: [six2dez](https://github.com/six2dez)
10 | * Language: Bash
11 |
12 | Yess this tool outperforms the work of subdomain enumeration via **6 unique techniques**. Currently if configured well, gives the most number of subdomains compared to any other open-source tool 🚀 . Let's take a look at the enumeration techniques it performs:-
13 |
14 | 1. **Passive Enumeration \(** [subfinder](https://github.com/projectdiscovery/subfinder), [assetfinder](https://github.com/tomnomnom/assetfinder), [amass](https://github.com/OWASP/Amass), [findomain](https://github.com/Findomain/Findomain), [crobat](https://github.com/cgboal/sonarsearch), [waybackurls](https://github.com/tomnomnom/waybackurls), [github-subdomains](https://github.com/gwen001/github-subdomains), [Anubis](https://jldc.me/), [gauplus](https://github.com/bp0lr/gauplus) and [mildew](https://github.com/daehee/mildew)\)
15 | 2. **Certificate transparency** \([ctfr](https://github.com/UnaPibaGeek/ctfr), [tls.bufferover](https://github.com/six2dez/reconftw/blob/main/tls.bufferover.run) and [dns.bufferover](https://github.com/six2dez/reconftw/blob/main/dns.bufferover.run)\)
16 | 3. **Bruteforce** \([puredns](https://github.com/d3mondev/puredns)\)
17 | 4. **Permutations** \([DNScewl](https://github.com/codingo/DNSCewl)\)
18 | 5. **JS files & Source Code Scraping** \([gospider](https://github.com/jaeles-project/gospider), [analyticsRelationship](https://github.com/Josue87/analyticsRelationship)\)
19 | 6. **DNS Records** \([dnsx](https://github.com/projectdiscovery/dnsx)\) 🤖
20 |
21 | ### Installation:
22 |
23 | * The [installer](https://github.com/six2dez/reconftw/blob/main/install.sh) script installs all the required dependencies and tools required.
24 |
25 | ```bash
26 | git clone https://github.com/six2dez/reconftw
27 | cd reconftw/
28 | ./install.sh
29 | ```
30 |
31 | ### Running ReconFTW:
32 |
33 | * ReconFTW has a `-s` flag that performs subdomain enumeration & web probing.
34 | * Out of all the 6 techniques if we want to skip any step we can do it through its [config](https://github.com/six2dez/reconftw/blob/main/reconftw.cfg#L49) file. Just set the value of a particular function to `false`
35 | * Also, you can provide your own wordlist for bruteforcing by specifying them in the reconftw config file.
36 | * Highly recommended that you run this tool in a VPS.
37 |
38 | ```bash
39 | ./reconftw.sh -d example.com -s
40 | ```
41 |
42 | **Flags:**
43 |
44 | * **d** - target domain
45 | * **s** - Perform subdomain enumeration
46 |
47 | {% hint style="success" %}
48 | **Tip**: 🧙♂ Using `--deep` mode will run more time taking steps but return more subdomains
49 | {% endhint %}
50 |
51 |
52 |
53 |
54 | ## Taking Subdomain Enumeration to next level 🚀 🚀
55 |
56 | The biggest fear while performing subdomain enumeration is that the public DNS resolvers we are using should give us a ban/timeout as we are querying them at a high rate for a prolonged period of time. Since we would be querying the public resolvers using our single VPS IP address they might give us a ban. But what we perform the same task by distributing the workload amongst several VPS instances? The chances of a ban would be less right? Also, the execution time would be considerably less right?
57 |
58 | That's when **Axiom** comes to the rescue.
59 |
60 | ### [Axiom](https://github.com/pry0cc/axiom) 🤍
61 |
62 | * Author: [pyrocc](https://github.com/pry0cc)
63 | * Language: Bash
64 | * Supports: Digital Ocean, Linode, Azure, GCP, IBM
65 |
66 | Axiom is a dynamic infrastructure that helps to distribute the workload of a single task equally among several cloud instances. A perfect tool while performing mass recon. You will first need to install Axiom on your VPS/system from where you will be able to spin up/down the cloud instances.
67 |
68 | ### How does axiom work?
69 |
70 | Let's consider want to perform DNS bruteforcing. For this first, you will need to initialize a fleet of instances. This can be any number of instances you want/authorize to make. Within a matter of 4-5 minutes that many instances would be initialized and ready to accept your commands.
71 |
72 | #### Steps:
73 |
74 | 1. Divide the bruteforce wordlist into equal number\(total number of instances\) of parts
75 | 2. Transfer each part to the respective instances
76 | 3. Perform standalone execution in separate instances
77 | 4. Merge the output results from all instances
78 | 5. Create a single output
79 |
80 | ### Installation:
81 |
82 | * Axiom has an interactive installer that will first ask for your cloud provider, API key, which provision to install, which region to choose, default instance size, etc.
83 |
84 | ```bash
85 | git clone https://github.com/pry0cc/axiom ~/.axiom/
86 | $HOME/.axiom/interact/axiom-configure
87 | ```
88 |
89 | ####
90 |
91 | #### Some stats: 📊
92 |
93 | | **Task** | **Axiom** \(15 instances\) ✅ | **Single VPS** \(4cpu/8gb\) |
94 | | :--- | :--- | :--- |
95 | | DNS bruteforcing \(11M wordlist\) | 1m 16s | 10m 28s |
96 | | Web probing \(50k subdomains\) | 1m 40s | 21m 22s |
97 |
98 |
99 |
100 | 
101 |
102 | Yes, it's possible to integrate **Axiom** in **ReconFTW**. Isn't that great? Do try this out !!😍
103 |
104 | ### Usage:
105 |
106 | * It's necessary to first install ReconFTW first on your controller/main system and then install/setup axiom.
107 | * Before running ReconFTW over Axiom it's recommended that you first initialize your fleet.
108 | * The thing to note here is to run ReconFTW over Axiom you have to use another script called `reconftw_axiom.sh`
109 |
110 | ```bash
111 | axiom-fleet testy -i=30
112 | axiom-select 'testy*'
113 | ./reconftw_axiom.sh -d example.com -s
114 | ```
115 |
116 |
117 |
118 |
119 |
120 | #### Liked my work? Don't hesitate to buy me a coffee XDD
121 |
122 | #### ❤💙💚 [https://www.buymeacoffee.com/siddheshparab](https://www.buymeacoffee.com/siddheshparab) 💚 💙 ❤
123 |
124 |
--------------------------------------------------------------------------------
/introduction/prequisites.md:
--------------------------------------------------------------------------------
1 | # Prerequisites
2 |
3 | ### What things do we need before performing a great enumeration?
4 |
5 | 1. **API keys of Passive DNS sources**
6 | 2. **100% accurate open public DNS resolvers**
7 | 3. **A VPS(Virtual Private Server)**
8 |
9 |
10 |
11 | ## 1) API keys for Passive DNS data :key:
12 |
13 | ### What is Passive DNS data?
14 |
15 | Whenever a domain is alive on the internet, to access it, a DNS query needs to be made to the DNS resolver. With special probes activated on the DNS resolver, it is possible to record these queries and store them into a database. This doesn't record which client made the request but, just the fact that at some point a domain has been associated with a specific DNS record.
16 |
17 | Hence, we can know, what subdomains of a particular root domain existed with the help of these DNS record database. These subdomains in the present time may be alive or dead. (we need further find out which are the valid ones). There exists various services/companies that are doing this work for past several years. Along with this, various companies have their internet crawlers which continuously keep on crawling the whole internet and discover new domains.
18 |
19 | There are a number of services/sources([SecurityTrails](https://securitytrails.com/), [Censys](https://censys.io/), [Shodan](https://www.shodan.io/), [VirusTotal](https://www.virustotal.com/), [WhoisXMLAPI](https://www.whoisxmlapi.com/), [DNSDB](https://www.farsightsecurity.com/tools/dnsdb-scout/)) that provide such historical DNS data & crawled data. These services provide their API keys so that we can query and retrieve subdomains of our choice.
20 |
21 | ### Configuring API keys:
22 |
23 | **There are 2 types of passive DNS services:-**
24 |
25 | **1) Allow querying their Datasets freely(partially)**\
26 | A number of sources allow users to freely query their DNS datasets. Check out which sources allow to freely query their dataset here. (we don't need to care about these sources as our subdomain enumeration tools like [**amass**](https://github.com/OWASP/Amass), [**subfinder**](https://github.com/projectdiscovery/subfinder), [**assetfinder**](https://github.com/tomnomnom/assetfinder) will query them and do the work for us:yum: )
27 |
28 | **2) Need to generate API keys to query Datasets**\
29 | Also, a number of sources require you to signup on to their platform and generate a unique API key for yourself so that you are authorized to query and retrieve results from their historical DNS datasets.
30 |
31 | ### Problems with obtaining free API keys of good passive sources:
32 |
33 | * Good passive sources provide API keys for a limited period. (7 days/20 days)
34 | * They provide a limited amount of API query quota. (50 per day/1000 per month )
35 | * Limited query results (2 pages of data)
36 |
37 | ### Is it worth making API keys?
38 |
39 | * Yes, absolutely, given below is the comparison between running [**Subfinder** ](https://github.com/projectdiscovery/subfinder)with API keys configured and without.
40 | * You can clearly see the difference that using API keys gave me a whopping **198,669 more**💥 subdomains.
41 | * Further, this passive data would be used to generate permutation/alterations which eventually would give us more subdomains.
42 |
43 | {% tabs %}
44 | {% tab title="Without API keys " %}
45 | This process takes around 2hrs and gives 13k valid resolvers.
# https://otx.alienvault.com (Free)
65 | [data_sources.AlienVault]
66 | [data_sources.AlienVault.Credentials]
67 | apikey = dca0d4d692a6fd757107333d43d5f284f9a38f245d267b1cd72b4c5c6d5c31
68 |
69 |
70 | #How to Add 2 API keys for a single service
71 | # https://app.binaryedge.com (Free)
72 | [data_sources.BinaryEdge]
73 | ttl = 10080
74 | [data_sources.BinaryEdge.account1]
75 | apikey = d749e0d3-ff9e-gcd0-a913-b5e62f6f216a
76 | [data_sources.BinaryEdge.account2]
77 | apikey = afdb97ff-t65e-r47f-bba7-c51dc5d83347
78 |
79 |
80 | ### **Running Amass:**
81 |
82 | * After setting up API keys now we are good to run amass.
83 |
84 | ```bash
85 | amass enum -passive -d example.com -config config.ini -o output.txt
86 | ```
87 |
88 | **Flags:-**
89 |
90 | * **enum** - Perform DNS enumeration
91 | * **passive** - passively collect information through the data sources mentioned in the config file.
92 | * **config** - Specify the location of your config file (default: `$HOME/.config/amass/config.ini` )
93 | * **o** - Output filename
94 |
95 | {% hint style="success" %}
96 | :man\_mage:**Tip**: After configuring your config file in order to verify whether the API keys have been correctly set up or not you can use this command:\
97 | _amass enum -list -config config.ini_
98 | {% endhint %}
99 |
100 | ###
101 |
102 | ### 2) Subfinder
103 |
104 | * **Author**: [projectdiscovery](https://github.com/projectdiscovery)
105 | * **Language**: Go
106 | * **Total Passive Sources**: **38**
107 |
108 | [**Subfinder** ](https://github.com/projectdiscovery/subfinder)is yet another great tool that one should have in their pipeline. There are some unique sources that subfinder queries for, that amass doesn't. This tool is been developed by the famous ProjectDiscovery team, who's tools are used by every other bugbounty hunter.
109 |
110 | ### :gear:Configuring Subfinder:
111 |
112 | **Installation:**
113 |
114 | ```bash
115 | go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest
116 | ```
117 |
118 | **Setting up Subfinder configuration file:**
119 |
120 | * Subfinder's default config file location is at _`$HOME/.config/subfinder/provider-config.yaml`_
121 | * After your first installation, if you didn't find the configuration file populated by default run the following command again `subfinder` in order to get it generated.
122 | * The subfinder config file follows YAML(YAML Ain't Markup Language) syntax. So, you need to be careful that you don't break the syntax. It's better that you use a text editor and set up syntax highlighting.
123 |
124 | **Example config file:-**
125 |
126 | * [**Link**](https://gist.github.com/sidxparab/ba50e138e5c912c7c59532ce38399d1b) to my subfinder config file for reference.
127 | * Some passive sources like `Censys` , `PassiveTotal` use 2 keys in combination in order to authenticate a user. For such services, both values need to be mentioned with a colon(:) in between them. _(Check how have I mentioned the "Censys" source values- `APP-id`:`Secret` in the below example )_
128 | * Subfinder automatically detects its config file only if at the default position.
129 |
130 | ```yaml
131 | securitytrails: []
132 | censys:
133 | - ac244e2f-b635-4581-878a-33f4e79a2c13:dd510d6e-1b6e-4655-83f6-f347b363def9
134 | shodan:
135 | - AAAAClP1bJJSRMEYJazgwhJKrggRwKA
136 | github:
137 | - d23a554bbc1aabb208c9acfbd2dd41ce7fc9db39
138 | - asdsd54bbc1aabb208c9acfbd2dd41ce7fc9db39
139 | passivetotal:
140 | - sample-email@user.com:password123
141 | ```
142 |
143 | ### **Running Subfinder:**
144 |
145 | ```bash
146 | subfinder -d example.com -all -config config.yaml -o output.txt
147 | ```
148 |
149 | **Flags:-**
150 |
151 | * **d** - Specify our target domain
152 | * **all** - Use all passive sources (slow enumeration but more results)
153 | * **config** - Config file location
154 |
155 | {% hint style="success" %}
156 | :man\_mage: **Tip:-** To view the sources that require API keys `subfinder -ls` command
157 | {% endhint %}
158 |
159 | ###
160 |
161 | ### **3) Assetfinder**
162 |
163 | * **Author**: [tomnomnom](https://github.com/tomnomnom)
164 | * **Language**: Go
165 | * **Total passive sources**: **9**
166 |
167 | Don't know why did I include this tool:joy:just because its build by the legend [Tomnomnom](https://twitter.com/TomNomNom) ? It doesn't give any unique subdomains compared to other tools but it's extremely fast.
168 |
169 | ```bash
170 | go install github.com/tomnomnom/assetfinder@latest
171 | ```
172 |
173 | **Running:**
174 |
175 | ```bash
176 | assetfinder --subs-only example.com > output.txt
177 | ```
178 |
179 | ###
180 |
181 | ### 4) Findomain
182 |
183 | * **Author**: [Edu4rdSHL](https://github.com/Edu4rdSHL)
184 | * **Language**: Rust
185 | * **Total Passive sources**: 21
186 |
187 | [**Findomain** ](https://github.com/Findomain/Findomain)is one of the standard subdomain finder tools in the industry. Another extremely fast enumeration tool. It also has a paid version that offers much more features like subdomain monitoring, resolution, less resource consumption.
188 |
189 | ### Configuring Findomain: :gear:
190 |
191 | **Installation:-**
192 |
193 | * Depending on your architecture download binary from [here](https://github.com/Findomain/Findomain/wiki/Installation#using-upstream-precompiled-binaries)
194 |
195 | ```bash
196 | wget -N -c https://github.com/Findomain/Findomain/releases/download/9.0.0/findomain-linux.zip
197 | unzip findomain-linux.zip
198 | mv findomain /usr/local/bin/findomain
199 | chmod 755 /usr/local/bin/findomain
200 | ```
201 |
202 | **Configuration:-**
203 |
204 | * You need to define API keys in your `.bashrc` or `.zshrc` .
205 | * Findomain will pick up them automatically.
206 |
207 | ```bash
208 | export findomain_virustotal_token="API_KEY"
209 | export findomain_fb_token="API_KEY"
210 | ```
211 |
212 | ### **Running Findomain:**
213 |
214 | ```bash
215 | findomain -t example.com -u output.txt
216 | ```
217 |
218 |
**Flags:-**
219 |
220 | * **t** - Target domain
221 | * **u** - Output file
222 |
223 |
224 |
225 |
226 |
227 | ## B) Internet Archives
228 |
229 | Internet Archives deploy their own web crawlers and indexing systems that crawl each website on the internet. Hence, they have historical data of all the websites that once existed. hence, Internet Archives can be a useful source to grab subdomains of a particular target that once existed and later perform permutations(more on this later) on them to get more valid subdomains.
230 |
231 | Internet Archive when queried gives back URLs. Since we are only concerned with the subdomains, we need to process those URLs to grab only unique FQDN subdomains from them.
232 |
233 | For this, we use a tool called [unfurl](https://github.com/tomnomnom/unfurl). This tool helps to extract the domain name from a list of URLs.
234 |
235 | ### 5) Gau
236 |
237 | * **Author**: [lc](https://github.com/lc)
238 | * **Language**: Go
239 | * **Sources**:
240 | * [web.archive.org](http://web.archive.org/)
241 | * [index.commoncrawl.org](http://index.commoncrawl.org/)
242 | * [otx.alienvault.com](https://otx.alienvault.com/)
243 | * [urlscan.io](https://urlscan.io/)
244 |
245 | [**Gau** ](https://github.com/lc/gau)works by querying all the above 4 internet archive services and grabs all the URLs that their internet-wide crawler had once crawled. So through this process we get tons of URL's belonging to our target that once existed. After collecting the URLs we extract only the domain/subdomain part from those URLs.
246 |
247 | #### Installation:
248 |
249 | ```bash
250 | go install github.com/lc/gau/v2/cmd/gau@latest
251 | ```
252 |
253 | #### Running gauplus:
254 |
255 | ```bash
256 | gau --threads 5 --subs example.com | unfurl -u domains | sort -u -o output_unfurl.txt
257 | ```
258 |
259 | **Flags:**
260 |
261 | * **threads** - How many workers to spawn
262 | * **subs** - Include subdomains of the target domain
263 |
264 |
265 |
266 | ### **6) Waybackurls**
267 |
268 | * **Author**: [tomnomnom](https://github.com/tomnomnom)
269 | * **Language**: Go
270 | * **Sources**:
271 | * [web.archive.org](http://web.archive.org/)
272 | * [index.commoncrawl.org](http://index.commoncrawl.org/)
273 | * [www.virustotal.com](https://www.virustotal.com)
274 |
275 | [**Waybackurls**](https://github.com/tomnomnom/waybackurls) works similar to Gau, but I have found that it returns some unique data that Gau couldn't find. Hence, we need to include waybackurls in our arsenal.
276 |
277 | #### Installation:
278 |
279 | ```bash
280 | go install github.com/tomnomnom/waybackurls@latest
281 | ```
282 |
283 | #### **Running Waybackurls:**
284 |
285 | ```bash
286 | waybackurls example.com | unfurl -u domains | sort -u -o output.txt
287 | ```
288 |
289 |
290 |
291 |
292 |
293 |
294 |
295 | ## C) GitHub Scraping
296 |
297 | ### 7) Github-subdomains
298 |
299 | * **Author**: [gwen001](https://github.com/gwen001)
300 | * **Language**: Go
301 |
302 | Organizations sometimes host their source code on GitHub, also employees working at these organizations sometimes leak the source code on GitHub. Additionally, I have came around instances where security researchers host their reconnaissance data in public repositories. The tool Github-subdomains can help you extract these exposed/leaked subdomains of your target from GitHub.
303 |
304 | **Installation:**
305 |
306 | ```bash
307 | go install github.com/gwen001/github-subdomains@latest
308 | ```
309 |
310 | :gear:**Configuring github-subdomains:**
311 |
312 | * For github-subdomains to scrap domains from GitHub you need to specify a list of GitHub access tokens.
313 | * [**Here**](https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/creating-a-personal-access-token#creating-a-personal-access-token-classic) is an article on how you can generate your GitHub access tokens.
314 | * These access tokens are used by the tool to perform searches and find subdomains on behalf of you.
315 | * I always prefer that you make at least 10 tokens from 3 different accounts(30 in total) to avoid rate limiting.
316 | * Specify 1 token per line.
317 |
318 | **Running github-subdomains:**
319 |
320 | ```bash
321 | github-subdomains -d example.com -t tokens.txt -o output.txt
322 | ```
323 |
324 | **Flags:**
325 |
326 | * **d -** target
327 | * **t** - file containing tokens
328 | * **o** - output file
329 |
330 |
331 |
332 | ## ~~**D)** Rapid7 Project Sonar dataset(depreciated)~~
333 |
334 | [Project Sonar](https://opendata.rapid7.com/about/) is a security research project by Rapid7 that conducts internet-wide scans. Rapid7 has been generous and made this data freely available to the public. Project Sonar contains [8 different datasets](https://opendata.rapid7.com/) with a total size of over **66.6 TB** which are updated on a regular basis. You can read here how you can parse these datasets on your own using this [guide](https://0xpatrik.com/project-sonar-guide/).
335 |
336 | This internet-wide DNS dataset could be an excellent resource for us to grab our subdomains right? But querying such large datasets could take up significant time. That's when **Crobat** comes to the rescue.
337 |
338 | ### 8) [Crobat](https://github.com/Cgboal/SonarSearch)
339 |
340 | * **Author**: [Cgboal](https://github.com/Cgboal)
341 | * **Language**: Go
342 |
343 | [Cgboal ](https://twitter.com/CalumBoal)has done an excellent work of parsing and indexing the whole Rapid7 Sonar dataset into MongoDB and creating an API to query this database. This Crobat API is freely available at [https://sonar.omnisint.io/](https://sonar.omnisint.io/).More over he developed a command-line tool that uses this API and returns the results at a blazing fast speed.
344 |
345 | ### Installation:
346 |
347 | ```bash
348 | go get github.com/cgboal/sonarsearch/cmd/crobat
349 | ```
350 |
351 | ### Running:
352 |
353 | ```bash
354 | crobat -s example.com > output.txt
355 | ```
356 |
357 | **Flags:**
358 |
359 | * **s** - Target Name
360 |
361 |
362 |
363 |
364 |
365 |
366 |
367 | ## :checkered\_flag:**That's it !!! Done with passive things** :checkered\_flag:
368 |
369 | #### Liked my work? Don't hesitate to buy me a coffee XDD
370 |
371 | #### :heart::blue\_heart::green\_heart: [https://www.buymeacoffee.com/siddheshparab](https://www.buymeacoffee.com/siddheshparab) :green\_heart: :blue\_heart: :heart:
372 |
373 |
--------------------------------------------------------------------------------
/passive-enumeration/recursive-enumeration.md:
--------------------------------------------------------------------------------
1 | # Recursive Enumeration
2 |
3 | Through various testing and trying out new things for subdomain enumeration, my friend [Six2dez ](https://twitter.com/Six2dez1)came across a technique where running the subdomain enumeration tools again on each of the subdomains found yields in getting more subdomains in total
4 |
5 | In easy words, we again run tools like Amass, Subfinder, Assetfinder again each of the subdomains that were found.
6 |
7 | {% hint style="danger" %}
8 | If you have set up API keys, this technique may consume your entire querying quota.
9 | {% endhint %}
10 |
11 | For a better understanding look at the image below:
12 |
13 | 
14 |
15 | ### Things to keep in mind:
16 |
17 | * This technique is only useful when your target has a large number of multi-level subdomains_(not effective for small & medium scope targets)._
18 | * It is recommended to execute this technique as the final step, exclusively on a validated list of subdomains that you have collected through other Passive + Active techniques.
19 | * This techniques may consume all your passive DNS service's API keys quota_(if they are configured)._
20 | * This techniques takes time to return the final results.
21 |
22 | ### Workflow:
23 |
24 | 1. Read the list of subdomains from the file "subdomains.txt".
25 | 2. Process the subdomains in two steps:
26 |
27 | **a)** Find the Top-10 most frequent occuring Second-Level Domain names with the help of tools like cut, sort, rev, uniq, etc.\
28 | **b)** Find the Top-10 most frequent occuring Third-Level domains.
29 | 3. Now run passive subdomain enumeration on these 10 Second-level domain names and 10 Third-level domain names using tools like amass, subfinder, assetfinder, findomain.
30 | 4. Keep appending the results to `passive_recursive.txt` file.
31 | 5. Now after finding out the a list of domain names, run puredns to DNS resolve them and find the alive subdomains.
32 |
33 |
34 |
35 | _Replace `subdomains.txt` with the filename of your subdomains list._
36 |
37 | ```bash
38 | #!/bin/bash
39 |
40 | go install -v github.com/tomnomnom/anew@latest
41 | subdomain_list="subdomains.txt"
42 |
43 | for sub in $( ( cat $subdomain_list | rev | cut -d '.' -f 3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 && cat subdomains.txt | rev | cut -d '.' -f 4,3,2,1 | rev | sort | uniq -c | sort -nr | grep -v '1 ' | head -n 10 ) | sed -e 's/^[[:space:]]*//' | cut -d ' ' -f 2);do
44 | subfinder -d $sub -silent -max-time 2 | anew -q passive_recursive.txt
45 | assetfinder --subs-only $sub | anew -q passive_recursive.txt
46 | amass enum -timeout 2 -passive -d $sub | anew -q passive_recursive.txt
47 | findomain --quiet -t $sub | anew -q passive_recursive.txt
48 | done
49 | ```
50 |
--------------------------------------------------------------------------------
/types/horizontal-enumeration.md:
--------------------------------------------------------------------------------
1 | # Horizontal Enumeration
2 |
3 | While performing a security assessment our main goal is to map out all the root domains owned by a single entity. This means, making an inventory of all the internet facing assets of a particular organization. It is a bit trickier to find related domains/acquisitions of a particular organization as this step includes some tedious methods and doesn't guarantee accurate results always. One has to solely perform manual analysis to verify the results.
4 |
5 | From the below image you can get an idea of what a **horizontal domain correlation** is:
6 |
7 | 
8 |
9 | \
10 | Let's look at how to find these related horizontal domains.
11 |
12 | {% hint style="danger" %}
13 | These enumeration methods can go out of scope and backfire you. Do it with caution!
14 | {% endhint %}
15 |
16 | ## 1) Finding related domains/acquisitions
17 |
18 | #### a) **WhoisXMLAPI**
19 |
20 | [**WhoisXMLAPI** ](https://www.whoisxmlapi.com/)is an excellent source that provides a good amount of related domains & acquisitions based on the WHOIS record. Singing up on their platform will assign you **500 free credits** which renews every month.\
21 | Visit [https://tools.whoisxmlapi.com/reverse-whois-search](https://tools.whoisxmlapi.com/reverse-whois-search) . Searching with the root domain name like **dell.com** will give you a list of all the associated domains.
22 |
23 | 
24 |
25 | {% hint style="warning" %}
26 | These are not 100% accurate results, as they contain false positives
27 | {% endhint %}
28 |
29 | #### b) **Whoxy** :moneybag:
30 |
31 | [**Whoxy**](https://www.whoxy.com/) is yet another great source to perform reverse WHOIS on parameters like Company Name, Registrant Email address, Owner Name, Domain keyword. Whoxy has an enormous database of around **455M WHOIS records**. But sadly this is a paid service :(
32 |
33 | To effectively use Whoxy API there's a command-line tool called [**whoxyrm**](https://github.com/MilindPurswani/whoxyrm)**.**
34 |
35 | ```
36 | go get -u github.com/milindpurswani/whoxyrm
37 | export WHOXY_API_KEY="89acb0f4557df3237l1"
38 |
39 | whoxyrm -company-name "Red Bull GmBH"
40 | ```
41 |
42 | 
43 |
44 | #### c) Crunchbase:moneybag:
45 |
46 | [**Crunchbase**](https://www.crunchbase.com/) is another great alternative for finding acquisitions but requires a paid subscription to view all the acquisitions. The trial version allows viewing some of the acquisitions.
47 |
48 | apt-get install whois
68 | whois -h whois.radb.net -- '-i origin AS714' | grep -Eo "([0-9.]+){4}/[0-9]+" | uniq -u
69 |
70 |
71 | #Installation
150 | git clone https://github.com/pielco11/fav-up.git
151 | cd fav-up/
152 | pip3 install -r requirements.txt
153 | apt-get install jq
154 |
155 | #Initializing Shodan API key
156 | shodan init A5TCTEH78E6Zhjdva6X2fls6Oob9F2hL
157 |
158 | #Running the tool
159 | python3 favUp.py -w www.github.com -sc -o output.json
160 |
161 | #Parsing the output
162 | cat output.json | jq -r 'try .found_ips' | sed "s/|/\n/g"
163 |
164 |
165 |
166 |