├── Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4 ├── Untitled.png ├── Untitled 1.png ├── Untitled 2.png ├── Untitled 3.png ├── Untitled 4.png ├── Untitled 5.png ├── Untitled 6.png ├── Untitled 7.png ├── Untitled 8.png ├── Untitled 9.png ├── Untitled 10.png ├── Untitled 11.png ├── Untitled 12.png ├── Untitled 13.png ├── Untitled 14.png ├── Untitled 15.png ├── Untitled 16.png ├── Untitled 17.png ├── Untitled 18.png ├── Untitled 19.png ├── Untitled 20.png ├── Untitled 21.png ├── Untitled 22.png ├── Untitled 23.png ├── Untitled 24.png ├── Untitled 25.png ├── Untitled 26.png ├── Untitled 27.png ├── Untitled 28.png ├── Untitled 29.png ├── Untitled 30.png ├── Untitled 31.png ├── Untitled 32.png ├── Untitled 33.png ├── Untitled 34.png ├── Untitled 35.png ├── Untitled 36.png ├── Untitled 37.png ├── Untitled 38.png ├── Untitled 39.png ├── Untitled 40.png ├── Untitled 41.png ├── Untitled 42.png ├── Untitled 43.png ├── Untitled 44.png ├── Untitled 45.png ├── Untitled 46.png ├── Untitled 47.png ├── Untitled 48.png ├── Untitled 49.png ├── Untitled 50.png └── Untitled 51.png └── Readme.md /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 1.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 2.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 3.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 4.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 4.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 5.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 5.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 6.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 7.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 7.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 8.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 8.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 9.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 9.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 10.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 10.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 11.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 11.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 12.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 12.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 13.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 13.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 14.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 14.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 15.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 15.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 16.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 16.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 17.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 17.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 18.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 18.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 19.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 19.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 20.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 20.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 21.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 21.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 22.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 22.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 23.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 23.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 24.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 24.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 25.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 25.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 26.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 26.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 27.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 27.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 28.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 28.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 29.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 29.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 30.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 30.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 31.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 31.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 32.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 32.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 33.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 33.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 34.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 34.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 35.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 35.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 36.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 36.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 37.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 37.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 38.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 38.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 39.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 39.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 40.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 40.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 41.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 41.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 42.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 42.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 43.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 43.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 44.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 44.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 45.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 45.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 46.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 46.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 47.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 47.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 48.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 48.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 49.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 49.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 50.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 50.png -------------------------------------------------------------------------------- /Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 51.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/System00-Security/Recon-Reloaded/HEAD/Information Gathering [ Reloaded ] 1d11633015374f1797b56a2447ced9a4/Untitled 51.png -------------------------------------------------------------------------------- /Readme.md: -------------------------------------------------------------------------------- 1 | # Information Gathering [ Reloaded ] 2 | 3 | ## Information Gathering & Scaning for sensitive information 4 | 5 | - **Whois Lookup** 6 | 7 | To Check Other websites registered by the registrant of the site (reverse check on the registrant, email address, and telephone), and in-depth investigation of the sites found. 8 | 9 | ```bash 10 | whois target.tld 11 | ``` 12 | 13 | - **Website Ip** 14 | 15 | For collecting Server Side Information sometime we need the ip of the website , but many website usage cdn to protect them here is conventional way to bypass the cdn.If Cdn is Not available on the target just use ping to find the ip of the website. 16 | 17 | ```bash 18 | ping uber.com 19 | ``` 20 | 21 | **If cdn is available** 22 | 23 | ```bash 24 | http://crimeflare.org:82/cfs.html #to find the real ip behind the cloudflare 25 | or 26 | https://github.com/gwen001/pentest-tools/blob/master/cloudflare-origin-ip.py 27 | or 28 | https://censys.io/ 29 | ``` 30 | 31 | - **Asset Discovery** 32 | - **Horizontal domain correlation** 33 | 34 | Most of The time we focus on subdomains ,but they skipout the other half aka Horizontal domain correlation . 35 | 36 | what is Horizontal domain correlation? 37 | 38 | horizontal domain correlation is a process of finding other domain names, which have a different second-level domain name but are related to the same entity 39 | 40 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled.png) 41 | 42 | Firstly, let's think about it. We cannot rely on a syntactic match as we did in the previous step. Potentially, [abcabcabc.com](http://abcabcabc.com/) and [cbacbacba.com](http://cbacbacba.com/) can be owned by the same entity. However, they don't match syntactically. For this purpose, we can use WHOIS data. There are some reverse WHOIS services which allow you to search based on the common value from WHOIS database. Lets run whois for sony.com 43 | 44 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%201.png) 45 | 46 | Now lets do a reverse whois lookup with the registrant email, we can do a reverse whois lookup using [https://viewdns.info/reversewhois/](https://viewdns.info/reversewhois/) 47 | 48 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%202.png) 49 | 50 | More reverse Whois site 51 | 52 | ```python 53 | https://domaineye.com/reverse-whois 54 | https://www.reversewhois.io/ 55 | ``` 56 | 57 | We can make this one cli mode 58 | 59 | [https://gist.github.com/JoyGhoshs/80543553f7442271fbc1092a9de08385](https://gist.github.com/JoyGhoshs/80543553f7442271fbc1092a9de08385) 60 | 61 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%203.png) 62 | 63 | - **Subdomain Enumeration / Vertical domain correlation** 64 | - Passive Enumeration 65 | 66 | There are so many tools available on the internet to gather subdomain from many different source.But We Also Can use those various passive subdomain collection source to find subdomain manually. 67 | 68 | - **Google Dorking** 69 | 70 | ``` 71 | site:*.target.tld 72 | 73 | ``` 74 | 75 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%204.png) 76 | 77 | Its Hard To go page to page to and copy those subdomain ,lets make it cli based 78 | 79 | ```bash 80 | #!/usr/bin/bash 81 | domain=$1 82 | agent="Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.85 Safari/537.36" 83 | curl -s -A $agent "https://www.google.com/search?q=site%3A*.$domain&start=10" | grep -Po '((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+' | grep $domain | sort -u 84 | curl -s -A $agent "https://www.google.com/search?q=site%3A*.$domain&start=20" | grep -Po '((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+' | grep $domain | sort -u 85 | curl -s -A $agent "https://www.google.com/search?q=site%3A*.$domain&start=30" | grep -Po '((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+' | grep $domain | sort -u 86 | curl -s -A $agent "https://www.google.com/search?q=site%3A*.$domain&start=40" | grep -Po '((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+' | grep $domain | sort -u 87 | ``` 88 | 89 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%205.png) 90 | 91 | You can make the script more simple with loop on start parameter 10 means page one 20 means page to and this goes on. 92 | 93 | - **Bing Dorking** 94 | 95 | ``` 96 | site:uber.com 97 | ``` 98 | 99 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%206.png) 100 | 101 | [**Shodan.io**](http://shodan.io) 102 | 103 | We can enumerate subdomain from shodan using the search web interface or using python based cli client. 104 | 105 | **Web-Client Dork-** 106 | 107 | > hostname:"target.tld" 108 | > 109 | 110 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%207.png) 111 | 112 | **Cli-Client** 113 | 114 | ```bash 115 | shodan init your_api_key #set your api key on client 116 | shodan domain domain.tld 117 | ``` 118 | 119 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%208.png) 120 | 121 | - **Hackertarget.com** 122 | 123 | ``` 124 | https://hackertarget.com/find-dns-host-records/ 125 | ``` 126 | 127 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%209.png) 128 | 129 | Hackertarget also has a api , we can use it on our cli without any auth-token or key 130 | 131 | ```bash 132 | curl -s https://api.hackertarget.com/hostsearch/?q=uber.com |grep -Po '((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+' 133 | ``` 134 | 135 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2010.png) 136 | 137 | - **Crt.sh** 138 | 139 | To find subdomain from certificate transparency. 140 | 141 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2011.png) 142 | 143 | lets make oneliner for this so we can grub it from cli. 144 | 145 | ```bash 146 | curl -s "https://crt.sh/?q=%25.target.tld&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u 147 | ``` 148 | 149 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2012.png) 150 | 151 | its boring for you to get through these many screenshots lets just create oneliner for other source to get subdomains of target domain from cli. 152 | 153 | - [riddler.io](http://riddler.io) 154 | 155 | ```bash 156 | curl -s "https://riddler.io/search/exportcsv?q=pld:domain.com" | grep -Po "(([\w.-]*)\.([\w]*)\.([A-z]))\w+" | sort -u 157 | ``` 158 | 159 | - [subbuster.cyberxplore.com](http://subbuster.cyberxplore.com) 160 | 161 | ```bash 162 | curl "https://subbuster.cyberxplore.com/api/find?domain=domain.tld" -s | grep -Po "(([\w.-]*)\.([\w]*)\.([A-z]))\w+" 163 | ``` 164 | 165 | - certspotter 166 | 167 | ```bash 168 | curl -s "https://certspotter.com/api/v1/issuances?domain=domain.com&include_subdomains=true&expand=dns_names" | jq .[].dns_names | tr -d '[]"\n ' | tr ',' '\n' 169 | ``` 170 | 171 | **SAN [** Subject Alternate Name **] domain extraction** 172 | 173 | These are little sample of the source to gather subdomains now lets know about SAN based subdomain enumeration S.A.N stands for Subject Alternate Name, The Subject Alternative Name (SAN) is an extension to the X.509 specification that allows to specify additional host names for a single SSL certificate. 174 | 175 | Lets Write a Bash Script to extracts domain from ssl certificate using openssl. 176 | 177 | ```bash 178 | sed -ne 's/^\( *\)Subject:/\1/p;/X509v3 Subject Alternative Name/{ 179 | N;s/^.*\n//;:a;s/^\( *\)\(.*\), /\1\2\n\1/;ta;p;q; }' < <( 180 | openssl x509 -noout -text -in <( 181 | openssl s_client -ign_eof 2>/dev/null <<<$'HEAD / HTTP/1.0\r\n\r' \ 182 | -connect sony.com:443 ) ) 183 | ``` 184 | 185 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2013.png) 186 | 187 | lets filter only domain from this result using grep 188 | 189 | ```bash 190 | sed -ne 's/^\( *\)Subject:/\1/p;/X509v3 Subject Alternative Name/{ 191 | N;s/^.*\n//;:a;s/^\( *\)\(.*\), /\1\2\n\1/;ta;p;q; }' < <( 192 | openssl x509 -noout -text -in <( 193 | openssl s_client -ign_eof 2>/dev/null <<<$'HEAD / HTTP/1.0\r\n\r' \ 194 | -connect sony.com:443 ) ) | grep -Po '((http|https):\/\/)?(([\w.-]*)\.([\w]*)\.([A-z]))\w+' 195 | ``` 196 | 197 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2014.png) 198 | 199 | **DNS enumeration using Cloudflare** 200 | 201 | Its a bit complicated process because you need a cloudflare account to use this python3 script. this script use cloudflare to extract subdomains. 202 | 203 | ```bash 204 | wget https://raw.githubusercontent.com/appsecco/bugcrowd-levelup-subdomain-enumeration/master/cloudflare_enum.py 205 | # Login into cloudflare https://www.cloudflare.com/login 206 | # "Add site" to your account https://www.cloudflare.com/a/add-site 207 | # Provide the target domain as a site you want to add 208 | # Wait for cloudflare to dig through DNS data and display the results 209 | python cloudflare_enum.py your@email.com target.tld 210 | ``` 211 | 212 | **Using Tools To enumerate subdomains** 213 | 214 | - assetfinder 215 | 216 | assetfinder is a passive subdomain enumeration tool from tomnomnom , it gets subdomain from different source and combine them. 217 | 218 | ```bash 219 | go get -u github.com/tomnomnom/assetfinder #download the assetfinder 220 | 221 | assetfinder --subs-only domain.tld # enumerates the subdomain 222 | ``` 223 | 224 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2015.png) 225 | 226 | Its Fast and mostly accurate for passively collecting subdomains. 227 | 228 | - findomain 229 | 230 | findomain is mostly wellknow for its speed and accurate result , most of the time these tools like subfinder,findomain,assetfinder [ passive subdomain enumerators ] usage same process same api , the advantage of using all of them is no passively gathered subdomain gets missed. 231 | 232 | ```bash 233 | download from [ https://github.com/Findomain/Findomain/releases/tag/5.0.0 ] 234 | 235 | findomain -t target.tld -q 236 | ``` 237 | 238 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2016.png) 239 | 240 | - Subfinder 241 | 242 | Subfinder is a subdomain discovery tool that discovers valid subdomains for websites by using passive online sources. 243 | 244 | ```bash 245 | download https://github.com/projectdiscovery/subfinder/releases/tag/v2.4.8 246 | subfinder -d domain.tld --silent 247 | ``` 248 | 249 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2017.png) 250 | 251 | 252 | - Active Enumeration 253 | 254 | In this phase we are gonna enumerate subdomains actively , the online based passive subdomain database sometime miss newly added subdomain , using an active enumeration we can find active subdomains and new unique subdomains. We are gonna bruteforce for subdomain. There are many tools available for subdomain bruteforce we are gonna use selected few tools. 255 | 256 | **Nmap** 257 | 258 | There is script on nmap for bruteforcing dns , we are gonna use it to brute for subdomains. 259 | 260 | ```bash 261 | nmap --script dns-brute --script-args dns-brute.domain=uber.com,dns-brute.threads=6 262 | ``` 263 | 264 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2018.png) 265 | 266 | ### **Goaltdns with massdns** 267 | 268 | Goaltdns is a permutation generation tool and massdns is dns reslover . We are gonna generate permutation with goaltdns and we will reslove those permutation using massdns. 269 | 270 | **Wordlists You can use :** 271 | 272 | [jhaddix/all.txt](https://www.notion.so/86a06c5dc309d08580a018c66354a056) 273 | 274 | [https://github.com/rbsec/dnscan/blob/master/subdomains-10000.txt](https://github.com/rbsec/dnscan/blob/master/subdomains-10000.txt) 275 | 276 | ```bash 277 | [Download-Goaltdns] https://github.com/subfinder/goaltdns 278 | [Download-Massdns] https://github.com/blechschmidt/massdns 279 | ``` 280 | 281 | **Permutation generation [ goaltdns ]** 282 | 283 | We will Use two things here in Permutation , we will generate Permutation for passively gathered subdomains , and the target host. 284 | 285 | ```bash 286 | goaltdns -l passive-subs.txt -w all.txt -o p.sub #Permutation for passively collected domains 287 | goaltdns -h uber.com -w all.txt -o p2.sub # Permutation for target host 288 | cat p.sub p2.sub | tee -a all-sub.txt ; rm p.sub p2.sub # combine 2 results 289 | ``` 290 | 291 | **Resolving Generated domain with massdns** 292 | 293 | After Generating permutation lets resolve those results with massdns , we are using this [resolver](https://raw.githubusercontent.com/janmasarik/resolvers/master/resolvers.txt) . 294 | 295 | ```bash 296 | massdns -r resolvers.txt -t AAAA -w result.txt all-sub.txt 297 | ``` 298 | 299 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2019.png) 300 | 301 | **If You wanna save time you can do Permutation generation and resolving on same time with suffledns.** 302 | 303 | ```bash 304 | [Download-suffledns] go get -v github.com/projectdiscovery/shuffledns/cmd/shuffledns 305 | 306 | shuffledns -d target.tld -list all.txt -r resolvers.txt 307 | ``` 308 | 309 | There is Website that does bruteforce for us to find alive subdomains subdomains. 310 | 311 | [https://phpinfo.me/domain/](https://phpinfo.me/domain/) 312 | 313 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2020.png) 314 | 315 | **Puredns** 316 | 317 | Fast dns bruteforcer 318 | 319 | ``` 320 | https://github.com/d3mondev/puredns [Puredns-download] 321 | ``` 322 | 323 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2021.png) 324 | 325 | 326 | Combining Two Enumeration Technic can be usefull to get more unique subdomains . we can get passive and active domains for target at the same time. 327 | 328 | **http/https Probing** 329 | 330 | After combining all the result you need to probe all domains/subdomains to detect these are using http or https protocol, you can do that using tool called httprobe or httpx 331 | 332 | ```bash 333 | cat all-subs.txt | httprobe | tee subdomains.txt 334 | cat all-subs.txt | httpx -silent | tee subdomains.txt 335 | ``` 336 | 337 | - **ASN lookup** 338 | 339 | There are many ways to find asn number of a company , asn number will help us to retrieve targets internet asset . 340 | 341 | We can find asn number of a company using dig and whois , but most of the time these will give you a hosting provider asn number. 342 | 343 | example : 344 | 345 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2022.png) 346 | 347 | But You can find a cloud company asn with this Technic cause they host on their on server. 348 | 349 | Example : google.com 350 | 351 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2023.png) 352 | 353 | Now lets skip those useless talk , we can extract asn ipdata of a target company using a free api called asnlookup.com 354 | 355 | ```python 356 | http://asnlookup.com/api/lookup?org=tesla 357 | ``` 358 | 359 | it will give you the result of all cidr from tesla inc. 360 | 361 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2024.png) 362 | 363 | Now select any of these ip and do a whois search to get asn number 364 | 365 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2025.png) 366 | 367 | Lets make the api cli based so we can call it from cli using python3 368 | 369 | ```python 370 | import requests 371 | import json 372 | def asn_lookup(company): 373 | headers = { 374 | 'User-Agent': 'ASNLookup PY/Client' 375 | } 376 | asn_db=requests.get(f'http://asnlookup.com/api/lookup?org={company}',headers).text 377 | print(f'{Fore.GREEN}[+] {Fore.WHITE}ASN Lookup Result For {company}') 378 | print('') 379 | asndb_load=json.loads(asn_db) 380 | for iprange in asndb_load: 381 | print(iprange) 382 | 383 | asn_lookup('company_name') 384 | ``` 385 | 386 | so where we can use this asn number? 387 | 388 | we can use this asn number on hackers search engine like shodan to get more extracted information about the target companies internal network. 389 | 390 | Shodan Dorks: 391 | 392 | ``` 393 | asn:AS394161 394 | ``` 395 | 396 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2026.png) 397 | 398 | or we can use censys to find more information about the target company. 399 | 400 | Censys dorks: 401 | 402 | ``` 403 | autonomous_system.asn:394161 404 | ``` 405 | 406 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2027.png) 407 | 408 | or we can find asn number from whatismyip database 409 | 410 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2028.png) 411 | 412 | - **Target Visualize/Web-Screenshot** 413 | 414 | After Enumerating subdomains/domains we need to visualize those target to see how the use interface look like , mostly is the subdomain is leaking any important information or database or not. 415 | 416 | sometime on domain/subdomain enumeration we got like 2k-10k subdomains its quite impossible to visit all of them cause it will take more than 30-40 hour , there are many tools available to screenshot those subdomains from subdomains list. 417 | 418 | **Gowitness** 419 | 420 | Its quite fast and doesn't require any external dependency. 421 | 422 | ```bash 423 | [download-gowitness] **https://github.com/sensepost/gowitness** 424 | gowitness file -f subdomains 425 | gowitness single https://uber.com #for single domain 426 | **** 427 | ``` 428 | 429 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2029.png) 430 | 431 | **EyeWitness** 432 | 433 | ```bash 434 | [download-eyewitness] https://github.com/FortyNorthSecurity/EyeWitness 435 | ./EyeWitness -f subdomains.txt --web 436 | ``` 437 | 438 | **Webscreenshot** 439 | 440 | ```bash 441 | [download-webscreenshot] pip3 install webscreenshot 442 | webscreenshot -i subdomains.txt 443 | 444 | ``` 445 | 446 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2030.png) 447 | 448 | - **Scanning for directory with possible sensitive information** 449 | 450 | sometime directory on domain or subdomain contains sensitive information like site backup , site database backup , private api interface backup or other sensitive staff stored on directory, on the sites www that disclose it to the whole internet , search engine misses those sometime cause these are less visited page . so we are gonna use some directory fuzzer / directory bruteforcer to find those sensitive files . 451 | 452 | For Wordlist you can try the seclists 453 | 454 | [SecLists/Discovery at master · danielmiessler/SecLists](https://github.com/danielmiessler/SecLists/tree/master/Discovery) 455 | 456 | **Dirsearch** 457 | 458 | dirsearch is one of the fastest and featured directory bruteforcer . 459 | 460 | 461 | 462 | ```bash 463 | [download-dirsearch] https://github.com/maurosoria/dirsearch 464 | python3 dirsearch.py -u https://target.tld -e php #single target with default wordlist 465 | python3 dirsearch.py -e php -u https://target.tld -w /path/to/wordlist #with wordlist 466 | python3 dirsearch.py -l subdomains.txt -e php # brute with list 467 | [-e] for extension 468 | [-w] for wordlist path 469 | ``` 470 | 471 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2031.png) 472 | 473 | **Wfuzz** 474 | 475 | its a traditional fuzzer for web-application.its usefull when we do api testing , its usefull for fuzzing the endpoints. 476 | 477 | ```bash 478 | [install-wfuzz] pip3 install wfuzz / apt install wfuzz 479 | wfuzz -w wordlist_path https://traget.com/FUZZ #define the brute path with FUZZ 480 | ``` 481 | 482 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2032.png) 483 | 484 | You Can Read More detailed information about fuzzing on [https://book.hacktricks.xyz/pentesting-web/web-tool-wfuzz](https://book.hacktricks.xyz/pentesting-web/web-tool-wfuzz) 485 | 486 | - **Parameter discovery** 487 | 488 | Web applications use parameters (or queries) to accept user input. We can test for some vulnerability on params like xss,sql,lfi,rce,etc 489 | 490 | There are many tools available for parameter discovery . 491 | 492 | **Arjun** 493 | 494 | ``` 495 | [download-arjun] pip3 install arjun 496 | arjun -i subdomains.txt -m GET -oT param.txt #for multiple target 497 | arjun -u target.com -m GET -oT param.txt #for single target 498 | 499 | [-m ] parameter method 500 | [-oT] text format output # you can see more options on arjun -h 501 | 502 | ``` 503 | 504 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2033.png) 505 | 506 | **ParamSpider** 507 | 508 | ``` 509 | $ git clone https://github.com/devanshbatham/ParamSpider 510 | $ cd ParamSpider 511 | $ pip3 install -r requirements.txt 512 | $ python3 paramspider.py --domain hackerone.com 513 | ``` 514 | 515 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2034.png) 516 | 517 | or or or we can use the bruteforce method for param discovery usin parameth 518 | 519 | **parameth** 520 | 521 | ``` 522 | [download-parameth] https://github.com/maK-/parameth 523 | ``` 524 | 525 | - **Subdomain Cname extraction** 526 | 527 | extracting cname of subdomain is usefull for us to see if any of these subdomain is pointing to other hosting/cloud services. So that later we can test for takeover. 528 | 529 | We can do that by using dig 530 | 531 | ```bash 532 | dig CNAME 1.github.com +short 533 | ``` 534 | 535 | so we have multiple subdomain , we can use xargs to make this automate with multitask 536 | 537 | ```bash 538 | cat subdomains.txt | xargs -P10 -n1 dig CNAME +short 539 | ``` 540 | 541 | - P10 Defines thread after that you can just tee all the cname to a text file. 542 | 543 | ```bash 544 | cat subdomains.txt | xargs -P10 -n1 dig CNAME +short | tee -a cnames 545 | ``` 546 | 547 | on these cname file we are gonna filter these cnames 548 | 549 | ``` 550 | "\.cloudfront.net" 551 | "\.s3-website" 552 | "\.s3.amazonaws.com" 553 | "w.amazonaws.com" 554 | "1.amazonaws.com" 555 | "2.amazonaws.com" 556 | "s3-external" 557 | "s3-accelerate.amazonaws.com" 558 | "\.herokuapp.com" 559 | "\.herokudns.com" 560 | "\.wordpress.com" 561 | "\.pantheonsite.io" 562 | "domains.tumblr.com" 563 | "\.zendesk.com" 564 | "\.github.io" 565 | "\.global.fastly.net" 566 | "\.helpjuice.com" 567 | "\.helpscoutdocs.com" 568 | "\.ghost.io" 569 | "cargocollective.com" 570 | "redirect.feedpress.me" 571 | "\.myshopify.com" 572 | "\.statuspage.io" 573 | "\.uservoice.com" 574 | "\.surge.sh" 575 | "\.bitbucket.io" 576 | "custom.intercom.help" 577 | "proxy.webflow.com" 578 | "landing.subscribepage.com" 579 | "endpoint.mykajabi.com" 580 | "\.teamwork.com" 581 | "\.thinkific.com" 582 | "clientaccess.tave.com" 583 | "wishpond.com" 584 | "\.aftership.com" 585 | "ideas.aha.io" 586 | "domains.tictail.com" 587 | "cname.mendix.net" 588 | "\.bcvp0rtal.com" 589 | "\.brightcovegallery.com" 590 | "\.gallery.video" 591 | "\.bigcartel.com" 592 | "\.activehosted.com" 593 | "\.createsend.com" 594 | "\.acquia-test.co" 595 | "\.proposify.biz" 596 | "simplebooklet.com" 597 | "\.gr8.com" 598 | "\.vendecommerce.com" 599 | "\.azurewebsites.net" 600 | "\.cloudapp.net" 601 | "\.trafficmanager.net" 602 | "\.blob.core.windows.net" 603 | ``` 604 | 605 | - **Crawling & Collecting Pagelinks** 606 | 607 | A url or pagelinks contains many information , there are many way to extract pagelinks from target domain. 608 | 609 | **Waybackmachine [** [https://web.archive.org/](https://web.archive.org/) **]** 610 | 611 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2035.png) 612 | 613 | Using Waybackmachine we could see old history of a website and pagelinks.There is a automated tools for that , the tool is written by tomnomnom. 614 | 615 | ```bash 616 | go get github.com/tomnomnom/waybackurls #download the script 617 | waybackurls target.tld 618 | cat domains.txt | waybackurls # for multiple domain/subdomain 619 | ``` 620 | 621 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2036.png) 622 | 623 | You can use gau for collecting all the pagelinks , gau stands for gather all url. 624 | 625 | **GAU** 626 | 627 | ``` 628 | [Download-gau] https://github.com/lc/gau 629 | gau target.tld 630 | ``` 631 | 632 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2037.png) 633 | 634 | Gau wayback collects urls from other source , sometime these source contains outdated or dead url , dead url is useless for us.there are few tools available to crawl the life website and gather all pagelinks. 635 | 636 | **Gospider** 637 | 638 | ```bash 639 | [download-gospider] https://github.com/jaeles-project/gospider 640 | #for-single-target 641 | gospider -s "https://uber.com/" -o output -c 10 -d 1 --other-source --include-subs -q 642 | #with-list 643 | gospider -S sites.txt -o output -c 10 -d 1 -t 20 -q 644 | ``` 645 | 646 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2038.png) 647 | 648 | ``` 649 | You can use https://github.com/hakluke/hakrawler for the same thing. 650 | ``` 651 | 652 | - **Javascript Files Crawling & find sensitive information from jsfile** 653 | 654 | Sometime javascript files contains sensitive information like api_key,auth token or other sensitive staff . so collecting javascript file is usefull for use to get a sensitive information. 655 | 656 | There Are so many tools available for scraping javascript from page 657 | 658 | ``` 659 | https://github.com/003random/getJS 660 | https://github.com/Threezh1/JSFinder 661 | ``` 662 | 663 | **getjs** 664 | 665 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2039.png) 666 | 667 | I also Have written an python3 script that will help you to filter jsfiles from a webpage. 668 | 669 | [https://gist.github.com/JoyGhoshs/1131a230d7ea1a33d1d744174d49180a](https://gist.github.com/JoyGhoshs/1131a230d7ea1a33d1d744174d49180a) 670 | 671 | or You can use Waybackurls or gau to collect javascript files from target domain. 672 | 673 | ```bash 674 | gau target.tld | grep "\\.js" | uniq | sort -u 675 | waybackurls targets.tld | grep "\\.js" | uniq | sort 676 | ``` 677 | 678 | After collecting Those javascript file we should scan those file for any sensitive information.sometime those javascript files contains endpoints these endpoints are usefull for further scanning , so lets filter those relative endpoints , there is a tool available for that. 679 | 680 | 681 | 682 | ```bash 683 | [download-relative-url-extractor] https://github.com/jobertabma/relative-url-extractor 684 | cat file.js | ./extract.rb 685 | ``` 686 | 687 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2040.png) 688 | 689 | Now its time to scan for some api key from all collected js files.we can do that by using grep and regex. 690 | 691 | ```bash 692 | [see-the-list] https://github.com/System00-Security/API-Key-regex 693 | cat file.js | grep API_REGEX 694 | ``` 695 | 696 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2041.png) 697 | 698 | - **Domain/Subdomain Version and technology detection** 699 | 700 | Its important to scan for domain/subdomain version and technology so that we can create a model for vulnerability detection , how we are gonna approach our target site. 701 | 702 | **Wappalyzer** 703 | 704 | Its a popular technology and version detection tool , there is chrome extension so that we can see the website we are visiting its technology. 705 | 706 | ```bash 707 | [install-wappalyzer] npm i -g wappalyzer 708 | wappalyzer https://uber.com #single domain 709 | cat subdomain.txt | xargs -P1 -n1 wappalyzer | tee -a result 710 | ``` 711 | 712 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2042.png) 713 | 714 | **Whatweb** 715 | 716 | Its a great scanner we are using from ages , its good for its availability and its usage. 717 | 718 | ```bash 719 | [install-whatweb] sudo apt install whatweb 720 | whatweb uber.com #single domain 721 | cat subdomains.txt | whatweb -i | tee -a result 722 | ``` 723 | 724 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2043.png) 725 | 726 | **WAD [Web application detector]** 727 | 728 | ```bash 729 | [install-wad] pip3 install wad 730 | wad -u https://uber.com #single domain 731 | wad -u subdomains.txt 732 | ``` 733 | 734 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2044.png) 735 | 736 | **Nuclei** 737 | 738 | nuclei is a community based scanner , it scan web-applications with template , there is template on nuclei call technologies it contains detection pattern of various technologies , we can use that as a technology detection tool. 739 | 740 | ```bash 741 | [nuclei-download] https://github.com/projectdiscovery/nuclei 742 | [nuclei-technology-detection-pattern] https://github.com/projectdiscovery/nuclei-templates/tree/master/technologies 743 | nuclei -u https://uber.com -t -v nuclei-templates/technologies -o detect 744 | nuclei -l subdomain.txt -v -t nuclei-templates/technologies -o detect 745 | 746 | [! You can use -v as verbose mode so that you can see whats going on , how the sending those request] 747 | ``` 748 | 749 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2045.png) 750 | 751 | most of the time we need to detect cms for mapping our attack process , there are more than 100 tools available for that some of them are online web-application based some of them are cli based. 752 | 753 | ``` 754 | https://github.com/Tuhinshubhra/CMSeeK [CMSEEK] 755 | https://github.com/oways/cms-checker [CMS-CHEAKER] 756 | https://github.com/n4xh4ck5/CMSsc4n [CMSsc4n] 757 | ``` 758 | 759 | - **Sensitive information discovery** 760 | 761 | Sometime search engine like shodan zoomascan others contains sensitive information about the target, or the subdomains somehow those search engine disclose sensitive information about database or others. 762 | 763 | - **Google Dorking** 764 | 765 | **Google.com** 766 | 767 | We all know the well known fast and popular search engine is google.com, this search engine collects and index mostly every website available on surface web , so we can use it to find sensitive information about a domain. we will use google advance search also known as dorking. 768 | 769 | *Publicly Exposed Documents* 770 | 771 | ```bash 772 | site:target.tld ext:doc | ext:docx | ext:odt | ext:rtf | ext:sxw | ext:psw | ext:ppt | ext:pptx | ext:pps | ext:csv 773 | ``` 774 | 775 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2046.png) 776 | 777 | *Directory listing* 778 | 779 | ``` 780 | site:target.tld intitle:index.of 781 | ``` 782 | 783 | *Exposed Configuration file* 784 | 785 | ``` 786 | site:target.tld ext:xml | ext:conf | ext:cnf | ext:reg | ext:inf | ext:rdp | ext:cfg | ext:txt | ext:ora | ext:ini | ext:env 787 | ``` 788 | 789 | *Database site exposure* 790 | 791 | ``` 792 | site:target.tld ext:sql | ext:dbf | ext:mdb 793 | ``` 794 | 795 | *logfile exposure* 796 | 797 | ``` 798 | site:target.tld ext:log 799 | ``` 800 | 801 | *backupfile exposure* 802 | 803 | ``` 804 | site:target.tld ext:bkf | ext:bkp | ext:bak | ext:old | ext:backup 805 | ``` 806 | 807 | *login pages* 808 | 809 | ``` 810 | site:target.tld inurl:login | inurl:signin | intitle:Login | intitle:"sign in" | inurl:auth 811 | ``` 812 | 813 | *sql error on google index* 814 | 815 | ``` 816 | site:target.tld intext:"sql syntax near" | intext:"syntax error has occurred" | intext:"incorrect syntax near" | intext:"unexpected end of SQL command" | intext:"Warning: mysql_connect()" | intext:"Warning: mysql_query()" | intext:"Warning: pg_connect()" 817 | ``` 818 | 819 | *php error on google index* 820 | 821 | ``` 822 | site:target.tld "PHP Parse error" | "PHP Warning" | "PHP Error" 823 | ``` 824 | 825 | *phpinfo exposure on google index* 826 | 827 | ``` 828 | site:target.tld ext:php intitle:phpinfo "published by the PHP Group" 829 | ``` 830 | 831 | *leakage on pastebin/pasting site* 832 | 833 | ``` 834 | site:pastebin.com | site:paste2.org | site:pastehtml.com | site:slexy.org | site:snipplr.com | site:snipt.net | site:textsnip.com | site:bitpaste.app | site:justpaste.it | site:heypasteit.com | site:hastebin.com | site:dpaste.org | site:dpaste.com | site:codepad.org | site:jsitor.com | site:codepen.io | site:jsfiddle.net | site:dotnetfiddle.net | site:phpfiddle.org | site:ide.geeksforgeeks.org | site:repl.it | site:ideone.com | site:paste.debian.net | site:paste.org | site:paste.org.ru | site:codebeautify.org | site:codeshare.io | site:trello.com "target.tld" 835 | ``` 836 | 837 | *github/gitlab page from google* 838 | 839 | ``` 840 | site:github.com | site:gitlab.com "target.tld" 841 | ``` 842 | 843 | *search for issue on stackoverflow* 844 | 845 | ``` 846 | site:stackoverflow.com "target.tld" 847 | ``` 848 | 849 | List of Sensitive information Dork Just add site:"target.com" to find sensitive information about the target. 850 | 851 | [https://gist.github.com/JoyGhoshs/f8033c17e27e773c9be1ee60901a35f1](https://gist.github.com/JoyGhoshs/f8033c17e27e773c9be1ee60901a35f1) 852 | 853 | Automate full process [https://github.com/m3n0sd0n4ld/uDork](https://github.com/m3n0sd0n4ld/uDork) 854 | 855 | - **Github Recon** 856 | 857 | many of the target has github repo some of them are opensource project , sometime those github code project leaks their private api key for many services or sometime the source code disclose something sensitive thats why github is not only code vault it's also pii vault for hackers. 858 | 859 | You can do the recon 2 way on github one is manually one is automatically ,using github dorking tools. 860 | 861 | [Dork-list](https://www.notion.so/5d9b3fd4127c4564aa9d4b57e750dea2) 862 | 863 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2047.png) 864 | 865 | **GitDorker** 866 | 867 | ``` 868 | [download-gitdorker] https://github.com/obheda12/GitDorker 869 | python3 GitDorker.py -tf TOKENSFILE -q tesla.com -d Dorks/DORKFILE -o tesla 870 | ``` 871 | 872 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2048.png) 873 | 874 | Github Recon helps you to find PII more easily. 875 | 876 | - **Shodan Recon** 877 | 878 | shodan is most usefull search engine for hacker, you can find many sensitive and important information about the target from shodan , like google and github shodan also has advance search filter which will help us to find exact information about exact target. 879 | 880 | ``` 881 | shodan.io 882 | [filter-refernce] https://beta.shodan.io/search/filters 883 | ``` 884 | 885 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2049.png) 886 | 887 | **shodan search filter** 888 | 889 | ``` 890 | **hostname:target.com** | to find all asset available for target.com on shodan 891 | **http.title:"title"** | to find server/host with similer title 892 | **http.html:"/file"** | to find server/host with similar path 893 | **html:"context"** | to find server/host with similar string 894 | **server: "apache 2.2.3"** | ****to find server/host with same server 895 | **port:80** | to find server/host with same port 896 | **os:"windows"** | to find server/host with same os 897 | **asn:AS3214** | to find host/server with matched asn 898 | **http.status:200** | to find server/host with 200 http response code 899 | **http.favicon.hash:"hash"** | to find server/host with same favico hash 900 | **ssl.cert.subject.cn:"uber.com"** | find all server/host with same common name 901 | **product: "service_name"** | ****to find all the server/host with same service 902 | ``` 903 | 904 | we can use these filter to create a perfect query to search vulnerability or sensitive information on shodan. 905 | 906 | Example: 907 | 908 | ``` 909 | hostname:uber.com html:"db_uname:" port:"80" http.status:200 # this will find us a asset of uber.com with db_uname: with it with staus response code 200 910 | http.html:/dana-na/ ssl.cert.subject.cn:"uber.com" # this will find us Pulse VPN with possible CVE-2019-11510 911 | html:"horde_login" ssl.cert.subject.cn:"uber.com" # this will find us Horde Webamil with possible CVE 2018-19518 912 | We can Repet the second 2 process also with product filter Ex: 913 | product:"Pulse Secure" ssl.cert.subject.cn:"uber.com" 914 | http.html:"* The wp-config.php creation script uses this file" hostname:uber.com # this will find us open wp-config.php file with possible sensitive credential 915 | ``` 916 | 917 | Suppose we know a active exploit for apache 2.1 , to check manually to see which of our target subdomain is using apache 2.1 will cost us time and brain , for that we can create a dork on shodan to help us in this subject , **Example : server: "apache 2.1" hostname:"[target.com](http://target.com)"** we can replace the hostname to get more accurate result for target.com using ssl.cert.subject.cn:"target.com" , this will check if the target host/server contains target.com on their ssl or not. 918 | 919 | - **Leakix Scan** 920 | 921 | leakix is most underrated search engine for leaks and misconfiguration , it can find the leak for .git .env phpinfo() and many others. you can use it directly from the browser or use its client. 922 | 923 | 924 | 925 | ``` 926 | https://leakix.net 927 | ``` 928 | 929 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2050.png) 930 | 931 | Cli Client 932 | 933 | ``` 934 | [download] https://github.com/LeakIX/LeakIXClient 935 | ``` 936 | 937 | or You can use my python3 based client which grubs the result directly from web 938 | 939 | [https://gist.github.com/JoyGhoshs/c20865579d347aef4180ab6a30d3d8e1](https://gist.github.com/JoyGhoshs/c20865579d347aef4180ab6a30d3d8e1) 940 | 941 | ![Untitled](Information%20Gathering%20%5B%20Reloaded%20%5D%201d11633015374f1797b56a2447ced9a4/Untitled%2051.png) 942 | --------------------------------------------------------------------------------