├── .gitignore ├── Dockerfile ├── README.md ├── _config.yml ├── docs ├── _config.yml └── index.md ├── leaklooker.py └── requirements.txt /.gitignore: -------------------------------------------------------------------------------- 1 | leaklooker.py 2 | -------------------------------------------------------------------------------- /Dockerfile: -------------------------------------------------------------------------------- 1 | FROM python:3-alpine 2 | 3 | RUN apk add --update --no-cache git bash 4 | WORKDIR /leaklooker 5 | COPY . . 6 | RUN pip3 install -r requirements.txt 7 | ENTRYPOINT [ "python", "leaklooker.py" ] 8 | CMD ["-h"] 9 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # LeakLooker - Powered by Binaryedge.io 2 | Find open databases/services 3 | 4 | New version supports: 5 | - Elasticsearch 6 | - CouchDB 7 | - MongoDB 8 | - Gitlab 9 | - Rsync 10 | - Jenkins 11 | - Sonarqube 12 | - Kibana 13 | - CassandraDB 14 | - RethinkDB 15 | - Directory listing 16 | - Amazon S3 17 | 18 | and custom query. 19 | 20 | Queries: 21 | 22 | https://docs.binaryedge.io/api-v2/ 23 | 24 | Background: 25 | 26 | v1: https://medium.com/@woj_ciech/leaklooker-find-open-databases-in-a-second-9da4249c8472 27 | 28 | v2: https://medium.com/hackernoon/leaklooker-v2-find-more-open-servers-and-source-code-leaks-25e671700e41 29 | 30 | v3: https://medium.com/@woj_ciech/leaklooker-part-3-dna-samples-internal-files-and-more-967e794fa031 31 | 32 | ## Requirements: 33 | Python 3 & 34 | Binaryedge API 35 | 36 | ***Paste your BinaryEdge API key in line 113*** 37 | ``` 38 | pip3 install colorama 39 | pip3 install hurry.filesize 40 | pip3 install beautifulsoup4 41 | pip3 install pybinaryedge 42 | ``` 43 | 44 | ``` 45 | pip install -r requirements.txt 46 | ``` 47 | 48 | ## Usage 49 | ``` 50 | (venv) root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py -h 51 | 52 | , 53 | )\ 54 | / \ 55 | ' # ' 56 | ', ,' 57 | `' 58 | 59 | , 60 | )\ 61 | / \ 62 | ' ~ ' 63 | ', ,' 64 | `' 65 | LeakLooker - Find open databases - Powered by Binaryedge.io 66 | https://medium.com/@woj_ciech https://github.com/woj-ciech/ 67 | Example: python leaklooker.py --mongodb --couchdb --kibana --elastic --first 21 --last 37 68 | usage: leaklooker.py [-h] [--elastic] [--couchdb] [--mongodb] [--gitlab] 69 | [--rsync] [--jenkins] [--sonarqube] [--query QUERY] 70 | [--cassandra] [--rethink] [--listing] [--kibana] 71 | [--s3asia] [--s3usa] [--s3europe] [--first FIRST] 72 | [--last LAST] 73 | 74 | optional arguments: 75 | -h, --help show this help message and exit 76 | --elastic Elastic search (default: False) 77 | --couchdb CouchDB (default: False) 78 | --mongodb MongoDB (default: False) 79 | --gitlab Gitlab (default: False) 80 | --rsync Rsync (default: False) 81 | --jenkins Jenkins (default: False) 82 | --sonarqube SonarQube (default: False) 83 | --query QUERY Additional query or filter for BinaryEdge (default: ) 84 | --cassandra Cassandra DB (default: False) 85 | --rethink Rethink DB (default: False) 86 | --listing Listing directory (default: False) 87 | --kibana Kibana (default: False) 88 | --s3asia Amazon s3 s3.ap-southeast-1 (default: False) 89 | --s3usa Amazon s3 s3.ap-southeast-1 (default: False) 90 | --s3europe Amazon s3 s3.ap-southeast-1 (default: False) 91 | 92 | Pages: 93 | --first FIRST First page (default: None) 94 | --last LAST Last page (default: None) 95 | 96 | ``` 97 | 98 | ***You need to specify first and last page*** 99 | 100 | ## Example 101 | 102 | ### Search for RethinkDB and listing directory in pages from 21 to 37 103 | ``` 104 | root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py --rethink --listing --first 21 --last 37 105 | ----------------------------------Listing directory - Page 21-------------------------------- 106 | https://[REDACTED]:6666 107 | Product: Apache httpd 108 | Hostname: localhost 109 | [REDACTED]/ 110 | [REDACTED]/ 111 | [REDACTED]/ 112 | [REDACTED]/ 113 | [REDACTED]/ 114 | ----------------------------- 115 | https://[REDACTED]:6666 116 | Product: MiniServ 117 | ----------------------------- 118 | https://[REDACTED]:6666 119 | Product: Apache httpd 120 | [REDACTED]/ 121 | [REDACTED]/ 122 | [REDACTED].html 123 | [REDACTED]/ 124 | [REDACTED].css 125 | [REDACTED]/ 126 | [REDACTED]/ 127 | [REDACTED]/ 128 | favicon.ico 129 | ----------------------------- 130 | https://[REDACTED]:6666 131 | Product: Apache httpd 132 | [REDACTED]/ 133 | [REDACTED]/ 134 | [REDACTED]/ 135 | [REDACTED]..> 136 | [REDACTED]/ 137 | [REDACTED]..> 138 | [REDACTED]/ 139 | ----------------------------------Rethink DB - Page 21-------------------------------- 140 | ReQL: [REDACTED]:28015 141 | HTTP Admin: http://[REDACTED]:8080 142 | Hostname: [REDACTED] 143 | Version: rethinkdb 2.3.6~0trusty (GCC 4.8.2) 144 | Name: [REDACTED] 145 | Database: [REDACTED] 146 | Tables: 147 | Database: rethinkdb 148 | Tables: 149 | cluster_config 150 | current_issues 151 | db_config 152 | jobs 153 | logs 154 | permissions 155 | server_config 156 | server_status 157 | stats 158 | table_config 159 | table_status 160 | users 161 | Database: [REDACTED] 162 | Tables: 163 | ----------------------------- 164 | ReQL: [REDACTED]:28015 165 | HTTP Admin: http://[REDACTED]:8080 166 | Hostname: [REDACTED] 167 | Version: rethinkdb 2.3.6~0jessie (GCC 4.9.2) 168 | Name: [REDACTED] 169 | Database: [REDACTED] 170 | Tables: 171 | Database: rethinkdb 172 | Tables: 173 | cluster_config 174 | current_issues 175 | db_config 176 | jobs 177 | logs 178 | permissions 179 | server_config 180 | server_status 181 | stats 182 | table_config 183 | table_status 184 | users 185 | Database: settings 186 | Tables: 187 | ----------------------------- 188 | 189 | ``` 190 | 191 | ### Search for Jenkins, Gitlab in Uruguay (Country code is UY) on pages from 1 to 2 192 | ``` 193 | root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py --jenkins --gitlab --first 1 --last 2 --query "country:UY" 194 | ----------------------------------GitLab - Page 1-------------------------------- 195 | Total results: 13 196 | https://[REDACTED]:443 197 | GitLab Community Edition 198 | Registration is open 199 | ----------------------- 200 | https://[REDACTED]:443 201 | Registration is closed. Check public repositories. https://164.73.232.10:443/explore 202 | ----------------------- 203 | https://[REDACTED]:443 204 | Registration is closed. Check public repositories. https://190.64.138.5:443/explore 205 | ----------------------- 206 | https://[REDACTED]:443 207 | GitLab Community Edition 208 | Registration is open 209 | [...] 210 | ----------------------------------Jenkins - Page 1-------------------------------- 211 | Total results: 6501 212 | http://[REDACTED]:443 213 | Executors 214 | Windows 215 | (master) 216 | Jobs 217 | ----------------------------- 218 | http://[REDACTED]:443 219 | Executors 220 | Jobs 221 | ----------------------------- 222 | http://[REDACTED]:443 223 | Executors 224 | Jobs 225 | [REDACTED] 226 | [REDACTED] 227 | ``` 228 | ### Search for mongoDB and Elasticsearch with keyword "medical" only on first page 229 | ``` 230 | root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py --mongo --elastic --first 1 --last 2 --query "medical" 231 | ``` 232 | ## Additional 233 | Tool has been made for educational purposes only. I'm not responsible for any damage caused. Don't be evil. 234 | -------------------------------------------------------------------------------- /_config.yml: -------------------------------------------------------------------------------- 1 | theme: jekyll-theme-hacker -------------------------------------------------------------------------------- /docs/_config.yml: -------------------------------------------------------------------------------- 1 | theme: jekyll-theme-hacker -------------------------------------------------------------------------------- /docs/index.md: -------------------------------------------------------------------------------- 1 | # LeakLooker - Powered by Binaryedge.io 2 | Find open databases/services 3 | 4 | New version supports: 5 | - Elasticsearch 6 | - CouchDB 7 | - MongoDB 8 | - Gitlab 9 | - Rsync 10 | - Jenkins 11 | - Sonarqube 12 | - Kibana 13 | - CassandraDB 14 | - RethinkDB 15 | - Directory listing 16 | - Amazon S3 17 | 18 | and custom query. 19 | 20 | Queries: 21 | 22 | https://docs.binaryedge.io/api-v2/ 23 | 24 | Background: 25 | 26 | v1: https://medium.com/@woj_ciech/leaklooker-find-open-databases-in-a-second-9da4249c8472 27 | 28 | v2: https://medium.com/hackernoon/leaklooker-v2-find-more-open-servers-and-source-code-leaks-25e671700e41 29 | 30 | v3: https://medium.com/@woj_ciech/leaklooker-part-3-dna-samples-internal-files-and-more-967e794fa031 31 | 32 | ## Requirements: 33 | Python 3 & 34 | Binaryedge API 35 | 36 | ***Paste your BinaryEdge API key in line 113*** 37 | ``` 38 | pip3 install colorama 39 | pip3 install hurry.filesize 40 | pip3 install beautifulsoup4 41 | pip3 install pybinaryedge 42 | ``` 43 | 44 | ``` 45 | pip install -r requirements.txt 46 | ``` 47 | 48 | ## Usage 49 | ``` 50 | (venv) root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py -h 51 | 52 | , 53 | )\ 54 | / \ 55 | ' # ' 56 | ', ,' 57 | `' 58 | 59 | , 60 | )\ 61 | / \ 62 | ' ~ ' 63 | ', ,' 64 | `' 65 | LeakLooker - Find open databases - Powered by Binaryedge.io 66 | https://medium.com/@woj_ciech https://github.com/woj-ciech/ 67 | Example: python leaklooker.py --mongodb --couchdb --kibana --elastic --first 21 --last 37 68 | usage: leaklooker.py [-h] [--elastic] [--couchdb] [--mongodb] [--gitlab] 69 | [--rsync] [--jenkins] [--sonarqube] [--query QUERY] 70 | [--cassandra] [--rethink] [--listing] [--kibana] 71 | [--s3asia] [--s3usa] [--s3europe] [--first FIRST] 72 | [--last LAST] 73 | 74 | optional arguments: 75 | -h, --help show this help message and exit 76 | --elastic Elastic search (default: False) 77 | --couchdb CouchDB (default: False) 78 | --mongodb MongoDB (default: False) 79 | --gitlab Gitlab (default: False) 80 | --rsync Rsync (default: False) 81 | --jenkins Jenkins (default: False) 82 | --sonarqube SonarQube (default: False) 83 | --query QUERY Additional query or filter for BinaryEdge (default: ) 84 | --cassandra Cassandra DB (default: False) 85 | --rethink Rethink DB (default: False) 86 | --listing Listing directory (default: False) 87 | --kibana Kibana (default: False) 88 | --s3asia Amazon s3 s3.ap-southeast-1 (default: False) 89 | --s3usa Amazon s3 s3.ap-southeast-1 (default: False) 90 | --s3europe Amazon s3 s3.ap-southeast-1 (default: False) 91 | 92 | Pages: 93 | --first FIRST First page (default: None) 94 | --last LAST Last page (default: None) 95 | 96 | ``` 97 | 98 | ***You need to specify first and last page*** 99 | 100 | ## Example 101 | 102 | ### Search for RethinkDB and listing directory in pages from 21 to 37 103 | ``` 104 | root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py --rethink --listing --first 21 --last 37 105 | ----------------------------------Listing directory - Page 21-------------------------------- 106 | https://[REDACTED]:6666 107 | Product: Apache httpd 108 | Hostname: localhost 109 | [REDACTED]/ 110 | [REDACTED]/ 111 | [REDACTED]/ 112 | [REDACTED]/ 113 | [REDACTED]/ 114 | ----------------------------- 115 | https://[REDACTED]:6666 116 | Product: MiniServ 117 | ----------------------------- 118 | https://[REDACTED]:6666 119 | Product: Apache httpd 120 | [REDACTED]/ 121 | [REDACTED]/ 122 | [REDACTED].html 123 | [REDACTED]/ 124 | [REDACTED].css 125 | [REDACTED]/ 126 | [REDACTED]/ 127 | [REDACTED]/ 128 | favicon.ico 129 | ----------------------------- 130 | https://[REDACTED]:6666 131 | Product: Apache httpd 132 | [REDACTED]/ 133 | [REDACTED]/ 134 | [REDACTED]/ 135 | [REDACTED]..> 136 | [REDACTED]/ 137 | [REDACTED]..> 138 | [REDACTED]/ 139 | ----------------------------------Rethink DB - Page 21-------------------------------- 140 | ReQL: [REDACTED]:28015 141 | HTTP Admin: http://[REDACTED]:8080 142 | Hostname: [REDACTED] 143 | Version: rethinkdb 2.3.6~0trusty (GCC 4.8.2) 144 | Name: [REDACTED] 145 | Database: [REDACTED] 146 | Tables: 147 | Database: rethinkdb 148 | Tables: 149 | cluster_config 150 | current_issues 151 | db_config 152 | jobs 153 | logs 154 | permissions 155 | server_config 156 | server_status 157 | stats 158 | table_config 159 | table_status 160 | users 161 | Database: [REDACTED] 162 | Tables: 163 | ----------------------------- 164 | ReQL: [REDACTED]:28015 165 | HTTP Admin: http://[REDACTED]:8080 166 | Hostname: [REDACTED] 167 | Version: rethinkdb 2.3.6~0jessie (GCC 4.9.2) 168 | Name: [REDACTED] 169 | Database: [REDACTED] 170 | Tables: 171 | Database: rethinkdb 172 | Tables: 173 | cluster_config 174 | current_issues 175 | db_config 176 | jobs 177 | logs 178 | permissions 179 | server_config 180 | server_status 181 | stats 182 | table_config 183 | table_status 184 | users 185 | Database: settings 186 | Tables: 187 | ----------------------------- 188 | 189 | ``` 190 | 191 | ### Search for Jenkins, Gitlab in Uruguay (Country code is UY) on pages from 1 to 2 192 | ``` 193 | root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py --jenkins --gitlab --first 1 --last 2 --query "country:UY" 194 | ----------------------------------GitLab - Page 1-------------------------------- 195 | Total results: 13 196 | https://[REDACTED]:443 197 | GitLab Community Edition 198 | Registration is open 199 | ----------------------- 200 | https://[REDACTED]:443 201 | Registration is closed. Check public repositories. https://164.73.232.10:443/explore 202 | ----------------------- 203 | https://[REDACTED]:443 204 | Registration is closed. Check public repositories. https://190.64.138.5:443/explore 205 | ----------------------- 206 | https://[REDACTED]:443 207 | GitLab Community Edition 208 | Registration is open 209 | [...] 210 | ----------------------------------Jenkins - Page 1-------------------------------- 211 | Total results: 6501 212 | http://[REDACTED]:443 213 | Executors 214 | Windows 215 | (master) 216 | Jobs 217 | ----------------------------- 218 | http://[REDACTED]:443 219 | Executors 220 | Jobs 221 | ----------------------------- 222 | http://[REDACTED]:443 223 | Executors 224 | Jobs 225 | [REDACTED] 226 | [REDACTED] 227 | ``` 228 | ### Search for mongoDB and Elasticsearch with keyword "medical" only on first page 229 | ``` 230 | root@kali:~/PycharmProjects/LeakLooker# python leaklooker.py --mongo --elastic --first 1 --last 2 --query "medical" 231 | ``` 232 | ## Additional 233 | Tool has been made for educational purposes only. I'm not responsible for any damage caused. Don't be evil. 234 | -------------------------------------------------------------------------------- /leaklooker.py: -------------------------------------------------------------------------------- 1 | from hurry.filesize import size 2 | from colorama import Fore 3 | import json 4 | import sys 5 | import argparse 6 | from bs4 import BeautifulSoup 7 | import requests 8 | from pybinaryedge import BinaryEdge 9 | from urllib.parse import urlparse 10 | 11 | 12 | 13 | 14 | description = Fore.BLUE + r""" 15 | , 16 | )\ 17 | / \ 18 | ' # ' 19 | ', ,' 20 | `' 21 | 22 | , 23 | )\ 24 | / \ 25 | ' ~ ' 26 | ', ,' 27 | `'""" + Fore.RESET + \ 28 | """ 29 | LeakLooker - Find open databases - Powered by Binaryedge.io 30 | https://medium.com/@woj_ciech https://github.com/woj-ciech/ 31 | Example: python leaklooker.py --mongodb --couchdb --kibana --elastic --first 21 --last 37""" 32 | 33 | print(description) 34 | 35 | 36 | parser = argparse.ArgumentParser( 37 | formatter_class=argparse.ArgumentDefaultsHelpFormatter # added to show default value 38 | ) 39 | 40 | group = parser.add_argument_group("Pages") 41 | 42 | parser.add_argument("--elastic", help="Elastic search", action='store_true') 43 | parser.add_argument("--couchdb", help="CouchDB", action='store_true') 44 | parser.add_argument("--mongodb", help="MongoDB", action='store_true') 45 | parser.add_argument("--gitlab", help="Gitlab", action='store_true') 46 | parser.add_argument("--rsync", help="Rsync", action='store_true') 47 | parser.add_argument("--jenkins", help="Jenkins", action='store_true') 48 | parser.add_argument("--sonarqube", help="SonarQube", action='store_true') 49 | parser.add_argument("--query", help="Additional query or filter for BinaryEdge", default="") 50 | parser.add_argument("--cassandra", help="Cassandra DB", action='store_true') 51 | parser.add_argument("--rethink", help="Rethink DB", action='store_true') 52 | parser.add_argument("--listing", help="Listing directory", action='store_true') 53 | parser.add_argument('--kibana', help='Kibana', action='store_true') 54 | parser.add_argument("--s3asia", help="Amazon s3 s3.ap-southeast-1", action="store_true") 55 | parser.add_argument("--s3usa", help="Amazon s3 s3.ap-southeast-1", action="store_true") 56 | parser.add_argument("--s3europe", help="Amazon s3 s3.ap-southeast-1", action="store_true") 57 | parser.add_argument("--proxy", help="In the form of socks5://127.0.0.1:9050", action="store") 58 | parser.add_argument("--minsize", help="Minimum size to consider parsing", action="store", default=25000000000, type=int) 59 | parser.add_argument("--key", help="BinaryEdge API Key", action="store") 60 | 61 | group.add_argument('--first', help='First page', default=None, type=int) 62 | group.add_argument('--last', help='Last page', default=None, type=int) 63 | 64 | args = parser.parse_args() 65 | 66 | gitlab = args.gitlab 67 | rsync = args.rsync 68 | jenkins = args.jenkins 69 | sonarqube = args.sonarqube 70 | query = args.query 71 | elastic = args.elastic 72 | couchdb = args.couchdb 73 | mongodb = args.mongodb 74 | kibana = args.kibana 75 | first = args.first 76 | last = args.last 77 | cassandra = args.cassandra 78 | listing = args.listing 79 | rethink = args.rethink 80 | s3asia = args.s3asia 81 | s3usa = args.s3usa 82 | s3europe = args.s3europe 83 | 84 | if args.proxy is not None: 85 | proxies = { 86 | 'http' : args.proxy, 87 | 'https' : args.proxy 88 | } 89 | else: 90 | proxies = None 91 | 92 | arr_query = [] 93 | second_part = "" 94 | 95 | if ":" in query: 96 | arr_query = query.split(":") 97 | second_part = '"' + arr_query[1] + '"' 98 | 99 | query = "%20AND%20" + arr_query[0] +":"+ second_part 100 | else: 101 | query = "" 102 | 103 | 104 | if first and last is None: 105 | print(description) 106 | print(Fore.RED + "Choose pages to search"+ Fore.RESET) 107 | sys.exit() 108 | elif last and first is None: 109 | print(description) 110 | print(Fore.RED + "Choose pages to search"+ Fore.RESET) 111 | sys.exit() 112 | elif first is None and last is None: 113 | print(description) 114 | print(Fore.RED + "Choose pages to search"+ Fore.RESET) 115 | sys.exit() 116 | elif first > last: 117 | print(description) 118 | print(Fore.RED + "Correct pages "+ Fore.RESET) 119 | sys.exit() 120 | else: 121 | last = last + 1 122 | 123 | elastic_query = "type:%22elasticsearch%22" 124 | mongodb_query = "type:%22mongodb%22" 125 | couchdb_query = "product:%22couchdb%22" 126 | rsync_query = "rsync port:%22873%22" 127 | sonarqube_query = "%22Title: SonarQube%22" 128 | jenkins_query = "%22Dashboard [Jenkins]%22" 129 | gitlab_query = "%22Sign in GitLab%22" 130 | kibana_query = "product:%22kibana%22" 131 | listing_query = '%22Index of /%22' 132 | cassandra_query = "type:%22cassandra%22" 133 | rethink_query = "type:%22rethinkdb%22" 134 | 135 | BINARYEDGE_API_KEY = args.key 136 | be = BinaryEdge(BINARYEDGE_API_KEY) 137 | 138 | buckets = set() 139 | def parse_bucket(bucket): 140 | parsed = urlparse(bucket) 141 | path = parsed.path.split('/') 142 | 143 | 144 | 145 | try: 146 | if parsed.netloc.startswith("s3"): 147 | if len(path) > 1: 148 | if path[1] not in buckets: 149 | print("https://" + parsed.netloc + "/" + path[1]) 150 | amazon_req = requests.get("https://" + parsed.netloc + "/" + path[1], timeout=10, proxies=proxies) 151 | if amazon_req.status_code == 200: 152 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 153 | elif amazon_req.status_code == 404: 154 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 155 | else: 156 | print("Status: " + Fore.RED + str(amazon_req.status_code) + Fore.RESET) 157 | buckets.add(path[1]) 158 | elif parsed.netloc == "": 159 | parsed_netloc = parsed.path.split("/") 160 | if parsed_netloc[3] not in buckets: 161 | print("https://" + parsed_netloc[2] + "/" + parsed_netloc[3]) 162 | amazon_req = requests.get("https://" + parsed_netloc[2] + "/" + parsed_netloc[3], timeout=10, proxies=proxies) 163 | if amazon_req.status_code == 200: 164 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 165 | elif amazon_req.status_code == 404: 166 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 167 | else: 168 | print("Status: " + Fore.RED + str(amazon_req.status_code) + Fore.RESET) 169 | buckets.add(parsed_netloc[3]) 170 | else: 171 | if parsed.netloc not in buckets: 172 | print("https://" + parsed.netloc) 173 | amazon_req = requests.get("https://" + parsed.netloc, proxies=proxies) 174 | if amazon_req.status_code == 200: 175 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 176 | elif amazon_req.status_code == 404: 177 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 178 | else: 179 | print("Status: " + Fore.RED + str(amazon_req.status_code) + Fore.RESET) 180 | buckets.add(parsed.netloc) 181 | except: 182 | pass 183 | 184 | 185 | def check_amazons3(results): 186 | for ip in results['events']: 187 | print(Fore.MAGENTA + str(ip['target']['ip']) + Fore.RESET) 188 | print(Fore.BLUE + "https://app.binaryedge.io/services/query/"+str(ip['target']['ip']) + Fore.RESET) 189 | 190 | splitted = ip['result']['data']['service']['banner'].split("\\r\\n") 191 | 192 | for header in splitted: 193 | if 'amazon' in header: 194 | splitted_header = header.split(" ") 195 | for i in splitted_header: 196 | if "amazonaws.com" in i: 197 | parsed = urlparse(i) 198 | path = parsed.path.split('/') 199 | try: 200 | 201 | if parsed.netloc.startswith("s3"): 202 | if len(path) > 1: 203 | if path[1] not in buckets: 204 | print("https://" + parsed.netloc + "/" + path[1]) 205 | amazon_req = requests.get("https://" + parsed.netloc + "/" + path[1], 206 | timeout=10, proxies=proxies) 207 | print(amazon_req.status_code) 208 | if amazon_req.status_code == 200: 209 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 210 | elif amazon_req.status_code == 404: 211 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 212 | else: 213 | print("Status: " + Fore.RED + str(amazon_req.status_code) + Fore.RESET) 214 | buckets.add(path[1]) 215 | else: 216 | if parsed.netloc not in buckets: 217 | print("https://" + parsed.netloc) 218 | amazon_req = requests.get("https://" + parsed.netloc, proxies=proxies) 219 | print(amazon_req.status_code) 220 | if amazon_req.status_code == 200: 221 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 222 | elif amazon_req.status_code == 404: 223 | print("Status: " + Fore.GREEN + str(amazon_req.status_code) + Fore.RESET) 224 | else: 225 | print("Status: " + Fore.RED + str(amazon_req.status_code) + Fore.RESET) 226 | buckets.add(parsed.netloc) 227 | except: 228 | pass 229 | if header == "" or "\\n" in header: 230 | break 231 | 232 | soup = BeautifulSoup(ip['result']['data']['service']['banner'],"html.parser") 233 | 234 | for a in soup.find_all(href=True): 235 | if "amazonaws.com" in a['href']: 236 | parse_bucket(a['href']) 237 | 238 | for a in soup.find_all("script", {"src":True}): 239 | if "amazonaws.com" in a['src']: 240 | parse_bucket(a['src']) 241 | 242 | for a in soup.find_all("img", {"src":True}): 243 | if "amazonaws.com" in a['src']: 244 | parse_bucket(a['src']) 245 | 246 | title = soup.find("meta", property="og:image") 247 | if title: 248 | if "amazonaws.com" in title['content']: 249 | parse_bucket(title['content']) 250 | 251 | print("----------------------------------") 252 | 253 | 254 | def binaryedge_query(query,page): 255 | headers = {'X-Key': BINARYEDGE_API_KEY} 256 | end = 'https://api.binaryedge.io/v2/query/search?query='+query+'&page='+str(page) 257 | req = requests.get(end,headers=headers) 258 | req_json = json.loads(req.content) 259 | 260 | try: 261 | print("Total results: " + Fore.GREEN + str(req_json['total']) + Fore.RESET) 262 | except: 263 | print("Error with your query") 264 | sys.exit() 265 | 266 | return req_json['events'] 267 | 268 | def check_jenkins(results): 269 | if results: 270 | for service in results: 271 | print('http://'+service['target']['ip'] +":"+str(service['target']['port'])) 272 | executors = set() 273 | jobs = set() 274 | 275 | try: 276 | html_code = service['result']['data']['response']['body'] 277 | soup = BeautifulSoup(html_code, features="html.parser") 278 | for project in soup.find_all("a", {"class": "model-link inside"}): 279 | if project['href'].startswith("/computer"): 280 | splitted = project['href'].split("/") 281 | executors.add(splitted[2]) 282 | 283 | elif project['href'].startswith("job"): 284 | splitted = project['href'].split("/") 285 | jobs.add(splitted[1]) 286 | 287 | print(Fore.BLUE + "Executors" + Fore.RESET) 288 | for executor in executors: 289 | print(Fore.CYAN + executor + Fore.RESET) 290 | 291 | print(Fore.BLUE + "Jobs" + Fore.RESET) 292 | for job in jobs: 293 | print(Fore.CYAN + job + Fore.RESET) 294 | except: 295 | print(Fore.RED + "No information"+ Fore.RESET) 296 | print("-----------------------------") 297 | 298 | def check_sonarqube(results): 299 | if results: 300 | for service in results: 301 | found = False 302 | print('http://'+service['target']['ip'] +":"+str(service['target']['port'])) 303 | try: 304 | html_code = service['result']['data']['response']['body'] 305 | if "Welcome to SonarQube Dashboard" in html_code: 306 | soup = BeautifulSoup(html_code, features="html.parser") 307 | for project in soup.find_all("a", href=True): 308 | if "/dashboard/index" in project.attrs['href']: 309 | print(Fore.GREEN + project.contents[0] + Fore.RESET) 310 | found = True 311 | 312 | if not found: 313 | print("Open with no projects") 314 | 315 | elif service['result']['data']['response']['redirects']: 316 | print(Fore.RED + "Authentication" + Fore.RESET) 317 | print("---------------------") 318 | except: 319 | print(Fore.RED + "Can't retrieve details" + Fore.RESET) 320 | print("Server status: " + service['result']['data']['state']['state']) 321 | print("---------------------") 322 | 323 | def check_rsync(results): 324 | if results: 325 | for service in results: 326 | print('rsync://'+service['target']['ip']) 327 | print("Server status: " + service['result']['data']['state']['state']) 328 | try: 329 | print(Fore.GREEN + service['result']['data']['service']['banner'] + Fore.RESET,) 330 | except: 331 | print(Fore.RED + 'No information' + Fore.RESET) 332 | print("------------------------------") 333 | 334 | def check_gitlab(results): 335 | if results: 336 | for service in results: 337 | print('https://' + service['target']['ip'] + ":" + str(service['target']['port'])) 338 | html_code = service['result']['data']['response']['body'] 339 | if "register" in html_code: 340 | soup = BeautifulSoup(html_code, features="html.parser") 341 | for project in soup.find_all("meta", {'property':"twitter:description"}): 342 | print(Fore.GREEN + project.attrs['content'] + Fore.RESET) 343 | print(Fore.GREEN + "Registration is open" + Fore.RESET) 344 | else: 345 | print(Fore.RED + "Registration is closed. " + Fore.RESET + "Check public repositories. https://" + service['target']['ip'] + ":" + str(service['target']['port']) + "/explore") 346 | 347 | print("-----------------------") 348 | 349 | def check_kibana(results): 350 | if results: 351 | for service in results: 352 | print('http://' + service['target']['ip'] + ":" + str(service['target']['port'])+"/app/kibana#/discover?_g=()") 353 | print("Server status: " + service['result']['data']['state']['state']) 354 | print("-----------------------") 355 | 356 | def check_couchdb(results): 357 | if results: 358 | for service in results: 359 | print('https://' + service['target']['ip'] + ":" + str(service['target']['port']) +"/_utils") 360 | try: 361 | couch_json = json.loads(service['result']['data']['response']['body']) 362 | print("Status code: " + str(service['result']['data']['response']['statusCode'])) 363 | print("Vendor: " + Fore.CYAN + couch_json['vendor']['name'] + Fore.RESET) 364 | print('Features:') 365 | for i in couch_json['features']: 366 | print(Fore.GREEN + i + Fore.RESET) 367 | except Exception as e: 368 | if 'state' in service['result']['data']: 369 | print("Server status: " + service['result']['data']['state']['state']) 370 | else: 371 | print(Fore.RED + "Cannot retrieve information" + Fore.RESET) 372 | 373 | print("-----------------------------") 374 | 375 | def check_mongodb(results): 376 | if results: 377 | for service in results: 378 | print('IP: ' + service['target']['ip'] + ":" + str(service['target']['port'])) 379 | if not service['result']['error']: 380 | try: 381 | if service['result']['data']['listDatabases']['totalSize'] > argparse.minsize: 382 | print("Size: " + Fore.LIGHTBLUE_EX + size( 383 | service['result']['data']['listDatabases']['totalSize']) + Fore.RESET) 384 | 385 | for database in service['result']['data']['listDatabases']['databases']: 386 | if database['empty'] != 'true': 387 | print("Database name: " + Fore.BLUE + database['name'] + Fore.RESET) 388 | print("Size: " + Fore.BLUE + size(database['sizeOnDisk']) + Fore.RESET) 389 | print('Collections: ') 390 | for collection in database['collections']: 391 | print(Fore.LIGHTBLUE_EX + collection['name'] + Fore.RESET) 392 | print("-----------------------------") 393 | else: 394 | print("Total size is only " + Fore.RED + str(service['result']['data']['listDatabases']['totalSize']) +Fore.RESET + " which is below default - 217000000") 395 | print("-----------------------------") 396 | except: 397 | pass 398 | else: 399 | # print("Error: " + Fore.RED + service['result']['error'][0]['errmsg'] + Fore.RESET) 400 | print("-----------------------------") 401 | 402 | def check_elastic(results): 403 | if results: 404 | for service in results: 405 | print('http://' + service['target']['ip'] + ":" + str(service['target']['port']) + "/_cat/indices") 406 | print("Cluster name: "+ Fore.LIGHTMAGENTA_EX + service['result']['data']['cluster_name'] + Fore.RESET) 407 | print("Indices:") 408 | 409 | try: 410 | for indice in service['result']['data']['indices']: 411 | if indice['size_in_bytes'] > 10000000000: 412 | print("Name: " + Fore.GREEN + indice['index_name'] + Fore.RESET) 413 | print("No. of documents: " +Fore.BLUE + str(indice['docs']) + Fore.RESET) 414 | print("Size: " + Fore.LIGHTCYAN_EX + str(size(indice['size_in_bytes'])) + Fore.RESET) 415 | except: 416 | print("No indices") 417 | print("-----------------------------") 418 | 419 | def check_listing(results): 420 | if results: 421 | for service in results: 422 | dir = False 423 | print('https://' + service['target']['ip'] + ":" + str(service['target']['port'])) 424 | 425 | try: 426 | print("Product: " + Fore.MAGENTA + service['result']['data']['service']['product'] + Fore.RESET) 427 | 428 | if 'hostname' in service['result']['data']['service']: 429 | print("Hostname: " + Fore.YELLOW + service['result']['data']['service']['hostname'] + Fore.RESET) 430 | html_code = service['result']['data']['service']['banner'] 431 | except KeyError: 432 | if 'response' in service['result']['data']: 433 | print("Status code: " + str(service['result']['data']['response']['statusCode'])) 434 | html_code = service['result']['data']['response']['body'] 435 | else: 436 | html_code = "" 437 | 438 | 439 | soup = BeautifulSoup(html_code, features="html.parser") 440 | for project in soup.find_all("a", href=True): 441 | try: 442 | if project.contents[0] == "Name" or project.contents[0] == "Last modified" or project.contents[0] == "Size" or project.contents[0] == "Description" or project.contents[0] == "../": 443 | dir = True 444 | pass 445 | 446 | if dir == True: 447 | if project.contents[0] == "Name" or project.contents[0] == "Last modified" or project.contents[ 448 | 0] == "Size" or project.contents[0] == "Description" or project.contents[0] == "../": 449 | pass 450 | else: 451 | print(Fore.GREEN + str(project.contents[0]) + Fore.RESET) 452 | 453 | except: 454 | pass 455 | 456 | print("-----------------------------") 457 | 458 | def check_cassandra(results): 459 | if results: 460 | for service in results: 461 | print('IP: ' + service['target']['ip'] + ":" + str(service['target']['port'])) 462 | 463 | try: 464 | print("Cluster name: " + Fore.MAGENTA + service['result']['data']['info'][0]['cluster_name'] + Fore.RESET) 465 | print("Datacenter: " + Fore.YELLOW + service['result']['data']['info'][0]['data_center'] + Fore.RESET) 466 | 467 | for keyspace in service['result']['data']['keyspaces']: 468 | if keyspace == 'system' or keyspace =="system_traces" or keyspace == "system_schema" or keyspace=='system_auth' or keyspace=='system_distributed': 469 | pass 470 | else: 471 | print("Keyspace: " + Fore.BLUE + keyspace + Fore.RESET) 472 | print("Tables: ") 473 | for table in service['result']['data']['keyspaces'][keyspace]['tables']: 474 | print(Fore.GREEN + table + Fore.RESET) 475 | 476 | print("-----------------------------") 477 | 478 | except Exception as e: 479 | print("-----------------------------") 480 | pass 481 | 482 | def check_rethinkdb(results): 483 | if results: 484 | for service in results: 485 | print('ReQL: ' + service['target']['ip'] + ":" + str(service['result']['data']['status'][0]['network']['reql_port'])) 486 | print('HTTP Admin: ' + "http://"+service['target']['ip'] + ":" + str(service['result']['data']['status'][0]['network']['http_admin_port'])) 487 | 488 | if 'hostname' in service['result']['data']['status'][0]['network']: 489 | print("Hostname: " + Fore.BLUE + service['result']['data']['status'][0]['network']['hostname'] + Fore.RESET) 490 | 491 | print("Version: " + Fore.YELLOW + service['result']['data']['status'][0]['process']['version'] + Fore.RESET) 492 | print("Name: " + Fore.MAGENTA + service['result']['data']['status'][0]['name'] + Fore.RESET) 493 | 494 | for database in service['result']['data']['databases']: 495 | 496 | print("Database: " + Fore.LIGHTCYAN_EX + database + Fore.RESET) 497 | print("Tables: ") 498 | for table in service['result']['data']['databases'][database]['tables']: 499 | print(Fore.GREEN + table + Fore.RESET) 500 | 501 | print("-----------------------------") 502 | 503 | if rsync: 504 | for page in range(first,last): 505 | print(Fore.RED + '----------------------------------Rsync - Page ' + str( 506 | page) + '--------------------------------' + Fore.RESET) 507 | rsync_results = binaryedge_query(rsync_query + " " + query,page) 508 | check_rsync(rsync_results) 509 | 510 | if gitlab: 511 | for page in range(first,last): 512 | print(Fore.RED + '----------------------------------GitLab - Page ' + str( 513 | page) + '--------------------------------' + Fore.RESET) 514 | gitlab_results = binaryedge_query(gitlab_query+ " " + query,page) 515 | check_gitlab(gitlab_results) 516 | 517 | if sonarqube: 518 | for page in range(first, last): 519 | print(Fore.RED + '----------------------------------SonarQube - Page ' + str( 520 | page) + '--------------------------------' + Fore.RESET) 521 | sonarqube_results = binaryedge_query(sonarqube_query+ " " + query, page) 522 | check_sonarqube(sonarqube_results) 523 | 524 | if jenkins: 525 | for page in range(first, last): 526 | print(Fore.RED + '----------------------------------Jenkins - Page ' + str( 527 | page) + '--------------------------------' + Fore.RESET) 528 | jenkins_results = binaryedge_query(jenkins_query+ " " + query, page) 529 | check_jenkins(jenkins_results) 530 | 531 | if elastic: 532 | for page in range(first, last): 533 | print(Fore.RED + '----------------------------------Elastic - Page ' + str( 534 | page) + '--------------------------------' + Fore.RESET) 535 | elastic_results = binaryedge_query(elastic_query+ " " + query, page) 536 | check_elastic(elastic_results) 537 | 538 | if couchdb: 539 | for page in range(first, last): 540 | print(Fore.RED + '----------------------------------CouchDB - Page ' + str( 541 | page) + '--------------------------------' + Fore.RESET) 542 | couchdb_results = binaryedge_query(couchdb_query+ " " + query, page) 543 | check_couchdb(couchdb_results) 544 | 545 | if mongodb: 546 | for page in range(first, last): 547 | print(Fore.RED + '----------------------------------MongoDB - Page ' + str( 548 | page) + '--------------------------------' + Fore.RESET) 549 | mongodb_results = binaryedge_query(mongodb_query+ " " + query, page) 550 | check_mongodb(mongodb_results) 551 | 552 | if kibana: 553 | for page in range(first, last): 554 | print(Fore.RED + '----------------------------------Kibana - Page ' + str( 555 | page) + '--------------------------------' + Fore.RESET) 556 | kibana_results = binaryedge_query(kibana_query+ " " + query, page) 557 | check_kibana(kibana_results) 558 | 559 | if listing: 560 | for page in range(first,last): 561 | print(Fore.RED + '----------------------------------Listing directory - Page ' + str( 562 | page) + '--------------------------------' + Fore.RESET) 563 | listing_results = binaryedge_query(listing_query + " " + query,page) 564 | check_listing(listing_results) 565 | 566 | if cassandra: 567 | for page in range(first,last): 568 | print(Fore.RED + '----------------------------------Cassandra - Page ' + str( 569 | page) + '--------------------------------' + Fore.RESET) 570 | cassandra_results = binaryedge_query(cassandra_query + " " + query,page) 571 | check_cassandra(cassandra_results) 572 | 573 | if rethink: 574 | for page in range(first,last): 575 | print(Fore.RED + '----------------------------------Rethink DB - Page ' + str( 576 | page) + '--------------------------------' + Fore.RESET) 577 | rethink_results = binaryedge_query(rethink_query + " " + query,page) 578 | check_rethinkdb(rethink_results) 579 | 580 | if s3asia: 581 | search = '"s3.ap-southeast-1.amazonaws.com"'+ " " + query+' tag:"WEBSERVER"' 582 | for page in range(first,last): 583 | print(Fore.RED + '----------------------------------s3.ap-southeast-1.amazonaws.com - Page ' + str( 584 | page) + '--------------------------------' + Fore.RESET) 585 | results = be.host_search(search,page) 586 | check_amazons3(results) 587 | 588 | if s3usa: 589 | search = '"s3-us-west-2.amazonaws.com"'+ " " + query+' tag:"WEBSERVER"' 590 | for page in range(first,last): 591 | print(Fore.RED + '----------------------------------s3.ap-southeast-1.amazonaws.com - Page ' + str( 592 | page) + '--------------------------------' + Fore.RESET) 593 | results = be.host_search(search,page) 594 | check_amazons3(results) 595 | 596 | if s3europe: 597 | search = '"s3-eu-west-1.amazonaws.com"'+ " " + query+' tag:"WEBSERVER"' 598 | for page in range(first,last): 599 | print(Fore.RED + '----------------------------------s3.ap-southeast-1.amazonaws.com - Page ' + str( 600 | page) + '--------------------------------' + Fore.RESET) 601 | results = be.host_search(search,page) 602 | check_amazons3(results) 603 | 604 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | beautifulsoup4==4.7.1 2 | colorama==0.4.1 3 | hurry.filesize==0.9 4 | requests==2.22.0 5 | urllib3==1.25.2 6 | pybinaryedge 7 | pysocks 8 | --------------------------------------------------------------------------------