├── scrapy ├── wooyun │ ├── wooyun │ │ ├── __init__.py │ │ ├── spiders │ │ │ ├── __init__.py │ │ │ └── WooyunSpider.py │ │ ├── items.py │ │ ├── pipelines.py │ │ └── settings.py │ └── scrapy.cfg └── wooyun_drops │ ├── wooyun_drops │ ├── __init__.py │ ├── spiders │ │ ├── __init__.py │ │ └── WooyunSpider.py │ ├── items.py │ ├── settings.py │ └── pipelines.py │ └── scrapy.cfg ├── index.png ├── search.png ├── .gitignore ├── flask ├── static │ ├── bugs │ │ ├── bootstrap │ │ │ ├── fonts │ │ │ │ ├── glyphicons-halflings-regular.eot │ │ │ │ ├── glyphicons-halflings-regular.ttf │ │ │ │ ├── glyphicons-halflings-regular.woff │ │ │ │ └── glyphicons-halflings-regular.woff2 │ │ │ ├── js │ │ │ │ ├── npm.js │ │ │ │ └── bootstrap.min.js │ │ │ └── css │ │ │ │ ├── bootstrap-theme.min.css │ │ │ │ └── bootstrap-theme.css │ │ ├── js │ │ │ └── jquery.twbsPagination.js │ │ └── css │ │ │ └── style.css │ └── drops │ │ └── js │ │ └── bootstrap.min.js ├── templates │ ├── search_drops.html │ ├── search_bugs.html │ ├── index.html │ └── base.html └── app.py ├── update.sh ├── install.md ├── tornado ├── templates │ ├── search_drops.html │ ├── search_bugs.html │ ├── index.html │ └── base.html └── app.py ├── README.md └── elasticsearch_install.md /scrapy/wooyun/wooyun/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /scrapy/wooyun_drops/wooyun_drops/__init__.py: -------------------------------------------------------------------------------- 1 | -------------------------------------------------------------------------------- /index.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/0oVicero0/wooyun_public/master/index.png -------------------------------------------------------------------------------- /search.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/0oVicero0/wooyun_public/master/search.png -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | 2 | flask/static/drops/运维安全-2547.html 3 | 4 | *.html 5 | 6 | *.jpg 7 | 8 | *.pyc 9 | -------------------------------------------------------------------------------- /flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.eot: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/0oVicero0/wooyun_public/master/flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.eot -------------------------------------------------------------------------------- /flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.ttf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/0oVicero0/wooyun_public/master/flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.ttf -------------------------------------------------------------------------------- /flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.woff: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/0oVicero0/wooyun_public/master/flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.woff -------------------------------------------------------------------------------- /flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.woff2: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/0oVicero0/wooyun_public/master/flask/static/bugs/bootstrap/fonts/glyphicons-halflings-regular.woff2 -------------------------------------------------------------------------------- /scrapy/wooyun/wooyun/spiders/__init__.py: -------------------------------------------------------------------------------- 1 | # This package will contain the spiders of your Scrapy project 2 | # 3 | # Please refer to the documentation for information on how to create and manage 4 | # your spiders. 5 | -------------------------------------------------------------------------------- /scrapy/wooyun_drops/wooyun_drops/spiders/__init__.py: -------------------------------------------------------------------------------- 1 | # This package will contain the spiders of your Scrapy project 2 | # 3 | # Please refer to the documentation for information on how to create and manage 4 | # your spiders. 5 | -------------------------------------------------------------------------------- /update.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | 3 | mongod --config /usr/local/etc/mongod.conf & 4 | 5 | cd scrapy/wooyun 6 | scrapy crawl wooyun -a page_max=100 7 | 8 | cd ../wooyun_drops 9 | scrapy crawl wooyun -a page_max=10 10 | 11 | cd ../../flask 12 | ./app.py 13 | -------------------------------------------------------------------------------- /scrapy/wooyun/scrapy.cfg: -------------------------------------------------------------------------------- 1 | # Automatically created by: scrapy startproject 2 | # 3 | # For more information about the [deploy] section see: 4 | # https://scrapyd.readthedocs.org/en/latest/deploy.html 5 | 6 | [settings] 7 | default = wooyun.settings 8 | 9 | [deploy] 10 | #url = http://localhost:6800/ 11 | project = wooyun 12 | -------------------------------------------------------------------------------- /scrapy/wooyun_drops/scrapy.cfg: -------------------------------------------------------------------------------- 1 | # Automatically created by: scrapy startproject 2 | # 3 | # For more information about the [deploy] section see: 4 | # https://scrapyd.readthedocs.org/en/latest/deploy.html 5 | 6 | [settings] 7 | default = wooyun_drops.settings 8 | 9 | [deploy] 10 | #url = http://localhost:6800/ 11 | project = wooyun_drops 12 | -------------------------------------------------------------------------------- /flask/static/bugs/bootstrap/js/npm.js: -------------------------------------------------------------------------------- 1 | // This file is autogenerated via the `commonjs` Grunt task. You can require() this file in a CommonJS environment. 2 | require('../../js/transition.js') 3 | require('../../js/alert.js') 4 | require('../../js/button.js') 5 | require('../../js/carousel.js') 6 | require('../../js/collapse.js') 7 | require('../../js/dropdown.js') 8 | require('../../js/modal.js') 9 | require('../../js/tooltip.js') 10 | require('../../js/popover.js') 11 | require('../../js/scrollspy.js') 12 | require('../../js/tab.js') 13 | require('../../js/affix.js') -------------------------------------------------------------------------------- /scrapy/wooyun_drops/wooyun_drops/items.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | # Define here the models for your scraped items 4 | # 5 | # See documentation in: 6 | # http://doc.scrapy.org/en/latest/topics/items.html 7 | 8 | import scrapy 9 | 10 | 11 | class WooyunItem(scrapy.Item): 12 | # define the fields for your item here like: 13 | # name = scrapy.Field() 14 | title = scrapy.Field() 15 | author = scrapy.Field() 16 | datetime = scrapy.Field() 17 | category = scrapy.Field() 18 | url = scrapy.Field() 19 | html = scrapy.Field() 20 | 21 | image_urls = scrapy.Field() 22 | images = scrapy.Field() 23 | 24 | -------------------------------------------------------------------------------- /scrapy/wooyun/wooyun/items.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | 3 | # Define here the models for your scraped items 4 | # 5 | # See documentation in: 6 | # http://doc.scrapy.org/en/latest/topics/items.html 7 | 8 | import scrapy 9 | 10 | 11 | class WooyunItem(scrapy.Item): 12 | # define the fields for your item here like: 13 | # name = scrapy.Field() 14 | datetime = scrapy.Field() 15 | datetime_open = scrapy.Field() 16 | title = scrapy.Field() 17 | wooyun_id = scrapy.Field() 18 | author = scrapy.Field() 19 | bug_type = scrapy.Field() 20 | html = scrapy.Field() 21 | # 22 | image_urls = scrapy.Field() 23 | images = scrapy.Field() 24 | -------------------------------------------------------------------------------- /install.md: -------------------------------------------------------------------------------- 1 | wooyun_public在Ubuntu下的安装 2 | ============================= 3 | 4 | 以下为在ubuntu14.04和16.04的安装过程,需要安装的依赖组件: 5 | 6 | + python 2.7和pip 7 | + mongodb 8 | + scrapy 9 | + flask 或 tornado 10 | + pymongo 11 | 12 | 步骤 13 | -------- 14 | 1、安装python、pip、mongodb 15 | 16 | ```bash 17 | sudo apt-get install python python-pip mongodb 18 | ``` 19 | 2、安装scrapy 20 | 21 | ```bash 22 | 安装scrapy如果报错,则先apt-get安装下述依赖包,然后安装pip安装lxml后即可正常安装scrapy 23 | sudo apt-get install libxml2-dev libxslt1-dev python-dev zlib1g-dev libevent-dev python-openssl 24 | 25 | sudo pip install lxml 26 | sudo pip install scrapy 27 | ``` 28 | 3、安装pymongo和flask(或tornado) 29 | 30 | ```bash 31 | sudo pip install flask pymongo 32 | (sudo pip install tornado) 33 | ``` 34 | 4、从github下载源码 35 | 36 | ```bash 37 | git clone https://github.com/hanc00l/wooyun_public 38 | ``` 39 | 40 | 41 | -------------------------------------------------------------------------------- /tornado/templates/search_drops.html: -------------------------------------------------------------------------------- 1 | {% extends "base.html" %} 2 | {% block content %} 3 |
| 发表时间 | 11 |标题 | 12 |文章类型 | 13 |作者 | 14 |
|---|---|---|---|
| {{row['datetime']}} | 18 |19 | {{row['title']}} | 20 |{{row['category']}} | 21 | 22 |{{row['author']}} | 23 |
| 发表时间 | 11 |标题 | 12 |文章类型 | 13 |作者 | 14 |
|---|---|---|---|
| {{row['datetime']}} | 18 |19 | {{row['title']}} | 20 |{{row['category']}} | 21 | 22 |{{row['author']}} | 23 |
| 提交时间 | 11 |标题 | 12 |漏洞类型 | 13 |提交者 | 14 |
|---|---|---|---|
| {{row['datetime']}} | 18 |19 | {{row['title']}} | 20 |{{row['bug_type']}} | 21 |{{row['author']}} | 22 |
| 提交时间 | 11 |标题 | 12 |漏洞类型 | 13 |提交者 | 14 |
|---|---|---|---|
| {{row['datetime']}} | 18 |19 | {{row['title']}} | 20 |{{row['bug_type']}} | 21 |{{row['author']}} | 22 |