├── LICENSE.md ├── README.md └── XssPy.py /LICENSE.md: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2016 Faizan Ahmad 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # XssPy - Web Application XSS Scanner 2 | A tool by Fsecurify 3 | 4 | Author: Faizan Ahmad 5 | https://pk.linkedin.com/in/faizan-ahmad-015964118 6 | 7 | #Great News: Xsspy was recently used by an engineer at microsoft to find a bug in Pentagon's Bug Bounty Program. 8 | http://holisticinfosec.blogspot.com/2016/06/toolsmith-tidbit-xsspy.html 9 | 10 | #How to Use: 11 | http://fsecurify.com/xsspy-web-application-xss-scanner/ 12 | 13 | #Installation: 14 | Type the following in the terminal. 15 | 16 | git clone https://github.com/faizann24/XssPy/ /opt/xsspy 17 | 18 | The tool works on Python 2.7 and you should have mechanize installed. If mechanize is not installed, type "pip install mechanize" in the terminal. 19 | 20 | #Usage: 21 | python XssPy.py website.com (Do not write www.website.com OR http://www.website.com) 22 | 23 | #Payloads 24 | If you have found a XSS vulnerability, you can try the following payloads. 25 | http://pastebin.com/J1hCfL9J 26 | 27 | #Description: 28 | XssPy is a python tool for finding Cross Site Scripting vulnerabilities in websites. This tool is the first of its kind. Instead of just checking one page as most of the tools do, this tool traverses the website and find all the links and subdomains first. After that, it starts scanning each and every input on each and every page that it found while its traversal. It uses small yet effective payloads to search for XSS vulnerabilities. 29 | 30 | The tool has been tested parallel with paid Vulnerability Scanners and most of the scanners failed to detect the vulnerabilities that the tool was able to find. Moreover, most paid tools scan only one site whereas XSSPY first finds a lot of subdomains and then scan all the links altogether. The tool comes with: 31 | 32 | 1) Short Scanning 33 | 2) Comprehensive Scanning 34 | 3) Finding subdomains 35 | 4) Checking every input on every page 36 | 37 | With this tool, Cross Site Scripting vulnerabilities have been found in the websites of MIT, Stanford, Duke University, Informatica, Formassembly, ActiveCompaign, Volcanicpixels, Oxford, Motorola, Berkeley and many more. 38 | 39 | 40 | #NOTE: 41 | Mail me if you encounter any errors (fsecurify@gmail.com). You can also post your problems on the website. I'll try my best to respond as soon as possible. 42 | 43 | Best Regards 44 | Faizan Ahmad 45 | CEO of Fsecurify 46 | -------------------------------------------------------------------------------- /XssPy.py: -------------------------------------------------------------------------------- 1 | import mechanize 2 | import sys 3 | import httplib 4 | import argparse 5 | import logging 6 | from urlparse import urlparse 7 | 8 | br = mechanize.Browser() #initiating the browser 9 | br.addheaders = [('User-agent', 'Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.8.1.11)Gecko/20071127 Firefox/2.0.0.11')] 10 | br.set_handle_robots(False) 11 | br.set_handle_refresh(False) 12 | 13 | 14 | class color: 15 | RED = '\033[91m' 16 | GREEN = '\033[92m' 17 | YELLOW = '\033[93m' 18 | BOLD = '\033[1m' 19 | END = '\033[0m' 20 | @staticmethod 21 | def log(lvl, col, msg): 22 | logger.log(lvl, col + msg + color.END) 23 | 24 | print color.BOLD + color.RED + """ 25 | XssPy - Finding XSS made easier 26 | Author: Faizan Ahmad (Fsecurify) 27 | Email: fsecurify@gmail.com 28 | Usage: pythonXssPy.py website.com (Do not write www.website.com OR http://www.website.com) 29 | Comprehensive Scan: python XssPy.py website.com -e 30 | 31 | Description: XssPy is a python tool for finding Cross Site Scripting 32 | vulnerabilities in websites. This tool is the first of its kind. 33 | Instead of just checking one page as most of the tools do, this tool 34 | traverses the website and find all the links and subdomains first. 35 | After that, it starts scanning each and every input on each and every 36 | page that it found while its traversal. It uses small yet effective 37 | payloads to search for XSS vulnerabilities. XSS in many high 38 | profile websites and educational institutes has been found 39 | by using this very tool. 40 | """ + color.END 41 | 42 | logger = logging.getLogger(__name__) 43 | lh = logging.StreamHandler() # Handler for the logger 44 | logger.addHandler(lh) 45 | formatter = logging.Formatter('[%(asctime)s] %(message)s', datefmt='%H:%M:%S') 46 | lh.setFormatter(formatter) 47 | 48 | parser = argparse.ArgumentParser() 49 | parser.add_argument('-u', action='store', dest='url', help='The URL to analyze') 50 | parser.add_argument('-e', action='store_true', dest='compOn', help='Enable comprehensive scan') 51 | parser.add_argument('-v', action='store_true', dest='verbose', help='Enable verbose logging') 52 | results = parser.parse_args() 53 | 54 | logger.setLevel(logging.DEBUG if results.verbose else logging.INFO) 55 | 56 | def initializeAndFind(firstDomains): 57 | 58 | dummy = 0 #dummy variable for doing nothing 59 | firstDomains = [] #list of domains 60 | if not results.url: #if the url has been passed or not 61 | color.log(logging.INFO, color.GREEN, 'Url not provided correctly') 62 | return 0 63 | 64 | smallurl = results.url #small url is the part of url without http and www 65 | 66 | allURLS = [] 67 | allURLS.append(smallurl) #just one url at the moment 68 | largeNumberOfUrls = [] #in case one wants to do comprehensive search 69 | 70 | color.log(logging.INFO, color.GREEN, 'Doing a short traversal.') #doing a short traversal if no command line argument is being passed 71 | for url in allURLS: 72 | x = str(url) 73 | smallurl = x 74 | 75 | try: # Test HTTPS/HTTP compatibility. Prefers HTTPS but defaults to HTTP if any errors are encountered 76 | test = httplib.HTTPSConnection(smallurl) 77 | test.request("GET", "/") 78 | response = test.getresponse() 79 | if (response.status == 200) | (response.status == 302): 80 | url = "https://www." + str(url) 81 | elif response.status == 301: 82 | loc = response.getheader('Location') 83 | url = loc.scheme + '://' + loc.netloc 84 | else: 85 | url = "http://www." + str(url) 86 | except: 87 | url = "http://www." + str(url) 88 | 89 | try: 90 | br.open(url) 91 | color.log(logging.INFO, color.GREEN, 'Finding all the links of the website ' + str(url)) 92 | try: 93 | for link in br.links(): #finding the links of the website 94 | if smallurl in str(link.absolute_url): 95 | firstDomains.append(str(link.absolute_url)) 96 | firstDomains = list(set(firstDomains)) 97 | except: 98 | dummy = 0 99 | except: 100 | dummy = 0 101 | color.log(logging.INFO, color.GREEN, 'Number of links to test are: ' + str(len(firstDomains))) 102 | 103 | if results.compOn: 104 | color.log(logging.INFO, color.GREEN, 'Doing a comprehensive traversal. This could take a while') 105 | for link in firstDomains: 106 | try: 107 | br.open(link) 108 | try: 109 | for newlink in br.links(): #going deeper into each link and finding its links 110 | if smallurl in str(newlink.absolute_url): 111 | largeNumberOfUrls.append(newlink.absolute_url) 112 | except: 113 | dummy = 0 114 | except: 115 | dummy = 0 116 | 117 | firstDomains = list(set(firstDomains + largeNumberOfUrls)) 118 | color.log(logging.INFO, color.GREEN, 'Total Number of links to test have become: ' + str(len(firstDomains))) #all links have been found 119 | return firstDomains 120 | 121 | 122 | def findxss(firstDomains): 123 | color.log(logging.INFO, color.GREEN, 'Started finding XSS') #starting finding XSS 124 | xssLinks = [] #TOTAL CROSS SITE SCRIPTING FINDINGS 125 | count = 0 #to check for forms 126 | dummyVar = 0 #dummy variable for doing nothing 127 | if len(firstDomains) > 0: #if there is atleast one link 128 | for link in firstDomains: 129 | y = str(link) 130 | color.log(logging.DEBUG, color.YELLOW, str(link)) 131 | if 'jpg' in y: #just a small check 132 | color.log(logging.DEBUG, color.RED, '\tNot a good url to test') 133 | elif 'pdf' in y: 134 | color.log(logging.DEBUG, color.RED, '\tNot a good url to test') 135 | else: 136 | try: 137 | br.open(str(link)) #open the link 138 | except: 139 | dummyVar = 0 140 | try: 141 | for form in br.forms(): #check its forms 142 | count = count + 1 143 | except: 144 | dummyVar = 0 145 | if count > 0: #if a form exists, submit it 146 | try: 147 | params = list(br.forms())[0] #our form 148 | except: 149 | dummyVar = 0 150 | try: 151 | br.select_form(nr=0) #submit the first form 152 | except: 153 | dummyVar = 0 154 | for p in params.controls: 155 | par = str(p) 156 | if 'TextControl' in par: #submit only those forms which require text 157 | color.log(logging.DEBUG, color.YELLOW, '\tParam: ' + str(p.name)) 158 | try: 159 | br.form[str(p.name)] = '' #our payload 160 | except: 161 | dummyVar = 0 162 | try: 163 | br.submit() 164 | except: 165 | dummyVar = 0 166 | try: 167 | if '' in br.response().read(): #if payload is found in response, we have XSS 168 | color.log(logging.INFO, color.BOLD+color.GREEN, 'Xss found and the link is ' + str(link) + ' And the payload is ') 169 | xssLinks.append(link) 170 | else: 171 | dummyVar = 0 172 | except: 173 | color.log(logging.INFO, color.RED, '\tcould not read the page') 174 | try: 175 | br.back() 176 | except: 177 | dummyVar = 0 178 | 179 | #SECOND PAYLOAD 180 | 181 | try: 182 | br.form[str(p.name)] = 'javascript:alert(1)' #second payload 183 | except: 184 | dummyVar = 0 185 | try: 186 | br.submit() 187 | except: 188 | dummyVar = 0 189 | try: 190 | if '