634 |
635 | This program is free software: you can redistribute it and/or modify
636 | it under the terms of the GNU Affero General Public License as published
637 | by the Free Software Foundation, either version 3 of the License, or
638 | (at your option) any later version.
639 |
640 | This program is distributed in the hope that it will be useful,
641 | but WITHOUT ANY WARRANTY; without even the implied warranty of
642 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
643 | GNU Affero General Public License for more details.
644 |
645 | You should have received a copy of the GNU Affero General Public License
646 | along with this program. If not, see .
647 |
648 | Also add information on how to contact you by electronic and paper mail.
649 |
650 | If your software can interact with users remotely through a computer
651 | network, you should also make sure that it provides a way for users to
652 | get its source. For example, if your program is a web application, its
653 | interface could display a "Source" link that leads users to an archive
654 | of the code. There are many ways you could offer source, and different
655 | solutions will be better for different programs; see section 13 for the
656 | specific requirements.
657 |
658 | You should also get your employer (if you work as a programmer) or school,
659 | if any, to sign a "copyright disclaimer" for the program, if necessary.
660 | For more information on this, and how to apply and follow the GNU AGPL, see
661 | .
662 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | 
2 |
3 | # StalkPhish project has been renamed StalkPhish-OSS
4 | You can find the new project page here: [https://github.com/t4d/StalkPhish-OSS](https://github.com/t4d/StalkPhish-OSS)
5 |
6 | ## Online StalkPhish SaaS
7 | You can found our online StalkPhish SaaS application on [https://www.Stalkphish.io](https://www.Stalkphish.io) and use the REST API available for free.
8 |
9 | ## Join us
10 | You can join us on Keybase: [https://keybase.io/team/stalkphish](https://keybase.io/team/stalkphish) channel 'stalkphish'!
11 |
12 | ## Follow us
13 | [Our blog](https://www.Stalkphish.com)
14 | [LinkedIn](https://www.linkedin.com/company/stalkphish)
15 | [Twitter](https://twitter.com/Stalkphish_io)
16 | [Youtube](https://www.youtube.com/channel/UC5hb1CaRdmbSWpN0wTz6SFw)
17 |
--------------------------------------------------------------------------------
/docker/Dockerfile:
--------------------------------------------------------------------------------
1 | FROM alpine:3.18
2 |
3 | LABEL maintainer "contact@stalkphish.com"
4 |
5 | ENV INITSYSTEM=on
6 |
7 | # install packages
8 | RUN apk --no-cache add --update \
9 | py3-lxml \
10 | py3-pip \
11 | git \
12 | python3 \
13 | openrc \
14 | sqlite \
15 | supervisor \
16 | tor
17 |
18 | # clone StalkPhish from GitHub
19 | RUN git clone https://github.com/t4d/StalkPhish.git /opt/StalkPhish
20 |
21 | # upgrade pip
22 | RUN pip3 install --upgrade pip
23 |
24 | # install requirements' file
25 | RUN pip3 install -r /opt/StalkPhish/requirements.txt
26 |
27 | # create directories to share
28 | RUN mkdir /opt/StalkPhish/stalkphish/log
29 | RUN mkdir /opt/StalkPhish/stalkphish/dl
30 | RUN mkdir /opt/StalkPhish/stalkphish/db
31 |
32 | # Add custom supervisor config
33 | COPY supervisord.conf /etc/supervisor/conf.d/
34 | CMD ["/usr/bin/supervisord"; "-c"; "/etc/supervisor/conf.d/supervisord.conf"]
35 |
36 | # Make some clean
37 | RUN rm -rf /var/cache/apk/*
38 |
--------------------------------------------------------------------------------
/docker/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: "2"
2 |
3 | services:
4 | stalkphish:
5 | image: stalkphish
6 | container_name: stalkphish
7 | hostname: stalkphish
8 | build: ./
9 | volumes:
10 | - /tmp/log:/opt/StalkPhish/stalkphish/log:rw
11 | - /tmp/dl:/opt/StalkPhish/stalkphish/dl:rw
12 | - /tmp/db:/opt/StalkPhish/stalkphish/db:rw
13 | command: /usr/bin/supervisord -c /etc/supervisor/conf.d/supervisord.conf
14 |
--------------------------------------------------------------------------------
/docker/supervisord.conf:
--------------------------------------------------------------------------------
1 | [supervisord]
2 | nodaemon=true
3 |
4 | [program:tor]
5 | command=/usr/bin/tor
6 |
--------------------------------------------------------------------------------
/pics/stalkphish-logo.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/t4d/StalkPhish/8a7886e05ff319e8a95602cbb12260ff71ad7c89/pics/stalkphish-logo.png
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | BeautifulSoup4
2 | cfscrape
3 | requests
4 | pysocks
5 | ipwhois
6 | lxml
7 |
--------------------------------------------------------------------------------
/stalkphish/StalkPhish.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # StalkPhish - The Phishing kits stalker
5 | # Copyright (C) 2018-2020 Thomas "tAd" Damonneville
6 | #
7 | # This program is free software: you can redistribute it and/or modify
8 | # it under the terms of the GNU Affero General Public License as
9 | # published by the Free Software Foundation, either version 3 of the
10 | # License, or (at your option) any later version.
11 | #
12 | # This program is distributed in the hope that it will be useful,
13 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 | # GNU Affero General Public License for more details.
16 | #
17 | # You should have received a copy of the GNU Affero General Public License
18 | # along with this program. If not, see .
19 |
20 | import os
21 | import sys
22 | import glob
23 | import time
24 | import getopt
25 | import socket
26 | from tools.utils import VerifyPath
27 | from tools.utils import NetInfo
28 | from tools.sqlite import SqliteCmd
29 | from tools.addurl import AddUniqueURL
30 | from tools.logging import Logger
31 | from tools.confparser import ConfParser
32 | VERSION = "0.9.8-3"
33 |
34 |
35 | # Graceful banner :)
36 | def banner():
37 | banner = '''
38 | _____ _ _ _ _____ _ _ _
39 | / ____| | | | | | __ \| | (_) | |
40 | | (___ | |_ __ _| | | _| |__) | |__ _ ___| |__
41 | \___ \| __/ _` | | |/ / ___/| '_ \| / __| '_ \
42 | ____) | || (_| | | <| | | | | | \__ \ | | |
43 | |_____/ \__\__,_|_|_|\__\| |_| |_|_|___/_| |_|
44 | '''
45 | print(banner)
46 | print("-= StalkPhish - The Phishing Kit stalker - v" + VERSION + " =-\n")
47 |
48 |
49 | # Usage
50 | def usage():
51 | usage = """
52 | -h --help Prints this help
53 | -c --config Configuration file to use (mandatory)
54 | -G --get Try to download zip file containing phishing kit sources (long and noisy)
55 | -N --nosint Don't use OSINT databases
56 | -u --url Add only one URL
57 | -s --search Search for a specific string on OSINT modules
58 | """
59 | print(usage)
60 | sys.exit(0)
61 |
62 |
63 | # Tool options
64 | def args_parse():
65 | global ConfFile
66 | global DLPhishingKit
67 | global OSINTsources
68 | global UniqueURL
69 | global URLadd
70 | global SearchUString
71 | confound = "NO"
72 | DLPhishingKit = "NO"
73 | OSINTsources = "YES"
74 | UniqueURL = "NO"
75 | URLadd = ""
76 | SearchUString = ""
77 |
78 | if not len(sys.argv[1:]):
79 | usage()
80 | try:
81 | opts, args = getopt.getopt(sys.argv[1:], "hNGc:u:s:", ["help", "nosint", "get", "conf=", "url=", "search="])
82 | except getopt.GetoptError as err:
83 | print(err)
84 | usage()
85 | sys.exit(2)
86 |
87 | for o, a in opts:
88 | if o in ("-h", "--help"):
89 | usage()
90 | elif o in ("-c", "--config"):
91 | if os.path.isfile(a):
92 | ConfFile = a
93 | confound = "YES"
94 | else:
95 | print(" ERROR - Can't find configuration file.")
96 | usage()
97 | elif confound == "NO":
98 | print(" Error - Configuration file is mandatory.")
99 | usage()
100 |
101 | elif o in ("-N", "--nosint"):
102 | OSINTsources = "NO"
103 | elif o in ("-G", "--get"):
104 | DLPhishingKit = "YES"
105 | elif o in ("-u", "--url"):
106 | UniqueURL = "YES"
107 | URLadd = a
108 | elif o in ("-s", "--search"):
109 | SearchUString = a
110 | else:
111 | assert False, "Unhandled Option"
112 | return
113 |
114 |
115 | # Modules initialization
116 | def LaunchModules(SearchString):
117 | LOG.info("Proceeding to OSINT modules launch")
118 | try:
119 | # If more than one search word
120 | if ',' in SearchString:
121 | SearchString_list = [SearchString.strip(' ') for SearchString in SearchString.split(',')]
122 | else:
123 | SearchString_list = [SearchString]
124 | except:
125 | err = sys.exc_info()
126 | LOG.error("SearchString error " + str(err))
127 |
128 | ###################
129 | # URLScan module #
130 | ###################
131 | ModuleUrlscan = CONF.URLSCAN_active
132 | if ModuleUrlscan is True:
133 | from modules.urlscan import UrlscanOSINT, UrlscanExtractor
134 | ConfURLSCAN_url = CONF.URLSCAN_url
135 | ConfURLSCAN_apikey = CONF.URLSCAN_apikey
136 |
137 | for SearchString in SearchString_list:
138 | UrlscanOSINT(ConfURLSCAN_apikey, ConfURLSCAN_url, PROXY, SearchString, LOG)
139 | UrlscanExtractor(LOG, SQL, TABLEname, PROXY, UAFILE)
140 | else:
141 | pass
142 |
143 | ###################
144 | # URLQUERY module #
145 | ###################
146 | ModuleUrlquery = CONF.URLQUERY_active
147 | if ModuleUrlquery is True:
148 | from modules.urlquery import UrlqueryOSINT, UrlqueryExtractor
149 | ConfURLQUERY_url = CONF.URLQUERY_url
150 |
151 | for SearchString in SearchString_list:
152 | UrlqueryOSINT(ConfURLQUERY_url, PROXY, SearchString, LOG)
153 | UrlqueryExtractor(SearchString, LOG, SQL, TABLEname, PROXY, UAFILE)
154 | else:
155 | pass
156 |
157 | ####################
158 | # PHISHTANK module #
159 | ####################
160 | ModulePhishtank = CONF.PHISHTANK_active
161 | if ModulePhishtank is True:
162 | from modules.phishtank import PhishtankOSINT, PhishtankExtractor, DeletePhishtankFile
163 | ConfPHISHTANK_url = CONF.PHISHTANK_url
164 | ConfPHISHTANK_keep = CONF.PHISHTANK_keep
165 | ConfPHISHTANK_apikey = CONF.PHISHTANK_apikey
166 |
167 | try:
168 | if ConfPHISHTANK_apikey is not None:
169 | ConfPHISHTANK_url = "https://data.phishtank.com/data/{}/online-valid.json".format(ConfPHISHTANK_apikey)
170 | pass
171 | except:
172 | LOG.error("There's a problem with API key. Trying without...")
173 | pass
174 |
175 | try:
176 | # Get PHISHTANK free feed (if older than 1 hour)
177 | phishtank_file = ""
178 | filelist = glob.glob(SrcDir + "phishtank-feed-*.json")
179 | if filelist:
180 | last_phishtank_file = max(filelist, key=os.path.getctime)
181 | if os.stat(last_phishtank_file).st_mtime < time.time() - 7200:
182 | # file older than 2 hours, download a new one
183 | phishtank_file = SrcDir + "phishtank-feed-" + time.strftime("%Y%m%d-%H%M") + ".json"
184 | PhishtankOSINT(phishtank_file, ConfPHISHTANK_url, ConfPHISHTANK_keep, SrcDir, PROXY, LOG)
185 | else:
186 | LOG.info("Phishtank\'s file still exist (<2h). Proceeding to extraction...")
187 | phishtank_file = last_phishtank_file
188 | else:
189 | phishtank_file = SrcDir + "phishtank-feed-" + time.strftime("%Y%m%d-%H%M") + ".json"
190 | PhishtankOSINT(phishtank_file, ConfPHISHTANK_url, ConfPHISHTANK_keep, SrcDir, PROXY, LOG)
191 |
192 | for SearchString in SearchString_list:
193 | # Search into file
194 | LOG.info("Searching for \'" + SearchString + "\'...")
195 | PhishtankExtractor(phishtank_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE)
196 |
197 | # Proceed to file delete if don't want to keep it
198 | if ConfPHISHTANK_keep is not True:
199 | DeletePhishtankFile(phishtank_file, LOG)
200 | else:
201 | pass
202 | # if sys.exit() from Phishtank module
203 | except SystemExit:
204 | pass
205 | except:
206 | err = sys.exc_info()
207 | LOG.error("Phishtank module error: " + str(err))
208 |
209 | else:
210 | pass
211 |
212 | ####################
213 | # OPENPHISH module #
214 | ####################
215 | ModuleOpenPhish = CONF.OPENPHISH_active
216 | if ModuleOpenPhish is True:
217 | from modules.openphish import OpenphishOSINT, OpenphishExtractor, DeleteOpenphishFile
218 | ConfOPENPHISH_url = CONF.OPENPHISH_url
219 | ConfOPENPHISH_keep = CONF.OPENPHISH_keep
220 |
221 | try:
222 | # Get OPENPHISH free feed (if older than 1 hour)
223 | openphish_file = ""
224 | filelist = glob.glob(SrcDir + "openphish-feed-*.txt")
225 | if filelist:
226 | last_openphish_file = max(filelist, key=os.path.getctime)
227 | if os.stat(last_openphish_file).st_mtime < time.time() - 7200:
228 | # file older than 2 hours, download a new one
229 | openphish_file = SrcDir + "openphish-feed-" + time.strftime("%Y%m%d-%H%M") + ".txt"
230 | OpenphishOSINT(openphish_file, ConfOPENPHISH_url, ConfOPENPHISH_keep, SrcDir, PROXY, LOG)
231 | else:
232 | LOG.info("Openphish\'s file still exist (<2h). Proceeding to extraction...")
233 | openphish_file = last_openphish_file
234 | else:
235 | openphish_file = SrcDir + "openphish-feed-" + time.strftime("%Y%m%d-%H%M") + ".txt"
236 | OpenphishOSINT(openphish_file, ConfOPENPHISH_url, ConfOPENPHISH_keep, SrcDir, PROXY, LOG)
237 |
238 | for SearchString in SearchString_list:
239 | # Search into file
240 | LOG.info("Searching for \'" + SearchString + "\'...")
241 | OpenphishExtractor(openphish_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE)
242 |
243 | # Proceed to file delete if don't want to keep it
244 | if ConfOPENPHISH_keep is not True:
245 | DeleteOpenphishFile(openphish_file, LOG)
246 | else:
247 | pass
248 |
249 | except:
250 | err = sys.exc_info()
251 | LOG.error("Openphish module error: " + str(err))
252 | else:
253 | pass
254 |
255 | ####################
256 | # Phihstats module #
257 | ####################
258 | ModulePhishstats = CONF.PHISHSTATS_active
259 | if ModulePhishstats is True:
260 | from modules.phishstats import PhishstatsOSINT, PhishstatsExtractor, DeletePhishstatsFile
261 | ConfPHISHSTATS_url = CONF.PHISHSTATS_url
262 | ConfPHISHSTATS_keep = CONF.PHISHSTATS_keep
263 |
264 | try:
265 | # Get PHISHSTATS free feed (if older than 2 hour)
266 | phishstats_file = ""
267 | filelist = glob.glob(SrcDir + "phishstats-feed-*.json")
268 | if filelist:
269 | last_phishstats_file = max(filelist, key=os.path.getctime)
270 | if os.stat(last_phishstats_file).st_mtime < time.time() - 7200:
271 | # file older than 2 hours, download a new one
272 | phishstats_file = SrcDir + "phishstats-feed-" + time.strftime("%Y%m%d-%H%M") + ".json"
273 | PhishstatsOSINT(phishstats_file, ConfPHISHSTATS_url, ConfPHISHSTATS_keep, PROXY, SearchString, LOG)
274 | else:
275 | LOG.info("Phishstats\'s file still exist (<2h). Proceeding to extraction...")
276 | phishstats_file = last_phishstats_file
277 | else:
278 | phishstats_file = SrcDir + "phishstats-feed-" + time.strftime("%Y%m%d-%H%M") + ".json"
279 | PhishstatsOSINT(phishstats_file, ConfPHISHSTATS_url, ConfPHISHSTATS_keep, PROXY, SearchString, LOG)
280 |
281 | for SearchString in SearchString_list:
282 | # Search into file
283 | LOG.info("Searching for \'" + SearchString + "\'...")
284 | PhishstatsExtractor(phishstats_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE)
285 |
286 | # Proceed to file delete if don't want to keep it
287 | if ConfPHISHSTATS_keep is not True:
288 | DeletePhishstatsFile(phishstats_file, LOG)
289 | else:
290 | pass
291 | # if sys.exit() from Phishtank module
292 | except SystemExit:
293 | pass
294 | except:
295 | err = sys.exc_info()
296 | LOG.error("Phishstats module error: " + str(err))
297 | else:
298 | pass
299 |
300 | ############################
301 | # Phishing.Database module #
302 | ############################
303 | ModulePhishingDB = CONF.PHISHINGDB_active
304 | if ModulePhishingDB is True:
305 | from modules.phishingdb import PhishingDBOSINT, PhishingDBExtractor, DeletePhishingDBFile
306 | ConfPHISHINGDB_url = CONF.PHISHINGDB_url
307 | ConfPHISHINGDB_keep = CONF.PHISHINGDB_keep
308 |
309 | try:
310 | # Get Phishing.Database free feed (if older than 1 hour)
311 | phishingdb_file = ""
312 | filelist = glob.glob(SrcDir + "phishingdb-feed-*.txt")
313 | if filelist:
314 | last_phishingdb_file = max(filelist, key=os.path.getctime)
315 | if os.stat(last_phishingdb_file).st_mtime < time.time() - 7200:
316 | # file older than 2 hours, download a new one
317 | phishingdb_file = SrcDir + "phishingdb-feed-" + time.strftime("%Y%m%d-%H%M") + ".txt"
318 | PhishingDBOSINT(phishingdb_file, ConfPHISHINGDB_url, ConfPHISHINGDB_keep, SrcDir, PROXY, LOG)
319 | else:
320 | LOG.info("Phishing.Database\'s file still exist (<2h). Proceeding to extraction...")
321 | phishingdb_file = last_phishingdb_file
322 | else:
323 | phishingdb_file = SrcDir + "phishingdb-feed-" + time.strftime("%Y%m%d-%H%M") + ".txt"
324 | PhishingDBOSINT(phishingdb_file, ConfPHISHINGDB_url, ConfPHISHINGDB_keep, SrcDir, PROXY, LOG)
325 |
326 | for SearchString in SearchString_list:
327 | # Search into file
328 | LOG.info("Searching for \'" + SearchString + "\'...")
329 | PhishingDBExtractor(phishingdb_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE)
330 |
331 | # Proceed to file delete if don't want to keep it
332 | if ConfPHISHINGDB_keep is not True:
333 | DeletePhishingDBFile(phishingdb_file, LOG)
334 | else:
335 | pass
336 |
337 | except:
338 | err = sys.exc_info()
339 | LOG.error("Openphish module error: " + str(err))
340 | else:
341 | pass
342 |
343 | # Try to download Phshing kit sources
344 | def TryDLPK(TABLEname, InvTABLEname, DLDir, SQL, PROXY, LOG, UAFILE):
345 | from tools.download import TryPKDownload
346 | # Search in main Table for StillTryDownload column
347 | rows = SQL.SQLiteSearchNotDownloaded(TABLEname)
348 | try:
349 | for row in rows:
350 | siteDomain = row[1]
351 | IPaddress = row[2]
352 | if IPaddress:
353 | rASN = NetInfo()
354 | if rASN.GetASN(IPaddress):
355 | ASN = rASN.GetASN(IPaddress).strip('\"')
356 | else:
357 | ASN = None
358 | else:
359 | ASN = None
360 | if row[0].startswith('https'):
361 | siteURL = row[0]
362 | if row[0].startswith('http'):
363 | siteURL = str(row[0])
364 | else:
365 | siteURL = 'http://' + row[0]
366 | TryPKDownload(siteURL, siteDomain, IPaddress, TABLEname, InvTABLEname, DLDir, SQL, PROXY, LOG, UAFILE, ASN)
367 | except:
368 | err = sys.exc_info()
369 | LOG.error("TryDLPK module error: " + str(err))
370 |
371 |
372 | # Config file read
373 | def ConfAnalysis(ConfFile):
374 | global UA
375 | global UAFILE
376 | global CONF
377 | global DBfile
378 | global DBDir
379 | global SrcDir
380 | global DLDir
381 | global PROXY
382 | global TABLEname
383 | global InvTABLEname
384 | global SearchString
385 | global LogConf
386 | global LogDir
387 | global LogFile
388 | global LOG
389 |
390 | try:
391 | CONF = ConfParser(ConfFile)
392 | P = VerifyPath()
393 |
394 | # Database stuff
395 | DBfile = CONF.DBfile
396 | TABLEname = CONF.TABLEname
397 | InvTABLEname = CONF.InvestigTABLEname
398 |
399 | # Path stuff
400 | SrcDir = CONF.SrcDir
401 | P.VerifyOrCreate(SrcDir)
402 | DBDir = CONF.DatabaseDir
403 | P.VerifyOrCreate(DBDir)
404 | DLDir = CONF.DLDir
405 | P.VerifyOrCreate(DLDir)
406 |
407 | # Connection stuff
408 | PROXY = CONF.http_proxy
409 | UA = CONF.http_UA
410 | UAFILE = CONF.UAfile
411 |
412 | # Search stuff
413 | if SearchUString:
414 | SearchString = SearchUString
415 | else:
416 | SearchString = CONF.SearchString
417 |
418 | # Logging stuff
419 | LogConf = CONF.LogConf
420 | LogDir = CONF.LogDir
421 | P.VerifyOrCreate(LogDir)
422 | LogFile = CONF.LogFile
423 | llog = LogDir + LogFile
424 | LOG = Logger(llog)
425 |
426 | except:
427 | err = sys.exc_info()
428 | LOG.error("ConfAnalysis error " + str(err))
429 |
430 |
431 | # Main
432 | def main():
433 | global SQL
434 | try:
435 | # Config
436 | ConfAnalysis(ConfFile)
437 |
438 | # Output options
439 | P = VerifyPath()
440 | LOG.info("Configuration file to use: " + ConfFile)
441 | LOG.info("Database: " + DBfile)
442 | SQL = SqliteCmd(DBfile)
443 | LOG.info("Main table: " + TABLEname)
444 | SQL.SQLiteCreateTable(TABLEname)
445 | LOG.info("Investigation table: " + InvTABLEname)
446 | SQL.SQLiteInvestigCreateTable(InvTABLEname)
447 | LOG.info("Files directory: " + SrcDir)
448 | LOG.info("Download directory: " + DLDir)
449 | LOG.info("Declared Proxy: " + str(PROXY) + "\n")
450 |
451 | # Test proxy connection
452 | if PROXY:
453 | proxystring = PROXY.split('//')[1]
454 | proxyipadd = proxystring.split(':')[0]
455 | proxyport = proxystring.split(':')[1]
456 | s = socket.socket()
457 | try:
458 | s.connect((proxyipadd, int(proxyport)))
459 | except:
460 | LOG.error("Proxy connection error, exiting!")
461 | os._exit(1)
462 | else:
463 | pass
464 |
465 | # Only add URL into Database
466 | if UniqueURL == "YES":
467 | LOG.info("Add URL into database: {}".format(URLadd))
468 | AddUniqueURL(URLadd, LOG, SQL, TABLEname, PROXY, UAFILE)
469 | sys.stdout.flush()
470 | os._exit(0)
471 | else:
472 | pass
473 |
474 | # Modules launch
475 | if OSINTsources == "YES":
476 | LaunchModules(SearchString)
477 | else:
478 | pass
479 |
480 | # Phishing Kit download launch if activated
481 | if DLPhishingKit == "YES":
482 | LOG.info("Starting trying to download phishing kits sources...")
483 | TryDLPK(TABLEname, InvTABLEname, DLDir, SQL, PROXY, LOG, UAFILE)
484 | else:
485 | pass
486 |
487 | except KeyboardInterrupt:
488 | LOG.info("Shutdown requested...exiting")
489 | os._exit(0)
490 |
491 | except:
492 | err = sys.exc_info()
493 | LOG.error("Main error " + str(err))
494 |
495 |
496 | # Start
497 | if __name__ == '__main__':
498 | banner()
499 | args_parse()
500 | main()
501 |
--------------------------------------------------------------------------------
/stalkphish/conf/example.conf:
--------------------------------------------------------------------------------
1 | ###################################
2 | # StalkPhish's main configuration #
3 | ###################################
4 |
5 | [SEARCH]
6 | # External source keywords to search for (keywords separated by a comma)
7 | search = webmail,secure,email
8 |
9 | [PATHS]
10 | # Logging
11 | log_conf = ./conf/logging.conf
12 | log_dir = ./log/
13 | log_file = stalkphish.log
14 |
15 | # Where you download Phishing kits
16 | Kits_download_Dir = ./dl/
17 |
18 | # Where you download external source files to parse
19 | Ext_src_Files = ./files/
20 |
21 | [DATABASE]
22 | # Where you store your Databases
23 | Databases_files = ./db
24 | sqliteDB_filename = %(Databases_files)s/StalkPhish.sqlite3
25 | sqliteDB_tablename = StalkPhish
26 | sqliteDB_Investig_tablename = StalkPhishInvestig
27 |
28 | [CONNECT]
29 | # http_proxy:
30 | # (optional) Declare a HTTP proxy to use for HTTP Get informations
31 | # (you can comment the 'http_proxy' line if you don't want to use proxy)
32 | # ex: http_proxy = http://127.0.0.1:8080 for a HTTP_proxy server
33 | # ex: http_proxy = socks5://127.0.0.1:9050 for a SOCKS5 proxy server
34 | http_proxy = socks5://127.0.0.1:9050
35 |
36 | # StalkPhish's default user-agent (don't remove):
37 | http_UA = Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.91 Safari/537.36
38 | # Use a HTTP user-agents file to use for phishing kits HTTP Get informations
39 | UAfile = ./useragent_list.txt
40 |
41 | ########################
42 | # OSINT Search Modules #
43 | ########################
44 |
45 | [URLSCAN]
46 | # urlscan.io search API
47 | activate = yes
48 | API_url = https://urlscan.io/api/v1/search/
49 | API_key =
50 |
51 | [URLQUERY]
52 | # urlquery.net search web crawler
53 | activate = yes
54 | OSINT_url = https://urlquery.net/search
55 |
56 | [PHISHTANK]
57 | # Phishtank OSINT feed
58 | activate = yes
59 | OSINT_url = https://data.phishtank.com/data/online-valid.json
60 | keep_files = no
61 | API_key =
62 |
63 | [OPENPHISH]
64 | # Openphish OSINT feed
65 | activate = yes
66 | OSINT_url = https://www.openphish.com/feed.txt
67 | keep_files = no
68 |
69 | [PHISHSTATS]
70 | # Phishstats search API
71 | activate = yes
72 | OSINT_url = https://phishstats.info:2096/api/phishing?_where=
73 | keep_files = no
74 |
75 | [Phishing.Database]
76 | # Phishing.Database OSINT feed (raw mode)
77 | activate = yes
78 | OSINT_url = https://raw.githubusercontent.com/mitchellkrogza/Phishing.Database/master/phishing-links-NEW-today.txt
79 | keep_files = no
80 |
81 |
--------------------------------------------------------------------------------
/stalkphish/mobiles_useragent_list.txt:
--------------------------------------------------------------------------------
1 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12H321 Safari/600.1.4
2 | Mozilla/5.0 (Linux; U; Android 4.0.3; en-us; KFTT Build/IML74K) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
3 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFTHWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
4 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12H143 Safari/600.1.4
5 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFASWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
6 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_3 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12F70 Safari/600.1.4
7 | Mozilla/5.0 (Linux; U; Android 4.0.4; en-us; KFJWI Build/IMM76D) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
8 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFSOWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
9 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFAPWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
10 | Mozilla/5.0 (Linux; U; Android 4.0.3; en-us; KFOT Build/IML74K) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
11 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFARWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
12 | Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53 BingPreview/1.0b
13 | Mozilla/5.0 (Android; Tablet; rv:40.0) Gecko/40.0 Firefox/40.0
14 | Mozilla/5.0 (iPhone; CPU iPhone OS 7_1_2 like Mac OS X) AppleWebKit/537.51.2 (KHTML, like Gecko) Version/7.0 Mobile/11D257 Safari/9537.53
15 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFSAWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
16 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T230NU Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
17 | Mozilla/5.0 (Linux; U; Android 4.0.4; en-us; KFJWA Build/IMM76D) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
18 | Mozilla/5.0 (Linux; Android 4.0.4; BNTV600 Build/IMM76L) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.111 Safari/537.36
19 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_1_2 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12B440 Safari/600.1.4
20 | Mozilla/5.0 (Linux; Android 5.0.2; SM-T530NU Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
21 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_1_3 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12B466 Safari/600.1.4
22 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_2 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12D508 Safari/600.1.4
23 | Mozilla/5.0 (Linux; Android 5.0; SM-G900V Build/LRX21T) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
24 | Mozilla/5.0 (Linux; Android 5.1.1; Nexus 7 Build/LMY48I) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
25 | Mozilla/5.0 (Linux; Android 5.0.2; SM-T800 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
26 | Mozilla/5.0 (Linux; Android 5.0.1; SAMSUNG SCH-I545 4G Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
27 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) CriOS/45.0.2454.68 Mobile/12H321 Safari/600.1.4
28 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG-SM-G900A Build/LRX21T) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
29 | Mozilla/5.0 (Linux; U; Android 4.0.3; en-gb; KFTT Build/IML74K) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
30 | Mozilla/5.0 (Linux; Android 4.0.4; BNTV400 Build/IMM76L) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.111 Safari/537.36
31 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG SM-G900P Build/LRX21T) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
32 | Mozilla/5.0 (iPhone; CPU iPhone OS 7_1_1 like Mac OS X) AppleWebKit/537.51.2 (KHTML, like Gecko) Version/7.0 Mobile/11D201 Safari/9537.53
33 | Mozilla/5.0 (Linux; U; Android 4.4.2; en-us; LG-V410/V41010d Build/KOT49I.V41010d) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/30.0.1599.103 Safari/537.36
34 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12B411 Safari/600.1.4
35 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T320 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
36 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) GSA/7.0.55539 Mobile/12H321 Safari/600.1.4
37 | Mozilla/5.0 (Linux; Android 5.0.2; LG-V410/V41020c Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/34.0.1847.118 Safari/537.36
38 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFTHWA Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
39 | Mozilla/5.0 (Android; Mobile; rv:40.0) Gecko/40.0 Firefox/40.0
40 | Mozilla/5.0 (Linux; Android 4.4.2; SM-P600 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
41 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG SM-N900V 4G Build/LRX21V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
42 | Mozilla/5.0 (Linux; Android 4.4.3; KFTHWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/44.1.81 like Chrome/44.0.2403.128 Safari/537.36
43 | Mozilla/5.0 (Linux; Android 4.4.2; GT-P5210 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
44 | Mozilla/5.0 (Linux; Android 4.4.2; QTAQZ3 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
45 | Mozilla/5.0 (Linux; Android 4.4.2; QMV7B Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
46 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Mobile/12H321
47 | Mozilla/5.0 (Linux; U; Android 4.0.3; en-ca; KFTT Build/IML74K) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
48 | Mozilla/5.0 (Linux; Android 5.0.1; SAMSUNG SM-N910V 4G Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
49 | Mozilla/5.0 (Linux; Android 5.0.2; SAMSUNG SM-T530NU Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.2 Chrome/38.0.2125.102 Safari/537.36
50 | Mozilla/5.0 (Linux; Android 5.0.2; SM-T700 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
51 | Mozilla/5.0 (Linux; Android 5.0.1; SAMSUNG-SM-N910A Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
52 | Mozilla/5.0 (Linux; Android 5.0.2; VK810 4G Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
53 | Mozilla/5.0 (Linux; Android 5.1.1; Nexus 7 Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
54 | Mozilla/5.0 (Linux; Android 5.1.1; SM-G920V Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
55 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T520 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
56 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T900 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
57 | Mozilla/5.0 (Linux; Android 4.1.2; GT-N8013 Build/JZO54K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
58 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFAPWA Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
59 | Mozilla/5.0 (Linux; Android 5.0.1; SM-N910V Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
60 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_1_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12B436 Safari/600.1.4
61 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_0_2 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12A405 Safari/600.1.4
62 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T310 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
63 | Mozilla/5.0 (Linux; Android 5.1.1; Nexus 10 Build/LMY48I) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
64 | Mozilla/5.0 (Linux; Android 4.4.2; QMV7A Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
65 | Mozilla/5.0 (iPhone; CPU iPhone OS 7_0_4 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11B554a Safari/9537.53
66 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG-SM-N900A Build/LRX21V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
67 | Mozilla/5.0 (Linux; Android 4.4.4; XT1080 Build/SU6-7.2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
68 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-N910P Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
69 | Mozilla/5.0 (Linux; Android 5.0.1; LGLK430 Build/LRX21Y) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/38.0.2125.102 Safari/537.36
70 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T217S Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
71 | Mozilla/5.0 (Linux; Android 5.1; XT1254 Build/SU3TL-39) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
72 | Mozilla/5.0 (Linux; Android 5.0.1; SAMSUNG-SGH-I337 Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
73 | Mozilla/5.0 (Linux; Android 4.4.3; KFASWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/44.1.81 like Chrome/44.0.2403.128 Safari/537.36
74 | Mozilla/5.0 (Linux; Android 5.0.2; SAMSUNG SM-T800 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.0 Chrome/38.0.2125.102 Safari/537.36
75 | Mozilla/5.0 (Linux; Android 5.0; SM-G900V Build/LRX21T) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.133 Mobile Safari/537.36
76 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) GSA/8.0.57838 Mobile/12H321 Safari/600.1.4
77 | Mozilla/5.0 (iPhone; CPU iPhone OS 7_1 like Mac OS X) AppleWebKit/537.51.2 (KHTML, like Gecko) Version/7.0 Mobile/11D167 Safari/9537.53
78 | Mozilla/5.0 (Linux; Android 4.4.2; GT-N5110 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
79 | Mozilla/5.0 (iPhone; CPU iPhone OS 9_0 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Version/9.0 Mobile/13A4325c Safari/601.1
80 | Mozilla/5.0 (Linux; Android 4.4.2; RCT6203W46 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/30.0.0.0 Safari/537.36
81 | Mozilla/5.0 (Linux; Android 4.4.4; en-us; SAMSUNG SM-N910T Build/KTU84P) AppleWebKit/537.36 (KHTML, like Gecko) Version/2.0 Chrome/34.0.1847.76 Mobile Safari/537.36
82 | Mozilla/5.0 (Linux; Android 4.4.2; RCT6203W46 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
83 | Mozilla/5.0 (Linux; U; Android 4.0.4; en-ca; KFJWI Build/IMM76D) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
84 | Mozilla/5.0 (Linux; Android 4.4.2; RCT6773W22 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
85 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG-SM-G870A Build/LRX21T) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
86 | Mozilla/5.0 (Linux; Android 4.4.3; KFSOWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/44.1.81 like Chrome/44.0.2403.128 Safari/537.36
87 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-G920P Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.2 Chrome/38.0.2125.102 Mobile Safari/537.36
88 | Mozilla/5.0 (Linux; Android 5.0.2; SM-T550 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
89 | Mozilla/5.0 (Linux; U; Android 4.0.3; en-gb; KFOT Build/IML74K) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
90 | Mozilla/5.0 (Linux; Android 5.0.2; SM-P900 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
91 | Mozilla/5.0 (Linux; Android 5.1.1; Nexus 9 Build/LMY48I) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
92 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T530NU Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
93 | Mozilla/5.0 (Linux; Android 5.1.1; SM-T330NU Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
94 | Mozilla/5.0 (Android; Tablet; rv:34.0) Gecko/34.0 Firefox/34.0
95 | Mozilla/5.0 (Linux; Android 4.4.2; RCT6773W22 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/30.0.0.0 Safari/537.36
96 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG-SM-G900A Build/LRX21T) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
97 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T210R Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
98 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG SM-N900P Build/LRX21V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
99 | Mozilla/5.0 (Linux; Android 5.0.2; SM-T350 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
100 | Mozilla/5.0 (Linux; Android 5.0.2; SM-T530NU Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.133 Safari/537.36
101 | Mozilla/5.0 (Linux; Android 5.0.2; SAMSUNG-SM-G920A Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.0 Chrome/38.0.2125.102 Mobile Safari/537.36
102 | Mozilla/5.0 (Linux; Android 4.4.2; QTAQZ3 Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.135 Safari/537.36
103 | Mozilla/5.0 (Linux; Android 5.0.1; SCH-I545 Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
104 | Mozilla/5.0 (Linux; Android 5.0; SM-G900P Build/LRX21T) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
105 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_0 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12A365 Safari/600.1.4
106 | Mozilla/5.0 (Linux; Android 4.4.3; KFAPWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/44.1.81 like Chrome/44.0.2403.128 Safari/537.36
107 | Mozilla/5.0 (Linux; Android 5.0.1; VS985 4G Build/LRX21Y) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
108 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) CriOS/45.0.2454.68 Mobile/12H143 Safari/600.1.4
109 | Mozilla/5.0 (Linux; Android 5.0.2; LG-V410/V41020b Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/34.0.1847.118 Safari/537.36
110 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_1_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12B435 Safari/600.1.4
111 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-G920T Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.2 Chrome/38.0.2125.102 Mobile Safari/537.36
112 | Mozilla/5.0 (Linux; Android 4.4.3; KFTHWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/34.0.0.0 Safari/537.36
113 | Mozilla/5.0 (Linux; Android 4.4.3; KFSAWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/44.1.81 like Chrome/44.0.2403.128 Safari/537.36
114 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T230NU Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.133 Safari/537.36
115 | Mozilla/5.0 (Linux; Android 4.2.2; SM-T110 Build/JDQ39) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
116 | Mozilla/5.0 (Linux; Android 5.0.1; SAMSUNG SM-N910T Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
117 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T330NU Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
118 | Mozilla/5.0 (Linux; Android 5.0.2; LG-V410 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
119 | Mozilla/5.0 (Linux; Android 4.4.2; SM-T237P Build/KOT49H) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
120 | Mozilla/5.0 (Linux; Android 5.0.2; SM-T800 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.133 Safari/537.36
121 | Mozilla/5.0 (Linux; U; Android 4.4.2; en-us; LGMS323 Build/KOT49I.MS32310c) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/30.0.1599.103 Mobile Safari/537.36
122 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-gb; KFTHWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
123 | Mozilla/5.0 (Linux; Android 5.0.1; SAMSUNG SPH-L720 Build/LRX22C) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
124 | Mozilla/5.0 (Linux; U; Android 4.4.3; en-us; KFSAWA Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
125 | Mozilla/5.0 (Linux; Android 4.4.4; Z970 Build/KTU84P) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/30.0.0.0 Mobile Safari/537.36
126 | Mozilla/5.0 (Linux; Android 5.1.1; Nexus 5 Build/LMY48I) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Mobile Safari/537.36
127 | Mozilla/5.0 (iPhone; CPU iPhone OS 6_1_3 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10B329 Safari/8536.25
128 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-G925T Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.2 Chrome/38.0.2125.102 Mobile Safari/537.36
129 | Mozilla/5.0 (Linux; Android 4.2.2; GT-P5113 Build/JDQ39) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
130 | Mozilla/5.0 (Linux; Android 4.4.3; KFARWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/44.1.81 like Chrome/44.0.2403.128 Safari/537.36
131 | Mozilla/5.0 (Linux; Android 4.4.2; LG-V410 Build/KOT49I.V41010d) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
132 | Mozilla/5.0 (iPod touch; CPU iPhone OS 8_4_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Version/8.0 Mobile/12H321 Safari/600.1.4
133 | Mozilla/5.0 (Linux; U; Android 4.0.3; en-ca; KFOT Build/IML74K) AppleWebKit/537.36 (KHTML, like Gecko) Silk/3.68 like Chrome/39.0.2171.93 Safari/537.36
134 | Mozilla/5.0 (Linux; Android 4.2.2; Le Pan TC802A Build/JDQ39) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
135 | Mozilla/5.0 (iPhone; CPU iPhone OS 7_0_6 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11B651 Safari/9537.53
136 | Mozilla/5.0 (iPad; CPU OS 8_4_1 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) Mobile/12H321 [FBAN/FBIOS;FBAV/38.0.0.6.79;FBBV/14316658;FBDV/iPad4,1;FBMD/iPad;FBSN/iPhone OS;FBSV/8.4.1;FBSS/2; FBCR/;FBID/tablet;FBLC/en_US;FBOP/1]
137 | Mozilla/5.0 (Linux; Android 4.4.4; Nexus 7 Build/KTU84P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.84 Safari/537.36
138 | Mozilla/5.0 (Linux; Android 4.2.2; QMV7B Build/JDQ39) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.114 Safari/537.36
139 | Mozilla/5.0 (Linux; U; Android 4.0.3; en-us) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.59 Mobile Safari/537.36
140 | Mozilla/5.0 (Linux; Android 5.0; SAMSUNG SM-N900T Build/LRX21V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/2.1 Chrome/34.0.1847.76 Mobile Safari/537.36
141 | Mozilla/5.0 (iPhone; CPU iPhone OS 8_4 like Mac OS X) AppleWebKit/600.1.4 (KHTML, like Gecko) GSA/7.0.55539 Mobile/12H143 Safari/600.1.4
142 |
--------------------------------------------------------------------------------
/stalkphish/modules/openphish.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import os
7 | import re
8 | import sys
9 | import socket
10 | import requests
11 | from os.path import dirname
12 | from urllib.parse import urlparse, quote
13 | from tools.utils import TimestampNow
14 | from tools.utils import UAgent
15 | from tools.utils import NetInfo
16 |
17 |
18 | # siteURl
19 | def SiteURLSQL(openphish_file, entry, LOG, SQL, TABLEname, PROXY, UAFILE, UAG):
20 | siteURL = quote(re.split("(?:[0-9a-fA-F]:?){32}", entry.rstrip())[0], ':/')
21 | dn = dirname(siteURL)
22 |
23 | # Test if entry still exist in DB
24 | if SQL.SQLiteVerifyEntry(TABLEname, dn) is 0:
25 | now = str(TimestampNow().Timestamp())
26 | siteDomain = urlparse(entry).netloc
27 | source_url = openphish_file
28 | try:
29 | IPaddress = socket.gethostbyname(siteDomain)
30 | if IPaddress:
31 | rASN = NetInfo()
32 | ASN = rASN.GetASN(IPaddress).strip('\"')
33 | else:
34 | pass
35 | # can't resolv
36 | except:
37 | IPaddress = ""
38 | ASN = ""
39 |
40 | # HTTP connection
41 | try:
42 | proxies = {'http': PROXY, 'https': PROXY}
43 | UA = UAG.ChooseUA(UAFILE)
44 | user_agent = {'User-agent': UA}
45 | try:
46 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12))
47 | # Follow redirect and add new URI to database
48 | if (len(r.history) > 1) and ("301" in str(r.history[-1])) and (siteURL != r.url) and (siteURL.split('/')[:-1] != r.url.split('/')[:-2]) and (siteURL + '/' != r.url):
49 | lastHTTPcode = str(r.status_code)
50 | SQL.SQLiteInsertPK(TABLEname, r.url, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
51 | else:
52 | pass
53 | lastHTTPcode = str(r.status_code)
54 | except ValueError:
55 | # No user-agent configured
56 | r = requests.get(siteURL, proxies=proxies, allow_redirects=True, timeout=(5, 12))
57 | lastHTTPcode = str(r.status_code)
58 | except requests.exceptions.Timeout:
59 | lastHTTPcode = "timeout"
60 | except requests.exceptions.ConnectionError:
61 | lastHTTPcode = "aborted"
62 | except:
63 | lastHTTPcode = "---"
64 | err = sys.exc_info()
65 | LOG.error("HTTP error: " + str(err))
66 | pass
67 | except Exception as e:
68 | # Unknown status code
69 | LOG.error("Connection error: {}".format(e))
70 | pass
71 |
72 | # Add data into database
73 | LOG.info(siteURL)
74 | SQL.SQLiteInsertPK(TABLEname, siteURL, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
75 |
76 | else:
77 | LOG.debug("Entry still known: " + siteURL)
78 | pass
79 |
80 |
81 | # Openphish (Community)
82 | def OpenphishOSINT(openphish_file, ConfOPENPHISH_url, ConfOPENPHISH_keep, SrcDir, PROXY, LOG):
83 | # Get Openphish OSINT TXT file
84 | proxies = {'http': PROXY, 'https': PROXY}
85 | LOG.info("Retrieving OpenPhish\'s file (" + ConfOPENPHISH_url + ") ... Could take several minutes...")
86 | resp = requests.get(url=ConfOPENPHISH_url, proxies=proxies, allow_redirects=True, timeout=(10, 20))
87 | with open(openphish_file, "wb") as file:
88 | file.write(resp.content)
89 | LOG.info("OpenPhish\'s file retrieved and save as " + openphish_file)
90 |
91 |
92 | # Data extraction
93 | def OpenphishExtractor(openphish_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE):
94 | UAG = UAgent()
95 | with open(openphish_file, "rt") as txt:
96 | for entry in txt:
97 | # Search
98 | if SearchString in entry:
99 | SiteURLSQL(openphish_file, entry, LOG, SQL, TABLEname, PROXY, UAFILE, UAG)
100 | else:
101 | pass
102 |
103 |
104 | # Delete OpenPhish downloaded file, or not
105 | def DeleteOpenphishFile(openphish_file, LOG):
106 | # Delete openphish_file
107 | try:
108 | os.remove(openphish_file)
109 | LOG.info("File " + openphish_file + " deleted.")
110 | except:
111 | LOG.error("Can't delete " + openphish_file + " !!!")
112 | pass
113 |
--------------------------------------------------------------------------------
/stalkphish/modules/phishingdb.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import os
7 | import re
8 | import sys
9 | import socket
10 | import requests
11 | from os.path import dirname
12 | from urllib.parse import urlparse, quote
13 | from tools.utils import TimestampNow
14 | from tools.utils import UAgent
15 | from tools.utils import NetInfo
16 |
17 |
18 | # siteURl
19 | def SiteURLSQL(phishingdb_file, entry, LOG, SQL, TABLEname, PROXY, UAFILE, UAG):
20 | siteURL = quote(re.split("(?:[0-9a-fA-F]:?){32}", entry.rstrip())[0], ':/')
21 | dn = dirname(siteURL)
22 |
23 | # Test if entry still exist in DB
24 | if SQL.SQLiteVerifyEntry(TABLEname, dn) is 0:
25 | now = str(TimestampNow().Timestamp())
26 | siteDomain = urlparse(entry).netloc
27 | source_url = phishingdb_file
28 | try:
29 | IPaddress = socket.gethostbyname(siteDomain)
30 | if IPaddress:
31 | rASN = NetInfo()
32 | ASN = rASN.GetASN(IPaddress).strip('\"')
33 | else:
34 | pass
35 | # can't resolv
36 | except:
37 | IPaddress = ""
38 | ASN = ""
39 |
40 | # HTTP connection
41 | try:
42 | proxies = {'http': PROXY, 'https': PROXY}
43 | UA = UAG.ChooseUA(UAFILE)
44 | user_agent = {'User-agent': UA}
45 | try:
46 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12))
47 | # Follow redirect and add new URI to database
48 | if (len(r.history) > 1) and ("301" in str(r.history[-1])) and (siteURL != r.url) and (siteURL.split('/')[:-1] != r.url.split('/')[:-2]) and (siteURL + '/' != r.url):
49 | lastHTTPcode = str(r.status_code)
50 | SQL.SQLiteInsertPK(TABLEname, r.url, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
51 | else:
52 | pass
53 | lastHTTPcode = str(r.status_code)
54 | except ValueError:
55 | # No user-agent configured
56 | r = requests.get(siteURL, proxies=proxies, allow_redirects=True, timeout=(5, 12))
57 | lastHTTPcode = str(r.status_code)
58 | except requests.exceptions.Timeout:
59 | lastHTTPcode = "timeout"
60 | except requests.exceptions.ConnectionError:
61 | lastHTTPcode = "aborted"
62 | except:
63 | lastHTTPcode = "---"
64 | err = sys.exc_info()
65 | LOG.error("HTTP error: " + str(err))
66 | pass
67 | except Exception as e:
68 | # Unknown status code
69 | LOG.error("Connection error: {}".format(e))
70 | pass
71 |
72 | # Add data into database
73 | LOG.info(siteURL)
74 | SQL.SQLiteInsertPK(TABLEname, siteURL, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
75 |
76 | else:
77 | LOG.debug("Entry still known: " + siteURL)
78 | pass
79 |
80 |
81 | # PhishingDB
82 | def PhishingDBOSINT(phishingdb_file, ConfPHISHINGDB_url, ConfPHISHINGDB_keep, SrcDir, PROXY, LOG):
83 | # Get PhishingDB OSINT TXT file
84 | proxies = {'http': PROXY, 'https': PROXY}
85 | LOG.info("Retrieving Phishing.Database\'s file (" + ConfPHISHINGDB_url + ") ... Could take several minutes...")
86 | resp = requests.get(url=ConfPHISHINGDB_url, proxies=proxies, allow_redirects=True, timeout=(10, 20))
87 | with open(phishingdb_file, "wb") as file:
88 | file.write(resp.content)
89 | LOG.info("Phishing.Database\'s file retrieved and saved as " + phishingdb_file)
90 |
91 |
92 | # Data extraction
93 | def PhishingDBExtractor(phishingdb_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE):
94 | UAG = UAgent()
95 | with open(phishingdb_file, "rt") as txt:
96 | for entry in txt:
97 | # Search
98 | if SearchString in entry:
99 | SiteURLSQL(phishingdb_file, entry, LOG, SQL, TABLEname, PROXY, UAFILE, UAG)
100 | else:
101 | pass
102 |
103 |
104 | # Delete Phishing.Database downloaded file, or not
105 | def DeletePhishingDBFile(phishingdb_file, LOG):
106 | # Delete phishingdb_file
107 | try:
108 | os.remove(phishingdb_file)
109 | LOG.info("File " + phishingdb_file + " deleted.")
110 | except:
111 | LOG.error("Can't delete " + phishingdb_file + " !!!")
112 | pass
113 |
--------------------------------------------------------------------------------
/stalkphish/modules/phishstats.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import os
7 | import sys
8 | import requests
9 | import re
10 | import socket
11 | import json
12 | import cfscrape
13 | from os.path import dirname
14 | from urllib.parse import urlparse, quote
15 | from tools.utils import TimestampNow
16 | from tools.utils import UAgent
17 | from tools.utils import NetInfo
18 |
19 |
20 | # siteURl
21 | def SiteURLSQL(item, LOG, SQL, TABLEname, PROXY, UAFILE, UAG):
22 | # remove URL containing UID-style strings
23 | siteURL = quote(re.split("(?:[0-9a-fA-F]:?){32}", item['page']['url'])[0], ':/')
24 | dn = dirname(siteURL)
25 |
26 | # Test if entry still exist in DB
27 | if SQL.SQLiteVerifyEntry(TABLEname, dn) is 0:
28 | now = str(TimestampNow().Timestamp())
29 | siteDomain = urlparse(item['page']['url']).netloc
30 | source_url = item['result'].replace("/api/v1", "")
31 | try:
32 | IPaddress = socket.gethostbyname(siteDomain)
33 | if IPaddress:
34 | rASN = NetInfo()
35 | ASN = rASN.GetASN(IPaddress).strip('\"')
36 | else:
37 | pass
38 | # can't resolv
39 | except:
40 | IPaddress = ""
41 | ASN = ""
42 |
43 | # HTTP connection
44 | try:
45 | proxies = {'http': PROXY, 'https': PROXY}
46 | UA = UAG.ChooseUA(UAFILE)
47 | user_agent = {'User-agent': UA}
48 | try:
49 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12))
50 | lastHTTPcode = str(r.status_code)
51 | except ValueError:
52 | # No user-agent configured
53 | r = requests.get(siteURL, proxies=proxies, allow_redirects=True, timeout=(5, 12))
54 | lastHTTPcode = str(r.status_code)
55 | except requests.exceptions.Timeout:
56 | lastHTTPcode = "timeout"
57 | except requests.exceptions.ConnectionError:
58 | lastHTTPcode = "aborted"
59 | except:
60 | lastHTTPcode = "---"
61 | pass
62 | except Exception as e:
63 | # Unknown status code
64 | LOG.error("Connection error: {}".format(e))
65 | pass
66 |
67 | LOG.info(siteURL + " " + siteDomain + " " + IPaddress + " " + source_url + " " + now + " " + lastHTTPcode)
68 | SQL.SQLiteInsertPK(TABLEname, siteURL, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
69 |
70 | else:
71 | LOG.debug("Entry still known: " + siteURL)
72 | pass
73 |
74 |
75 | # Phishstats API Search
76 | def PhishstatsOSINT(phishstats_file, ConfPHISHSTATS_url, ConfPHISHSTATS_keep, PROXY, SearchString, LOG):
77 | global HTMLText
78 | try:
79 | proxies = {'http': PROXY, 'https': PROXY}
80 | try:
81 | # If more than one search word
82 | if ',' in SearchString:
83 | SearchString_list = [SearchString.strip(' ') for SearchString in SearchString.split(',')]
84 | print(SearchString_list)
85 | else:
86 | SearchString_list = [SearchString]
87 | except:
88 | err = sys.exc_info()
89 | LOG.error("SearchString error " + str(err))
90 |
91 | # Using CloudFlare Scraper
92 | scraper = cfscrape.create_scraper()
93 | r = scraper.get(ConfPHISHSTATS_url + "(title,like,~" + SearchString + "~)", timeout=(10, 20))
94 |
95 | # download Phishstats' JSON file
96 | with open(phishstats_file, "wb") as file:
97 | file.write(r.content)
98 | LOG.info("Phishstats\' file retrieved. Proceeding to extraction...")
99 |
100 | except requests.exceptions.ConnectTimeout as e:
101 | LOG.error("Error while connecting to Phishstats: {}".format(e))
102 | pass
103 | except Exception as e:
104 | LOG.error("Phishstats connection error: {}".format(e))
105 | sys.exit(0)
106 | pass
107 |
108 |
109 | # Parse Phishstats result
110 | def PhishstatsExtractor(phishstats_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE):
111 | UAG = UAgent()
112 |
113 | try:
114 | file = json.loads(open(phishstats_file).read())
115 | # Search in Phishstats JSON file
116 | for entry in file:
117 | print(entry['url'])
118 | bla = entry['url']
119 | SiteURLSQL(phishstats_file, bla, LOG, SQL, TABLEname, PROXY, UAFILE, UAG)
120 |
121 | except TypeError:
122 | pass
123 |
124 | except Exception as e:
125 | LOG.error("Phishstats JSON parser Error: {}".format(e))
126 |
127 |
128 | # Delete Phishstats downloaded file, or not
129 | def DeletePhishstatsFile(phishstats_file, LOG):
130 | # Delete phishstats_file
131 | try:
132 | os.remove(phishstats_file)
133 | LOG.info("File " + phishstats_file + " deleted.")
134 | except:
135 | LOG.error("Can't delete " + phishstats_file + " !!!")
136 | pass
137 |
--------------------------------------------------------------------------------
/stalkphish/modules/phishtank.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import re
7 | import os
8 | import sys
9 | import json
10 | import socket
11 | import requests
12 | import cfscrape
13 | from os.path import dirname
14 | from urllib.parse import urlparse, quote
15 | from tools.utils import TimestampNow
16 | from tools.utils import UAgent
17 | from tools.utils import NetInfo
18 |
19 |
20 | # siteURl
21 | def SiteURLSQL(phishtank_file, entry, LOG, SQL, TABLEname, PROXY, UAFILE, UAG):
22 | # remove URL containing UID-style strings
23 | siteURL = quote(re.split("(?:[0-9a-fA-F]:?){32}", entry['url'])[0], ':/')
24 | dn = dirname(siteURL)
25 |
26 | # Test if entry still exist in DB
27 | if SQL.SQLiteVerifyEntry(TABLEname, dn) is 0:
28 |
29 | IPaddress = entry['details'][0]['ip_address']
30 | source_url = entry['phish_detail_url']
31 | siteDomain = urlparse(entry['url']).netloc
32 | now = str(TimestampNow().Timestamp())
33 | try:
34 | IPaddress = socket.gethostbyname(siteDomain)
35 | if IPaddress:
36 | rASN = NetInfo()
37 | ASN = rASN.GetASN(IPaddress).strip('\"')
38 | else:
39 | pass
40 | # can't resolv
41 | except:
42 | IPaddress = ""
43 | ASN = ""
44 |
45 | # HTTP connection
46 | try:
47 | proxies = {'http': PROXY, 'https': PROXY}
48 | UA = UAG.ChooseUA(UAFILE)
49 | user_agent = {'User-agent': UA}
50 | try:
51 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12))
52 | # Follow redirect and add new URI to database
53 | if (len(r.history) > 1) and ("301" in str(r.history[-1])) and (siteURL != r.url) and (siteURL.split('/')[:-1] != r.url.split('/')[:-2]) and (siteURL + '/' != r.url):
54 | lastHTTPcode = str(r.status_code)
55 | SQL.SQLiteInsertPK(TABLEname, r.url, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
56 | else:
57 | pass
58 | lastHTTPcode = str(r.status_code)
59 | except ValueError:
60 | # No user-agent configured
61 | r = requests.get(siteURL, proxies=proxies, allow_redirects=True, timeout=(5, 12))
62 | lastHTTPcode = str(r.status_code)
63 | except requests.exceptions.Timeout:
64 | lastHTTPcode = "timeout"
65 | except requests.exceptions.ConnectionError:
66 | lastHTTPcode = "aborted"
67 | except:
68 | lastHTTPcode = "---"
69 | err = sys.exc_info()
70 | LOG.error("HTTP error: " + str(err))
71 | pass
72 | except:
73 | # Unknown status code
74 | err = sys.exc_info()
75 | LOG.error("Connection error: " + str(err))
76 | pass
77 |
78 | # Add data into database
79 | LOG.info(siteURL)
80 | SQL.SQLiteInsertPK(TABLEname, siteURL, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
81 |
82 | else:
83 | LOG.debug("Entry still known: " + siteURL)
84 | pass
85 |
86 |
87 | # PhishTank
88 | def PhishtankOSINT(phishtank_file, ConfPHISHTANK_url, ConfPHISHTANK_keep, SrcDir, PROXY, LOG):
89 | # Get phishtank OSINT JSON file
90 | proxies = {'http': PROXY, 'https': PROXY}
91 | LOG.info("Retrieving Phishtank's JSON file... Could take several minutes...")
92 | # Using CloudFlare Scraper
93 | scraper = cfscrape.create_scraper()
94 | # resp = scraper.get(ConfPHISHTANK_url, proxies=proxies, allow_redirects=True, timeout=(10, 20))
95 | resp = scraper.get(ConfPHISHTANK_url, allow_redirects=True, timeout=(10, 20))
96 |
97 | # download PhishTank JSON file
98 | if str(resp.status_code) == "403":
99 | LOG.error("PhishTank refused your connection (HTTP 403 code). Maybe Cloudflare asking for a captcha? Or there is an API key problem.")
100 | sys.exit(0)
101 |
102 | if str(resp.status_code) != "509":
103 | with open(phishtank_file, "wb") as file:
104 | file.write(resp.content)
105 | LOG.info("Phishtank\'s file retrieved. Proceeding to extraction...")
106 | # Error if download limit exceeded
107 | else:
108 | LOG.error("PhishTank download limit exceeded. Can't download JSON file. Maybe you should use an API key?")
109 | sys.exit(0)
110 |
111 |
112 | def PhishtankExtractor(phishtank_file, SearchString, LOG, SQL, TABLEname, PROXY, UAFILE):
113 | UAG = UAgent()
114 | # Search in Phishtank JSON file
115 | file = json.loads(open(phishtank_file).read())
116 | for entry in file:
117 | # Search
118 | if SearchString in entry['url']:
119 | SiteURLSQL(phishtank_file, entry, LOG, SQL, TABLEname, PROXY, UAFILE, UAG)
120 | else:
121 | pass
122 |
123 |
124 | # Delete OpenPhish downloaded file, or not
125 | def DeletePhishtankFile(phishtank_file, LOG):
126 | # Delete phishtank_file
127 | try:
128 | os.remove(phishtank_file)
129 | LOG.info("File " + phishtank_file + " deleted.")
130 | except:
131 | LOG.error("Can't delete " + phishtank_file + " !!!")
132 | pass
133 |
--------------------------------------------------------------------------------
/stalkphish/modules/urlquery.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import requests
7 | import re
8 | import socket
9 | from os.path import dirname
10 | from urllib.parse import urlparse, quote
11 | from tools.utils import TimestampNow
12 | from tools.utils import UAgent
13 | from tools.utils import NetInfo
14 |
15 |
16 | # siteURl
17 | def SiteURLSQL(SearchString, line, LOG, SQL, TABLEname, PROXY, UAFILE, UAG):
18 | # remove URL containing UID-style strings
19 | siteURL = quote(re.split("(?:[0-9a-fA-F]:?){32}", line[0])[0], ':/')
20 | if siteURL.startswith('https:'):
21 | siteDomain = siteURL.split('/')[2]
22 | else:
23 | siteDomain = siteURL.split('/')[0]
24 | siteURL = "http://" + siteURL
25 | dn = dirname(siteURL)
26 |
27 | # Test if entry still exist in DB
28 | if SQL.SQLiteVerifyEntry(TABLEname, dn) is 0:
29 | # Proceed to informations retrieve
30 | now = str(TimestampNow().Timestamp())
31 | source_url = "https://urlquery.net/" + line[1]
32 | try:
33 | IPaddress = socket.gethostbyname(siteDomain)
34 | if IPaddress:
35 | rASN = NetInfo()
36 | ASN = rASN.GetASN(IPaddress).strip('\"')
37 | else:
38 | pass
39 | # can't resolv
40 | except:
41 | IPaddress = ""
42 | ASN = ""
43 |
44 | # HTTP connection
45 | try:
46 | proxies = {'http': PROXY, 'https': PROXY}
47 | UA = UAG.ChooseUA(UAFILE)
48 | user_agent = {'User-agent': UA}
49 | try:
50 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True)
51 | # Follow redirect and add new URI to database
52 | if (len(r.history) > 1) and ("301" in str(r.history[-1])) and (siteURL != r.url) and (siteURL.split('/')[:-1] != r.url.split('/')[:-2]) and (siteURL + '/' != r.url):
53 | lastHTTPcode = str(r.status_code)
54 | SQL.SQLiteInsertPK(TABLEname, r.url, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
55 | else:
56 | pass
57 | lastHTTPcode = str(r.status_code)
58 | except ValueError:
59 | # No user-agent configured
60 | r = requests.get(siteURL, proxies=proxies, allow_redirects=True)
61 | lastHTTPcode = str(r.status_code)
62 | except requests.exceptions.Timeout:
63 | lastHTTPcode = "timeout"
64 | except requests.exceptions.ConnectionError:
65 | lastHTTPcode = "aborted"
66 | except:
67 | lastHTTPcode = "---"
68 | pass
69 | except Exception as e:
70 | # Unknown status code
71 | LOG.error("Connection error: {}".format(e))
72 | pass
73 |
74 | # Add data into database
75 | LOG.info(siteURL + " " + siteDomain + " " + IPaddress + " " + source_url + " " + now + " " + lastHTTPcode)
76 | SQL.SQLiteInsertPK(TABLEname, siteURL, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
77 |
78 | else:
79 | LOG.debug("Entry still known: " + siteURL)
80 | pass
81 |
82 |
83 | # Urlquery Web Search
84 | # UrlQuery could 'drop' your GET if using Tor network
85 | def UrlqueryOSINT(ConfURLQUERY_url, PROXY, SearchString, LOG):
86 | global HTMLText
87 | try:
88 | proxies = {'http': PROXY, 'https': PROXY}
89 | payload = {'q': SearchString}
90 | r = requests.get(ConfURLQUERY_url, params=payload, allow_redirects=True, timeout=(10, 20))
91 | HTMLText = r.text
92 | LOG.info("Searching for \'" + SearchString + "\'...")
93 | except Exception as e:
94 | LOG.error("Error while GETting HTML page: {}".format(e))
95 |
96 |
97 | # Parse urlQuery HTML page
98 | def UrlqueryExtractor(SearchString, LOG, SQL, TABLEname, PROXY, UAFILE):
99 | UAG = UAgent()
100 | # Search in Urlquery HTML file
101 | try:
102 | m = re.findall(r"", HTMLText)
103 | for line in m:
104 | SiteURLSQL(SearchString, line, LOG, SQL, TABLEname, PROXY, UAFILE, UAG)
105 | except TypeError:
106 | pass
107 |
108 | except Exception as e:
109 | LOG.error("HTML parser Error: {}".format(e))
110 |
--------------------------------------------------------------------------------
/stalkphish/modules/urlscan.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import requests
7 | import re
8 | import socket
9 | from os.path import dirname
10 | from urllib.parse import urlparse, quote
11 | from tools.utils import TimestampNow
12 | from tools.utils import UAgent
13 | from tools.utils import NetInfo
14 |
15 |
16 | # siteURl
17 | def SiteURLSQL(item, LOG, SQL, TABLEname, PROXY, UAFILE, UAG):
18 | # remove URL containing UID-style strings
19 | siteURL = quote(re.split("(?:[0-9a-fA-F]:?){32}", item['page']['url'])[0], ':/')
20 | dn = dirname(siteURL)
21 |
22 | # Test if entry still exist in DB
23 | if SQL.SQLiteVerifyEntry(TABLEname, dn) == 0:
24 | now = str(TimestampNow().Timestamp())
25 | siteDomain = urlparse(item['page']['url']).netloc
26 | source_url = item['result'].replace("/api/v1", "")
27 | try:
28 | IPaddress = socket.gethostbyname(siteDomain)
29 | if IPaddress:
30 | rASN = NetInfo()
31 | ASN = rASN.GetASN(IPaddress).strip('\"')
32 | else:
33 | pass
34 | # can't resolv
35 | except:
36 | IPaddress = ""
37 | ASN = ""
38 |
39 | # HTTP connection
40 | try:
41 | proxies = {'http': PROXY, 'https': PROXY}
42 | UA = UAG.ChooseUA(UAFILE)
43 | user_agent = {'User-agent': UA}
44 | try:
45 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True)
46 | lastHTTPcode = str(r.status_code)
47 | except ValueError:
48 | # No user-agent configured
49 | r = requests.get(siteURL, proxies=proxies, allow_redirects=True)
50 | lastHTTPcode = str(r.status_code)
51 | except requests.exceptions.Timeout:
52 | lastHTTPcode = "timeout"
53 | except requests.exceptions.ConnectionError:
54 | lastHTTPcode = "aborted"
55 | except:
56 | lastHTTPcode = "---"
57 | pass
58 | except Exception as e:
59 | # Unknown status code
60 | LOG.error("Connection error: {}".format(e))
61 | pass
62 |
63 | LOG.info(siteURL + " " + siteDomain + " " + IPaddress + " " + source_url + " " + now + " " + lastHTTPcode)
64 | SQL.SQLiteInsertPK(TABLEname, siteURL, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
65 |
66 | else:
67 | LOG.debug("Entry still known: " + siteURL)
68 | pass
69 |
70 |
71 | # Urlscan Web Search
72 | def UrlscanOSINT(ConfURLSCAN_apikey, ConfURLSCAN_url, PROXY, SearchString, LOG):
73 | global HTMLText
74 | try:
75 | proxies = {'http': PROXY, 'https': PROXY}
76 | payload = {'q': SearchString}
77 | headers = {}
78 | if ConfURLSCAN_apikey:
79 | headers = {'API-Key': '{}'.format(ConfURLSCAN_apikey)}
80 |
81 | r = requests.get(url=ConfURLSCAN_url + "?q=page.url:" + SearchString + " OR page.domain:" + SearchString, headers=headers, proxies=proxies, allow_redirects=True, timeout=(10, 20))
82 | HTMLText = r.json()
83 | LOG.info("Searching for \'" + SearchString + "\'...")
84 |
85 | except requests.exceptions.ConnectTimeout as e:
86 | LOG.error("Error while connecting to urlscan.io: {}".format(e))
87 | pass
88 | except Exception as e:
89 | LOG.error("Urlscan connection error: {}".format(e))
90 | pass
91 |
92 |
93 | # Parse Urlscan HTML page
94 | def UrlscanExtractor(LOG, SQL, TABLEname, PROXY, UAFILE):
95 | UAG = UAgent()
96 | # Search in Urlscan HTML file
97 | try:
98 | for item in HTMLText['results']:
99 | SiteURLSQL(item, LOG, SQL, TABLEname, PROXY, UAFILE, UAG)
100 | except TypeError:
101 | pass
102 |
103 | except Exception as e:
104 | LOG.error("HTML parser Error: {}".format(e))
105 |
--------------------------------------------------------------------------------
/stalkphish/tools/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/t4d/StalkPhish/8a7886e05ff319e8a95602cbb12260ff71ad7c89/stalkphish/tools/__init__.py
--------------------------------------------------------------------------------
/stalkphish/tools/addurl.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import re
7 | import sys
8 | import socket
9 | import requests
10 | from urllib.parse import urlparse
11 | from tools.utils import TimestampNow
12 | from tools.utils import UAgent
13 | from tools.utils import NetInfo
14 |
15 |
16 | # Data extraction
17 | def AddUniqueURL(URLadd, LOG, SQL, TABLEname, PROXY, UAFILE):
18 | UAG = UAgent()
19 | # add schema
20 | if URLadd.startswith("http://") or URLadd.startswith("https://"):
21 | pass
22 | else:
23 | URLadd = "http://{}".format(URLadd)
24 |
25 | # remove URL containing UID-style strings
26 | siteURL = re.split("(?:[0-9a-fA-F]:?){32}", URLadd.rstrip())[0]
27 | # Test if entry still exist in DB
28 | if SQL.SQLiteVerifyEntry(TABLEname, siteURL) == 0:
29 | now = str(TimestampNow().Timestamp())
30 | siteDomain = urlparse(URLadd).netloc
31 | source_url = "Manual"
32 | try:
33 | IPaddress = socket.gethostbyname(siteDomain)
34 | rASN = NetInfo()
35 | ASN = rASN.GetASN(IPaddress).strip('\"')
36 | # can't resolv
37 | except:
38 | IPaddress = ""
39 | ASN = ""
40 |
41 | # HTTP connection
42 | try:
43 | proxies = {'http': PROXY, 'https': PROXY}
44 | UA = UAG.ChooseUA(UAFILE)
45 | user_agent = {'User-agent': UA}
46 | try:
47 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12))
48 | lastHTTPcode = str(r.status_code)
49 | except ValueError:
50 | # No user-agent configured
51 | r = requests.get(siteURL, proxies=proxies, allow_redirects=True, timeout=(5, 12))
52 | lastHTTPcode = str(r.status_code)
53 | except requests.exceptions.Timeout:
54 | lastHTTPcode = "timeout"
55 | except requests.exceptions.ConnectionError:
56 | lastHTTPcode = "aborted"
57 | except:
58 | lastHTTPcode = "---"
59 | err = sys.exc_info()
60 | LOG.error("HTTP error: " + str(err))
61 | pass
62 | except:
63 | # Unknown status code
64 | err = sys.exc_info()
65 | LOG.error("Connection error: " + str(err))
66 | pass
67 |
68 | # Add data into database
69 | LOG.info(siteURL)
70 | SQL.SQLiteInsertPK(TABLEname, siteURL, siteDomain, IPaddress, source_url, now, lastHTTPcode, ASN)
71 |
72 | else:
73 | LOG.info("Entry still known: " + siteURL)
74 | pass
75 |
--------------------------------------------------------------------------------
/stalkphish/tools/confparser.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import sys
7 | import configparser
8 |
9 |
10 | class ConfParser:
11 | '''Configuration file parser'''
12 | def __init__(self, Confile=None):
13 | try:
14 | self.config = configparser.ConfigParser()
15 |
16 | with open(Confile, 'r', encoding='utf-8') as f:
17 | self.config.readfp(f)
18 |
19 | # search string(s) (comma separated)
20 | self.SearchString = self.config['SEARCH']['search']
21 |
22 | # Databases
23 | self.DatabaseDir = self.config['DATABASE']['Databases_files']
24 | self.DBfile = self.config['DATABASE']['sqliteDB_filename']
25 | self.TABLEname = self.config['DATABASE']['sqliteDB_tablename']
26 | self.InvestigTABLEname = self.config['DATABASE']['sqliteDB_Investig_tablename']
27 |
28 | # Paths
29 | # Logging
30 | self.LogConf = self.config['PATHS']['log_conf']
31 | self.LogDir = self.config['PATHS']['log_dir']
32 | self.LogFile = self.config['PATHS']['log_file']
33 |
34 | self.DLDir = self.config['PATHS']['Kits_download_Dir']
35 | self.SrcDir = self.config['PATHS']['Ext_src_Files']
36 |
37 | # Proxy
38 | try:
39 | self.http_proxy = self.config['CONNECT']['http_proxy']
40 | except:
41 | self.http_proxy = None
42 |
43 | self.http_UA = self.config['CONNECT']['http_UA']
44 | self.UAfile = self.config['CONNECT']['UAfile']
45 |
46 | # Modules
47 | self.URLSCAN_active = self.config['URLSCAN'].getboolean('activate')
48 | self.URLSCAN_url = self.config['URLSCAN']['API_url']
49 | try:
50 | self.URLSCAN_apikey = self.config['URLSCAN']['API_key']
51 | except:
52 | self.URLSCAN_apikey = None
53 |
54 | self.URLQUERY_active = self.config['URLQUERY'].getboolean('activate')
55 | self.URLQUERY_url = self.config['URLQUERY']['OSINT_url']
56 |
57 | self.PHISHTANK_active = self.config['PHISHTANK'].getboolean('activate')
58 | self.PHISHTANK_url = self.config['PHISHTANK']['OSINT_url']
59 | self.PHISHTANK_keep = self.config['PHISHTANK'].getboolean('keep_files')
60 | try:
61 | self.PHISHTANK_apikey = self.config['PHISHTANK']['API_key']
62 | except:
63 | self.PHISHTANK_apikey = None
64 |
65 | self.OPENPHISH_active = self.config['OPENPHISH'].getboolean('activate')
66 | self.OPENPHISH_url = self.config['OPENPHISH']['OSINT_url']
67 | self.OPENPHISH_keep = self.config['OPENPHISH'].getboolean('keep_files')
68 |
69 | self.PHISHSTATS_active = self.config['PHISHSTATS'].getboolean('activate')
70 | self.PHISHSTATS_url = self.config['PHISHSTATS']['OSINT_url']
71 | self.PHISHSTATS_keep = self.config['PHISHSTATS'].getboolean('keep_files')
72 |
73 | self.PHISHINGDB_active = self.config['Phishing.Database'].getboolean('activate')
74 | self.PHISHINGDB_url = self.config['Phishing.Database']['OSINT_url']
75 | self.PHISHINGDB_keep = self.config['Phishing.Database'].getboolean('keep_files')
76 |
77 | except IOError:
78 | print("[!!!] Configuration file Error: " + Confile)
79 | except:
80 | err = sys.exc_info()
81 | print("[!!!] ConfParser Error: " + str(err))
82 |
--------------------------------------------------------------------------------
/stalkphish/tools/connect.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import requests
7 |
8 |
9 | def HTTPCode(url, user_agent):
10 | proxies = None
11 | try:
12 | r = requests.get(url=url, proxies=proxies, headers=user_agent, allow_redirects=True, timeout=(5, 12))
13 | lastHTTPcode = str(r.status_code)
14 | print(lastHTTPcode)
15 |
16 | except ValueError:
17 | r = requests.get(url=url, proxies=proxies, allow_redirects=True, timeout=(5, 12), headers={'User-agent': 'Mozilla/5.0 (iPhone; CPU iPhone OS 10_3_3 like Mac OS X) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.0 Mobile/14G60 Safari/602.1'})
18 | lastHTTPcode = str(r.status_code)
19 | print(lastHTTPcode)
20 |
21 | except requests.exceptions.ConnectionError:
22 | print('Connection Error.')
23 |
--------------------------------------------------------------------------------
/stalkphish/tools/download.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import requests
7 | from bs4 import BeautifulSoup
8 | import re
9 | import os
10 | import io
11 | import zipfile
12 | import sys
13 | import hashlib
14 | import cfscrape
15 | from urllib.parse import urlparse
16 | from tools.utils import TimestampNow
17 | from tools.utils import SHA256
18 | from tools.utils import UAgent
19 | from tools.utils import ZipSearch
20 | from requests.packages.urllib3.exceptions import InsecureRequestWarning
21 | requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
22 |
23 |
24 | def PKDownloadOpenDir(siteURL, siteDomain, IPaddress, TABLEname, InvTABLEname, DLDir, SQL, PROXY, LOG, UAFILE, ASN):
25 | global Ziplst
26 | proxies = {'http': PROXY, 'https': PROXY}
27 | UAG = UAgent()
28 | UA = UAG.ChooseUA(UAFILE)
29 | user_agent = {'User-agent': UA}
30 | now = str(TimestampNow().Timestamp())
31 | SHA = SHA256()
32 | Ziplst = []
33 |
34 | rhtml = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12), verify=False)
35 | thtml = BeautifulSoup(rhtml.text, 'html.parser')
36 | try:
37 | PageTitle = thtml.title.text.strip()
38 | except:
39 | PageTitle = None
40 | if PageTitle is not None:
41 | PageTitle = re.sub('\s+', ' ', PageTitle)
42 | SQL.SQLiteInvestigUpdateTitle(InvTABLEname, siteURL, PageTitle)
43 | else:
44 | pass
45 |
46 | thtmlatag = thtml.select('a')
47 | Ziplst += [siteURL + "/" + tag['href'] for tag in thtmlatag if '.zip' in tag.text]
48 |
49 | for f in Ziplst:
50 | try:
51 | r = requests.get(f, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12), verify=False)
52 | lastHTTPcode = str(r.status_code)
53 | # Reduce filename lenght
54 | if len(f) > 250:
55 | zzip = f.replace('/', '_').replace(':', '')[:250]
56 | else:
57 | zzip = f.replace('/', '_').replace(':', '')
58 | try:
59 | savefile = DLDir + zzip
60 | # Still collected file
61 | if os.path.exists(savefile):
62 | LOG.info("[DL ] Found still collected archive: " + savefile)
63 | # New file to download
64 | else:
65 | if zipfile.is_zipfile(io.BytesIO(r.content)):
66 | LOG.info("[DL ] Found archive in an open dir, downloaded it as: " + savefile)
67 | with open(savefile, "wb") as code:
68 | code.write(r.content)
69 | pass
70 | ZipFileName = str(zzip + '.zip')
71 | ZipFileHash = SHA.hashFile(savefile)
72 | # Extract e-mails from downloaded file
73 | try:
74 | ZS = ZipSearch()
75 | extracted_emails = str(ZS.PKzipSearch(InvTABLEname, SQL, LOG, DLDir, savefile)).strip("[]").replace("'", "")
76 | LOG.info("[Emails] found: {}".format(extracted_emails))
77 | SQL.SQLiteInvestigInsertEmail(InvTABLEname, extracted_emails, ZipFileName)
78 | except Exception as e:
79 | LOG.info("Extracted emails exception: {}".format(e))
80 |
81 | SQL.SQLiteInvestigUpdatePK(InvTABLEname, siteURL, ZipFileName, ZipFileHash, now, lastHTTPcode)
82 | else:
83 | pass
84 | except Exception as e:
85 | LOG.error("Error downloading file: {}".format(e))
86 | except requests.exceptions.ContentDecodingError:
87 | LOG.error("[DL ] content-type error")
88 |
89 |
90 | # Connexion tests, Phishing kits downloadingd
91 | def TryPKDownload(siteURL, siteDomain, IPaddress, TABLEname, InvTABLEname, DLDir, SQL, PROXY, LOG, UAFILE, ASN):
92 | global ziplist
93 | global PageTitle
94 | proxies = {'http': PROXY, 'https': PROXY}
95 | UAG = UAgent()
96 | UA = UAG.ChooseUA(UAFILE)
97 | user_agent = {'User-agent': UA}
98 | now = str(TimestampNow().Timestamp())
99 | SHA = SHA256()
100 |
101 |
102 | # Let's try to find a phishing kit source archive
103 | try:
104 | r = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12), verify=False)
105 | # Generate page hash
106 | try:
107 | soup = BeautifulSoup(r.content, 'lxml')
108 | # Body hash only
109 | # body = soup.find('body')
110 | try:
111 | #page_body = body.findChildren()
112 | page_body = soup
113 | except:
114 | # print(r.content) ## print, frequently, JS in
115 | pass
116 | try:
117 | sha1 = hashlib.sha1()
118 | sha1.update(repr(page_body).encode("utf-8"))
119 | PageHash = sha1.hexdigest()
120 | SQL.SQLiteInsertPageHash(TABLEname, siteURL, PageHash)
121 | except:
122 | pass
123 | except Exception as e:
124 | print(e)
125 |
126 | # if (str(r.status_code) != "404"):
127 | LOG.info("[" + str(r.status_code) + "] " + r.url)
128 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
129 | if SQL.SQLiteInvestigVerifyEntry(InvTABLEname, siteDomain, IPaddress) is 0:
130 | SQL.SQLiteInvestigInsert(InvTABLEname, siteURL, siteDomain, IPaddress, now, str(r.status_code))
131 | else:
132 | pass
133 | ziplist = []
134 | path = siteURL
135 | pathl = '/' .join(path.split("/")[:3])
136 | pathlist = path.split("/")[3:]
137 |
138 | # Make list
139 | current = 0
140 | newpath = ""
141 | while current < len(pathlist):
142 | if current == 0:
143 | newpath = pathlist[current]
144 | else:
145 | newpath = newpath + "/" + pathlist[current]
146 | current = current + 1
147 | pathD = pathl + "/" + newpath
148 | ziplist.append(pathD)
149 | rootpath = pathl + "/"
150 | if rootpath != pathD:
151 | ziplist.append(rootpath)
152 | else:
153 | pass
154 |
155 | # Get page title
156 | try:
157 | if len(ziplist) >= 1:
158 | rhtml = requests.get(siteURL, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12), verify=False)
159 | thtml = BeautifulSoup(rhtml.text, 'html.parser')
160 | try:
161 | PageTitle = thtml.title.text.strip()
162 | except:
163 | PageTitle = None
164 | if PageTitle is not None:
165 | PageTitle = re.sub('\s+', ' ', PageTitle)
166 | LOG.info(PageTitle)
167 | SQL.SQLiteInvestigUpdateTitle(InvTABLEname, siteURL, PageTitle)
168 | else:
169 | pass
170 | except AttributeError:
171 | pass
172 | except requests.exceptions.ReadTimeout:
173 | pass
174 | except requests.exceptions.ConnectTimeout:
175 | pass
176 | except:
177 | err = sys.exc_info()
178 | LOG.error("Get PageTitle Error: " + siteURL + str(err))
179 |
180 | # Try to find and download phishing kit archive (.zip)
181 | try:
182 | if len(ziplist) >= 1:
183 | for zip in ziplist:
184 | if (' = ' or '%' or '?' or '-' or '@') not in os.path.basename(os.path.normpath(zip)):
185 | try:
186 | # if URL is not rootpath siteURL
187 | if int(len(zip.split("/")[3:][0])) > 0:
188 | LOG.info("trying " + zip + ".zip")
189 | # Try to use cfscraper if Cloudflare's check
190 | if PageTitle != None and "Cloudflare" in PageTitle:
191 | scraper = cfscrape.create_scraper()
192 | rz = scraper.get(zip + ".zip", headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12), verify=False)
193 | else:
194 | rz = requests.get(zip + ".zip", headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12), verify=False)
195 | # if str(rz.status_code) != "404":
196 | lastHTTPcode = str(rz.status_code)
197 | # Reduce filename lenght
198 | if len(zip) > 250:
199 | zzip = zip.replace('/', '_').replace(':', '')[:250]
200 | else:
201 | zzip = zip.replace('/', '_').replace(':', '')
202 | try:
203 | savefile = DLDir + zzip + '.zip'
204 | # Still collected file
205 | if os.path.exists(savefile):
206 | LOG.info("[DL ] Found still collected archive: " + savefile)
207 | return
208 | # New file to download
209 | else:
210 | if zipfile.is_zipfile(io.BytesIO(rz.content)):
211 | LOG.info("[DL ] Found archive, downloaded it as: " + savefile)
212 | with open(savefile, "wb") as code:
213 | code.write(rz.content)
214 | pass
215 | ZipFileName = str(zzip + '.zip')
216 | ZipFileHash = SHA.hashFile(savefile)
217 | SQL.SQLiteInvestigUpdatePK(InvTABLEname, siteURL, ZipFileName, ZipFileHash, now, lastHTTPcode)
218 | # Extract e-mails from downloaded file
219 | try:
220 | ZS = ZipSearch()
221 | extracted_emails = str(ZS.PKzipSearch(InvTABLEname, SQL, LOG, DLDir, savefile)).strip("[]").replace("'", "")
222 | LOG.info("[Email] Found: {}".format(extracted_emails))
223 | SQL.SQLiteInvestigInsertEmail(InvTABLEname, extracted_emails, ZipFileName)
224 | except Exception as e:
225 | LOG.info("Extracted emails exception: {}".format(e))
226 | return
227 | else:
228 | pass
229 | except requests.exceptions.ContentDecodingError:
230 | LOG.error("[DL ] content-type error")
231 | except:
232 | pass
233 |
234 | # rootpath of siteURL
235 | else:
236 | rr = requests.get(zip, headers=user_agent, proxies=proxies, allow_redirects=True, timeout=(5, 12), verify=False)
237 | thtml = BeautifulSoup(rr.text, 'html.parser')
238 | try:
239 | PageTitle = thtml.title.text.strip()
240 | except:
241 | PageTitle = None
242 | if PageTitle is not None:
243 | PageTitle = re.sub('\s+', ' ', PageTitle)
244 | else:
245 | pass
246 |
247 | except requests.exceptions.ReadTimeout:
248 | LOG.debug("Connection Timeout: " + siteURL)
249 | except requests.exceptions.ConnectTimeout:
250 | LOG.debug("Connection Timeout")
251 | except Exception as e:
252 | LOG.error("Error Downloading zip: {}".format(e))
253 | pass
254 |
255 | # Search for OpenDir
256 | try:
257 | if PageTitle is not None:
258 | # OpenDir's zip search
259 | if 'Index of' in PageTitle:
260 | PKDownloadOpenDir(zip, siteDomain, IPaddress, TABLEname, InvTABLEname, DLDir, SQL, PROXY, LOG, UAFILE, ASN)
261 | # 000webhostapp OpenDir-like zip search
262 | elif '.000webhostapp.com Free Website' in PageTitle:
263 | PKDownloadOpenDir(zip, siteDomain, IPaddress, TABLEname, InvTABLEname, DLDir, SQL, PROXY, LOG, UAFILE, ASN)
264 | else:
265 | pass
266 | else:
267 | pass
268 | except Exception as e:
269 | LOG.error("Potential OpenDir connection error: " + str(e))
270 | pass
271 |
272 | else:
273 | pass
274 | else:
275 | pass
276 | # Ziplist empty
277 | else:
278 | pass
279 | except Exception as e:
280 | LOG.error("DL Error: " + str(e))
281 |
282 |
283 |
284 | except requests.exceptions.ConnectionError:
285 | err = sys.exc_info()
286 | if '0x05: Connection refused' in err:
287 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Conn. refused')
288 | if '0x04: Host unreachable' in err:
289 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Unreachable')
290 | else:
291 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Conn. error')
292 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
293 | LOG.debug("Connection error: " + siteURL)
294 |
295 | except requests.exceptions.ConnectTimeout:
296 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Conn. timeout')
297 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
298 | LOG.debug("Connection Timeout: " + siteURL)
299 |
300 | except requests.exceptions.ReadTimeout:
301 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Conn. readtimeout')
302 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
303 | LOG.debug("Connection Read Timeout: " + siteURL)
304 |
305 | except requests.exceptions.MissingSchema:
306 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Malformed URL')
307 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
308 | LOG.debug("Malformed URL, skipping: " + siteURL)
309 |
310 | except requests.exceptions.InvalidURL:
311 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Malformed URL')
312 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
313 | LOG.debug("Malformed URL, skipping: " + siteURL)
314 |
315 | except requests.exceptions.ChunkedEncodingError:
316 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Can\'t read data')
317 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
318 | LOG.debug("Can't read data, skipping: " + siteURL)
319 |
320 | except requests.exceptions.TooManyRedirects:
321 | SQL.SQLiteInvestigUpdateCode(InvTABLEname, siteURL, now, 'Too many redirects')
322 | SQL.SQLiteInsertStillTryDownload(TABLEname, siteURL)
323 | LOG.debug("Too many redirects, skipping: " + siteURL)
324 |
325 | except KeyboardInterrupt:
326 | LOG.info("Shutdown requested...exiting")
327 | os._exit(0)
328 |
329 | except Exception as e:
330 | LOG.error("Error trying to find kit: " + str(e))
331 |
--------------------------------------------------------------------------------
/stalkphish/tools/logging.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import sys
7 | import logging
8 | from logging.handlers import RotatingFileHandler
9 |
10 |
11 | def Logger(LogFile):
12 | try:
13 | # Create the Logger
14 | logger = logging.getLogger(__name__)
15 | logger.setLevel(logging.DEBUG)
16 |
17 | # stdout logging handler
18 | stdout_logger_handler = logging.StreamHandler()
19 | stdout_logger_handler.setLevel(logging.INFO)
20 |
21 | # Handler for file logging (10Mb rotating logs x10)
22 | file_logger_handler = logging.handlers.RotatingFileHandler(LogFile, mode='a', maxBytes=10000000, backupCount=10)
23 | file_logger_handler.setLevel(logging.DEBUG)
24 |
25 | # Create a Formatter for formatting the log messages
26 | logger_formatter = logging.Formatter('%(asctime)s - %(filename)s - %(levelname)s - %(message)s')
27 |
28 | # Add the Formatter to the Handlers
29 | file_logger_handler.setFormatter(logger_formatter)
30 | stdout_logger_handler.setFormatter(logger_formatter)
31 |
32 | # Add handlers to the Logger
33 | logger.addHandler(file_logger_handler)
34 | logger.addHandler(stdout_logger_handler)
35 |
36 | return logger
37 |
38 | except:
39 | err = sys.exc_info()
40 | print("[!!!] Logging Error: " + str(err))
41 |
--------------------------------------------------------------------------------
/stalkphish/tools/sqlite.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import sqlite3
7 | import sys
8 |
9 |
10 | class SqliteCmd(object):
11 | '''Sqlite3 DB commands'''
12 | def __init__(self, DBfile):
13 | self.conn = sqlite3.connect(DBfile)
14 | self.cur = self.conn.cursor()
15 |
16 | # Main DB operations
17 | def SQLiteCreateTable(self, TABLEname):
18 | '''Creating main Table if not exist'''
19 | self.cur.execute('CREATE TABLE IF NOT EXISTS ' + TABLEname + ' (siteURL TEXT NOT NULL PRIMARY KEY, siteDomain TEXT, IPaddress TEXT, SRClink TEXT, time TEXT, lastHTTPcode TEXT, StillInvestig TEXT, StillTryDownload TEXT, page_hash TEXT, ASN TEXT)')
20 |
21 | def SQLiteInsertPK(self, TABLEname, siteURL, siteDomain, IPaddress, SRClink, now, lastHTTPcode, ASN):
22 | '''Insert new Phishing Kit infos'''
23 | self.cur.execute('INSERT OR IGNORE INTO ' + TABLEname + ' VALUES (?,?,?,?,?,?,?,?,?,?);', (siteURL, siteDomain, IPaddress, SRClink, now, lastHTTPcode, '', '', '', ASN))
24 | self.conn.commit()
25 |
26 | def SQLiteInsertStillInvestig(self, TABLEname, siteURL):
27 | '''Insert StillInvestig changes'''
28 | try:
29 | self.cur.execute('UPDATE ' + TABLEname + ' SET StillInvestig =\'Y\' WHERE siteURL=?;', (siteURL,))
30 | self.conn.commit()
31 | except:
32 | err = sys.exc_info()
33 | print("[!!!] SQLiteInsertStillInvestig Error: " + str(err))
34 | # print("[!!!] SQLiteInsertInvestigPK Error: " + str(err))
35 |
36 | def SQLiteInsertStillTryDownload(self, TABLEname, siteURL):
37 | '''Insert StillTryDownload changes'''
38 | try:
39 | self.cur.execute('UPDATE ' + TABLEname + ' SET StillTryDownload =\'Y\' WHERE siteURL LIKE \'' + siteURL + '%\';')
40 | self.conn.commit()
41 | except:
42 | err = sys.exc_info()
43 | print("[!!!] SQLiteInsertStillTryDownload Error: " + str(err))
44 |
45 | def SQLiteInsertPageHash(self, TABLEname, siteURL, PageHash):
46 | '''Insert StillTryDownload changes'''
47 | try:
48 | self.cur.execute('UPDATE ' + TABLEname + ' SET page_hash=' + "\"" + PageHash + "\"" + ' WHERE siteURL LIKE ' + "\"" + siteURL + "%\"" + ';')
49 | self.conn.commit()
50 | except:
51 | err = sys.exc_info()
52 | print("[!!!] SQLiteInsertPageHash Error: " + str(err))
53 |
54 | def SQLiteVerifyEntry(self, TABLEname, siteURL):
55 | '''Verify if entry still exist'''
56 | res = self.cur.execute('SELECT EXISTS (SELECT 1 FROM ' + TABLEname + ' WHERE siteURL LIKE ' + "\"" + siteURL + "%\"" + ' LIMIT 1);')
57 | fres = res.fetchone()[0]
58 | # 0ô
59 | if fres != 0:
60 | return 1
61 | else:
62 | return 0
63 |
64 | # Investigation DB operations
65 | def SQLiteInvestigCreateTable(self, InvTABLEname):
66 | '''Creating Investigation Table if not exist'''
67 | self.cur.execute('CREATE TABLE IF NOT EXISTS ' + InvTABLEname + ' (siteURL TEXT NOT NULL PRIMARY KEY, siteDomain TEXT, IPaddress TEXT, ZipFileName TEXT, ZipFileHash TEXT, FirstSeentime TEXT, FirstSeenCode TEXT, LastSeentime TEXT, LastSeenCode TEXT, PageTitle TEXT, extracted_emails TEXT)')
68 |
69 | def SQLiteInvestigInsert(self, InvTABLEname, siteURL, siteDomain, IPaddress, now, lastHTTPcode):
70 | '''Insert new URL info into Investigation table'''
71 | try:
72 | self.cur.execute('INSERT OR IGNORE INTO ' + InvTABLEname + '(siteURL, siteDomain, IPaddress, FirstSeentime, FirstSeenCode) VALUES (?,?,?,?,?);', (siteURL, siteDomain, IPaddress, now, lastHTTPcode))
73 | self.conn.commit()
74 | except:
75 | err = sys.exc_info()
76 | print("[!!!] SQLiteInvestigInsert Error: " + str(err))
77 |
78 | def SQLiteInvestigUpdatePK(self, InvTABLEname, siteURL, ZipFileName, ZipFileHash, now, lastHTTPcode):
79 | '''Update new Phishing Kit Investigation infos'''
80 | try:
81 | self.cur.execute('UPDATE ' + InvTABLEname + ' SET ZipFileName=?, ZipFileHash=?, LastSeentime=?, LastSeenCode=? where siteURL=?;', (ZipFileName, ZipFileHash, now, lastHTTPcode, siteURL))
82 | self.conn.commit()
83 | except:
84 | err = sys.exc_info()
85 | print("[!!!] SQLiteInvestigUpdatePK Error: " + str(err))
86 |
87 | def SQLiteInvestigUpdateCode(self, InvTABLEname, siteURL, now, lastHTTPcode):
88 | '''Update new HTTP code infos in Investigation table'''
89 | self.cur.execute('UPDATE ' + InvTABLEname + ' SET LastSeentime=?, LastSeenCode=? where siteURL=?;', (now, lastHTTPcode, siteURL))
90 | self.conn.commit()
91 |
92 | def SQLiteInvestigUpdateTitle(self, InvTABLEname, siteURL, PageTitle):
93 | '''Add Page title in Investigation table'''
94 | self.cur.execute('UPDATE ' + InvTABLEname + ' SET PageTitle=? where siteURL=?;', (PageTitle, siteURL))
95 | self.conn.commit()
96 |
97 | def SQLiteInvestigState(self, TABLEname, siteURL):
98 | '''Update new HTTP code infos in StillInvestig column'''
99 | self.cur.execute('UPDATE ' + TABLEname + ' SET StillInvestig=\'Y\' where siteURL=?;', (siteURL,))
100 | self.conn.commit()
101 |
102 | def SQLiteDownloadedState(self, TABLEname, siteURL):
103 | '''Update new HTTP code infos in StillTryDownload column'''
104 | self.cur.execute('UPDATE ' + TABLEname + ' SET StillTryDownload=\'Y\' where siteURL=?;', (siteURL,))
105 | self.conn.commit()
106 |
107 | def SQLiteInvestigInsertEmail(self, InvTABLEname, extracted_emails, ZipFileName):
108 | self.cur.execute('UPDATE ' + InvTABLEname + ' SET extracted_emails=? where ZipFileName=?;', (extracted_emails, ZipFileName))
109 | self.conn.commit()
110 |
111 | def SQLiteInvestigVerifyEntry(self, InvTABLEname, siteDomain, IPaddress):
112 | '''Verify if entry still exist'''
113 | res = self.cur.execute('SELECT EXISTS (SELECT 1 FROM ' + InvTABLEname + ' WHERE (siteDomain=? AND IPaddress=?)) LIMIT 1;', (siteDomain, IPaddress))
114 | fres = res.fetchone()[0]
115 | # 0ô
116 | if fres != 0:
117 | return 1
118 | else:
119 | return 0
120 |
121 | # Select requests
122 | def SQLiteSearch200(self, TABLEname):
123 | '''Searching for UP phishing kit'''
124 | self.cur.execute('SELECT siteURL, siteDomain, IPaddress FROM ' + TABLEname + ' WHERE lastHTTPcode IS 200;')
125 | return self.cur.fetchall()
126 |
127 | def SQLiteSearchNotInvestig(self, TABLEname):
128 | '''Searching for Still not Investigate URL'''
129 | self.cur.execute('SELECT siteURL, siteDomain, IPaddress FROM ' + TABLEname + ' WHERE StillInvestig IS NOT \'Y\';')
130 | return self.cur.fetchall()
131 |
132 | def SQLiteSearchNotDownloaded(self, TABLEname):
133 | '''Searching for Still not Downloaded PK'''
134 | self.cur.execute('SELECT siteURL, siteDomain, IPaddress FROM ' + TABLEname + ' WHERE StillTryDownload IS NOT \'Y\';')
135 | return self.cur.fetchall()
136 |
137 | def SQLiteSearchDateString(self, TABLEname, DateString):
138 | '''Searching for piece of date string'''
139 | self.cur.execute('SELECT siteURL, siteDomain, IPaddress FROM ' + TABLEname + ' WHERE time like \'%' + DateString + '%\';')
140 | return self.cur.fetchall()
141 |
142 | def SQLiteSearchInvestigSiteURL(self, InvTABLEname, ZipFileName):
143 | '''Searching for SiteURL of downloaded phishing kit'''
144 | self.cur.execute('SELECT siteURL FROM ' + InvTABLEname + ' WHERE ZipFileName IS ?;', (ZipFileName,))
145 | return self.cur.fetchall()
146 |
147 | def __del__(self):
148 | try:
149 | self.cur.close()
150 | self.conn.close()
151 | except:
152 | pass
153 |
154 | def SQLiteClose(self):
155 | self.__del__()
156 |
--------------------------------------------------------------------------------
/stalkphish/tools/sqlitecreate.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | import sys
5 |
6 |
7 | class SqliteCreate:
8 | '''Sqlite3 DB creation'''
9 | def __init__(self, DBfile):
10 | try:
11 | file = open(DBfile, 'w+')
12 | except:
13 | err = sys.exc_info()
14 | print(err)
15 |
--------------------------------------------------------------------------------
/stalkphish/tools/utils.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python3
2 | # -*- coding: utf-8 -*-
3 |
4 | # This file is part of StalkPhish - see https://github.com/t4d/StalkPhish
5 |
6 | import os
7 | import re
8 | import sys
9 | import zipfile
10 | import datetime
11 | import hashlib
12 | import random
13 | from ipwhois.net import Net
14 | from ipwhois.asn import IPASN
15 | import json
16 |
17 |
18 | class TimestampNow:
19 | '''Generate Timestamp'''
20 | def Timestamp(self):
21 | now = datetime.datetime.now().strftime("%c")
22 | return now
23 |
24 |
25 | class VerifyPath:
26 | '''Verify or create path if not exist'''
27 | def VerifyOrCreate(self, path):
28 | try:
29 | os.makedirs(path, mode=0o777, exist_ok=True)
30 | except FileExistsError:
31 | pass
32 | except:
33 | err = sys.exc_info()
34 | print("[!!!] VerifyPath class Error: " + str(err))
35 |
36 |
37 | class SHA256:
38 | '''Generate sha256 hash of a file'''
39 | def hashFile(self, filename, block_size=65536):
40 | h = hashlib.sha256()
41 | try:
42 | with open(filename, 'rb') as f:
43 | buf = f.read(block_size)
44 | while len(buf) > 0:
45 | h.update(buf)
46 | buf = f.read(block_size)
47 | filehash = h.hexdigest()
48 | return filehash
49 | except:
50 | err = sys.exc_info()
51 | print("[!!!] Error in hashFile Class: " + str(err))
52 |
53 |
54 | class UAgent:
55 | '''Choose a random user-agent from a file'''
56 | def ChooseUA(self, UAfile):
57 | try:
58 | with open(UAfile, 'rb') as f:
59 | UA = random.choice(list(f)).strip().decode("utf-8")
60 | return UA
61 | except:
62 | err = sys.exc_info()
63 | print("[!!!] Problem with UserAgent Class: " + str(err))
64 |
65 |
66 | class NetInfo:
67 | '''Retrieve network informations'''
68 | def GetASN(self, IPaddress):
69 | '''Retrieve AS Number of an IP address'''
70 | try:
71 | if IPaddress:
72 | net = Net(IPaddress)
73 | obj = IPASN(net)
74 | res = obj.lookup()
75 | IPasn = json.dumps(res["asn"])
76 | else:
77 | IPasn = None
78 | return IPasn
79 | except:
80 | err = sys.exc_info()
81 | print("[!!!] Problem with NetInfo Class: " + str(err))
82 |
83 |
84 | class ZipSearch:
85 | '''Search for e-mail addresses into Zip file'''
86 | def PKzipSearch(self, InvTABLEname, SQL, LOG, DLDir, savefile):
87 | try:
88 | # print(zipfile.getinfo(savefile))
89 | if zipfile.is_zipfile(savefile):
90 | file = zipfile.ZipFile(savefile, "r")
91 | extracted_emails = []
92 | for name in file.namelist():
93 | if re.findall("php|ini$", name):
94 | scam_email2 = re.findall(r'[\w\.-]+@[\w\.-]+\.\w+', str(file.read(name)))
95 | for mailadd in scam_email2:
96 | if mailadd not in extracted_emails:
97 | extracted_emails.append(mailadd)
98 | # Extracted scammers email
99 | if any(map(len, extracted_emails)):
100 | return [extracted_emails]
101 | else:
102 | LOG.info("No emails in this kit")
103 | pass
104 | else:
105 | LOG.info("{} is not a zip file...".format(savefile))
106 | except Exception as e:
107 | print("[!!!] Problem with PKzipSearch Class: " + str(e))
108 |
--------------------------------------------------------------------------------
/stalkphish/useragent_list.txt:
--------------------------------------------------------------------------------
1 | Mozilla/5.0 (Linux; Android 5.1.1; SM-J320FN Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.107 Mobile Safari/537.36
2 | Mozilla/5.0 (Linux; Android 6.0; ALE-L21 Build/HuaweiALE-L21) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Mobile Safari/537.36
3 | Mozilla/5.0 (Windows NT 10.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
4 | Mozilla/5.0 (Windows NT 6.2; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
5 | Mozilla/5.0 (Windows NT 6.2; WOW64; rv:55.0) Gecko/20100101 Firefox/55.0
6 | Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)
7 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-G361F-ORANGE Build/LMY48B) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.3 Chrome/38.0.2125.102 Mobile Safari/537.36
8 | Mozilla/5.0 (Linux; Android 6.0; M50 Build/MRA58K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.98 Mobile Safari/537.36
9 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-J530F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
10 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-T580 Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.2 Chrome/51.0.2704.106 Safari/537.36
11 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.91 Safari/537.36
12 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0
13 | Mozilla/5.0 (iPhone; CPU iPhone OS 11_0_2 like Mac OS X) AppleWebKit/604.1.38 (KHTML, like Gecko) Version/11.0 Mobile/15A421 Safari/604.1
14 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-G531F Build/LMY48B) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.3 Chrome/38.0.2125.102 Mobile Safari/537.36
15 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-G955F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
16 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/601.7.7 (KHTML, like Gecko) Version/9.1.2 Safari/601.7.7
17 | Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; NP06; rv:11.0) like Gecko
18 | Mozilla/5.0 (Windows Phone 10.0; Android 6.0.1; Microsoft; Lumia 550) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Mobile Safari/537.36 Edge/15.15063
19 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_1_1 like Mac OS X) AppleWebKit/602.2.14 (KHTML, like Gecko) Version/10.0 Mobile/14B100 Safari/602.1
20 | Mozilla/5.0 (Linux; Android 6.0.1; SAMSUNG SM-J510FN Build/MMB29M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
21 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-J330FN Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
22 | Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36
23 | Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; MATM; rv:11.0) like Gecko
24 | Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36 OPR/47.0.2631.71
25 | Mozilla/5.0 (Windows NT 6.3; Win64; x64; Trident/7.0; rv:11.0) like Gecko
26 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-J320F Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
27 | Mozilla/5.0 (Linux; Android 7.0; SM-G935F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Mobile Safari/537.36
28 | Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:55.0) Gecko/20100101 Firefox/55.0
29 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-J320FN Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
30 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-A520F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
31 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36
32 | Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:54.0) Gecko/20100101 Firefox/54.0
33 | Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36
34 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36
35 | Mozilla/4.0 (compatible; MSIE 4.01; Mac_PowerPC)
36 | Mozilla/5.0 (Linux; Android 6.0.1; SAMSUNG SM-A520F Build/MMB29K) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
37 | Mozilla/5.0 (Linux; Android 7.0; SM-G930F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Mobile Safari/537.36
38 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 Safari/603.3.8
39 | Mozilla/5.0 (compatible, MSIE 11, Windows NT 6.3; Trident/7.0; rv:11.0) like Gecko
40 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-A310F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
41 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 Safari/603.3.8
42 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36
43 | Mozilla/5.0 (Linux; Android 6.0.1; SM-A310F Build/MMB29K; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/61.0.3163.98 Mobile Safari/537.36
44 | Mozilla/5.0 (Windows NT 6.1; Win64; x64)
45 | Mozilla/5.0 (Windows NT 6.1; rv:54.0) Gecko/20100101 Firefox/54.0
46 | Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.101 Safari/537.36
47 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-G531F Build/LMY48B) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
48 | Mozilla/5.0 (Linux; U; Android 4.4.2; fr-fr; GT-P5210 Build/KOT49H) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 Safari/534.30
49 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_2 like Mac OS X) AppleWebKit/602.3.12 (KHTML, like Gecko) Version/10.0 Mobile/14C92 Safari/602.1
50 | Mozilla/5.0 (Linux; Android 6.0.1; SAMSUNG SM-T550 Build/MMB29M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/4.0 Chrome/44.0.2403.133 Safari/537.36
51 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-J320FN-ORANGE Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.5 Chrome/38.0.2125.102 Mobile Safari/537.36
52 | Mozilla/5.0 (Linux; Android 5.1.1; SM-G531F Build/LMY48B) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.98 Mobile Safari/537.36
53 | Mozilla/5.0 (Windows NT 10.0; rv:55.0) Gecko/20100101 Firefox/55.0
54 | Mozilla/5.0 (Linux; Android 4.1.2; GT-S7390G Build/JZO54K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.81 Mobile Safari/537.36
55 | Mozilla/5.0 (Linux; Android 6.0.1; SM-J510FN Build/MMB29M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Mobile Safari/537.36
56 | Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:24.0) Gecko/20100101 Firefox/24.0
57 | Mozilla/5.0 (iPhone; CPU iPhone OS 11_0 like Mac OS X) AppleWebKit/604.1.38 (KHTML, like Gecko) Version/11.0 Mobile/15A372 Safari/604.1
58 | Mozilla/5.0 (Linux; Android 6.0.1; ASUS_X007D Build/MMB29M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.98 Mobile Safari/537.36
59 | Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
60 | Mozilla/5.0 (Linux; Android 6.0.1; SM-A500FU Build/MMB29M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.98 Mobile Safari/537.36
61 | Mozilla/5.0 (Linux; Android 6.0; F3311 Build/37.0.A.2.156) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.98 Mobile Safari/537.36
62 | Mozilla/5.0 (Linux; Android 6.0; L-ITE 552 HD Build/MRA58K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.98 Mobile Safari/537.36
63 | Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
64 | Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.78 Safari/537.36 OPR/47.0.2631.55
65 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_0_2 like Mac OS X) AppleWebKit/602.1.50 (KHTML, like Gecko) Version/10.0 Mobile/14A456 Safari/602.1
66 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-G925F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
67 | Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; Trident/6.0)
68 | Mozilla/5.0 (Windows NT 6.3; WOW64; rv:54.0) Gecko/20100101 Firefox/54.0
69 | Mozilla/5.0 (Linux; Android 5.1.1; SM-J320FN Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.125 Mobile Safari/537.36
70 | Mozilla/5.0 (Linux; Android 6.0.1; SAMSUNG SM-A320FL Build/MMB29K) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
71 | Mozilla/5.0 (Linux; Android 5.1.1; SAMSUNG SM-J320FN Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.5 Chrome/38.0.2125.102 Mobile Safari/537.36
72 | Mozilla/5.0 (Linux; Android 5.1.1; SM-G531F Build/LMY48B) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Mobile Safari/537.36
73 | Mozilla/5.0 (Linux; Android 5.1.1; SM-J320FN Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Mobile Safari/537.36
74 | Mozilla/5.0 (Linux; U; Android 4.3; fr-fr; GT-I9300 Build/JSS15J) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 Mobile Safari/534.30
75 | Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:55.0) Gecko/20100101 Firefox/55.0
76 | Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36
77 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_3_1 like Mac OS X) AppleWebKit/603.1.30 (KHTML, like Gecko) Version/10.0 Mobile/14E304 Safari/602.1
78 | Mozilla/5.0 (Linux; Android 6.0.1; SAMSUNG SM-A310F Build/MMB29K) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
79 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-G950F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
80 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-G920F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
81 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.101 Safari/537.36
82 | Mozilla/5.0 (iPad; CPU OS 10_3_2 like Mac OS X) AppleWebKit/603.2.4 (KHTML, like Gecko) Version/10.0 Mobile/14F89 Safari/602.1
83 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-G935F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
84 | Mozilla/5.0 (iPhone; CPU iPhone OS 7_1_2 like Mac OS X) AppleWebKit/537.51.2 (KHTML, like Gecko) Version/7.0 Mobile/11D257 Safari/9537.53
85 | Mozilla/5.0 (Windows NT 6.1; rv:55.0) Gecko/20100101 Firefox/55.0
86 | Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
87 | Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko
88 | Mozilla/5.0 (Windows NT 6.3; WOW64; rv:55.0) Gecko/20100101 Firefox/55.0
89 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.1.2 Safari/603.3.8
90 | Mozilla/5.0 (Windows NT 6.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.112 Safari/537.36
91 | Mozilla/5.0 (Windows NT 6.0; rv:52.0) Gecko/20100101 Firefox/52.0
92 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_2_1 like Mac OS X) AppleWebKit/602.4.6 (KHTML, like Gecko) Version/10.0 Mobile/14D27 Safari/602.1
93 | Mozilla/5.0 (Linux; Android 7.0; SAMSUNG SM-G930F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/5.4 Chrome/51.0.2704.106 Mobile Safari/537.36
94 | Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
95 | Mozilla/5.0 (iPhone; CPU iPhone OS 9_3_5 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Version/9.0 Mobile/13G36 Safari/601.1
96 | Mozilla/5.0 (Windows NT 5.1; rv:52.0) Gecko/20100101 Firefox/52.0
97 | Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36
98 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.79 Safari/537.36 Edge/14.14393
99 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:54.0) Gecko/20100101 Firefox/54.0
100 | Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.112 Safari/537.36
101 | Mozilla/5.0 (Windows NT 6.1; Trident/7.0; rv:11.0) like Gecko
102 | Mozilla/5.0 (Windows NT 10.0; WOW64; rv:54.0) Gecko/20100101 Firefox/54.0
103 | Mozilla/5.0 (iPhone; CPU iPhone OS 11_0_1 like Mac OS X) AppleWebKit/604.1.38 (KHTML, like Gecko) Version/11.0 Mobile/15A402 Safari/604.1
104 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.90 Safari/537.36
105 | Mozilla/5.0 (iPad; CPU OS 9_3_5 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Version/9.0 Mobile/13G36 Safari/601.1
106 | Mozilla/5.0 (Windows NT 6.1; WOW64; rv:55.0) Gecko/20100101 Firefox/55.0
107 | Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_1) AppleWebKit/601.2.4 (KHTML, like Gecko) Version/9.0.1 Safari/601.2.4 facebookexternalhit/1.1 Facebot Twitterbot/1.0
108 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_3_2 like Mac OS X) AppleWebKit/603.2.4 (KHTML, like Gecko) Version/10.0 Mobile/14F89 Safari/602.1
109 | Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko
110 | Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
111 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36 Edge/15.15063
112 | Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
113 | Mozilla/5.0 (iPad; CPU OS 10_3_3 like Mac OS X) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.0 Mobile/14G60 Safari/602.1
114 | Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36
115 | Mozilla/5.0 (Windows NT 10.0; WOW64; rv:55.0) Gecko/20100101 Firefox/55.0
116 | Mozilla/5.0 (iPhone; CPU iPhone OS 10_3_3 like Mac OS X) AppleWebKit/603.3.8 (KHTML, like Gecko) Version/10.0 Mobile/14G60 Safari/602.1
117 |
--------------------------------------------------------------------------------
|