├── README.md
├── clamd
├── clamd.md
└── pyclamd-test.py
├── dnsbl
└── dnsbl.py
├── docs
├── clamd-installation.md
├── install-notes.txt
├── notes.txt
├── pestudio.md
└── yara-installation.md
├── dumpbin
├── dumpbin.conf
├── dumpbin.md
└── dumpbin.py
├── hash-reputation
├── hash-services.py
└── hash-services.yaml
├── misc
└── extract-domains.py
├── pe-studio
├── pestudio.md
├── pestudio.py
└── test.xml
├── reputation-services
├── hash_services.py
├── main.py
├── services-settings.yaml
├── shodan_services.py
└── virustotal_services.py
├── site-reputation
├── alienvault-otx-check.py
├── bluecoat-site-reputation.py
└── shodan-check.py
├── static-analysis
├── malware-static-analysis.py
└── yara_checks.py
└── urlvoid-ipvoid-checks
├── ip_domain_check.py
└── settings.yaml
/README.md:
--------------------------------------------------------------------------------
1 | # malware-static-analyzer
2 |
3 | It is a malware analyzer written in Python2.x for detection of malicious files.
4 |
5 | ### Features
6 | * Detect presence of IP addresses and check if IP is blacklisted using virustotal, ipvoid ( IP reputation checks)
7 | * Detect presence of Domains and check if they are blacklisted in databases like virustotal, urlvoid. (Domain reputation checks)
8 | * Searches for possible e-mail addresses (E-mail reputation checks using SpamAssassin)
9 | * Get results from Virustotal database (Virustotal integration using Virustotal Public API)
10 | * Checks if the file is packed using software packer programs
11 | * Yara rule based checks for viruses, exploits, web shells, anti-debug functionalities etc.
12 | * Analyze PE file header and sections(number of sections, entropy of sections, suspicious section names, suspicious flags in characterstics of PE file etc)
13 | * Detection of anti-virtualization techniques
14 | * Detection of Windows API calls commonly used by malware
15 | * JSON based report
16 | * Checks for viruses,spyware using clamd and pyclamd
17 | * PEStudio integration and extraction of malicious indicators from PEStudio report
18 | * Checks for compiler flags in EXE/DLL. Most reputed programs usually make use of these flags.
19 | * Dynamic base(ASLR)
20 | * NX Compatible(DEP)
21 | * Guard(CFG)
22 | * (Look them in optional header values section)
23 |
24 |
25 | ### Usage
26 |
27 | ### To do
28 | * Clamd integration
29 | * Virustotal integration - file report(hash report), ip report, url report, domain report
30 | * Analyze ELF file for linux malware analysis using tools such as ldd,readlef, string etc
31 | * Find strings in PE file using sysinternal 'strings' utility(https://docs.microsoft.com/en-us/sysinternals/downloads/strings)
32 | * Check if IP address or domain is listed in DNSBL servers like spamhaus etc.
33 | * PE studio professional for initial malware assessment - purchase license - https://www.winitor.com/tools/pestudio/current/pestudio.zip
34 | * CISCO threat intelligence search - https://talosintelligence.com/reputation_center/lookup?search=igcar.gov.in
35 |
36 | ### Many Thanks to wonderful people behind the following projects:
37 | * https://github.com/secrary/SSMA
38 | * https://github.com/ClickSecurity/data_hacking/blob/master/pefile_classification/pe_features.py#L317
39 | * https://github.com/hiddenillusion/AnalyzePE/blob/master/AnalyzePE.py
40 | * https://github.com/Ice3man543/MalScan/blob/master/malscan.py
41 |
42 | ### Clamd and pyclamd installation
43 | * https://gist.github.com/AfroThundr3007730/91a3e2cbfc848088b70d731133ff3f2a
44 | * https://linux-audit.com/install-clamav-on-centos-7-using-freshclam/
45 | * https://geekdecoder.com/clamav-on-centos-6/
46 | * https://www.decalage.info/python/pyclamd
47 | * https://www.moshe-schmidt.de/linux/clamav-permission-denied-how-to-fix-it/
48 | * https://frankfu.click/web-develop/python/autoadmin-chapter4-python-and-security.html
49 |
--------------------------------------------------------------------------------
/clamd/clamd.md:
--------------------------------------------------------------------------------
1 | ## Clamd and pyclamd installation on CentOS 7.0 or later
2 | #### Enable EPEL repository:
3 | ```
4 | [root@joshi]# yum install epel-release
5 | [root@joshi]# yum install clamav clamd clamav-data
6 | [root@joshi]# yum install clamav-scanner
7 | ```
8 | Typical rpms you will find on the system:
9 | ```
10 | [root@joshi]# rpm -qa |grep clam
11 | clamav-server-0.99.4-1.el7.x86_64
12 | clamav-scanner-0.99.4-1.el7.noarch
13 | clamav-scanner-systemd-0.99.4-1.el7.noarch
14 | clamav-data-0.99.4-1.el7.noarch
15 | clamav-filesystem-0.99.4-1.el7.noarch
16 | clamav-0.99.4-1.el7.x86_64
17 | clamav-lib-0.99.4-1.el7.x86_64
18 | clamav-server-systemd-0.99.4-1.el7.noarch
19 | ```
20 | #### Write logs to a separate file and change its owner to clamscan user:
21 | ```
22 | [root@joshi]# touch /var/log/clamd.scan
23 | [root@joshi]# chown clamscan:clamscan /var/log/clamd.scan
24 | ```
25 | #### Create unix socket for communication
26 | ```
27 | [root@joshi]# touch /var/run/clamd.scan/clamd.sock
28 | [root@joshi]# chown clamscan:clamscan /var/run/clamd.scan/clamd.sock
29 | [root@joshi]# ls -l /var/run/clamd.scan/clamd.sock
30 | srw-rw-rw- 1 clamscan clamscan 0 May 14 13:05 /var/run/clamd.scan/clamd.sock
31 | ```
32 | Now, rename clamd services:
33 | ```
34 | [root@joshi]# ls -l /usr/lib/systemd/system/clam*
35 | -rw-r--r-- 1 root root 135 May 14 13:04 /usr/lib/systemd/system/clamd@scan.service
36 | -rw-r--r-- 1 root root 217 May 14 13:03 /usr/lib/systemd/system/clamd@.service
37 |
38 | # mv /usr/lib/systemd/system/clamd@scan.service /usr/lib/systemd/system/clamdscan.service
39 | # mv /usr/lib/systemd/system/clamd@.service /usr/lib/systemd/system/clamd.service
40 | ```
41 | Do not forget to change service name in clamdscan.service (remove @ in service name)
42 | #### Clamdscan service
43 | ```
44 | [root@joshi]# cat /usr/lib/systemd/system/clamdscan.service
45 | .include /lib/systemd/system/clamd.service
46 |
47 | [Unit]
48 | Description = Generic clamav scanner daemon
49 |
50 | [Install]
51 | WantedBy = multi-user.target
52 | ```
53 | #### Clamd service
54 | ```
55 | [root@joshi]# cat /usr/lib/systemd/system/clamd.service
56 | [Unit]
57 | Description = clamd scanner daemon
58 | After = syslog.target nss-lookup.target network.target
59 |
60 | [Service]
61 | Type = forking
62 | ExecStart = /usr/sbin/clamd -c /etc/clamd.d/scan.conf
63 | Restart = on-failure
64 | PrivateTmp = true
65 | ```
66 | ### enable and start clamd services
67 | ```
68 | # systemctl enable clamdscan.service
69 | # systemctl start clamdscan.service
70 | ## systemctl enable clamd.service -- this step is not required
71 | # systemctl start clamd.service
72 | ```
73 | ### Typical Clamd configuration file:
74 | ```
75 | [root@joshi]# cat /etc/clamd.d/scan.conf |grep -v ^#|grep -v ^$
76 | LogFile /var/log/clamd.scan
77 | LogSyslog yes
78 | LocalSocket /var/run/clamd.scan/clamd.sock
79 | User clamscan
80 | AllowSupplementaryGroups yes
81 | ```
82 | Note - If this above file contains a line with 'Example', comment it with '#' or remove the line.
83 |
84 | ### Now, scan home directory for any viruses:
85 | ```
86 | [root@joshi]# clamscan -r /home/joshi/
87 | ```
88 | ### Check logs by default under /var/log/messages:
89 | ```
90 | [root@ joshi]# cat /var/log/messages
91 | ```
92 | Typical errors that you might encounter while installing clamd are:
93 | ### lstat() failed: Permission denied. ERROR
94 | ```
95 | [root@joshi]# clamdscan -c /etc/clamd.d/scan.conf clamd.conf
96 | /home/joshi/clamd.conf: lstat() failed: Permission denied. ERROR
97 |
98 | ----------- SCAN SUMMARY -----------
99 | Infected files: 0
100 | Total errors: 1
101 | Time: 0.000 sec (0 m 0 s)
102 |
103 | This happens when directory or file permissions are not OK. e.g.
104 |
105 | [root@joshi]# ls -l /home
106 | total 4
107 | drwx------ 9 joshi email 4096 May 14 12:44 joshi
108 | ```
109 | You have to remember - basically, there are two commands: clamscan and clamdscan.
110 | * clamdscan uses clamd daemon service which is run under clamscan user. The user permissions will come in picture here!
111 | * clamscan runs as a normal program on a host machine.
112 |
113 | ### Install python client for clamd - pyclamd
114 | ```
115 | [root@joshi]# easy_install --index-url=http://osrepo.gov.in/pypi/simple pip
116 | [root@joshi]# pip3 install pyclamd --trusted-host=osrepo.gov.in
117 | ```
118 |
--------------------------------------------------------------------------------
/clamd/pyclamd-test.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import pyclamd
3 | import logging
4 | import sys
5 | # setup logging
6 | logging.basicConfig(level=logging.INFO)
7 | log = logging.getLogger(__name__)
8 |
9 | clam_instance = None
10 | try:
11 | clam_instance = pyclamd.ClamdUnixSocket('/var/run/clamd.scan/clamd.sock')
12 | clam_instance.ping()
13 | except Exception,e:
14 | logger.error("Error while establishing connection with Clamd daemon")
15 | sys.exit(1)
16 | if clam_instance:
17 | r = clam_instance.scan_file('/home/joshi/bootstrap-4.0.0-dist.zip')
18 | # if no virus, response is None. Otherwise virus information is returned in response.
19 | print r is None
20 |
21 | # eicar test
22 | # create eicar file
23 | void = open('/home/joshi/eicar','wb').write(clam_instance.EICAR())
24 |
25 | r=clam_instance.scan_file('/home/joshi/eicar')
26 | #Out[16]: {'/home/joshi/eicar': ('FOUND', 'Eicar-Test-Signature')}
27 |
28 | print r['/home/joshi/eicar']
29 | #('FOUND', 'Eicar-Test-Signature')
30 |
31 | print r['/home/joshi/eicar'][0]
32 | #'FOUND'
33 |
34 |
35 |
--------------------------------------------------------------------------------
/dnsbl/dnsbl.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 |
3 | import gevent
4 | from gevent import socket
5 | import socket
6 |
7 | DNSBL_servers = [
8 | 'cbl.abuseat.org',
9 | 'zen.spamhaus.org',
10 | 'bogons.cymru.com',
11 | 'bl.spamcop.net',
12 | 'aspews.ext.sorbs.net',
13 | 'b.barracudacentral.org',
14 | # 'blacklist.woody.ch',
15 | # 'combined.abuse.ch',
16 | # 'dnsbl.ahbl.org',
17 | # 'dnsbl.inps.de',
18 | # 'dnsbl.njabl.org',
19 | # 'dnsbl.sorbs.net',
20 | # 'drone.abuse.ch',
21 | # 'duinv.aupads.org',
22 | # 'http.dnsbl.sorbs.net'
23 | # 'ips.backscatterer.org',
24 | # 'misc.dnsbl.sorbs.net',
25 | # 'orvedb.aupads.org',
26 | # 'pbl.spamhaus.org',
27 | # 'sbl.spamhaus.org',
28 | # 'short.rbl.jp',
29 | # 'smtp.dnsbl.sorbs.net',
30 | # 'socks.dnsbl.sorbs.net',
31 | # 'spam.abuse.ch',
32 | # 'spam.dnsbl.sorbs.net',
33 | # 'spamrbl.imp.ch',
34 | # 'web.dnsbl.sorbs.net',
35 | # 'wormrbl.imp.ch',
36 | # 'xbl.spamhaus.org',
37 | ]
38 |
39 | class DNSBL_check():
40 | """A DNSBL class for checking existance of ip in DNSBL database."""
41 |
42 | def __init__(self, ip=None,timeout=3):
43 | self.ip = ip
44 | self.dnsbl_servers = DNSBL_servers
45 | self.timeout = timeout
46 |
47 | def form_query(self, dnsbl_server):
48 | reversed_ip = '.'.join(reversed(self.ip.split('.')))
49 | return '{reversed_ip}.{server}.'.format(reversed_ip=reversed_ip, server=dnsbl_server)
50 |
51 | def query(self, link):
52 | try:
53 | result = socket.gethostbyname(self.form_query(link))
54 | except Exception:
55 | result = False
56 | return link, result
57 |
58 | def check(self):
59 | results = []
60 | dnsbl_checks = [gevent.spawn(self.query, server_link) for server_link in self.dnsbl_servers]
61 | gevent.joinall(dnsbl_checks, self.timeout)
62 | for item in dnsbl_checks:
63 | if item.successful():
64 | results.append(item.value)
65 | else:
66 | results.append((item.args[0], None))
67 | return results
68 |
69 | ### for testing purpose
70 | #dnsbl_instance = DNSBL_check('59.185.236.31',2)
71 | #print dnsbl_instance.check()
72 |
--------------------------------------------------------------------------------
/docs/clamd-installation.md:
--------------------------------------------------------------------------------
1 | ## Clamd and pyclamd installation on CentOS 7.0 or later
2 | #### Enable EPEL repository:
3 | ```
4 | [root@joshi]# yum install epel-release
5 | [root@joshi]# yum install clamav clamd clamav-data
6 | [root@joshi]# yum install clamav-scanner
7 | ```
8 | Typical rpms you will find on the system:
9 | ```
10 | [root@joshi]# rpm -qa |grep clam
11 | clamav-server-0.99.4-1.el7.x86_64
12 | clamav-scanner-0.99.4-1.el7.noarch
13 | clamav-scanner-systemd-0.99.4-1.el7.noarch
14 | clamav-data-0.99.4-1.el7.noarch
15 | clamav-filesystem-0.99.4-1.el7.noarch
16 | clamav-0.99.4-1.el7.x86_64
17 | clamav-lib-0.99.4-1.el7.x86_64
18 | clamav-server-systemd-0.99.4-1.el7.noarch
19 | ```
20 | #### Write logs to a separate file and change its owner to clamscan user:
21 | ```
22 | [root@joshi]# touch /var/log/clamd.scan
23 | [root@joshi]# chown clamscan:clamscan /var/log/clamd.scan
24 | ```
25 | #### Create unix socket for communication
26 | ```
27 | [root@joshi]# touch /var/run/clamd.scan/clamd.sock
28 | [root@joshi]# chown clamscan:clamscan /var/run/clamd.scan/clamd.sock
29 | [root@joshi]# ls -l /var/run/clamd.scan/clamd.sock
30 | srw-rw-rw- 1 clamscan clamscan 0 May 14 13:05 /var/run/clamd.scan/clamd.sock
31 | ```
32 | Now, rename clamd services:
33 | ```
34 | [root@joshi]# ls -l /usr/lib/systemd/system/clam*
35 | -rw-r--r-- 1 root root 135 May 14 13:04 /usr/lib/systemd/system/clamd@scan.service
36 | -rw-r--r-- 1 root root 217 May 14 13:03 /usr/lib/systemd/system/clamd@.service
37 |
38 | # mv /usr/lib/systemd/system/clamd@scan.service /usr/lib/systemd/system/clamdscan.service
39 | # mv /usr/lib/systemd/system/clamd@.service /usr/lib/systemd/system/clamd.service
40 | ```
41 | Do not forget to change service name in clamdscan.service (remove @ in service name)
42 | #### Clamdscan service
43 | ```
44 | [root@joshi]# cat /usr/lib/systemd/system/clamdscan.service
45 | .include /lib/systemd/system/clamd.service
46 |
47 | [Unit]
48 | Description = Generic clamav scanner daemon
49 |
50 | [Install]
51 | WantedBy = multi-user.target
52 | ```
53 | #### Clamd service
54 | ```
55 | [root@joshi]# cat /usr/lib/systemd/system/clamd.service
56 | [Unit]
57 | Description = clamd scanner daemon
58 | After = syslog.target nss-lookup.target network.target
59 |
60 | [Service]
61 | Type = forking
62 | ExecStart = /usr/sbin/clamd -c /etc/clamd.d/scan.conf
63 | Restart = on-failure
64 | PrivateTmp = true
65 | ```
66 | ### enable and start clamd services
67 | ```
68 | # systemctl enable clamdscan.service
69 | # systemctl start clamdscan.service
70 | ## systemctl enable clamd.service -- this step is not required
71 | # systemctl start clamd.service
72 | ```
73 | ### Typical Clamd configuration file:
74 | ```
75 | [root@joshi]# cat /etc/clamd.d/scan.conf |grep -v ^#|grep -v ^$
76 | LogFile /var/log/clamd.scan
77 | LogSyslog yes
78 | LocalSocket /var/run/clamd.scan/clamd.sock
79 | User clamscan
80 | AllowSupplementaryGroups yes
81 | ```
82 | Note - If this above file contains a line with 'Example', comment it with '#' or remove the line.
83 |
84 | ### Now, scan home directory for any viruses:
85 | ```
86 | [root@joshi]# clamscan -r /home/joshi/
87 | ```
88 | ### Check logs by default under /var/log/messages:
89 | ```
90 | [root@ joshi]# cat /var/log/messages
91 | ```
92 | Typical errors that you might encounter while installing clamd are:
93 | ### lstat() failed: Permission denied. ERROR
94 | ```
95 | [root@joshi]# clamdscan -c /etc/clamd.d/scan.conf clamd.conf
96 | /home/joshi/clamd.conf: lstat() failed: Permission denied. ERROR
97 |
98 | ----------- SCAN SUMMARY -----------
99 | Infected files: 0
100 | Total errors: 1
101 | Time: 0.000 sec (0 m 0 s)
102 |
103 | This happens when directory or file permissions are not OK. e.g.
104 |
105 | [root@joshi]# ls -l /home
106 | total 4
107 | drwx------ 9 joshi email 4096 May 14 12:44 joshi
108 | ```
109 | You have to remember - basically, there are two commands: clamscan and clamdscan.
110 | * clamdscan uses clamd daemon service which is run under clamscan user. The user permissions will come in picture here!
111 | * clamscan runs as a normal program on a host machine.
112 |
113 | ### Install python client for clamd - pyclamd
114 | ```
115 | [root@joshi]# easy_install --index-url=http://osrepo.gov.in/pypi/simple pip
116 | [root@joshi]# pip3 install pyclamd --trusted-host=osrepo.gov.in
117 | ```
118 |
--------------------------------------------------------------------------------
/docs/install-notes.txt:
--------------------------------------------------------------------------------
1 | Determine file's MIME type:
2 | ----------------------------
3 |
4 | You can also install python-magic package available as a part of linux distributions.
5 | sudo apt install python-magic
6 |
7 | Source - https://github.com/PSJoshi/python-magic
8 | sudo pip install python-magic
9 |
10 | ssdeep installation
11 | --------------------
12 | sudo apt install ssdeep
13 | sudo apt install python-cffi
14 | sudo apt install python-all-dev
15 | sudo apt install libffi-dev python-dev
16 | sudo apt install libfuzzy-dev
17 | sudo pip install ssdeep
18 |
19 | Instructions on how to install on different platform are available here:
20 | http://python-ssdeep.readthedocs.io/en/latest/installation.html
21 |
22 | sudo pip install pefile
23 |
24 | It is better if you compile yara from source rather than using latest yara package in repository. However, if you wish to use existing yara package from ubuntu distribution, then do this:
25 |
26 | sudo apt install libssl-dev
27 | sudo apt install yara
28 | sudo pip install yara-python
29 |
30 |
31 |
32 |
--------------------------------------------------------------------------------
/docs/notes.txt:
--------------------------------------------------------------------------------
1 | Static file analysis
2 |
3 | Densityscout
4 | DensityScout tool calculates density(like entropy) of files in file systems path and detect potential malware on the system. Typically malware authors pack their code with variety of packer tools or adapt obfuscation and encryption techniques to protect their code. Most of the windows files are not encrypted and/or packed and using this tool, it is possible to find out potential malicious files.
5 |
6 | This software can be downloaded from - https://www.cert.at/downloads/software/densityscout_en.html
7 | It is available on Windows as well as Linux platform.
8 |
9 | Yara
10 | The file will be checked against yara rule to find potential webshells, exploits, anti-virutal machine detection signatures.
11 |
12 | Virustotal
13 | The file will be checked against Virustotal database to detect possible infection(s).
14 |
15 | The file will be checked for address space layout randomization and other compiler flags.
16 |
17 | Signature check
18 | check if the file is signed or not
19 |
20 | Clamd scan
21 | Check if the file contains virus using clamd scan
22 |
23 | PEStudio check
24 | Check the indicators as reported by PEStudio and check the malicious quotient.
25 |
26 | Aim is to build a repository of features that can be used in doing static analysis of file using machine learning.
27 |
28 |
29 |
30 | # ENCODED DATA
31 | pat_hex = Pattern_re('Hex blob', r'([A-F0-9][A-F0-9]|[a-f0-9][a-f0-9]){16,}', weight=1)
32 | pat_b64 = Pattern_re('Base64 blob', r'(?:[A-Za-z0-9+/]{4}){2,}(?:[A-Za-z0-9+/]{2}[AEIMQUYcgkosw048]=|[A-Za-z0-9+/][AQgw]==)', weight=1)
33 |
34 | Analyze PE file header and sections (number of sections, entropy of sections/PE file, suspicious section names, suspicious flags in the characteristics of the PE file, etc.)
35 | Searches for possible domains, e-mail addresses, IP addresses in the strings of the file.
36 | Checks if domains are blacklisted based on abuse.ch’s Ransomware Domain Blocklist and malwaredomains.com’s blocklist.
37 | Find suspicious API calls
38 | Find anti-VM features
39 | Find anti-debugging features
40 | Find presence of IP/domains
41 | Find entropy
42 | Check file using YARA signatures
43 | Test digital signature using sysinternal sigcheck
44 | Check compiler flags using densityscout
45 | Check file using virustotal
46 |
47 | Other interesting repositories:
48 | https://github.com/secrary/SSMA
49 | https://github.com/raden/pi-ngaji/blob/master/pi-ngaji.py
50 | https://github.com/naisofly/Static-Malware-Analysis
51 | https://github.com/Sp3ctr3/PyTriage
52 | https://github.com/ShilpeshTrivedi/MAUPS/blob/master/MAUPS.py
53 | https://github.com/foreni-packages/peframe
54 | https://github.com/hiddenillusion/AnalyzePE
55 |
56 | Sandbox evasion techniques - http://unprotect.tdgt.org/index.php/Sandbox_Evasion
57 |
58 | https://github.com/blackfist/malware-ml
59 | Malware analysis course - https://github.com/RPISEC/Malware
60 | https://github.com/ClickSecurity/data_hacking/blob/master/pefile_classification/pe_features.py
61 |
62 | Some interesting papers and packages:
63 | -------------------------------------
64 | oledump.py
65 | import pefile,peutils
66 |
67 | Open source malware static analysis
68 |
69 | PEFrame
70 | Pyew
71 | mastiff
72 |
73 | Find average values for features:
74 |
75 | #Symbols
76 | Major linker version
77 | Initialize data size
78 | Major image version
79 | DLL char
80 | characterstic in COFF file header
81 |
82 |
83 | Evaluation of automated static analysis tools for malware detection - https://www.researchgate.net/publication/319719981_Evaluation_of_automated_static_analysis_tools_for_malware_detection_in_Portable_Executable_files
84 |
85 | Automated static analysis using python - https://www.youtube.com/watch?v=tNxJzx754BI
86 |
87 | Static and dynamic analysis using malware hunter - https://github.com/abdesslem/malwareHunter
88 |
89 | Python malware analysis library - https://github.com/keithjjones/malgazer
90 |
91 | List of tools for Malware analysis - https://andreafortuna.org/cybersecurity/malware-analysis-my-own-list-of-tools-and-resources/
92 |
93 | Malware analysis with multiple features - https://www.researchgate.net/publication/224849898_Malware_Analysis_with_Multiple_Features
94 | File scanning frameworks - https://www.decalage.info/fr/scan_frameworks
95 |
96 | Feature selection and improving classification performance for malware detection -
97 | https://digitalcommons.kennesaw.edu/cgi/viewcontent.cgi?referer=https://www.google.co.in/&httpsredir=1&filename=0&article=1009&context=cs_etd&type=additional
98 |
99 | Malware detection using machine learning algorithms - https://arxiv.org/pdf/1205.3062.pdf
100 |
101 | Creating distributed malware analysis toolchain - https://www.ieeelcn.org/lcn42demos/1570387359.pdf
102 | Investigation of malicious PE files using supervised learning - http://dl.ifip.org/db/conf/im/im2017-ws1-annet/160.pdf
103 |
104 | Various tools for malware analysis - https://blog.because-security.com/t/malware-analysis-forensics-analyze-malicious-documents/190
105 | Malware detection using machine learning approach - https://github.com/prk54/malware-detection-machine-learning-approach
106 | Machine learning for malware analysis - http://on-demand.gputechconf.com/gtc/2017/presentation/s7739-andrew-davis-machine-learning-for-malware-analysis.pdf
107 | Automating static analysis using Laika-Boss - https://www.sans.org/reading-room/whitepapers/malicious/automating-static-file-analysis-metadata-collection-laika-boss-38295
108 |
109 | ENISA notes on static analysis - https://www.enisa.europa.eu/topics/trainings-for-cybersecurity-specialists/online-training-material/technical-operational
110 |
111 | A Learning model to detect maliciousness of PE file - https://www.sciencedirect.com/science/article/pii/S1319157817300149
112 |
113 | PE analysis tools - https://blog.malwarebytes.com/threat-analysis/2014/05/five-pe-analysis-tools-worth-looking-at/
114 |
115 | Dataset for training static PE malware machine learning model - https://arxiv.org/abs/1804.04637
116 |
117 | Presentations:
118 | https://conference.hitb.org/hitbsecconf2011kul/materials/D1%20SIGINT%20-%20Muhammad%20Najmi%20Ahmad%20Zabidi%20-%20Compiling%20Features%20for%20Malcious%20Binaries.pdf
119 | https://www.nsc.liu.se/joint-sec-training-media/forensics.pdf
120 | https://www.blackhat.com/presentations/bh-dc-07/Kendall_McMillan/Presentation/bh-dc-07-Kendall_McMillan.pdf
121 | https://www.slideshare.net/j0b1n/introduction-to-malware-analysis-42732148
122 | http://pelp.sourceforge.net/StaticMalwareDetection.pdf
123 | https://www.first.org/resources/papers/conf2016/FIRST-2016-38.pdf
124 |
125 | Paper on static analysis - https://www.blackhat.com/docs/eu-15/materials/eu-15-KA-Automating-Linux-Malware-Analysis-Using-Limon-Sandbox-wp.pdf
126 |
127 | Real time classification of malicious executables - http://ro.ecu.edu.au/cgi/viewcontent.cgi?article=1137&context=ism
128 |
129 | System call based anomaly detection using python - https://github.com/shayanb/Stide-ADS
130 | Behaviour based malware detection system for Android - https://github.com/kapilkchaurasia/Behavior-Based-Malware-Detection-System-for-Android
131 | Automatic analysis of malware behaviour using machine learning - https://github.com/bienkma/DetectionMalwareBehavior
132 | Repository of live malwares - https://github.com/ytisf/theZoo
133 |
134 | Extract file from another file(pe-carve) - https://github.com/MalwareLu/tools/blob/master/pe-carv.py
135 | pe-check - wrapper around pe module - http://blog.didierstevens.com/2014/11/18/update-pecheck-py-version-0-4-0/
136 |
137 | Static malware analysis - https://github.com/devwerks/Static-Malware-Analyses
138 |
139 | Java based PE file malware analysis tool - https://github.com/katjahahn/PortEx
140 | Master thesis on PE file malware static analysis - https://raw.githubusercontent.com/katjahahn/PortEx/master/masterthesis/masterthesis.pdf
141 |
142 | Ember white paper from Endgame - https://arxiv.org/pdf/1804.04637.pdf
143 |
144 | Linux malware analysis system - https://github.com/Tencent/HaboMalHunter
145 |
146 | elf tool analysis examples using pyelftools - https://github.com/eliben/pyelftools/tree/master/examples
147 |
148 | Anomaly types to watch for:
149 | Thread local storage(TLS) callbacks
150 | Low or high number of sections
151 | Suspicious section names
152 | Small section length
153 | Zero section length
154 | Zero timestamp
155 | Timestamp too old(Before 2000)
156 | Future timestamp
157 | High file entropy(>7.0)
158 | Suspicious MS-DOS stub
159 | Suspicious image base
160 | Fake entry point
161 |
--------------------------------------------------------------------------------
/docs/pestudio.md:
--------------------------------------------------------------------------------
1 | ## PEStudio (https://www.winitor.com/index.html)
2 | For malware analysis, PEStudio is a great tool for novice as well as experienced security people. The static analysis tool scans the file and creates a nicely organized list of file headers information and alerts the user if there are any anomalies in the file headers. Typical output contains:
3 | * Hashes – md5, SHA-1, and SHA-256 hashes of file
4 | * Virustotal -PEStudio automatically submits file hash to virustotal and list its results.
5 | * DOS-STUB - This section displays DOS stub - section between MZ and PE header and this section is responsible for famous message - "This file requires Windows to run" in case user is trying to run the program on old DOS system or non-DOS system.
6 | * File-header - General information about file header - CPU architecture, 32-bit/64-bit, size of optional header, compiler options etc.
7 | * Directories - Relative virutal address locations and size of each.
8 | * Sections - Sections in file. Malicious files may have strange section names and PE studio will display it in different color.
9 | * Libraries - DLL files that the program uses/references while being analyzed.
10 | * Imports - List of all OS/Win32 API calls the program uses. This gives us idea about program capabilities and the possible use-cases. e.g. if there are many calls like socket, connect, send, it is highly likely that program is using network communication in a big way.
11 | * Exports - These are functions that PE file exports for other PE files to use. Many times, there is only one export but in some DLL files case, you will see many export functions if the DLL is being used by many other programs.
12 | * Resources - It list out the resources like bitmaps, icons used by program.
13 | * Strings - This parses each string present in the file into a nice and sortable list. It also checks the list against blacklisted strings and raise the alert when any suspicious string is found.
14 | * Debug, version, certificate, overlay etc - The program also checks if any debugging options are enabled, checks its version and certificate authrority etc. Certificate checks are useful as it is possible to raise alert in case a Microsoft Windows file (usually owned by Microsoft) is signed by some third party/ Unknown party.
15 |
16 | The features that are available in PEstudio are described here:
17 | https://www.winitor.com/features.html
18 |
19 | Some other useful tools that are commonly used for static analysis are listed here - https://toddcullumresearch.com/2017/07/01/todds-giant-intro-of-windows-malware-analysis-tools/
20 |
--------------------------------------------------------------------------------
/docs/yara-installation.md:
--------------------------------------------------------------------------------
1 | ### Installing Yara from source on CentOS
2 |
3 | * Enable EPEL repository and install EPEL rpm
4 | ```
5 | # yum install epel-release
6 | ```
7 | * Now, install all required packages.
8 | ```
9 | [root@joshi]# yum groupinstall "Development tools"
10 | [root@joshi yara-3.7.1]# yum install jansson-devel jansson
11 | [root@joshi yara-3.7.1]# yum install file-devel
12 | [root@joshi yara-3.7.1]# yum install openssl-devel
13 | [root@joshi yara-3.7.1]# yum install python36 python36-devel
14 | ```
15 | * Download yara from https://github.com/VirusTotal/yara/releases and extract tar file. Then, do usual steps of compilation - config, make and make install.
16 |
17 | ```
18 | [root@joshi yara-3.7.1]# tar -zxvf yara-3.7.1.tar.gz
19 | [root@joshi joshi]# cd yara-3.7.1/
20 |
21 | [root@joshi yara-3.7.1]# ./bootstrap.sh
22 | [root@joshi yara-3.7.1]# ./configure --enable-magic
23 | [root@joshi yara-3.7.1]# make
24 | [root@joshi yara-3.7.1]# make install
25 | [root@joshi yara-3.7.1]# make check
26 |
27 | you will get result like this:
28 | ============================================================================
29 | Testsuite summary for yara 3.7.1
30 | ============================================================================
31 | # TOTAL: 7
32 | # PASS: 7
33 | # SKIP: 0
34 | # XFAIL: 0
35 | # FAIL: 0
36 | # XPASS: 0
37 | # ERROR: 0
38 | ============================================================================
39 | ```
40 |
41 | That's all ... Now, you can play with yara and malware!!!
42 |
43 | ```
44 | [root@joshi yara-3.7.1]# /usr/local/bin/yara --help
45 | YARA 3.7.1, the pattern matching swiss army knife.
46 | Usage: yara [OPTION]... [NAMESPACE:]RULES_FILE... FILE | DIR | PID
47 |
48 | Mandatory arguments to long options are mandatory for short options too.
49 |
50 | -t, --tag=TAG print only rules tagged as TAG
51 | -i, --identifier=IDENTIFIER print only rules named IDENTIFIER
52 | -c, --count print only number of matches
53 | -n, --negate print only not satisfied rules (negate)
54 | -D, --print-module-data print module data
55 | -g, --print-tags print tags
56 | -m, --print-meta print metadata
57 | -s, --print-strings print matching strings
58 | -L, --print-string-length print length of matched strings
59 | -e, --print-namespace print rules' namespace
60 | -p, --threads=NUMBER use the specified NUMBER of threads to scan a directory
61 | -l, --max-rules=NUMBER abort scanning after matching a NUMBER of rules
62 | -d VAR=VALUE define external variable
63 | -x MODULE=FILE pass FILE's content as extra data to MODULE
64 | -a, --timeout=SECONDS abort scanning after the given number of SECONDS
65 | -k, --stack-size=SLOTS set maximum stack size (default=16384)
66 | --max-strings-per-rule=NUMBER set maximum number of strings per rule (default=10000)
67 | -r, --recursive recursively search directories
68 | -f, --fast-scan fast matching mode
69 | -w, --no-warnings disable warnings
70 | --fail-on-warnings fail on warnings
71 | -v, --version show version information
72 | -h, --help show this help and exit
73 |
74 | Send bug reports and suggestions to: vmalvarez@virustotal.com.
75 | ```
76 |
77 | Note:
78 | If you enable EPEL and are happy with the version of yara as a part of distribution,
79 | installation of yara is simple.
80 | ```
81 | [root@joshi] yum search yara
82 | yara-devel.x86_64 : Development files for yara
83 | yara-doc.noarch : Documentation for yara
84 | python-pyarabic.noarch : Arabic text tools for Python
85 | yara.x86_64 : Pattern matching Swiss knife for malware researchers
86 |
87 | [root@joshi] yum install yara yara-devel yara-doc
88 | ```
89 |
90 | Thanks to:
91 | * Yara compilation from source - http://securitasdato.blogspot.com/2018/04/installing-yara-from-source-code-on.html
92 | * Find urls in PDF - https://seanthegeek.net/257/install-yara-write-yara-rules/
93 |
--------------------------------------------------------------------------------
/dumpbin/dumpbin.conf:
--------------------------------------------------------------------------------
1 | dumpbin_path: D:\VisualC_Python\9.0\VC\bin
2 | dumpbin_exe: dumpbin.exe
3 | check_program: c:\windows\notepad.exe
--------------------------------------------------------------------------------
/dumpbin/dumpbin.md:
--------------------------------------------------------------------------------
1 | ### Analysis of compiler security flags using Dumpbin
2 |
3 | Dumpbin is a powerful utility and is available as a part of Visual C++ tools. This python script uses "dumpbin" program to get the status of compiler security flags like ASLR, DEP, CFG etc.
4 |
5 | * Download Visual C++ tools for python from the following site - https://wiki.python.org/moin/WindowsCompilers
6 |
7 | Typical dumpbin usage on Windows command line is done as follows:
8 |
9 | ```
10 | cd C:\Users\Joshi\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\bin>
11 | C:\Users\Joshi\AppData\Local\Programs\Common\Microsoft\Visual C++ for Python\9.0\VC\bin> dumpbin.exe /headers xxx.exe
12 | ```
13 |
14 | Basically, you are looking for following characteristics in EXE/DLL files:
15 | * Dynamic base(ASLR)
16 | * NX Compatible(DEP)
17 | * Guard(CFG)
18 | * Look them in optional header values section
19 |
20 | The python script can be used to automate this process.
21 |
--------------------------------------------------------------------------------
/dumpbin/dumpbin.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 |
3 | """
4 | this script uses dumpbin.exe to extract information about compiler security flags like NX,CFG etc from executable/DLL files. The flag information is quite useful for static analysis of a binary file.
5 | Note- This program requires 'Visual C++ tools for python package' and it can be downloaded from https://wiki.python.org/moin/WindowsCompilers
6 | """
7 |
8 | import subprocess
9 | import yaml
10 | import logging
11 | import os
12 | import sys
13 | import argparse
14 |
15 | logging.basicConfig(stream = sys.stdout, level=logging.DEBUG)
16 | logger = logging.getLogger(__name__)
17 |
18 | def subprocess_response(dumpbin_cmd, check_program):
19 | stdout = stderr = None
20 | try:
21 | cmd = dumpbin_cmd + ' /headers ' + check_program
22 | #cmd = os.path.join(os.path.sep,dumpbin_cmd,' /headers ', check_program)
23 | logger.debug("command passed to subprocess module for execution is %s" %cmd)
24 | process_instance = subprocess.Popen(cmd, shell=True,stdout=subprocess.PIPE, stderr=subprocess.PIPE)
25 | stdout, stderr = process_instance.communicate()
26 | if stderr:
27 | logger.info("An error is encountered during execution of command %s - %s" %(dumpbin_cmd,stderr))
28 | except Exception as exc:
29 | logger.error("Error while executing %s - %s" %(dumpbin_cmd,exc.message),exc_info=True)
30 | return stdout, stderr
31 |
32 | def load_config(yaml_file):
33 | config = None
34 | try:
35 | with open(yaml_file,'r') as f:
36 | config = yaml.load(f)
37 | except Exception as exc:
38 | logger.error("Error while loading configuration file %s - %s" %(yaml_file,exc.message), exc_info = True)
39 | return config
40 |
41 | if __name__ == "__main__":
42 | try:
43 | parser = argparse.ArgumentParser(description = "This program uses dumpbin program to capture compiler security flags.")
44 | parser.add_argument("-c","--config", required=True,help="Configuration file",dest='config')
45 | args = parser.parse_args()
46 | if args.config:
47 | if not os.path.isfile(args.config):
48 | logger.error("Configuration file required for dumpbin.exe is not present! Quitting...")
49 | sys.exit(1)
50 | yaml_config = load_config(args.config)
51 | #logger.info("%s" %yaml_config)
52 | logger.debug(yaml_config['dumpbin_path'])
53 | logger.debug(yaml_config['dumpbin_exe'])
54 | logger.debug(yaml_config['check_program'])
55 |
56 | # check if dumpbin.exe path is valid
57 | dumpbin_cmd = os.path.join(yaml_config['dumpbin_path'],yaml_config['dumpbin_exe'])
58 | if not os.path.isfile(dumpbin_cmd):
59 | logger.info("%s could not be found! Please check the path" %dumpbin_cmd)
60 | sys.exit(1)
61 | # check if program path is valid
62 | if not os.path.isfile(yaml_config['check_program']):
63 | logger.info("%s could not be found! Please check the path" %yaml_config['check_program'])
64 | sys.exit(1)
65 | stdout,stderr = subprocess_response(dumpbin_cmd,yaml_config['check_program'])
66 |
67 | security_flags = dict()
68 | # initalize compiler security flags
69 | security_flags['ASLR'] = 0
70 | security_flags['DEP'] = 0
71 | security_flags['CFG'] = 0
72 |
73 | if not stderr:
74 | logger.info("%s" %stdout)
75 | exe_headers = stdout.strip().split('\n')
76 | for line in exe_headers:
77 | if "Dynamic base" in line:
78 | logger.info("ASLR bit set")
79 | security_flags['ASLR'] = 1
80 | elif "NX compatible" in line:
81 | logger.info("DEP bit set")
82 | security_flags['DEP'] = 1
83 | elif "Guard" in line:
84 | logger.info("CFG bit set")
85 | security_flags['CFG'] = 1
86 | else:
87 | logger.info("Error while getting output from 'dumpbin.exe' program")
88 |
89 | logger.info("Security flags: %s" %security_flags)
90 |
91 | except Exception as exc:
92 | logger.error("Error while executing dumpbin script - %s" %exc.message,exc_info=True)
93 |
--------------------------------------------------------------------------------
/hash-reputation/hash-services.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import sys
3 | import logging
4 | import requests
5 | import argparse
6 | import json
7 | import yaml
8 | import os
9 | from bs4 import BeautifulSoup
10 | from pprint import pprint
11 |
12 | """
13 | This script uses file repuation services on the web to check if the file is malicious or not.
14 | File reputation services:
15 |
16 | Virustotal service(public key api - 4 requests per min constraint):
17 |
18 | Threatexpert: http://threatexpert.com/reports.aspx
19 | e.g.
20 | http://threatexpert.com/reports.aspx?find=7e010e90d1dbd292de3d2ae20e04b7ba
21 |
22 | Shadowserver: http://bin-test.shadowserver.org
23 | This server allows us to test an executable against a list of known software applications using md5/sha1 hash.
24 | e.g.
25 | Details of program associated with hash:
26 | http://bin-test.shadowserver.org/api?md5=0E53C14A3E48D94FF596A2824307B492
27 | http://bin-test.shadowserver.org/api?sha1=000000206738748EDD92C4E3D2E823896700F849
28 |
29 | Check if program is whitelisted or not:
30 | http://innocuous.shadowserver.org/api/?query=0E53C14A3E48D94FF596A2824307B492
31 | Team cymru report
32 |
33 | """
34 | # setup logging
35 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
36 | logger = logging.getLogger(__name__)
37 |
38 | def yaml_config(yaml_file):
39 | try:
40 | with open(yaml_file, 'r') as f:
41 | configuration_data = yaml.load(f)
42 | return configuration_data
43 |
44 | except Exception as exc:
45 | logger.error("Error while reading yaml configuration file - %s" %e.message,exc_info=True)
46 |
47 | def md5sum(file):
48 | try:
49 | f = open(file, "rb")
50 | data = f.read()
51 | md5 = hashlib.md5(data).hexdigest()
52 | f.close()
53 | except Exception, msg:
54 | print msg
55 |
56 | return md5
57 |
58 | def sha256sum(file):
59 | try:
60 | f = open(file, "rb")
61 | data = f.read()
62 | sha256 = hashlib.sha256(data).hexdigest()
63 | f.close()
64 | except Exception, msg:
65 | print msg
66 |
67 | return sha256
68 |
69 |
70 | def shadowserver_hash_report(url,file_hash,proxy_dict = None):
71 | try:
72 | response = None
73 | shadow_response = None
74 | headers = {'Accept-Encoding': "gzip, deflate", 'User-Agent': 'Python-based agent'}
75 | url = url + file_hash
76 | response = requests.get(url,proxies=proxy_dict,headers = headers)
77 | if response.status_code == 200:
78 | if (response.text):
79 | r = response.text
80 | split_response=r.strip().split(' ')[1:]
81 | hash_details = ''.join(split_response)
82 | if hash_details:
83 | shadow_response = json.loads(hash_details)
84 | return shadow_response
85 | return shadow_response
86 | except Exception as e:
87 | logger.error("Error while getting file hash information from Shadow Server - %s" %e.message,exc_info=True)
88 |
89 | def shadowserver_whitelist_report(url,file_hash,proxy_dict=None):
90 | try:
91 | response = None
92 | headers = {'Accept-Encoding': "gzip, deflate", 'User-Agent': 'Python-based agent'}
93 | url = url + file_hash
94 | response = requests.get(url,proxies=proxy_dict,headers = headers)
95 | if response.status_code == 200:
96 | if (response.text):
97 | r = response.text
98 | # check if the hash is marked as whitelist or not
99 | whitelist_response=r.strip().split(',')[0]
100 | if "whitelisted" in whitelist_response.lower():
101 | return True, whitelist_response
102 | # not whitelisted
103 | return False, ''
104 |
105 | except Exception as e:
106 | logger.error("Error while getting file whitelist information from Shadow Server - %s" %e.message,exc_info=True)
107 |
108 |
109 | def threatexpert_report(url, file_hash,proxy_dict=None):
110 | try:
111 | response = None
112 | headers = {'Accept-Encoding': "gzip, deflate", 'User-Agent': 'Python-based agent'}
113 | url = url + file_hash
114 | response = requests.get(url,proxies=proxy_dict,headers = headers)
115 | if response.status_code == 200:
116 | if response.text:
117 | soup = BeautifulSoup(response.text,'lxml')
118 | element_p = soup.findAll('p')
119 | for item in element_p:
120 | logger.info(item.text)
121 | if 'no threatexpert reports found'.lower() in item.text.lower():
122 | return False
123 | return True
124 | except Exception as e:
125 | logger.error("Error while getting file hash information from Threat expert - %s" %e.message,exc_info=True)
126 |
127 |
128 | def virustotal_report(url, api_key, file_hash,proxy_dict=None):
129 | try:
130 | vt_response = None
131 | params = {'apikey': api_key, 'resource': file_hash}
132 | headers = {'Accept-Encoding': "gzip, deflate", 'User-Agent': 'Python-based VirtualTotal agent'}
133 | response = requests.get(url, proxies=proxy_dict, params=params, headers=headers)
134 | if response.status_code == 200:
135 | if response.text:
136 | vt_response = json.dumps(response.text)
137 | return vt_response
138 |
139 | except Exception as e:
140 | logger.error("Error while getting file hash information from VirusTotal - %s" %e.message,exc_info=True)
141 |
142 | return vt_response
143 |
144 | def cmd_arguments():
145 |
146 | try:
147 | parser = argparse.ArgumentParser("This script uses file repuation services on the internet to check if the file is malicious or not.")
148 | parser.add_argument('--config', required=True, help='Please specify configuration file',dest='config_file')
149 | parser.add_argument('--md5-hash', required=True, help='Please specify md5 hash',dest='hash')
150 | args = parser.parse_args()
151 | return args
152 | except Exception as exc:
153 | logger.error("Error while getting command line arguments - %s" %exc.message,exc_info=True)
154 |
155 | if __name__ == "__main__":
156 | try:
157 |
158 | cmd_args = cmd_arguments()
159 | if cmd_args:
160 |
161 | # read yaml configuration
162 | if os.path.isfile(cmd_args.config_file):
163 | config = yaml_config(cmd_args.config_file)
164 | #logger.info(pprint(config))
165 |
166 | # enable proxy or not. If yes, add proxy details
167 | if config['proxy']['enable']:
168 | proxy_dict = {
169 | 'http':'http://{}:{}@{}:{}'.format(config['proxy']['user'],config['proxy']['password'],
170 | config['proxy']['host'], config['proxy']['port']),
171 | 'https':'http://{}:{}@{}:{}'.format(config['proxy']['user'],config['proxy']['password'],
172 | config['proxy']['host'], config['proxy']['port'])
173 | }
174 | else:
175 | proxy_dict = {}
176 |
177 | # check hash using threatexpert service
178 | if config['online-hash-services']['threatexpert']['enable']:
179 | logger.debug("Threat expert url - %s" % config['online-hash-services']['threatexpert']['url'])
180 | url = config['online-hash-services']['threatexpert']['url']
181 | threatexpert_response = threatexpert_report(url,cmd_args.hash,proxy_dict)
182 | logger.info(threatexpert_response)
183 |
184 |
185 | # check hash using shadowserver whitelist service
186 | if config['online-hash-services']['shadowserver-whitelist']['enable']:
187 | logger.debug("Shadow server whitelist url - %s" % config['online-hash-services']['shadowserver-whitelist']['url'])
188 | url = config['online-hash-services']['shadowserver-whitelist']['url']
189 | iswhitelisted, response = shadowserver_whitelist_report(url,cmd_args.hash,proxy_dict)
190 | logger.debug("Shadow server whitelist service response - {}".format(response))
191 | if iswhitelisted:
192 | logger.info("Shadow server whitelist service reponse says that The hash {} is whitelisted".format(cmd_args.hash))
193 | else:
194 | logger.info("Shadow server whitelist service reponse says that The hash {} is not whitelisted".format(cmd_args.hash))
195 |
196 | # check hash using shadowserver hash service
197 | if config['online-hash-services']['shadowserver']['enable']:
198 | logger.debug("Shadow server url - %s" % config['online-hash-services']['shadowserver']['url'])
199 | url = config['online-hash-services']['shadowserver']['url']
200 | shadowserver_response = shadowserver_hash_report(url,cmd_args.hash,proxy_dict)
201 | logger.info(shadowserver_response)
202 |
203 | # check hash using virustotal service
204 | if config['online-hash-services']['virustotal']['enable']:
205 | logger.debug("Virustotal url - %s" % config['online-hash-services']['virustotal']['url'])
206 | logger.debug("Virustotal key - %s" % config['online-hash-services']['virustotal']['key'])
207 | url = config['online-hash-services']['virustotal']['url']
208 | api_key = config['online-hash-services']['virustotal']['key']
209 | virustotal_response = virustotal_report(url,api_key,cmd_args.hash,proxy_dict)
210 | logger.info(virustotal_response)
211 |
212 |
213 | else:
214 | logger.info("YAML configuration file %s is not found." %cmd_args.config)
215 |
216 | except Exception as e:
217 | logger.error("Error while getting file hash information - %s" %e.message,exc_info=True)
218 |
219 |
--------------------------------------------------------------------------------
/hash-reputation/hash-services.yaml:
--------------------------------------------------------------------------------
1 | online-hash-services:
2 | virustotal:
3 | url: https://www.virustotal.com/vtapi/v2/file/report
4 | key:
5 | enable: yes
6 | threatexpert:
7 | url: http://threatexpert.com/reports.aspx?find=
8 | enable: yes
9 | shadowserver:
10 | url: http://bin-test.shadowserver.org/api?md5=
11 | enable: yes
12 | shadowserver-whitelist:
13 | url: http://innocuous.shadowserver.org/api/?query=
14 | enable: yes
15 | proxy:
16 | user:
17 | password: xxxx
18 | host:
19 | port: 8080
20 | enable: yes
21 |
--------------------------------------------------------------------------------
/misc/extract-domains.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import re
3 | from string import ascii_lowercase, ascii_uppercase, digits
4 | from tldextract import extract
5 | from collections import defaultdict
6 | import socket
7 |
8 | # some regular expressions
9 | domain_regex = re.compile("([a-z0-9][a-z0-9\-]{0,61}[a-z0-9]\.)+[a-z0-9][a-z0-9\-]*[a-z0-9]", re.IGNORECASE)
10 | ipv4_regex = re.compile("[1-2]?[0-9]?[0-9]\.[1-2]?[0-9]?[0-9]\.[1-2]?[0-9]?[0-9]\.[1-2]?[0-9]?[0-9]")
11 |
12 | def extract_domain(url):
13 | try:
14 | domain = extract(url)
15 | except Exception:
16 | raise Exception("Problem while extracting domain".format(url))
17 | # concatenate domain and tld
18 | return '.'.join((domain[1],domain[2]))
19 |
20 | def extract_domain_details(url):
21 | try:
22 | domain_details = extract(url)
23 | except Exception:
24 | raise Exception("Problem while extracting domain".format(url))
25 | return domain_details
26 |
27 | def domain_frequencies(url_list):
28 | freq_dist = defaultdict(int)
29 | for item in url_list:
30 | # add domain
31 | freq_dist[extract_domain(item)] += 1
32 | freq_dist = [[k,v] for k,v in freq_dist.iteritems()]
33 | return freq_dist
34 |
35 | def valid_ipv4_old(check_str):
36 | resp = ipv4_regex.search(check_str)
37 | if resp:
38 | return resp.group(0)
39 |
40 | def valid_domain(check_str):
41 | resp = domain_regex.search(check_str)
42 | if resp:
43 | return resp.group(0)
44 |
45 | def valid_ipv4(address):
46 | try:
47 | socket.inet_pton(socket.AF_INET, address)
48 | except AttributeError: # no inet_pton here, sorry
49 | try:
50 | socket.inet_aton(address)
51 | except socket.error:
52 | return False
53 | return address.count('.') == 3
54 | except socket.error: # not a valid address
55 | return False
56 |
57 | return True
58 |
59 |
60 | if __name__ == '__main__':
61 | test_url = 'www.google.co.sg'
62 | print extract_domain(test_url)
63 | print extract_domain_details(test_url)
64 | urls = ['www.google.co.in','bing.com','bing.com','gov.in','www.google.co.in']
65 | print domain_frequencies(urls)
66 | print valid_ipv4('24.12.32.312')
67 | print valid_domain('www.google.co')
68 |
--------------------------------------------------------------------------------
/pe-studio/pestudio.md:
--------------------------------------------------------------------------------
1 | ## PEStudio (https://www.winitor.com/index.html)
2 | For malware analysis, PEStudio is a great tool for novice as well as experienced security people. The static analysis tool scans the file and creates a nicely organized list of file headers information and alerts the user if there are any anomalies in the file headers. Typical output contains:
3 | * Hashes – md5, SHA-1, and SHA-256 hashes of file
4 | * Virustotal -PEStudio automatically submits file hash to virustotal and list its results.
5 | * DOS-STUB - This section displays DOS stub - section between MZ and PE header and this section is responsible for famous message - "This file requires Windows to run" in case user is trying to run the program on old DOS system or non-DOS system.
6 | * File-header - General information about file header - CPU architecture, 32-bit/64-bit, size of optional header, compiler options etc.
7 | * Directories - Relative virutal address locations and size of each.
8 | * Sections - Sections in file. Malicious files may have strange section names and PE studio will display it in different color.
9 | * Libraries - DLL files that the program uses/references while being analyzed.
10 | * Imports - List of all OS/Win32 API calls the program uses. This gives us idea about program capabilities and the possible use-cases. e.g. if there are many calls like socket, connect, send, it is highly likely that program is using network communication in a big way.
11 | * Exports - These are functions that PE file exports for other PE files to use. Many times, there is only one export but in some DLL files case, you will see many export functions if the DLL is being used by many other programs.
12 | * Resources - It list out the resources like bitmaps, icons used by program.
13 | * Strings - This parses each string present in the file into a nice and sortable list. It also checks the list against blacklisted strings and raise the alert when any suspicious string is found.
14 | * Debug, version, certificate, overlay etc - The program also checks if any debugging options are enabled, checks its version and certificate authrority etc. Certificate checks are useful as it is possible to raise alert in case a Microsoft Windows file (usually owned by Microsoft) is signed by some third party/ Unknown party.
15 |
16 | The features that are available in PEstudio are described here:
17 | https://www.winitor.com/features.html
18 |
19 | Some other useful tools that are commonly used for static analysis are listed here - https://toddcullumresearch.com/2017/07/01/todds-giant-intro-of-windows-malware-analysis-tools/
20 |
21 |
--------------------------------------------------------------------------------
/pe-studio/pestudio.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 |
3 | """
4 | This script processes xml output of pestudio program - a very useful utility that analyzes EXE files for malicious
5 | contents.
6 |
7 | To get XML output:
8 | c:\pestudio> pestudioprompt.exe -file:c:\Users\Joshi\Desktop\putty.exe -xml:test.xml
9 |
10 | """
11 |
12 | from xml.etree import ElementTree as ET
13 | import re
14 | import argparse
15 | from urlparse import urlparse
16 | import logging
17 | import os
18 | import sys
19 |
20 | # setup logging
21 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
22 | logger = logging.getLogger(__name__)
23 |
24 | def getXML_file(input_file):
25 | root = None
26 | try:
27 | root = ET.parse(input_file).getroot()
28 | except Exception as exc:
29 | logger.error("Error while parsing XML file {} - {}"
30 | .format(input_file,exc.message),exc_info=True)
31 | return root
32 |
33 | def cmd_arguments():
34 | args = None
35 | try:
36 | parser = argparse.ArgumentParser("This script is used to parse XML reports of PEStudio for malware analysis.")
37 |
38 | parser.add_argument('--xml', required=True, help='Please specify PEStudio XML file.',dest='xml_file')
39 | args = parser.parse_args()
40 |
41 | except Exception as exc:
42 | logger.error("Error while getting command line arguments - %s" %exc.message,exc_info=True)
43 | return args
44 |
45 | if __name__ == "__main__":
46 | try:
47 | cmd_args = cmd_arguments()
48 |
49 | dirname, filename = os.path.split(os.path.abspath(__file__))
50 | logger.debug("Directory - {} File- {}".format(dirname,filename))
51 | if not os.path.isfile(cmd_args.xml_file):
52 | logger.info("The file {} is could not be found. Quitting..".format(cmd_args.pestudio_xml))
53 | sys.exit(1)
54 |
55 | xml_root = getXML_file(cmd_args.xml_file)
56 | pestudio_report = list()
57 |
58 | # parse indicators
59 | pestudio_indicators = xml_root.findall('Indicators')
60 | pestudio_dict = dict()
61 | for item in pestudio_indicators:
62 | logger.info("Number of indicators:{}".format(item.attrib))
63 |
64 | for indicators in pestudio_indicators:
65 | indicator_list = indicators.findall('Indicator')
66 | ind_list = list()
67 | for item in indicator_list:
68 | ind_list.append(item.text)
69 | pestudio_dict['indicators'] = ind_list
70 |
71 | pestudio_report.append(pestudio_dict)
72 |
73 | # parse strings
74 | pestudio_strings = xml_root.findall('strings')
75 | for item in pestudio_strings:
76 | logger.info("Number of strings:{}".format(item.attrib))
77 |
78 | pestudio_dict = dict()
79 | for strings in pestudio_strings:
80 | string_list = strings.findall('string')
81 | list_string = list()
82 |
83 | for item in string_list:
84 | # check for presence of ip address or url
85 | present_flag = False
86 | if item.text:
87 | #ipv4 validation
88 | if re.match('^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$', item.text) != None:
89 | present_flag = True
90 |
91 | # url validation
92 | # if (re.findall('^http[s]?://\S+',item.text)):
93 | # present_flag = True
94 |
95 | # url validation
96 | url_result = urlparse(item.text)
97 | if url_result.scheme in ('http','https'):
98 | present_flag = True
99 |
100 | if present_flag:
101 | list_string.append(item.text)
102 | #list_string.append(item.text)
103 | if list_string:
104 | pestudio_dict['strings'] = list_string
105 |
106 | if pestudio_dict:
107 | pestudio_report.append(pestudio_dict)
108 |
109 | logger.info("{}".format(pestudio_report))
110 |
111 | except Exception as exc:
112 | logger.error("Error while running PEstudio parser script - {}".format(exc.message),exc_info=True)
113 |
--------------------------------------------------------------------------------
/pe-studio/test.xml:
--------------------------------------------------------------------------------
1 |
2 |
3 | SSH, Telnet and Rlogin clientRelease 0.6310:07:2015 - 11:28:2032495616executableGUIMicrosoft Visual C++ 7.0 MFC7A0DFC5353FF6DE7DE0208A29FA2FFC944AC2504A02AF84EE142ADAA3EA70B868185906Fn/aThe file modifies the registryThe file references the ClipboardThe file references the Desktop windowThe file starts child ProcessesThe file queries for files and streamsThe file references the Event LogThe file queries for visible/invisible windowThe file changes the protection of the Virtual Address SpaceThe file references a communications deviceThe file contains 1 MIME64 Encoding string(s)The file ignores Data Execution Prevention (DEP) as Mitigation techniqueThe file ignores Address Space Layout Randomization (ASLR) as Mitigation techniqueThe file checksum is invalidThe file ignores Cookies placed on the Stack (GS) as Mitigation techniquec:\windows\system32\advapi32.dllImplicit12A00009.00x000A1449120.63.0.0c:\windows\winsxs\x86_microsoft.windows.common-controls_6595b64144ccf1df_5.82.7601.17514_none_ec83dffa859149af\comctl32.dllImplicit4[201~[200~Recent.Sessions................0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~.hne.oefcsblwaefb..><sebyxls.r:cea.irsf.o:s.1.aieteso=10><sebydniy..eso=0000..poesrrhtcue.8...ae.ut...ye.i3....<ecito>.ewr.letadtria.mltr.ecito>..dpnec>..dpnetseby.......la.omncnrl..nta.f5t.e.ix.aie.......loigcnrl.ntecin.ra.......asmliett.ye.i3........ae.irsf.idw.omncnrl........eso=6000........ulcetkn.55614c1f.......lnug=.........rcsoacietr=x6.....eednasml>...eedny..<..elr.st.e.p.wr.....<sv:plcto.mn:sv=unshmsmcootcmamv....1.mPP0.stig.....xls.tp.shmsmcootcmsi20.idwstig.....<paaetu<diwr>....sv:idwstig>...sv:plcto><asml>..@....0.mvrin.......rcsoAcietr=x6..nm=PTY..tp=wn2.>..dsrpinAntokcin.n.emnleuao<dsrpin..<eedny..<eednAsml>....<..odCmo.otos6isedo..ogtWnPntv........okn.otosi.h.letae..>....<sebydniytp=wn2.......nm=McootWnosCmo.otos.......vrin............pbiKyoe=69b44cfd........agae.........poesrrhtcue.8.>..<dpnetseby..<dpnec>....Dcaeu.ob.DIaae..>..am3apiainxlsam3.r:cea.irsf.o:s.3>...am3wnosetns.....mn=ht:.cea.irsf.o.M.05Wnosetns>....diwr>re.pAae...<am3wnosetns..<am3apiain..seby.E0mm........02468:<>@BDFHJLNPRTVXZ\^`bdfhjlnprtvxz|~heofslaf.>sbxsrcais.:..its=0<eyny.s=00.osrtu...eu..ei..<ct>erltdramt.ct>.pe>.pesb....aonnl.t.5..xae...licr.tcnr....slety.3....eis.d.mcr....eo60....uctn5641....ng.....cocerx...ensl..en.<.l.tepw...s:lt.ns=nhscocav...P0si...l.psmmotmi0iwtg..<aeudw>..viwtg..vpco<sl.@...vi....cocerx.n=T.t=n..driAtki..mluodri.<en.<ensl..<.dm.tsieo.gWPt....onooihlte...<eynypw2...n=coWoCooo....rn......bKo=94cd....ge....perhce8>.dntey.dnc..Dauo.Ia.>.maiixsm.:e.rfos3..mwoen...nh:cais...5nsts..dw>epa..a3nsts.a3pan.ey.m.@n=T.t=n..driAtki..mluodri.<en.<ensl..<.dm.tsieo.gWPt....onooihlte...<eynypw2...n=coWoCooo....rn......bKo=94cd....ge....perhce8>.dntey.dnc..Dauo.Ia.>.maiixsm.:e.rfos3..mwoen...nh:cais...5nsts..dw>epa..a3nsts.a3pan.ey.s:.q.8m>.P.m.8.0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`..P..@..0...p..`....x..x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X.T1.nP...8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X......0123456789................................H.........h......................H.................................H.null.About.PuTTYMS.Shell.Dlg.CloseView..LicenceVisit..Web.SitePuTTY.1997.2013.Simon.Tatham..All.rights.reserved.PuTTYConfigBoxPuTTY.ConfigurationMS.Shell.DlgPuTTY.Event.LogMS.Shell.Dlg.CloseC.opyPuTTY.LicenceMS.Shell.DlgCopyright..1997.2013.Simon.TathamPortions.copyright.Robert.de.Bath..Joris.van.Rantwijk..DelianDelchev..Andreas.Schultz..Jeroen.Massar..Wez.Furlong..NicolasBarry..Justin.Bradford..Ben.Harris..Malcolm.Smith..Ahmad.Khalifa.abcdefghijklmnopqrstuvwxyz{|}~.hne.oefcsblwaefb..><sebyxls.r:cea.irsf.o:s.1.aieteso=10><sebydniy..eso=0000..poesrrhtcue.8...ae.ut...ye.i3....<ecito>.ewr.letadtria.mltr.ecito>..dpnec>..dpnetseby.......la.omncnrl..nta.f5t.e.ix.aie.......loigcnrl.ntecin.ra.......asmliett.ye.i3........ae.irsf.idw.omncnrl........eso=6000........ulcetkn.55614c1f.......lnug=.........rcsoacietr=x6.....eednasml>...eedny..<..elr.st.e.p.wr.....<sv:plcto.mn:sv=unshmsmcootcmamv....1.mPP0.stig.....xls.tp.shmsmcootcmsi20.idwstig.....<paaetu<diwr>....sv:idwstig>...sv:plcto><asml>..@....0.mvrin.......rcsoAcietr=x6..nm=PTY..tp=wn2.>..dsrpinAntokcin.n.emnleuao<dsrpin..<eedny..<eednAsml>....<..odCmo.otos6isedo..ogtWnPntv........okn.otosi.h.letae..>....<sebydniytp=wn2.......nm=McootWnosCmo.otos.......vrin............pbiKyoe=69b44cfd........agae.........poesrrhtcue.8.>..<dpnetseby..<dpnec>....Dcaeu.ob.DIaae..>..am3apiainxlsam3.r:cea.irsf.o:s.3>...am3wnosetns.....mn=ht:.cea.irsf.o.M.05Wnosetns>....diwr>re.pAae...<am3wnosetns..<am3apiain..seby.E0mmBry.utnBafr.BnHri.MlomSih.ha.hlf.bdfhjlnprtvxz|~heofslaf.>sbxsrcais.:..its=0<eyny.s=00.osrtu...eu..ei..<ct>erltdramt.ct>.pe>.pesb....aonnl.t.5..xae...licr.tcnr....slety.3....eis.d.mcr....eo60....uctn5641....ng.....cocerx...ensl..en.<.l.tepw...s:lt.ns=nhscocav...P0si...l.psmmotmi0iwtg..<aeudw>..viwtg..vpco<sl.@...vi....cocerx.n=T.t=n..driAtki..mluodri.<en.<ensl..<.dm.tsieo.gWPt....onooihlte...<eynypw2...n=coWoCooo....rn......bKo=94cd....ge....perhce8>.dntey.dnc..Dauo.Ia.>.maiixsm.:e.rfos3..mwoen...nh:cais...5nsts..dw>epa..a3nsts.a3pan.ey.mp...048<@DHLPTX\`dhlptx|hosa.sxras:.t=<yys0.st..u.i.c>rtrm.t.e.eb..on...xe.lc.cr..lt...esdmr..o0..cn61..g..ccr..nl.n<ltp..:tn=hccv.Ps...smtiit.<ed>.it.vc<l@.v..ccr.=.=.diti.lor.e.es.<d.se.Wt..noht..enp2.ncWCo..r...bo9c..g..prc8.ne.n.Du.a>mixm:.fs.mon.n:as.5ss.wea.3ssapne..nTtn.rAk.mudi<n<nl..mtiogP..ooile.<yyw..=oooo..n...K=4d..e..ehe>dtydc.aoI..ais.ero3.we..hci..nt.d>p.ant.3a.ys..m.......`..@....`..@....`..@....`..@....`..@....`..@....`.p..P..0.p..P..0.p..P..0.p..P..0.p..P..0.p..P..0.p..P..0.p..P..0.p..P..0.p..P..0.p..P..0.p...x..h..H....h..H....h..H....h..H....h..H....h..H....h..H....h..H....h..H..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..p.8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X.T1.nx...8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.Markus.Kuhn..Colin.Watson..and.CORE.SDI.S.A.Permission.is.hereby.granted..free.of.charge..to.any.personobtaining.a.copy.of.this.software.and.associated.documentationfiles..the..Software....to.deal.in.the.Software.without.restriction.defghijklmnopqrstuvwxyz{|}~.hne.oefcsblwaefb..><sebyxls.r:cea.irsf.o:s.1.aieteso=10><sebydniy..eso=0000..poesrrhtcue.8...ae.ut...ye.i3....<ecito>.ewr.letadtria.mltr.ecito>..dpnec>..dpnetseby.......la.omncnrl..nta.f5t.e.ix.aie.......loigcnrl.ntecin.ra.......asmliett.ye.i3........ae.irsf.idw.omncnrl........eso=6000........ulcetkn.55614c1f.......lnug=.........rcsoacietr=x6.....eednasml>...eedny..<..elr.st.e.p.wr.....<sv:plcto.mn:sv=unshmsmcootcmamv....1.mPP0.stig.....xls.tp.shmsmcootcmsi20.idwstig.....<paaetu<diwr>....sv:idwstig>...sv:plcto><asml>..@....0.mvrin.......rcsoAcietr=x6..nm=PTY..tp=wn2.>..dsrpinAntokcin.n.emnleuao<dsrpin..<eedny..<eednAsml>....<..odCmo.otos6isedo..ogtWnPntv........okn.otosi.h.letae..>....<sebydniytp=wn2.......nm=McootWnosCmo.otos.......vrin............pbiKyoe=69b44cfd........agae.........poesrrhtcue.8.>..<dpnetseby..<dpnec>....Dcaeu.ob.DIaae..>..am3apiainxlsam3.r:cea.irsf.o:s.3>...am3wnosetns.....mn=ht:.cea.irsf.o.M.05Wnosetns>....diwr>re.pAae...<am3wnosetns..<am3apiain..seby.E0mmfls.h.Sfwr..t.eli.h.otaewtotrsrcindfhjlnprtvxz|~heofslaf.>sbxsrcais.:..its=0<eyny.s=00.osrtu...eu..ei..<ct>erltdramt.ct>.pe>.pesb....aonnl.t.5..xae...licr.tcnr....slety.3....eis.d.mcr....eo60....uctn5641....ng.....cocerx...ensl..en.<.l.tepw...s:lt.ns=nhscocav...P0si...l.psmmotmi0iwtg..<aeudw>..viwtg..vpco<sl.@...vi....cocerx.n=T.t=n..driAtki..mluodri.<en.<ensl..<.dm.tsieo.gWPt....onooihlte...<eynypw2...n=coWoCooo....rn......bKo=94cd....ge....perhce8>.dntey.dnc..Dauo.Ia.>.maiixsm.:e.rfos3..mwoen...nh:cais...5nsts..dw>epa..a3nsts.a3pan.ey.mx.unarBHiMoShh.l.dhlptx|hosa.sxras:.t=<yys0.st..u.i.c>rtrm.t.e.eb..on...xe.lc.cr..lt...esdmr..o0..cn61..g..ccr..nl.n<ltp..:tn=hccv.Ps...smtiit.<ed>.it.vc<l@.v..ccr.=.=.diti.lor.e.es.<d.se.Wt..noht..enp2.ncWCo..r...bo9c..g..prc8.ne.n.Du.a>mixm:.fs.mon.n:as.5ss.wea.3ssapne.p.08@HPX`hpxhs.xa:t<y0s.uicrr...b.n.x.cc.l..sm.o.c6.g.c.n.<t.:nhc.s.sti.e>i.cl..cr==dt.o..s<.eW.nh.ep.cC.r.b9.g.r8n..uamx:f.o.:s5swa3spe.Tnrkmd<<l.toP.ol.yw.oo.n.K4.e.h>td.o.aseo.e.c.n.>.n.ay.m...`...`...`...`...`...`...`p..0p..0p..0p..0p..0p..0p..0p..0p..0p..0p..0p.x..H...H...H...H...H...H...H...H...H..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..p8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..x.8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X.T1.n.E..8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.including.without.limitation.the.rights.to.use..copy..modify..merge.defghijklmnopqrstuvwxyz{|}~.hne.oefcsblwaefb..><sebyxls.r:cea.irsf.o:s.1.aieteso=10><sebydniy..eso=0000..poesrrhtcue.8...ae.ut...ye.i3....<ecito>.ewr.letadtria.mltr.ecito>..dpnec>..dpnetseby.......la.omncnrl..nta.f5t.e.ix.aie.......loigcnrl.ntecin.ra.......asmliett.ye.i3........ae.irsf.idw.omncnrl........eso=6000........ulcetkn.55614c1f.......lnug=.........rcsoacietr=x6.....eednasml>...eedny..<..elr.st.e.p.wr.....<sv:plcto.mn:sv=unshmsmcootcmamv....1.mPP0.stig.....xls.tp.shmsmcootcmsi20.idwstig.....<paaetu<diwr>....sv:idwstig>...sv:plcto><asml>..@....0.mvrin.......rcsoAcietr=x6..nm=PTY..tp=wn2.>..dsrpinAntokcin.n.emnleuao<dsrpin..<eedny..<eednAsml>....<..odCmo.otos6isedo..ogtWnPntv........okn.otosi.h.letae..>....<sebydniytp=wn2.......nm=McootWnosCmo.otos.......vrin............pbiKyoe=69b44cfd........agae.........poesrrhtcue.8.>..<dpnetseby..<dpnec>....Dcaeu.ob.DIaae..>..am3apiainxlsam3.r:cea.irsf.o:s.3>...am3wnosetns.....mn=ht:.cea.irsf.o.M.05Wnosetns>....diwr>re.pAae...<am3wnosetns..<am3apiain..seby..1mm.EP.gwtotlmtto.h.ihst.s.cp.mdf.mredfhjlnprtvxz|~heofslaf.>sbxsrcais.:..its=0<eyny.s=00.osrtu...eu..ei..<ct>erltdramt.ct>.pe>.pesb....aonnl.t.5..xae...licr.tcnr....slety.3....eis.d.mcr....eo60....uctn5641....ng.....cocerx...ensl..en.<.l.tepw...s:lt.ns=nhscocav...P0si...l.psmmotmi0iwtg..<aeudw>..viwtg..vpco<sl.@...vi....cocerx.n=T.t=n..driAtki..mluodri.<en.<ensl..<.dm.tsieo.gWPt....onooihlte...<eynypw2...n=coWoCooo....rn......bKo=94cd....ge....perhce8>.dntey.dnc..Dauo.Ia.>.maiixsm.:e.rfos3..mwoen...nh:cais...5nsts..dw>epa..a3nsts.a3pan.ey.m..hSw.teihoaworridhlptx|hosa.sxras:.t=<yys0.st..u.i.c>rtrm.t.e.eb..on...xe.lc.cr..lt...esdmr..o0..cn61..g..ccr..nl.n<ltp..:tn=hccv.Ps...smtiit.<ed>.it.vc<l@.v..ccr.=.=.diti.lor.e.es.<d.se.Wt..noht..enp2.ncWCo..r...bo9c..g..prc8.ne.n.Du.a>mixm:.fs.mon.n:as.5ss.wea.3ssapne.xuaBioh..hpxhs.xa:t<y0s.uicrr...b.n.x.cc.l..sm.o.c6.g.c.n.<t.:nhc.s.sti.e>i.cl..cr==dt.o..s<.eW.nh.ep.cC.r.b9.g.r8n..uamx:f.o.:s5swa3spep0@P`ph.atysucr.bnxc..s..6gcn<.ncssieic.c=d..s.Wn.pc..9grn.axfo:5w3p.nkd<.o.ly.onK..>doae......ym.`.`.`.`.`.`.`.0.0.0.0.0.0.0.0.0.0.0..H.H.H.H.H.H.H.H.H.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..x8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X....8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X....n.E..8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.publish..distribute..sublicense..and.or.sell.copies.of.the.Software.defghijklmnopqrstuvwxyz{|}~.hne.oefcsblwaefb..><sebyxls.r:cea.irsf.o:s.1.aieteso=10><sebydniy..eso=0000..poesrrhtcue.8...ae.ut...ye.i3....<ecito>.ewr.letadtria.mltr.ecito>..dpnec>..dpnetseby.......la.omncnrl..nta.f5t.e.ix.aie.......loigcnrl.ntecin.ra.......asmliett.ye.i3........ae.irsf.idw.omncnrl........eso=6000........ulcetkn.55614c1f.......lnug=.........rcsoacietr=x6.....eednasml>...eedny..<..elr.st.e.p.wr.....<sv:plcto.mn:sv=unshmsmcootcmamv....1.mPP0.stig.....xls.tp.shmsmcootcmsi20.idwstig.....<paaetu<diwr>....sv:idwstig>...sv:plcto><asml>..@....0.mvrin.......rcsoAcietr=x6..nm=PTY..tp=wn2.>..dsrpinAntokcin.n.emnleuao<dsrpin..<eedny..<eednAsml>....<..odCmo.otos6isedo..ogtWnPntv........okn.otosi.h.letae..>....<sebydniytp=wn2.......nm=McootWnosCmo.otos.......vrin............pbiKyoe=69b44cfd........agae.........poesrrhtcue.8.>..<dpnetseby..<dpnec>....Dcaeu.ob.DIaae..>..am3apiainxlsam3.r:cea.irsf.o:s.3>...am3wnosetns.....mn=ht:.cea.irsf.o.M.05Wnosetns>....diwr>re.pAae...<am3wnosetns..<am3apiain..seby..1mm.EP.gwtotlmtto.h.ihst.s.cp.mdf.mredfhjlnprtvxz|~heofslaf.>sbxsrcais.:..its=0<eyny.s=00.osrtu...eu..ei..<ct>erltdramt.ct>.pe>.pesb....aonnl.t.5..xae...licr.tcnr....slety.3....eis.d.mcr....eo60....uctn5641....ng.....cocerx...ensl..en.<.l.tepw...s:lt.ns=nhscocav...P0si...l.psmmotmi0iwtg..<aeudw>..viwtg..vpco<sl.@...vi....cocerx.n=T.t=n..driAtki..mluodri.<en.<ensl..<.dm.tsieo.gWPt....onooihlte...<eynypw2...n=coWoCooo....rn......bKo=94cd....ge....perhce8>.dntey.dnc..Dauo.Ia.>.maiixsm.:e.rfos3..mwoen...nh:cais...5nsts..dw>epa..a3nsts.a3pan.ey.m..gttmt..htsc.d.rdhlptx|hosa.sxras:.t=<yys0.st..u.i.c>rtrm.t.e.eb..on...xe.lc.cr..lt...esdmr..o0..cn61..g..ccr..nl.n<ltp..:tn=hccv.Ps...smtiit.<ed>.it.vc<l@.v..ccr.=.=.diti.lor.e.es.<d.se.Wt..noht..enp2.ncWCo..r...bo9c..g..prc8.ne.n.Du.a>mixm:.fs.mon.n:as.5ss.wea.3ssapne..hwtiowrihpxhs.xa:t<y0s.uicrr...b.n.x.cc.l..sm.o.c6.g.c.n.<t.:nhc.s.sti.e>i.cl..cr==dt.o..s<.eW.nh.ep.cC.r.b9.g.r8n..uamx:f.o.:s5swa3spexaih.ph.atysucr.bnxc..s..6gcn<.ncssieic.c=d..s.Wn.pc..9grn.axfo:5w3pp@`hayurbx.s.gn.cseccd..np.9r.xo53.k<ol.n.>oe...m```````00000000000..................................x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x...8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X....8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X....nP...8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.and.to.permit.persons.to.whom.the.Software.is.furnished.to.do.so.re.defghijklmnopqrstuvwxyz{|}~.hne.oefcsblwaefb..><sebyxls.r:cea.irsf.o:s.1.aieteso=10><sebydniy..eso=0000..poesrrhtcue.8...ae.ut...ye.i3....<ecito>.ewr.letadtria.mltr.ecito>..dpnec>..dpnetseby.......la.omncnrl..nta.f5t.e.ix.aie.......loigcnrl.ntecin.ra.......asmliett.ye.i3........ae.irsf.idw.omncnrl........eso=6000........ulcetkn.55614c1f.......lnug=.........rcsoacietr=x6.....eednasml>...eedny..<..elr.st.e.p.wr.....<sv:plcto.mn:sv=unshmsmcootcmamv....1.mPP0.stig.....xls.tp.shmsmcootcmsi20.idwstig.....<paaetu<diwr>....sv:idwstig>...sv:plcto><asml>..@....0.mvrin.......rcsoAcietr=x6..nm=PTY..tp=wn2.>..dsrpinAntokcin.n.emnleuao<dsrpin..<eedny..<eednAsml>....<..odCmo.otos6isedo..ogtWnPntv........okn.otosi.h.letae..>....<sebydniytp=wn2.......nm=McootWnosCmo.otos.......vrin............pbiKyoe=69b44cfd........agae.........poesrrhtcue.8.>..<dpnetseby..<dpnec>....Dcaeu.ob.DIaae..>..am3apiainxlsam3.r:cea.irsf.o:s.3>...am3wnosetns.....mn=ht:.cea.irsf.o.M.05Wnosetns>....diwr>re.pAae...<am3wnosetns..<am3apiain..seby..1mm..P..itiue.ulcne.n.rsl.oiso.h.otaedfhjlnprtvxz|~heofslaf.>sbxsrcais.:..its=0<eyny.s=00.osrtu...eu..ei..<ct>erltdramt.ct>.pe>.pesb....aonnl.t.5..xae...licr.tcnr....slety.3....eis.d.mcr....eo60....uctn5641....ng.....cocerx...ensl..en.<.l.tepw...s:lt.ns=nhscocav...P0si...l.psmmotmi0iwtg..<aeudw>..viwtg..vpco<sl.@...vi....cocerx.n=T.t=n..driAtki..mluodri.<en.<ensl..<.dm.tsieo.gWPt....onooihlte...<eynypw2...n=coWoCooo....rn......bKo=94cd....ge....perhce8>.dntey.dnc..Dauo.Ia.>.maiixsm.:e.rfos3..mwoen...nh:cais...5nsts..dw>epa..a3nsts.a3pan.ey.m.Pgttmt..htsc.d.rdhlptx|hosa.sxras:.t=<yys0.st..u.i.c>rtrm.t.e.eb..on...xe.lc.cr..lt...esdmr..o0..cn61..g..ccr..nl.n<ltp..:tn=hccv.Ps...smtiit.<ed>.it.vc<l@.v..ccr.=.=.diti.lor.e.es.<d.se.Wt..noht..enp2.ncWCo..r...bo9c..g..prc8.ne.n.Du.a>mixm:.fs.mon.n:as.5ss.wea.3ssapne..gtt.tcdrhpxhs.xa:t<y0s.uicrr...b.n.x.cc.l..sm.o.c6.g.c.n.<t.:nhc.s.sti.e>i.cl..cr==dt.o..s<.eW.nh.ep.cC.r.b9.g.r8n..uamx:f.o.:s5swa3spe.wiwiph.atysucr.bnxc..s..6gcn<.ncssieic.c=d..s.Wn.pc..9grn.axfo:5w3pxi.hayurbx.s.gn.cseccd..np.9r.xo53p`aub..ncec.n.rx5.<ln>e.m```000000.................................x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.x.8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x..8x...8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..8.x..X..P.8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X..H..8...x..h..X....n....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.....H.....8.......x.....h.....X.subject.to.the.following.conditions:The.above.copyright.notice.and.this.permission.notice.shall.beincluded.in.all.copies.or.substantial.portions.of.the.Software.THE.SOFTWARE.IS.PROVIDED..AS.IS...WITHOUTWARRANTY.OF.ANY.KIND..EXPRESS.OR.IMPLIED.INCLUDING.BUT.NOT.LIMITED.TO.THE.WARRANTIES.OFMERCHANTABILITY..FITNESS.FOR.A.PARTICULARPURPOSE.AND.NONINFRINGEMENT...IN.NO.EVENT.SHALL.THECOPYRIGHT.HOLDERS.BE.LIABLE.FOR.ANY.CLAIM..DAMAGESOR.OTHER.LIABILITY..WHETHER.IN.AN.ACTION.OF.CONTRACT.TORT.OR.OTHERWISE..ARISING.FROM..OUT.OF.OR.INCONNECTION.WITH.THE.SOFTWARE.OR.THE.USE.OROTHER.DEALINGS.IN.THE.SOFTWARE.VS_VERSION_INFOStringFileInfo080904B0CompanyNameSimon.TathamProductNamePuTTY.suiteFileDescriptionSSH..Telnet.and.Rlogin.clientInternalNamePuTTYOriginalFilenamePuTTYFileVersionRelease.0.63ProductVersionRelease.0.63LegalCopyrightCopyright..1997.2013.Simon.Tatham.VarFileInfoTranslation.This.program.cannot.be.run.in.DOS.mode.Rich.~.text`.rdata@.data.rsrchTwEYYu;9]VhLwEYYu>9]hDwEYYuO9]h<wEYYuf9]h4wEh.wE.h.wEh.wEYYuH9EYYu;9EYYtjhhvEhgvE.GhdvEYYuE9E^h`vEYYus9E
4 |
--------------------------------------------------------------------------------
/reputation-services/hash_services.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import sys
3 | import hashlib
4 | import logging
5 | import requests
6 | import argparse
7 | import json
8 | import yaml
9 | import os
10 | from bs4 import BeautifulSoup
11 | import pycurl
12 | import StringIO
13 |
14 | # setup logging
15 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
16 | logger = logging.getLogger(__name__)
17 |
18 | class hash_services():
19 |
20 | def __init__(self, config_file=None, check_file=None, hash_md5=None):
21 |
22 | self.config_file = config_file
23 | self.check_file = check_file
24 | self.hash_md5 = hash_md5
25 | self.hash_sha256 = None
26 | self.shadowserver_url = None
27 | self.shadowserver_whitelist_url = None
28 | self.threatexpert_url = None
29 | self.virustotal_url = None
30 | self.virustotal_api_key = None
31 | self.proxy_dict = dict()
32 | self.use_proxy = False
33 | self.proxy_auth_type = 'basic'
34 | self.proxy_host = ''
35 | self.proxy_port = 8080
36 | self.proxy_user = ''
37 | self.proxy_password = ''
38 |
39 | # check if config file path is OK
40 | if not (self.config_file and os.path.isfile(self.config_file)):
41 | logger.error("The configuration file {} to be used for checking"
42 | " file reputation is not found.Quitting...".format(self.config_file))
43 | sys.exit(1)
44 |
45 | # check if file path is valid for file under test
46 | if not(self.check_file and os.path.isfile(self.check_file)):
47 | logger.info("The file {} could not be found and hence, its reputation"
48 | " can not be determined".format(self.check_file))
49 | #sys.exit(1)
50 |
51 | # process configuration file
52 | self._process_config()
53 |
54 | def _process_config(self):
55 | # process configuration file to find md5, sha256 hashes of check_file.
56 | # also build proxy_dict, shadowserver urls, threatexpert urls etc.
57 | try:
58 | config_data = self._yaml_config()
59 | if config_data:
60 | # enable proxy or not. If yes, add proxy details
61 | if config_data['proxy']['enable']:
62 | self.proxy_dict = {
63 | 'http':'http://{}:{}@{}:{}'.format(config_data['proxy']['user'],config_data['proxy']['password'],
64 | config_data['proxy']['host'], config_data['proxy']['port']),
65 |
66 | 'https':'http://{}:{}@{}:{}'.format(config_data['proxy']['user'],config_data['proxy']['password'],
67 | config_data['proxy']['host'], config_data['proxy']['port'])
68 | }
69 | self.proxy_user = config_data['proxy']['user']
70 | self.proxy_password = config_data['proxy']['password']
71 | self.proxy_host = config_data['proxy']['host']
72 | self.proxy_port = config_data['proxy']['port']
73 | self.use_proxy = True
74 | # proxy authentication - basic/digest
75 | proxy_auth_type = config_data['proxy']['proxy_auth_type']
76 | if proxy_auth_type.lower() == 'basic':
77 | self.proxy_auth_type = pycurl.HTTPAUTH_BASIC
78 | else:
79 | self.proxy_auth_type = pycurl.HTTPAUTH_DIGEST
80 | else:
81 | self.proxy_dict = dict()
82 | self.use_proxy = False
83 |
84 | # compute md5 and sha256 checksum
85 | if not self.hash_md5: # md5 hash is not specified
86 | self.hash_md5 = self._md5_hash(self.check_file)
87 | self.hash_sha256 = self._sha256_hash(self.check_file)
88 | # prepare urls for various site feeds
89 |
90 | # shadowserver url
91 | if config_data['online-hash-services']['shadowserver']['enable']:
92 | logger.debug("Shadow server url - %s" % config_data['online-hash-services']['shadowserver']['url'])
93 | self.shadowserver_url = config_data['online-hash-services']['shadowserver']['url']
94 | else:
95 | self.shadowserver_url = None
96 | logger.info("Shadowserver url: {}".format( self.shadowserver_url))
97 |
98 | # shadowserver whitelist url
99 | if config_data['online-hash-services']['shadowserver-whitelist']['enable']:
100 | logger.debug("Shadow server whitelist url - {}".format(config_data['online-hash-services']['shadowserver-whitelist']['url']))
101 | self.shadowserver_whitelist_url = config_data['online-hash-services']['shadowserver-whitelist']['url']
102 | else:
103 | self.shadowserver_whitelist_url = None
104 | logger.info("Shadowserver whitelist url: {}".format( self.shadowserver_whitelist_url))
105 |
106 | # check hash using threatexpert service
107 | if config_data['online-hash-services']['threatexpert']['enable']:
108 | logger.debug("Threat expert url - %s" % config_data['online-hash-services']['threatexpert']['url'])
109 | self.threatexpert_url = config_data['online-hash-services']['threatexpert']['url']
110 |
111 | # check hash using virustotal service
112 | if config_data['online-hash-services']['virustotal']['enable']:
113 | logger.debug("Virustotal url - %s" % config_data['online-hash-services']['virustotal']['url'])
114 | logger.debug("Virustotal key - %s" % config_data['online-hash-services']['virustotal']['key'])
115 | self.virustotal_url = config_data['online-hash-services']['virustotal']['url']
116 | self.virustotal_api_key = config_data['online-hash-services']['virustotal']['key']
117 | logger.info("Virustotal url: {} ".format(self.virustotal_url))
118 |
119 |
120 | except Exception,e:
121 | logger.error("Error while processing configuration of file {} - {}"
122 | .format(self.config_file,e.message),exc_info=True)
123 |
124 |
125 | def _yaml_config(self):
126 |
127 | configuration_data = None
128 | try:
129 | # yaml configuration
130 | if os.path.isfile(self.config_file):
131 | with open(self.config_file, 'r') as f:
132 | configuration_data = yaml.load(f)
133 | except Exception,e:
134 | logger.error("Error while reading yaml configuration file {} - {}"
135 | .format(self.config_file,e.message), exc_info=True)
136 |
137 | return configuration_data
138 |
139 | def _md5_hash(self, filename):
140 | md5 = None
141 | try:
142 | f = open(filename, "rb")
143 | data = f.read()
144 | md5 = hashlib.md5(data).hexdigest()
145 | f.close()
146 | except Exception, e:
147 | logger.error("Error while computing md5 checksum for file {} - {}"
148 | .format(filename,e.message),exc_info=True)
149 | return md5
150 |
151 | def _sha256_hash(self, filename):
152 | sha256 = None
153 | try:
154 | f = open(filename, "rb")
155 | data = f.read()
156 | sha256 = hashlib.sha256(data).hexdigest()
157 | f.close()
158 | except Exception, e:
159 | logger.error("Error while computing sha256 checksum for file {} - {}"
160 | .format(filename,e.message),exc_info=True)
161 |
162 | return sha256
163 |
164 | def _curl_response(self,url):
165 |
166 | try:
167 | response = None
168 | output = StringIO.StringIO()
169 | curl_instance = pycurl.Curl()
170 | curl_instance.setopt(pycurl.FOLLOWLOCATION,1)
171 | curl_instance.setopt(pycurl.USERAGENT, 'Mozilla/57.0 (Windows NT 6.3; Win64; x64)'
172 | ' AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36')
173 |
174 | if self.use_proxy:
175 | curl_instance.setopt(pycurl.PROXY, self.proxy_host)
176 | curl_instance.setopt(pycurl.PROXYPORT, self.proxy_port)
177 | curl_instance.setopt(pycurl.PROXYAUTH, self.proxy_auth_type)
178 | curl_instance.setopt(pycurl.PROXYUSERPWD, "{}:{}".format(self.proxy_user, self.proxy_password))
179 | curl_instance.setopt(pycurl.VERBOSE, 0)
180 | curl_instance.setopt(pycurl.SSL_VERIFYPEER, 0)
181 | curl_instance.setopt(curl_instance.URL, url)
182 | curl_instance.setopt(curl_instance.WRITEDATA, output)
183 | curl_instance.perform()
184 | response = output.getvalue()
185 | curl_instance.close()
186 |
187 | except Exception,e:
188 | logger.error("Error while getting response from url {} - {}".format(url, e.message), exc_info=True)
189 |
190 | return response
191 |
192 | def shadowserver_hash_report(self):
193 | try:
194 | response = None
195 | shadow_response = None
196 |
197 | if not self.hash_md5:
198 | logger.info("MD5 hash of file {} is not valid. Quitting...")
199 | sys.exit(1)
200 |
201 | shadowserver_url = self.shadowserver_url + self.hash_md5
202 | response = self._curl_response(shadowserver_url)
203 |
204 | if response:
205 | logger.debug("Shadow server response - {}".format(response))
206 | split_response=response.strip().split(' ')[1:]
207 | hash_details = ''.join(split_response)
208 | logger.debug("Shadow server response - hash details - {}".format(hash_details))
209 | if hash_details:
210 | shadow_response = json.loads(hash_details)
211 | return shadow_response
212 |
213 | except Exception,e:
214 | logger.error("Error while getting file hash information from Shadow Server - %s" %e.message,exc_info=True)
215 |
216 | return shadow_response
217 |
218 |
219 | def shadowserver_whitelist_report(self):
220 |
221 | try:
222 | if not self.hash_md5:
223 | logger.info("MD5 hash of file {} is not valid. Quitting...")
224 | sys.exit(1)
225 | response = None
226 | shadowserver_whitelist_url = self.shadowserver_whitelist_url + self.hash_md5
227 | response = self._curl_response(shadowserver_whitelist_url)
228 | if response:
229 | # check if the hash is marked as whitelist or not
230 | whitelist_response=response.strip().split(',')[0]
231 | if "whitelisted" in whitelist_response.lower():
232 | return True, whitelist_response
233 | else:
234 | return False, whitelist_response
235 |
236 | else: return False, ''
237 |
238 | except Exception as e:
239 | logger.error("Error while getting file whitelist information from Shadow Server - %s" %e.message,exc_info=True)
240 | # not whitelisted
241 | return False, ''
242 |
243 | def threatexpert_hash_report(self):
244 |
245 | try:
246 | if not self.hash_md5:
247 | logger.info("MD5 hash of file {} is not valid. Quitting...")
248 | sys.exit(1)
249 |
250 | response = None
251 | threatexpert_url = self.threatexpert_url + self.hash_md5
252 | response = self._curl_response(threatexpert_url)
253 | if response:
254 | soup = BeautifulSoup(response,'lxml')
255 | element_p = soup.findAll('p')
256 | for item in element_p:
257 | logger.info(item.text)
258 | if 'no threatexpert reports found'.lower() in item.text.lower():
259 | return False
260 | return True
261 | except Exception as e:
262 | logger.error("Error while getting file hash information from Threat expert - %s" %e.message,exc_info=True)
263 |
264 | return False
265 |
266 | def virustotal_hash_report(self):
267 |
268 | try:
269 | if not self.hash_md5:
270 | logger.info("MD5 hash of file {} is not valid. Quitting...")
271 | sys.exit(1)
272 |
273 | vt_response = None
274 | virustotal_url = "{}?apikey={}&resource={}".format(self.virustotal_url,self.virustotal_api_key,self.hash_md5)
275 | response = self._curl_response(virustotal_url)
276 | if response:
277 | vt_response = json.loads(response)
278 |
279 | except Exception as e:
280 | logger.error("Error while getting file hash information from VirusTotal - %s" %e.message,exc_info=True)
281 |
282 | return vt_response
283 |
--------------------------------------------------------------------------------
/reputation-services/main.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import sys
3 | import logging
4 | import requests
5 | import argparse
6 | import yaml
7 | import os
8 | from pprint import pprint
9 | from hash_services import hash_services
10 | from virustotal_services import Virustotal
11 | from shodan_services import Shodan_checks
12 |
13 | from time import sleep
14 |
15 | # setup logging
16 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
17 | logger = logging.getLogger(__name__)
18 |
19 | def yaml_config(yaml_file):
20 | try:
21 | with open(yaml_file, 'r') as f:
22 | configuration_data = yaml.load(f)
23 | return configuration_data
24 |
25 | except Exception as exc:
26 | logger.error("Error while reading yaml configuration file - %s" %e.message,exc_info=True)
27 |
28 |
29 | def cmd_arguments():
30 |
31 | try:
32 | parser = argparse.ArgumentParser("This script uses file repuation services on the internet to check if the file is malicious or not.")
33 | parser.add_argument('--config', required=True, help='Please specify configuration file',dest='config_file')
34 | parser.add_argument('--hash-file', required=False, help='Please specify the file to be checked for its reputation',dest='hash_file')
35 | parser.add_argument('--md5-hash', required=False, help='Please specify md5 hash',dest='hash')
36 | args = parser.parse_args()
37 | return args
38 | except Exception as exc:
39 | logger.error("Error while getting command line arguments - %s" %exc.message,exc_info=True)
40 |
41 | if __name__ == "__main__":
42 | try:
43 |
44 | cmd_args = cmd_arguments()
45 |
46 | if cmd_args:
47 |
48 | # read yaml configuration
49 | if os.path.isfile(cmd_args.config_file):
50 | config_data = yaml_config(cmd_args.config_file)
51 | logger.debug(pprint(config_data))
52 |
53 |
54 | # Shodan information
55 | logger.info("Getting information using Shodan servers APIs...")
56 | shodan_instance = Shodan_checks(config_data['shodan-services']['key'],
57 | config_data['shodan-services']['url'],
58 | config_data['proxy']['enable'], config_data['proxy']['proxy_auth_type'],
59 | config_data['proxy']['host'], config_data['proxy']['port'],
60 | config_data['proxy']['user'], config_data['proxy']['password'])
61 |
62 | # get API information
63 | logger.info("Getting Shodan API status information.")
64 | shodan_instance.api_information()
65 | logger.info("Shodan API status information is fetched successfully.")
66 |
67 | ### HTTP client headers information
68 | logger.info("Getting HTTP client headers using Shodan API.")
69 | shodan_instance.http_headers()
70 | logger.info("HTTP client headers are fetched successfully.")
71 |
72 | ### IP information
73 | logger.info("Getting IP information using Shodan API.")
74 | shodan_instance.ip_information('59.185.236.31')
75 | logger.info("IP information is fetched successfully.")
76 |
77 | ### Scanning ports usage
78 | logger.info("Checking scan ports using Shodan API.")
79 | shodan_instance.scanning_ports()
80 | logger.info("Scan Ports information is fetched successfully.")
81 |
82 | #### check if VPN ports are open
83 | logger.info("Checking if VPN ports are open or not using Shodan API")
84 | shodan_instance.is_vpn('59.185.236.31')
85 | logger.info("VPN ports information is fetched successfully.")
86 |
87 | hash_val = '7657fcb7d772448a6d8504e4b20168b8'
88 | # Use of different hash services available on web to find out malicious files
89 | hash_service_instance = hash_services(cmd_args.config_file,cmd_args.hash_file)
90 |
91 | #shadow server hash report
92 | logger.info("Getting Shadow server hash report...")
93 | response = hash_service_instance.shadowserver_hash_report()
94 | logger.info("Shadow server hash report:\n {}".format(response))
95 |
96 | # shadow server whitelist response
97 | logger.info("Getting Shadow server whitelist report...")
98 | whitelist_status, whitelist_response = hash_service_instance.shadowserver_whitelist_report()
99 | logger.info("Shadow server whitelist report:\n {}- {}".format(whitelist_status, whitelist_response))
100 |
101 | #threat expert report
102 | logger.info("Getting Threatexpert report...")
103 | response = hash_service_instance.threatexpert_hash_report()
104 | logger.info("Threatexpert hash report:\n {}".format(response))
105 |
106 | #virustotal report
107 | logger.info("Getting virustotal report...")
108 | response = hash_service_instance.virustotal_hash_report()
109 | logger.info("Virustotal hash report:\n {}".format(response))
110 |
111 | logger.info("Getting Virustotal response for hash {}".format(hash_val))
112 |
113 | # use virustotal service to find malicious hash,url,ip information
114 | vt = Virustotal(config_data['online-hash-services']['virustotal']['key'],
115 | config_data['proxy']['enable'], config_data['proxy']['proxy_auth_type'],
116 | config_data['proxy']['host'], config_data['proxy']['port'],
117 | config_data['proxy']['user'], config_data['proxy']['password'])
118 |
119 | # Virustotal file report
120 | logger.info("Getting Virustotal file report..")
121 | if os.path.isfile(cmd_args.hash_file):
122 | response = vt.file_report(cmd_args.hash_file)
123 | logger.info("Virustotal report of file {}:\n {}".format(cmd_args.hash_file,response))
124 | sleep(10)
125 | else:
126 | logger.info("The file {} is not present on the system."
127 | "Kindly re-check the path and then try again.".format(cmd_args.hash_file))
128 |
129 | # Virustotal hash report
130 | logger.info("Getting Virustotal hash report...")
131 | response = vt.hash_report(hash_val)
132 | logger.info("Virustotal report for hash {}:\n{}".format(hash_val,response))
133 | sleep(10)
134 |
135 | # Virustotal url report
136 | logger.info("Getting Virustotal url report...")
137 | url = 'http://www.barc.gov.in'
138 | response = vt.url_report(url)
139 | logger.info("Virustotal report for url {}:\n{}".format(url,response))
140 | sleep(10)
141 |
142 | # Virustotal ip report
143 | logger.info("Getting Virustotal ip report...")
144 | test_ip = '59.185.236.31'
145 | response = vt.ip_report(test_ip)
146 | logger.info("Virustotal report for ip {}:\n{}".format(test_ip,response))
147 | sleep(10)
148 |
149 | # Virustotal domain report
150 | logger.info("Getting Virustotal domain report...")
151 | test_domain = '027.ru'
152 | response = vt.domain_report(test_domain)
153 | logger.info("Virustotal report for domain {}:\n".format(test_domain,response))
154 | sleep(10)
155 |
156 |
157 | except Exception,e:
158 | logger.error("Error while running the python script - {}".format(e.message),exc_info=True)
159 |
--------------------------------------------------------------------------------
/reputation-services/services-settings.yaml:
--------------------------------------------------------------------------------
1 | general:
2 | # NOTSET=0, DEBUG=10, INFO=20, WARNING=30, ERROR=40
3 | log_level: 20
4 |
5 | online-hash-services:
6 | virustotal:
7 | url: https://www.virustotal.com/vtapi/v2/file/report
8 | key:
9 | enable: yes
10 | threatexpert:
11 | url: http://threatexpert.com/reports.aspx?find=
12 | enable: yes
13 | shadowserver:
14 | url: http://bin-test.shadowserver.org/api?md5=
15 | enable: yes
16 | shadowserver-whitelist:
17 | url: http://innocuous.shadowserver.org/api/?query=
18 | enable: yes
19 |
20 | shodan-services:
21 | key:
22 | url: https://api.shodan.io
23 |
24 | proxy:
25 | user:
26 | password:
27 | host: 10.0.0.1
28 | port: 8080
29 | enable: yes
30 | # timeout period in sec - default 60
31 | timeout: 60
32 | #proxy authentication type - basic, digest
33 | proxy_auth_type: digest
34 |
35 |
36 |
--------------------------------------------------------------------------------
/reputation-services/shodan_services.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import pycurl
3 | import StringIO
4 | import os
5 | import sys
6 | import time
7 | import ipaddress
8 | import validators
9 | import urllib
10 | import logging
11 | import json
12 |
13 | # setup logging
14 | logging.basicConfig(stream=sys.stdout,level = logging.ERROR)
15 | logger = logging.getLogger(__name__)
16 |
17 | class Shodan_checks():
18 |
19 | api_base_url = 'https://api.shodan.io'
20 |
21 | def __init__(self,api_key=None, base_url=None,use_proxy = False,
22 | proxy_auth_type = 'basic', proxy_host = None,
23 | proxy_port=8080, proxy_user=None,proxy_password = None):
24 |
25 | self.api_key = api_key
26 | self.use_proxy = use_proxy
27 | self.proxy_host = proxy_host
28 | self.proxy_port = 8080
29 | self.proxy_user = proxy_user
30 | self.proxy_password = proxy_password
31 | if proxy_auth_type.lower() == 'basic':
32 | self.proxy_auth_type = pycurl.HTTPAUTH_BASIC
33 | else:
34 | self.proxy_auth_type = pycurl.HTTPAUTH_DIGEST
35 |
36 | # shodan base url
37 | self.base_url = base_url or api_base_url
38 |
39 | def valid_ip(self,ip_addr):
40 | try:
41 | ip_addr = unicode(ip_addr)
42 | res = ipaddress.ip_address(ip_addr)
43 | return True
44 | except Exception:
45 | return False
46 |
47 | def _curl_response(self, url = None, request_type='GET', parameters = None):
48 |
49 | try:
50 | response = None
51 | output = StringIO.StringIO()
52 | curl_instance = pycurl.Curl()
53 | if request_type == 'POST':
54 | if parameters:
55 | curl_instance.setopt(curl_instance.HTTPPOST,parameters)
56 | elif request_type == 'GET':
57 | if parameters:
58 | url = url + '?' + urllib.urlencode(parameters)
59 |
60 | curl_instance.setopt(pycurl.FOLLOWLOCATION,1)
61 | curl_instance.setopt(pycurl.USERAGENT, 'Mozilla/57.0 (Windows NT 6.3; Win64; x64)'
62 | ' AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36')
63 |
64 | if self.use_proxy:
65 | curl_instance.setopt(pycurl.PROXY, self.proxy_host)
66 | curl_instance.setopt(pycurl.PROXYPORT, self.proxy_port)
67 | curl_instance.setopt(pycurl.PROXYAUTH, self.proxy_auth_type)
68 | curl_instance.setopt(pycurl.PROXYUSERPWD, "{}:{}".format(self.proxy_user, self.proxy_password))
69 | curl_instance.setopt(pycurl.VERBOSE, 0)
70 | curl_instance.setopt(pycurl.SSL_VERIFYPEER, 0)
71 | curl_instance.setopt(curl_instance.URL, url)
72 | curl_instance.setopt(curl_instance.WRITEDATA, output)
73 | curl_instance.perform()
74 | response = output.getvalue()
75 | curl_instance.close()
76 |
77 | except Exception,e:
78 | logger.error("Error while getting response from Virustotal API service - {}".format(e.message), exc_info=True)
79 |
80 | return response
81 |
82 | def api_information(self):
83 |
84 | json_response = None
85 | try:
86 | shodan_url = self.base_url + '/api-info?key={}'.format(self.api_key)
87 | shodan_response= self._curl_response(shodan_url, 'GET', None)
88 | if shodan_response:
89 | logger.info("Shodan response:{}".format(shodan_response))
90 | json_response = json.loads(shodan_response)
91 | except Exception,exc:
92 | logger.error("Error while getting Shodan API status information - {}".format(exc.message),exc_info=True)
93 | return json_response
94 |
95 |
96 | def ip_information(self,ip=None):
97 |
98 | json_response = None
99 | try:
100 | shodan_url = self.base_url + '/shodan/host/{}?key={}'.format(ip,self.api_key)
101 | shodan_response= self._curl_response(shodan_url, 'GET', None)
102 | if shodan_response:
103 | logger.info("Shodan response:{}".format(shodan_response))
104 | json_response = json.loads(shodan_response)
105 | except Exception,exc:
106 | logger.error("Error while getting IP information from Shodan API - {}".format(exc.message),exc_info=True)
107 |
108 | return json_response
109 |
110 | def scanning_ports(self):
111 |
112 | json_response = None
113 | try:
114 | shodan_url = self.base_url + '/shodan/ports?key={}'.format(self.api_key)
115 | shodan_response= self._curl_response(shodan_url, 'GET', None)
116 | if shodan_response:
117 | logger.info("Shodan response:{}".format(shodan_response))
118 | json_response = json.loads(shodan_response)
119 | except Exception,exc:
120 | logger.error("Error while getting port information from Shodan API - %s" %exc.message,exc_info=True)
121 |
122 | return json_response
123 |
124 | def is_vpn(self,ip=None):
125 | json_response = None
126 | try:
127 | shodan_url = self.base_url + '/shodan/host/{}?key={}'.format(ip,self.api_key)
128 | shodan_response= self._curl_response(shodan_url, 'GET', None)
129 | if shodan_response:
130 | logger.info("Shodan response:{}".format(shodan_response))
131 | json_response = json.loads(shodan_response)
132 | for banner in json_response['data']:
133 | if banner['port'] in [500, 4500]:
134 | return True
135 | except Exception,exc:
136 | logger.error("Error while detecting VPN from Shodan API - %s" %exc.message,exc_info=True)
137 |
138 | return False
139 |
140 |
141 | def http_headers(self):
142 |
143 | json_response = None
144 | try:
145 | shodan_url = self.base_url + '/tools/httpheaders?key={}'.format(self.api_key)
146 | shodan_response= self._curl_response(shodan_url, 'GET', None)
147 | if shodan_response:
148 | logger.info("Shodan response:{}".format(shodan_response))
149 | json_response = json.loads(shodan_response)
150 |
151 | except Exception,exc:
152 | logger.error("Error while getting client HTTP headers information from Shodan API - %s" %exc.message,exc_info=True)
153 |
154 | return json_response
155 |
156 |
--------------------------------------------------------------------------------
/reputation-services/virustotal_services.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import pycurl
3 | import StringIO
4 | import os
5 | import sys
6 | import time
7 | import ipaddress
8 | import validators
9 | import urllib
10 | import logging
11 | import hashlib
12 |
13 | # setup logging
14 | logging.basicConfig(stream=sys.stdout,level = logging.ERROR)
15 | logger = logging.getLogger(__name__)
16 |
17 | class Virustotal():
18 |
19 | def __init__(self,api_key = None, use_proxy = False, proxy_auth_type = 'basic',
20 | proxy_host = None, proxy_port=8080, proxy_user=None,
21 | proxy_password = None):
22 | self.api_key = api_key
23 | self.use_proxy = use_proxy
24 | self.proxy_host = proxy_host
25 | self.proxy_port = 8080
26 | self.proxy_user = proxy_user
27 | self.proxy_password = proxy_password
28 | if proxy_auth_type.lower() == 'basic':
29 | self.proxy_auth_type = pycurl.HTTPAUTH_BASIC
30 | else:
31 | self.proxy_auth_type = pycurl.HTTPAUTH_DIGEST
32 |
33 | def _md5_hash(self, filename):
34 | md5 = None
35 | try:
36 | f = open(filename, "rb")
37 | data = f.read()
38 | md5 = hashlib.md5(data).hexdigest()
39 | f.close()
40 | except Exception, e:
41 | logger.error("Error while computing md5 checksum for file {} - {}"
42 | .format(filename,e.message),exc_info=True)
43 | return md5
44 |
45 | def valid_ip(self,ip_addr):
46 | try:
47 | ip_addr = unicode(ip_addr)
48 | res = ipaddress.ip_address(ip_addr)
49 | return True
50 | except Exception:
51 | return False
52 |
53 | def file_report(self,filename):
54 |
55 | vt_response = None
56 | try:
57 | vt_url = 'https://www.virustotal.com/vtapi/v2/file/report'
58 | if not os.path.isfile(filename):
59 | logger.error("The file path {} is not valid!".format(filename))
60 | return vt_reponse
61 |
62 | # compute file hash
63 | file_hash = self._md5_hash(filename)
64 |
65 | if self.api_key:
66 | post_parameters = [('resource', file_hash),
67 | ('apikey',self.api_key)]
68 | vt_response = self._curl_response(vt_url, 'POST', post_parameters)
69 | if vt_response:
70 | logger.debug("Virustotal API results for file hash {} - {}"
71 | .format(file_hash,vt_response))
72 |
73 | except Exception,e:
74 | logger.error("Error while retriving virustotal file report for file {} - {}"
75 | .format(filename,e.message),exc_info = True)
76 |
77 | return vt_response
78 |
79 | def hash_report(self,hash_val):
80 |
81 | vt_response = None
82 | try:
83 | vt_url = 'https://www.virustotal.com/vtapi/v2/file/report'
84 |
85 | if self.api_key:
86 | post_parameters = [('resource', hash_val),
87 | ('apikey',self.api_key)]
88 | vt_response = self._curl_response(vt_url, 'POST', post_parameters)
89 | if vt_response:
90 | logger.debug("Virustotal API results for hash {} - {}"
91 | .format(hash_val,vt_response))
92 |
93 | except Exception,e:
94 | logger.error("Error while retriving virustotal report for hash {} - {}"
95 | .format(hash_val,e.message),exc_info = True)
96 |
97 | return vt_response
98 |
99 | def url_report(self,url):
100 |
101 | vt_response = None
102 | try:
103 | if not url.find('http')>=0:
104 | url = 'http://' + url
105 |
106 | vt_url = 'http://www.virustotal.com/vtapi/v2/url/report'
107 | if self.api_key:
108 | post_parameters = [('resource', url),
109 | ('apikey',self.api_key)]
110 | vt_response = self._curl_response(vt_url, 'POST', post_parameters)
111 | if vt_response:
112 | logger.debug("Virustotal API results for url {} - {}"
113 | .format(url,vt_response))
114 |
115 | except Exception,e:
116 | logger.error("Error while retriving virustotal url report for url {} - {}"
117 | .format(url,e.message),exc_info=True)
118 |
119 | return vt_response
120 |
121 | def ip_report(self,ip_addr):
122 |
123 | vt_response = None
124 | try:
125 | if not self.valid_ip(ip_addr):
126 | logger.error("IP address {} is not valid. Quitting..")
127 | return vt_reponse
128 |
129 | vt_url = 'http://www.virustotal.com/vtapi/v2/ip-address/report'
130 | if self.api_key:
131 | parameters = {'ip': ip_addr,
132 | 'apikey': self.api_key
133 | }
134 | vt_response = self._curl_response(vt_url, 'GET', parameters)
135 | if vt_response:
136 | logger.debug("Virustotal API results for ip address {} - {}"
137 | .format(ip_addr,vt_response))
138 |
139 | except Exception,e:
140 | logger.error("Error while retriving virustotal ip address report for ip {} - {}"
141 | .format(ip_addr,e.message),exc_info=True)
142 |
143 | return vt_response
144 |
145 | def domain_report(self,domain):
146 |
147 | vt_response = None
148 | try:
149 | if not validators.domain(domain):
150 | logger.error("Domain {} is not valid. Quitting..")
151 | return vt_reponse
152 |
153 | vt_url = 'http://www.virustotal.com/vtapi/v2/domain/report'
154 | if self.api_key:
155 | parameters = {'domain': domain,
156 | 'apikey': self.api_key
157 | }
158 | vt_response = self._curl_response(vt_url, 'GET', parameters)
159 | if vt_response:
160 | logger.debug("Virustotal API results for domain {} - {}"
161 | .format(domain,vt_response))
162 |
163 | except Exception,e:
164 | logger.error("Error while retriving virustotal domain report for domain {} - {}"
165 | .format(domain,e.message),exc_info=True)
166 |
167 | return vt_response
168 |
169 |
170 | def _curl_response(self, url = None, request_type='GET', parameters = None):
171 |
172 | try:
173 | response = None
174 | output = StringIO.StringIO()
175 | curl_instance = pycurl.Curl()
176 | if request_type == 'POST' and parameters:
177 | curl_instance.setopt(curl_instance.HTTPPOST,parameters)
178 | elif request_type == 'GET' and parameters:
179 | url = url + '?' + urllib.urlencode(parameters)
180 |
181 | curl_instance.setopt(pycurl.FOLLOWLOCATION,1)
182 | curl_instance.setopt(pycurl.USERAGENT, 'Mozilla/57.0 (Windows NT 6.3; Win64; x64)'
183 | ' AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36')
184 |
185 | if self.use_proxy:
186 | curl_instance.setopt(pycurl.PROXY, self.proxy_host)
187 | curl_instance.setopt(pycurl.PROXYPORT, self.proxy_port)
188 | curl_instance.setopt(pycurl.PROXYAUTH, self.proxy_auth_type)
189 | curl_instance.setopt(pycurl.PROXYUSERPWD, "{}:{}".format(self.proxy_user, self.proxy_password))
190 | curl_instance.setopt(pycurl.VERBOSE, 0)
191 | curl_instance.setopt(pycurl.SSL_VERIFYPEER, 0)
192 | curl_instance.setopt(curl_instance.URL, url)
193 | curl_instance.setopt(curl_instance.WRITEDATA, output)
194 | curl_instance.perform()
195 | response = output.getvalue()
196 | curl_instance.close()
197 |
198 | except Exception,e:
199 | logger.error("Error while getting response from Virustotal API service - {}".format(e.message), exc_info=True)
200 |
201 | return response
202 |
203 |
--------------------------------------------------------------------------------
/site-reputation/alienvault-otx-check.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import sys
3 | import logging
4 | import requests
5 | import argparse
6 | import json
7 | from OTXv2 import OTXv2
8 | from pprint import pprint
9 |
10 | """
11 | This script uses AlienVault API to get indicators of compromise (IOC) - ip, domains, hashes etc.
12 | These IOC can be applied in organization environment to get rid of malicious activities.
13 | Links of interest:
14 | https://github.com/AlienVault-OTX/ApiV2
15 | https://github.com/AlienVault-OTX/OTX-Python-SDK/blob/master/howto_use_python_otx_api.ipynb
16 | https://github.com/Neo23x0/signature-base/blob/master/threatintel/get-otx-iocs.py
17 | https://github.com/Neo23x0/signature-base/tree/master/threatintel
18 |
19 | Sample usage cases as per https://github.com/AlienVault-OTX/ApiV2:
20 |
21 | https://www.threatcrowd.org/searchApi/v2/email/report/?email=william19770319@yahoo.com
22 | https://www.threatcrowd.org/searchApi/v2/domain/report/?domain=aoldaily.com
23 | https://www.threatcrowd.org/searchApi/v2/ip/report/?ip=188.40.75.132
24 | https://www.threatcrowd.org/searchApi/v2/antivirus/report/?antivirus=plugx
25 | https://www.threatcrowd.org/searchApi/v2/file/report/?resource=ec8c89aa5e521572c74e2dd02a4daf78
26 |
27 | """
28 | # setup logging
29 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
30 | logger = logging.getLogger(__name__)
31 |
32 | # OTX base urls
33 | otx_base_url_domain = 'https://www.threatcrowd.org/searchApi/v2/domain/report/?'
34 | otx_base_url_ip = 'https://www.threatcrowd.org/searchApi/v2/ip/report/?'
35 | otx_base_url_hash = 'https://www.threatcrowd.org/searchApi/v2/file/report/?'
36 | otx_base_url_email = 'https://www.threatcrowd.org/searchApi/v2/email/report/?'
37 |
38 | def cmd_arguments():
39 |
40 | try:
41 | parser = argparse.ArgumentParser("This script checks domain reputation using Alienvault threat exchange service.")
42 |
43 | parser.add_argument('--domain', required=False, help='Please specify domain name!',dest='domain')
44 | parser.add_argument('--ip', required=False, help='Please specify ip!',dest='ip')
45 | parser.add_argument('--email', required=False, help='Please specify email!',dest='email')
46 | parser.add_argument('--hash', required=False, help='Please specify domain name!',dest='hash')
47 |
48 | parser.add_argument('--proxy-host', required=False, help='Please specify proxy host',dest='proxy_host')
49 | parser.add_argument('--proxy-port', required=False, help='Please specify proxy port',dest='proxy_port')
50 | parser.add_argument('--proxy-user', required=False, help='Please specify proxy user',dest='proxy_user')
51 | parser.add_argument('--proxy-password', required=False, help='Please specify proxy password',dest='proxy_password')
52 |
53 | args = parser.parse_args()
54 | if args.domain == None and args.ip == None and args.email == None and args.hash == None:
55 | logger.error("You have to specify at least one argument - domain/ip/email/file hash for finding out its threat reputation.")
56 | sys.exit(1)
57 | return args
58 | except Exception as exc:
59 | logger.error("Error while getting command line arguments - %s" %exc.message,exc_info=True)
60 |
61 |
62 | if __name__ == "__main__":
63 | try:
64 | cmd_args = cmd_arguments()
65 | proxy_dict = {
66 | 'http':'http://{}:{}@{}:{}'.format(cmd_args.proxy_user,cmd_args.proxy_password,cmd_args.proxy_host,cmd_args.proxy_port),
67 | 'https':'http://{}:{}@{}:{}'.format(cmd_args.proxy_user,cmd_args.proxy_password,cmd_args.proxy_host,cmd_args.proxy_port),
68 | }
69 | logger.info("Proxies - {}".format(proxy_dict))
70 |
71 | ### Domain information ###
72 | if cmd_args.domain:
73 | cmd_args_domain = str(cmd_args.domain).replace("https://","").replace("http://","").replace("www","")
74 | otx_url = otx_base_url_domain + 'domain={}'.format(cmd_args_domain)
75 | response = requests.get(otx_url, proxies=proxy_dict, verify=False)
76 | if response.text:
77 | json_response = json.loads(response.text)
78 | pprint(json_response)
79 |
80 | ### IP information ###
81 | if cmd_args.ip:
82 | otx_url = otx_base_url_ip + 'ip={}'.format(cmd_args.ip)
83 | response = requests.get(otx_url, proxies=proxy_dict, verify=False)
84 | if response.text:
85 | json_response = json.loads(response.text)
86 | pprint(json_response)
87 |
88 | ### file hash information ###
89 | if cmd_args.hash:
90 | otx_url = otx_base_url_hash + 'resource={}'.format(cmd_args.hash)
91 | response = requests.get(otx_url, proxies=proxy_dict, verify=False)
92 | if response.text:
93 | json_response = json.loads(response.text)
94 | pprint(json_response)
95 |
96 | ### email information ###
97 | if cmd_args.email:
98 | otx_url = otx_base_url_email + 'email={}'.format(cmd_args.email)
99 | response = requests.get(otx_url, proxies=proxy_dict, verify=False)
100 | if response.text:
101 | json_response = json.loads(response.text)
102 | pprint(json_response)
103 |
104 | except Exception as exc:
105 | logger.error("Error while getting site reputation information - %s" %exc.message,exc_info=True)
106 |
107 |
--------------------------------------------------------------------------------
/site-reputation/bluecoat-site-reputation.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import sys
3 | import logging
4 | import requests
5 | import argparse
6 | import json
7 | from bs4 import BeautifulSoup
8 |
9 | """
10 | This script uses Bluecoat datasbase to get website reputation.
11 | Site reputation categories - https://sitereview.bluecoat.com/rest/categoryDetails?id=$NUM$
12 | To-do:
13 | Possible to use IBM X-force check API
14 | https://exchange.xforce.ibmcloud.com/url/
15 | https://api.xforce.ibmcloud.com/url/
16 |
17 | """
18 |
19 | # setup logging
20 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
21 | logger = logging.getLogger(__name__)
22 |
23 | url = 'https://sitereview.bluecoat.com/rest/categorization'
24 |
25 | def cmd_arguments():
26 |
27 | try:
28 | parser = argparse.ArgumentParser("This script checks domain reputation using Bluecoat reputation service.")
29 |
30 | parser.add_argument('--domain', required=True, help='Please specify domain name!',dest='domain')
31 | parser.add_argument('--proxy-host', required=False, help='Please specify proxy host',dest='proxy_host')
32 | parser.add_argument('--proxy-port', required=False, help='Please specify proxy port',dest='proxy_port')
33 | parser.add_argument('--proxy-user', required=False, help='Please specify proxy user',dest='proxy_user')
34 | parser.add_argument('--proxy-password', required=False, help='Please specify proxy password',dest='proxy_password')
35 |
36 | args = parser.parse_args()
37 | return args
38 | except Exception as exc:
39 | logger.error("Error while getting command line arguments - %s" %exc.message,exc_info=True)
40 |
41 |
42 | if __name__ == "__main__":
43 | try:
44 | cmd_args = cmd_arguments()
45 | if cmd_args.domain:
46 | proxy_dict = {
47 | 'http':'http://{}:{}@{}:{}'.format(cmd_args.proxy_user,cmd_args.proxy_password,cmd_args.proxy_host,cmd_args.proxy_port),
48 | 'https':'http://{}:{}@{}:{}'.format(cmd_args.proxy_user,cmd_args.proxy_password,cmd_args.proxy_host,cmd_args.proxy_port),
49 | }
50 | logger.info("Proxies - {}".format(proxy_dict))
51 | post_data = {"url":"{}".format(cmd_args.domain)}
52 | response = requests.post(url,proxies=proxy_dict,data=post_data,verify=False)
53 | res_json=json.loads(response.text)
54 | if 'errorType' in res_json:
55 | site_category = res_json['errorType']
56 | else:
57 | soup_response = BeautifulSoup(res_json['categorization'], 'lxml')
58 | site_category = soup_response.find("a").text
59 |
60 | # Display warning if Bluecoat CAPTCHAs are activated
61 | if site_category == 'captcha':
62 | logger.warning('Blue Coat CAPTCHA is received. Kindly change your IP or manually solve a CAPTCHA at "https://sitereview.bluecoat.com/sitereview.jsp"')
63 |
64 | logger.info("Bluecoat reputation for site {} - {}".format(cmd_args.domain,site_category))
65 |
66 | except Exception as exc:
67 | logger.error("Error while getting site reputation information - %s" %exc.message,exc_info=True)
68 |
69 |
--------------------------------------------------------------------------------
/site-reputation/shodan-check.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import sys
3 | import logging
4 | import requests
5 | import argparse
6 | import json
7 | from bs4 import BeautifulSoup
8 | #import shodan
9 |
10 | """
11 | This script uses Shodan API to determine site reputation and vulnerabilities if present.
12 |
13 | """
14 |
15 | # setup logging
16 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
17 | logger = logging.getLogger(__name__)
18 |
19 | base_url = 'https://api.shodan.io'
20 |
21 | def api_information(api_key,proxy_dict=None):
22 | json_response = None
23 | try:
24 | url = base_url + '/api-info?key={}'.format(api_key)
25 | response = requests.get(url,proxies=proxy_dict,verify=False)
26 | json_response = response.json()
27 | except Exception,exc:
28 | logger.error("Error while getting Shodan API status information - %s" %exc.message,exc_info=True)
29 | return json_response
30 |
31 | def ip_information(api_key=None,proxy_dict=None,test_ip=None):
32 | json_response = None
33 | try:
34 | url = base_url + '/shodan/host/{}?key={}'.format(test_ip,api_key)
35 | response = requests.get(url,proxies=proxy_dict,verify=False)
36 | json_response = response.json()
37 | except Exception,exc:
38 | logger.error("Error while getting IP information from Shodan API - %s" %exc.message,exc_info=True)
39 |
40 | return json_response
41 |
42 | def shodan_scanning_ports(api_key,proxy_dict=None):
43 | json_response = None
44 | try:
45 | url = base_url + '/shodan/ports?key={}'.format(api_key)
46 | response = requests.get(url,proxies=proxy_dict,verify=False)
47 | json_response = response.json()
48 | except Exception,exc:
49 | logger.error("Error while getting IP information from Shodan API - %s" %exc.message,exc_info=True)
50 |
51 | return json_response
52 |
53 | def is_vpn(api_key=None,proxy_dict=None, test_ip=None):
54 | json_response = None
55 | try:
56 | url = base_url + '/shodan/host/{}?key={}'.format(test_ip,api_key)
57 | response = requests.get(url,proxies=proxy_dict,verify=False)
58 | json_response = response.json()
59 | for banner in json_response['data']:
60 | if banner['port'] in [500, 4500]:
61 | return True
62 | except Exception,exc:
63 | logger.error("Error while detecting VPN from Shodan API - %s" %exc.message,exc_info=True)
64 | return False
65 |
66 | def http_headers(api_key,proxy_dict=None):
67 | json_response = None
68 | try:
69 | url = base_url + '/tools/httpheaders?key={}'.format(api_key)
70 | response = requests.get(url,proxies=proxy_dict,verify=False)
71 | json_response = response.json()
72 |
73 | except Exception,exc:
74 | logger.error("Error while getting client HTTP headers information from Shodan API - %s" %exc.message,exc_info=True)
75 |
76 | return json_response
77 |
78 |
79 |
80 | #def is_vpn_shodan_module(api_key,proxy_dict, test_ip):
81 | # # this function uses shodan python module.
82 | # try:
83 | # host = api.host(test_ip)
84 | # for banner in host['data']:
85 | # if banner['port'] in [500, 4500]:
86 | # return True
87 | # return False
88 | # except Exception,exc:
89 | # logger.error("Error while detecting VPN from Shodan API - %s" %exc.message,exc_info=True)
90 |
91 | def cmd_arguments():
92 |
93 | try:
94 | parser = argparse.ArgumentParser("This script uses Shodan API to check various attributes of site like reputation, open ports etc.")
95 |
96 | parser.add_argument('--domain', required=False, help='Please specify domain name!',dest='domain')
97 | parser.add_argument('--ip', required=False, help='Please specify domain name!',dest='ip')
98 | parser.add_argument('--api-key', required=True, help='Please specify domain name!',dest='api_key')
99 | parser.add_argument('--proxy-host', required=False, help='Please specify proxy host',dest='proxy_host')
100 | parser.add_argument('--proxy-port', required=False, help='Please specify proxy port',dest='proxy_port')
101 | parser.add_argument('--proxy-user', required=False, help='Please specify proxy user',dest='proxy_user')
102 | parser.add_argument('--proxy-password', required=False, help='Please specify proxy password',dest='proxy_password')
103 | args = parser.parse_args()
104 | return args
105 | except Exception as exc:
106 | logger.error("Error while getting command line arguments - %s" %exc.message,exc_info=True)
107 |
108 |
109 | if __name__ == "__main__":
110 | try:
111 | cmd_args = cmd_arguments()
112 | if cmd_args.ip == None and cmd_args.domain == None:
113 | logger.info("Kindly enter either the IP or domain name to get relevant results!")
114 |
115 | if cmd_args.ip:
116 | proxy_dict = {
117 | 'http':'http://{}:{}@{}:{}'.format(cmd_args.proxy_user,cmd_args.proxy_password,cmd_args.proxy_host,cmd_args.proxy_port),
118 | 'https':'http://{}:{}@{}:{}'.format(cmd_args.proxy_user,cmd_args.proxy_password,cmd_args.proxy_host,cmd_args.proxy_port),
119 | }
120 | logger.info("Proxies - {}".format(proxy_dict))
121 |
122 | ### API information
123 | logger.info("Getting Shodan API status information.")
124 | api_info = api_information(cmd_args.api_key,proxy_dict)
125 | logger.info("Shodan API status information:\n{}".format(api_info))
126 | logger.info("The job of getting Shodan API status information is over")
127 |
128 | ### HTTP client headers information
129 | logger.info("Getting HTTP client headers using Shodan API.")
130 | http_client_headers = http_headers(cmd_args.api_key,proxy_dict)
131 | logger.info("HTTP client headers:\n{}".format(http_client_headers))
132 | logger.info("The job of getting HTTP client headers using Shodan API is over.")
133 |
134 | ### IP information
135 | logger.info("Getting IP information from Shodan engine..")
136 | ip_info = ip_information(cmd_args.api_key,proxy_dict,cmd_args.ip)
137 | if ip_info:
138 | logger.info("IP information as reported by Shodan engine:\n {}".format(ip_info))
139 | logger.info("The job of getting IP information from Shodan engine is over.")
140 |
141 | ### Scanning ports used by Shodan engine
142 | logger.info("Checking scan ports used by Shodan engine..")
143 | scanning_ports = shodan_scanning_ports(cmd_args.api_key,proxy_dict)
144 | if scanning_ports:
145 | logger.info("Scanning ports used by Shodan are:{}".format(','.join([str(x) for x in scanning_ports])))
146 | logger.info("Checking of scan ports used by Shodan engine is over.")
147 |
148 | #### check if VPN ports are open
149 | logger.info("Checking if VPN ports are open or not for site {} using Shodan engine..".format(cmd_args.ip))
150 | vpn_result = is_vpn(cmd_args.api_key,proxy_dict,cmd_args.ip)
151 | if vpn_result:
152 | logger.info("As reported by Shodan engine, VPN ports are open. Please check if it is OK.")
153 | else:
154 | logger.info("No VPN ports are open for ip {}".format(cmd_args.ip))
155 | logger.info("Checking of VPN ports using Shodan engine is over.")
156 |
157 |
158 | except Exception as exc:
159 | logger.error("Error while getting Shodan site reputation information - %s" %exc.message,exc_info=True)
160 |
--------------------------------------------------------------------------------
/static-analysis/malware-static-analysis.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 |
3 | import argparse
4 | import magic
5 | import hashlib
6 | import ssdeep
7 | import os
8 | import sys
9 | import datetime
10 | import logging
11 | import math
12 | import pefile
13 | import peutils
14 | import struct
15 | import re
16 | import base64
17 | import string
18 | from collections import Counter
19 | from yara_checks import yara_checks
20 |
21 | """
22 | Some useful links:
23 | http://breakinsecurity.com/pe-format-manipulation-with-pefile/
24 | https://github.com/ClickSecurity/data_hacking/blob/master/pefile_classification/pe_features.py#L317
25 | https://github.com/hiddenillusion/AnalyzePE/blob/master/AnalyzePE.py
26 | https://github.com/Ice3man543/MalScan/blob/master/malscan.py
27 | https://github.com/secrary/SSMA
28 | https://github.com/Rurik/FileInfo/blob/master/FileInfo.py
29 |
30 | PE file format:
31 | https://github.com/deptofdefense/SalSA/wiki/PE-File-Format
32 | https://msdn.microsoft.com/en-us/library/windows/desktop/ms680547(v=vs.85).aspx
33 | """
34 | # setup logging
35 | logging.basicConfig(stream=sys.stdout,level = logging.DEBUG)
36 | logger = logging.getLogger(__name__)
37 |
38 | IMAGE_FILE_MACHINE_I386=332
39 | IMAGE_FILE_MACHINE_IA64=512
40 | IMAGE_FILE_MACHINE_AMD64=34404
41 |
42 | def is_exe(filename):
43 | flg = False
44 | try:
45 | pe = pefile.PE(filename)
46 | flg = True
47 | except Exception,e:
48 | logger.error("Error while loading {} file".format(filename,e.message),exc_info=True)
49 | return flg
50 |
51 | def convertToUTF8(s):
52 | if (isinstance(s, unicode)):
53 | return s.encode( "utf-8" )
54 | try:
55 | u = unicode( s, "utf-8" )
56 | except:
57 | return str(s)
58 | utf8 = u.encode( "utf-8" )
59 |
60 | return utf8
61 |
62 | def RemoveNulls(s):
63 | s = s.split('\x00', 1)[0]
64 | return s.decode('ascii', 'ignore')
65 |
66 | def is_hex_old(s):
67 | return re.match(r'\b[0-9a-fA-F]+\b',s)
68 | #return re.match(r'^[0-9a-fA-F]+$', s) is not None
69 |
70 | def is_hex(s):
71 | return all(c in string.hexdigits for c in s)
72 |
73 | def isBase64(s):
74 | try:
75 | if base64.b64encode(base64.b64decode(s)) == s:
76 | return True;
77 | except Exception:
78 | pass;
79 | return False;
80 |
81 | def pe_sections(filename):
82 | sections = list()
83 | try:
84 | pe = pefile.PE(filename)
85 | for section in pe.sections:
86 | if section:
87 | logger.info("Section name - {}".format(section.Name.decode('utf-8')))
88 | logger.info("Virtual address - {}".format(hex(section.VirtualAddress)))
89 | logger.info("Virtual size - {}".format(hex(section.Misc_VirtualSize)))
90 | logger.info("Size of raw data - {}".format(section.SizeOfRawData))
91 | logger.info("Section entropy - {}".format(section.get_entropy()))
92 | sections.append([
93 | {'Section name':section.Name.decode('utf-8')},
94 | {'Virtual address':hex(section.VirtualAddress)},
95 | {'Virtual size':hex(section.Misc_VirtualSize)},
96 | {'Raw data size':section.SizeOfRawData},
97 | {'Section entropy':section.get_entropy()}
98 | ])
99 |
100 | except Exception,e:
101 | logger.error("Error while getting PE section(s) of file {} - {}".format(filename,e.message),exc_info=True)
102 |
103 | return sections
104 |
105 | def dos_headers(filename):
106 | dos_header_fields = list()
107 | try:
108 | pe = pefile.PE(filename)
109 | for field in pe.DOS_HEADER.dump():
110 | dos_header_fields.append(field)
111 | except Exception,e:
112 | logger.error("Error while getting DOS Headers information of file {} - {}".format(filename,e.message),exc_info=True)
113 |
114 | return dos_header_fields
115 |
116 | def get_import_dlls(filename):
117 | imported_dlls = list()
118 |
119 | try:
120 | pe = pefile.PE(filename)
121 | for entry in pe.DIRECTORY_ENTRY_IMPORT:
122 | imported_dlls.append(entry.dll.decode('utf-8'))
123 |
124 | except Exception,e:
125 | logger.error("Error while getting imported DLL information of file {} - {}".format(filename,e.message),exc_info=True)
126 |
127 | return imported_dlls
128 |
129 | def get_import_dll_functions(filename):
130 |
131 | import_dll_functions = list()
132 |
133 | try:
134 | pe = pefile.PE(filename)
135 | for entry in pe.DIRECTORY_ENTRY_IMPORT:
136 | dll_functions = [{'name': func.name.decode('utf-8'),'address':func.address} for func in entry.imports]
137 | import_dll_functions.append({entry.dll.decode('utf-8'):dll_functions})
138 |
139 | except Exception,e:
140 | logger.error("Error while getting import DLL function names for file {} - {}".format(filename,e.message),exc_info=True)
141 |
142 | return import_dll_functions
143 |
144 | def check_kernel_mode(filename):
145 | # check if file supports kernel mode of operation
146 | has_kernel_mode = False
147 |
148 | try:
149 | pe = pefile.PE(filename)
150 | for entry in pe.DIRECTORY_ENTRY_IMPORT:
151 | if (entry.dll == 'ntoskrnl.exe'):
152 | has_kernel_mode = True
153 |
154 | except Exception,e:
155 | logger.error("Error in checking if the file {} requires/supports kernel mode of operation - {}"
156 | .format(filename,e.message),exc_info=True)
157 |
158 | return has_kernel_mode
159 |
160 | def check_dynamic_loaders(filename):
161 | # check if dynamic loaders are supported
162 | dynamic_loader = False
163 | try:
164 | pe = pefile.PE(filename)
165 | i = 0
166 | for entry in pe.DIRECTORY_ENTRY_IMPORT:
167 | if (entry.dll == 'KERNEL32.DLL'):
168 | for imp in entry.imports:
169 | i += 1
170 | if (i == 0) or (i <= 10):
171 | dynamic_loader = True
172 |
173 | except Exception,e:
174 | logger.error("Error while checking dynamic loaders in file {} - {}"
175 | .format(filename,e.message),exc_info=True)
176 |
177 | return dynamic_loader
178 |
179 | def get_antidebug_functions(filename):
180 |
181 | anti_debug_functions = ['CheckRemoteDebuggerPresent', 'FindWindow',
182 | 'GetWindowThreadProcessId', 'IsDebuggerPresent',
183 | 'OutputDebugString', 'Process32First', 'Process32Next',
184 | 'TerminateProcess', 'UnhandledExceptionFilter',
185 | 'ZwQueryInformation','NtQueryInformationProcess',
186 | 'NtSetInformationThread']
187 | response = list()
188 | try:
189 | pe = pefile.PE(filename)
190 | if hasattr(pe, "DIRECTORY_ENTRY_IMPORT"):
191 | for entry in pe.DIRECTORY_ENTRY_IMPORT:
192 | for imp in entry.imports:
193 | if imp:
194 | for fun in anti_debug_functions:
195 | if imp.name.find(fun)>=0:
196 | response.append("%s %s" %(hex(imp.address),imp.name))
197 | else:
198 | logger.info("No 'DIRECTORY_ENTRY_IMPORT' section file {}".format(filename))
199 | if response:
200 | return '\n'.join(response)
201 | else:
202 | return None
203 | except Exception,e:
204 | logger.error("Error while checking presence of anti-debugging functions"
205 | " in the file {} - {}".format(filename,e.message),exc_info=True)
206 |
207 | def antiVM_checks(filename):
208 | vm_tricks = list()
209 | try:
210 | """
211 | source: https://code.google.com/p/pyew
212 | """
213 | vm_signatures = {
214 | "Red Pill":"\x0f\x01\x0d\x00\x00\x00\x00\xc3",
215 | "VirtualPc trick":"\x0f\x3f\x07\x0b",
216 | "VMware trick":"VMXh",
217 | "VMCheck.dll":"\x45\xC7\x00\x01",
218 | "VMCheck.dll for VirtualPC":"\x0f\x3f\x07\x0b\xc7\x45\xfc\xff\xff\xff\xff",
219 | "Xen":"XenVMM",
220 | "Bochs & QEmu CPUID Trick":"\x44\x4d\x41\x63",
221 | "Torpig VMM Trick": "\xE8\xED\xFF\xFF\xFF\x25\x00\x00\x00\xFF\x33\xC9\x3D\x00\x00\x00\x80\x0F\x95\xC1\x8B\xC1\xC3",
222 | "Torpig (UPX) VMM Trick": "\x51\x51\x0F\x01\x27\x00\xC1\xFB\xB5\xD5\x35\x02\xE2\xC3\xD1\x66\x25\x32\xBD\x83\x7F\xB7\x4E\x3D\x06\x80\x0F\x95\xC1\x8B\xC1\xC3"
223 | }
224 |
225 | vm_strings = {
226 | "Virtual Box":"VBox",
227 | "VMware":"WMvare"
228 | }
229 |
230 | with open(filename,'rb') as f:
231 | buf = f.read()
232 | # check if virtualbox or vmplayer is present
233 | for item in vm_strings:
234 | match = re.findall(vm_strings[item], buf, re.IGNORECASE | re.MULTILINE)
235 | if match:
236 | vm_tricks.append(item)
237 |
238 | for sig in vm_signatures:
239 | if buf.find(vm_signatures[sig][::-1]) > -1:
240 | vm_tricks.append(sig)
241 | if vm_tricks:
242 | return '\n'.join(vm_tricks)
243 | else: return None
244 |
245 | except Exception,e:
246 | logger.error("Error while checking presence of anti-Virtual Machine functions"
247 | " in the file {} - {}".format(filename,e.message),exc_info=True)
248 |
249 | def find_ip_address(filename):
250 | ip_list = list()
251 | ip_regex = r'(?:[\d]{1,3})\.(?:[\d]{1,3})\.(?:[\d]{1,3})\.(?:[\d]{1,3})$'
252 | try:
253 | with open(filename, 'rb') as f:
254 | buf = f.read()
255 | regex = re.findall(ip_regex,buf)
256 | if regex is not None:
257 | for match in regex:
258 | if match not in ip_list:
259 | ip_list.append(match)
260 |
261 | except Exception,e:
262 | logger.error("Error while finding IP addresses in file - {}".format(filename,e.message),exc_info=True)
263 |
264 | if ip_list:
265 | return ','.join(ip_list)
266 | else: return None
267 |
268 | def find_urls(filename):
269 | url_list = list()
270 | url_regex = r'https?://(?:[-\w.]|(?:%[\da-fA-F]{2}))+'
271 | try:
272 | with open(filename, 'rb') as f:
273 | buf = f.read()
274 | regex = re.findall(url_regex,buf)
275 | if regex is not None:
276 | for match in regex:
277 | if match not in url_list:
278 | url_list.append(match)
279 |
280 | except Exception,e:
281 | logger.error("Error while finding urls in the file - {}".format(filename,e.message),exc_info=True)
282 |
283 | if url_list:
284 | return ','.join(url_list)
285 | else: return None
286 |
287 | def find_emails(filename):
288 | email_list = list()
289 | #email_regex = r'^.+@([?)[a-zA-Z0-9-.]+.([a-zA-Z]{2,3}|[0-9]{1,3})(]?)$'
290 | #email_regex = r'\S+@\S+'
291 | email_regex = r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$.'
292 | try:
293 | with open(filename, 'rb') as f:
294 | buf = f.read()
295 | regex = re.findall(email_regex,buf)
296 | if regex is not None:
297 | for match in regex:
298 | if match not in email_list:
299 | email_list.append(match)
300 |
301 | except Exception,e:
302 | logger.error("Error while finding urls in the file - {}".format(filename,e.message),exc_info=True)
303 |
304 | if email_list:
305 | return ','.join(email_list)
306 | else: return None
307 |
308 |
309 |
310 | def suspicious_api_calls(filename):
311 |
312 | suspicious_apis = ['accept','AddCredentials','bind','CertDeleteCertificateFromStore',
313 | 'CheckRemoteDebuggerPresent','CloseHandle','closesocket','connect','ConnectNamedPipe',
314 | 'CopyFile','CreateFile','CreateProcess','CreateToolhelp32Snapshot','CreateFileMapping',
315 | 'CreateRemoteThread','CreateDirectory','CreateService','CreateThread','CryptEncrypt',
316 | 'DeleteFile','DeviceIoControl','DisconnectNamedPipe','DNSQuery','EnumProcesses',
317 | 'ExitProcess','ExitThread','FindWindow','FindResource','FindFirstFile','FindNextFile',
318 | 'FltRegisterFilter','FtpGetFile','FtpOpenFile','GetCommandLine','GetComputerName',
319 | 'GetCurrentProcess','GetThreadContext','GetDriveType','GetFileSize','GetFileAttributes',
320 | 'GetHostByAddr','GetHostByName','GetHostName','GetModuleHandle','GetModuleFileName',
321 | 'GetProcAddress','GetStartupInfo','GetSystemDirectory','GetTempFileName','GetTempPath',
322 | 'GetTickCount','GetUpdateRect','GetUpdateRgn','GetUserNameA','GetUrlCacheEntryInfo',
323 | 'GetVersionEx','GetWindowsDirectory','GetWindowThreadProcessId','HttpSendRequest',
324 | 'HttpQueryInfo','IcmpSendEcho','IsBadReadPtr','IsBadWritePtr','IsDebuggerPresent',
325 | 'InternetCloseHandle','InternetConnect','InternetCrackUrl','InternetQueryDataAvailable',
326 | 'InternetGetConnectedState','InternetOpen','InternetQueryDataAvailable','InternetQueryOption',
327 | 'InternetReadFile','InternetWriteFile','LdrLoadDll','LoadLibrary','LoadLibraryA','LockResource',
328 | 'listen','MapViewOfFile','OutputDebugString','OpenFileMapping','OpenProcess','Process32First',
329 | 'Process32Next','recv','ReadFile','RegCloseKey','RegCreateKey','RegDeleteKey','RegDeleteValue',
330 | 'RegEnumKey','RegOpenKey','ReadProcessMemory','send','sendto','SetFilePointer','SetKeyboardState',
331 | 'SetWindowsHook','ShellExecute','Sleep','socket','StartService','TerminateProcess','UnhandledExceptionFilter',
332 | 'URLDownload','VirtualAlloc','VirtualFree','VirtualProtect','VirtualAllocEx','WinExec','WriteProcessMemory',
333 | 'WriteFile','WSASend','WSASocket','WSAStartup','ZwQueryInformation'
334 | ]
335 | api_calls = list()
336 | try:
337 |
338 | pe = pefile.PE(filename)
339 | if hasattr(pe, "DIRECTORY_ENTRY_IMPORT"):
340 | for entry in pe.DIRECTORY_ENTRY_IMPORT:
341 | for imp in entry.imports:
342 | for api_call in suspicious_apis:
343 | if imp.name.find(api_call)>=0:
344 | api_calls.append(imp.name)
345 | except Exception,e:
346 | logger.error("Error while checking suspicious call usage for file {} - {}".format(filename,e.message),exc_info=True)
347 | if api_calls:
348 | return '\n'.join(api_calls)
349 | else: return None
350 |
351 | def get_export_symbols(filename):
352 |
353 | export_symbols = list()
354 |
355 | try:
356 | pe = pefile.PE(filename)
357 | if hasattr(pe, "DIRECTORY_ENTRY_EXPORT"):
358 | for entry in pe.DIRECTORY_ENTRY_EXPORT.symbols:
359 | export_symbols.append({entry.name.decode('utf-8'):entry.address})
360 |
361 | except Exception,e:
362 | logger.error("Error while getting exported symbols for file {} - {}".format(filename,e.message),exc_info=True)
363 |
364 | return export_symbols
365 |
366 | def get_extension(filename):
367 | try:
368 | 'returns ext of the file type using pefile'
369 | if pe.is_dll() == True:
370 | return 'dll'
371 | if pe.is_driver() == True:
372 | return 'sys'
373 | if pe.is_exe() == True:
374 | return 'exe'
375 | else:
376 | return 'bin'
377 | except Exception,e:
378 | logger.error("Error while getting extension for file {} - {}"
379 | .format(filename,e.message),exc_info=True)
380 | return None
381 |
382 | def detect_overlay(filename):
383 | # good reference - http://struppigel.blogspot.in/2014/05/accurate-overlay-detection.html
384 | Isoverlay = False
385 | try:
386 | offset = 0
387 | pe = pefile.PE(filename)
388 |
389 | #if not pe.get_overlay():
390 | # logger.info("It is good to know that no file overlay detected.")
391 |
392 | for sec in pe.sections:
393 | pointer_to_raw = sec.PointerToRawData
394 | size_of_raw = sec.SizeOfRawData
395 | if offset < size_of_raw + pointer_to_raw:
396 | offset = size_of_raw + pointer_to_raw
397 | f_size = os.path.getsize(filename)
398 | if offset < f_size:
399 | Isoverlay = True
400 |
401 | except Exception,e:
402 | logger.error("Error while checking overlay for file {} - {}"
403 | .format(filename,e.message),exc_info=True)
404 | return Isoverlay
405 |
406 | def compute_entropy(data):
407 | if not data:
408 | return 0
409 | entropy = 0
410 | for x in range(256):
411 | prob = float(data.count(chr(x)))/len(data)
412 | if prob > 0 :
413 | entropy += -prob*math.log(prob,2)
414 |
415 | return entropy
416 |
417 | def compute_data_entropy(filename):
418 | try:
419 | ent = 0
420 | counts = Counter()
421 | with open(filename,'r') as f:
422 | data=f.read()
423 |
424 | for d in data:
425 | counts[d] +=1
426 |
427 | probs = [float(c) / len(data) for c in counts.values()]
428 | probs = [p for p in probs if p > 0.]
429 |
430 | for p in probs:
431 | ent -= p * math.log(p, 2)
432 |
433 | except Exception,e:
434 | logger.error("Error while computing entropy for file {} - {} ".format(filename, e.message), exc_info=True)
435 |
436 | return ent
437 |
438 | def compute_md5(filename):
439 | try:
440 |
441 | return hashlib.md5(filename).hexdigest()
442 |
443 | except Exception,e:
444 | logger.error("Error while computing md5 hash of file {} - {} ".format(filename, e.message), exc_info=True)
445 |
446 | def compute_sha1(filename):
447 | try:
448 |
449 | return hashlib.sha1(filename).hexdigest()
450 |
451 | except Exception,e:
452 | logger.error("Error while computing md5 hash of file {} - {} ".format(filename, e.message), exc_info=True)
453 |
454 | def ssdeep_hash(filename):
455 | try:
456 |
457 | return ssdeep.hash_from_file(filename)
458 |
459 | except Exception, e:
460 | logger.error("Error while computing ssdeep hash of file {} - {}".format(filename, e.message), exc_info=True)
461 |
462 | def mime_type(filename):
463 | try:
464 |
465 | return magic.from_file(filename, mime=True)
466 |
467 | except Exception, e:
468 | logger.error("Error while determining file type for file {} - {}".format(filename, e.message), exc_info=True)
469 |
470 | def get_file_size(filename):
471 | try:
472 |
473 | return os.path.getsize(filename)
474 |
475 | except Exception, e:
476 | logger.error("Error while getting file size for file {} - {}".format(filename, e.message), exc_info=True)
477 |
478 | def is_exe(filename):
479 | with open(filename,'rb') as f:
480 | # read first two bytes
481 | resp_bytes = f.read(2)
482 | # not an EXE file
483 | if resp_bytes !="MZ":
484 | return False
485 |
486 | else: return True
487 |
488 | def is_packed(filename):
489 | pack_status = False
490 | try:
491 | pe_instance = pefile.PE(filename)
492 | packed = peutils.is_probably_packed(pe_instance)
493 | if packed == 1:
494 | pack_status = True
495 | except Exception,e:
496 | logger.error("Error while checking if the file {} is packed using well known"
497 | " packer programs like UPX etc. - {}".format(filename, e.message), exc_info=True)
498 |
499 | return pack_status
500 |
501 | def check_sizeofrawdata(filename):
502 | size_rawdata_check = False
503 | try:
504 | pe_instance = pefile.PE(filename)
505 | no_sections = pe_instance.FILE_HEADER.NumberOfSections
506 | for i in range(no_sections-1):
507 | if i == no_sections-1:
508 | break
509 | else:
510 | nextp = pe_instance.sections[i].SizeOfRawData + pe_instance.sections[i].PointerToRawData
511 | currp = pe_instance.sections[i+1].PointerToRawData
512 | if nextp != currp:
513 | logger.info("The Size Of Raw data value is not valid and it may crash your disassembler/debugger")
514 | break
515 | else:
516 | pass
517 |
518 | size_rawdata_check = True
519 |
520 | except Exception, e:
521 | logger.error("Error while checking size of raw data value(s) in"
522 | " sections of PE file {} - {}".format(filename, e.message), exc_info=True)
523 |
524 | return size_rawdata_check
525 |
526 | def check_empty_section_name(filename):
527 | empty_section_name = False
528 | try:
529 | pe_instance = pefile.PE(filename)
530 | for sec in pe_instance.sections:
531 | if not re.match("^[.A-Za-z][a-zA-Z]+",sec.Name):
532 | logger.info("Empty or Non-ASCII section name - {}".format(sec.Name))
533 | empty_section_name = True
534 | except Exception,e:
535 | logger.info("Error while checking empty section name in file"
536 | " {} - {}".format(filename, e.message), exc_info=True)
537 |
538 | return empty_section_name
539 |
540 | def check_optional_header_size(filename):
541 | chk_option_header = True
542 | try:
543 | pe_instance = pefile.PE(filename)
544 | if pe_instance.FILE_HEADER.SizeOfOptionalHeader != 224:
545 | logger.debug("Size of optional header is not valid.")
546 | chk_option_header = False
547 | except Exception,e:
548 | logger.info("Error while checking size of optional header for file"
549 | " - {}".format(filename, e.message), exc_info=True)
550 |
551 | return chk_option_header
552 |
553 | def check_optional_header_checksum(filename):
554 | chk_checksum_header = True
555 | try:
556 | pe_instance = pefile.PE(filename)
557 | if pe_instance.OPTIONAL_HEADER.CheckSum == 0:
558 | logger.debug("Optional Header check sum {} is not valid."
559 | .format(pe_instance.OPTIONAL_HEADER.CheckSum))
560 | chk_checksum_header = False
561 | except Exception,e:
562 | logger.info("Error while checking checksum of optional header for file"
563 | " - {}".format(filename, e.message), exc_info=True)
564 |
565 | return chk_checksum_header
566 |
567 | def check_service_dll(filename):
568 | is_service_dll = False
569 | try:
570 | pe_instance = pefile.PE(filename)
571 | if hasattr(pe_instance,"DIRECTORY_ENTRY_EXPORT"):
572 | for dll_entry in pe_instance.DIRECTORY_ENTRY_EXPORT.symbols:
573 | if re.match('ServiceMain', dll_entry.name):
574 | is_service_dll = True
575 |
576 | except Exception,e:
577 | logger.info("Error while checking presence of Service DLLs in file {} - {}"
578 | .format(filename, e.message), exc_info=True)
579 |
580 | return is_service_dll
581 |
582 | def get_imports(filename):
583 | # get number of imports and import symbols
584 | num_import_symbols = 0
585 | num_imports = 0
586 | try:
587 | pe_instance = pefile.PE(filename)
588 | if hasattr(pe_instance,"DIRECTORY_ENTRY_IMPORT"):
589 | num_imports = len(pe_instance.DIRECTORY_ENTRY_IMPORT)
590 | for entry in pe_instance.DIRECTORY_ENTRY_IMPORT:
591 | num_import_symbols += len(entry.imports)
592 | except Exception,e:
593 | logger.info("Error while getting number of import symbols in file {} - {}"
594 | .format(filename,e.message), exc_info=True)
595 |
596 | return num_imports, num_import_symbols
597 |
598 | def get_bound_imports(filename):
599 | # get number of bounded imports
600 | num_bound_imports = 0
601 | num_bound_symbols = 0
602 | try:
603 | pe_instance = pefile.PE(filename)
604 | if hasattr(pe_instance,"DIRECTORY_ENTRY_BOUND_IMPORT"):
605 | num_bound_imports = len(pe_instance.DIRECTORY_ENTRY_BOUND_IMPORT)
606 | bound_symbols = list()
607 | for entry in pe_instance.DIRECTORY_ENTRY_BOUND_IMPORT:
608 | num_bound_symbols += len(entry.entries)
609 | except Exception,e:
610 | logger.info("Error while getting number of bound imports in file {} - {}"
611 | .format(filename,e.message), exc_info=True)
612 |
613 | return num_bound_imports, num_bound_symbols
614 |
615 | def get_exports(filename):
616 | # get number of exports and export symbols
617 | num_exports = 0
618 | num_export_symbols = 0
619 | try:
620 | pe_instance = pefile.PE(filename)
621 | if hasattr(pe_instance,"DIRECTORY_ENTRY_EXPORT"):
622 | num_exports = len(pe_instance.DIRECTORY_ENTRY_EXPORT)
623 | export_symbols = list()
624 | for entry in pe_instance.DIRECTORY_ENTRY_EXPORT.symbols:
625 | if entry.name:
626 | export_symbols.append(entry.name)
627 | logger.debug("Export symbols {}".format('\n'.join(export_symbols)))
628 | num_export_symbols = len(export_symbols)
629 | except Exception,e:
630 | logger.info("Error while getting number of export symbols in file {} - {}"
631 | .format.format(filename,e.message), exc_info=True)
632 |
633 | return num_exports,num_export_symbols
634 |
635 |
636 | def get_architecture(filename):
637 | arch = "Unknown"
638 | with open(filename,'rb') as f:
639 | f.seek(60)
640 | s=f.read(4)
641 | header_offset=struct.unpack("=1 :
118 | return dict(zip(['blacklist_status', 'analysis_date', 'ip_address', 'rdns', 'asn', 'country'],results[1:]))
119 | except Exception,e:
120 | logger.error("Error while getting ip information from IPVoid portal - %s"
121 | %e.message,exc_info=True)
122 | return None
123 |
124 | def ipvoid_results(html_content,ip):
125 | try:
126 | soup_instance = BeautifulSoup(html_content, "html.parser")
127 | data = soup_instance.find("table")
128 | if data:
129 | blacklist_status = data.find("td", text="Blacklist Status")
130 | result_blacklist_status = blacklist_status.findNext("td").text
131 | analysis_date = data.find("td", text="Analysis Date")
132 | result_analysis_date = analysis_date.findNext("td").text
133 | ip_addr = data.find("td", text="IP Address")
134 | result_ip_address = ip_addr.findNext("td").strong.text
135 | rdns_data = data.find("td", text="Reverse DNS")
136 | result_rdns = rdns_data.findNext("td").text
137 | asn_data = data.find("td", text="ASN")
138 | result_asn = asn_data.findNext("td").text
139 | country_data = data.find("td", text="Country Code")
140 | result_country = country_data.findNext("td").text
141 | return ['True',result_blacklist_status, result_analysis_date,
142 | result_ip_address, result_rdns, result_asn, result_country]
143 | else:
144 | logger.info("No results found on ipvoid portal for ip - %s!" %ip)
145 | except Exception as e:
146 | print("Error parsing ipvoid: %s" % e)
147 | return ['False']
148 |
149 |
150 | if __name__ == "__main__":
151 | try:
152 | cmd_args = cmd_arguments()
153 | if cmd_args:
154 |
155 | # read yaml configuration
156 | if os.path.isfile(cmd_args.config_file):
157 | config = yaml_config(cmd_args.config_file)
158 | logger.debug("%s" %pprint(config))
159 |
160 | # check if key and identity are valid
161 | if not (config['settings']['identifier']
162 | and config['settings']['key']):
163 | logger.error("API keys and identity for Urlvoid API service"
164 | " could not be found in the configuration file.."
165 | " Quitting..")
166 | # check if ip is valid
167 | if IsIP(cmd_args.ip):
168 | logger.info("IP %s will be checked using IPVoid blacklist service."% cmd_args.ip)
169 | ip_results = ipvoid_query(cmd_args.ip)
170 | logger.info("IP information as found on IPVoid blacklist service:%s" %ip_results)
171 |
172 | # check if url is valid
173 | if Isurl(cmd_args.url):
174 | _,domain = url_parameters(cmd_args.url)
175 | logger.info("Domain %s will be checked using URLvoid API domain service."%domain)
176 | remaining_queries = urlvoid_status(domain,config)
177 | if remaining_queries>0:
178 | blacklist_cnt = urlvoid_report(domain,config)
179 | if blacklist_cnt > 0:
180 | logger.info("As per URLVoid domain reputation service,"
181 | " the domain %s is blacklisted in %s lists."%(domain,blacklist_cnt))
182 | else:
183 | logger.info("As per URLVoid domain reputation service,"
184 | " the domain %s is not blacklisted."%(domain))
185 |
186 | except Exception as e:
187 | logger.error("Error while getting ip/domain information - %s" %e.message,exc_info=True)
188 |
--------------------------------------------------------------------------------
/urlvoid-ipvoid-checks/settings.yaml:
--------------------------------------------------------------------------------
1 | settings:
2 | identifier:
3 | key:
4 |
--------------------------------------------------------------------------------