├── LICENSE ├── README.md ├── docker ├── Dockerfile ├── cron_gitlogstashfir ├── docker-compose.yml ├── git-logstash-instance-x.sh ├── mkmf.rb.patch ├── rsyslog-docker.conf └── validate-json.sh ├── logstash-filter-enrsig ├── .travis.yml ├── CHANGELOG.md ├── CONTRIBUTORS ├── DEVELOPER.md ├── Gemfile ├── LICENSE ├── NOTICE.TXT ├── README.md ├── Rakefile ├── conf-samples │ ├── conf_enrsig.json │ ├── template_nbtscan_2_json.erb │ └── template_whois_2_json.erb ├── extra │ └── json-whois.py ├── lib │ └── logstash │ │ └── filters │ │ └── enrsig.rb ├── logstash-filter-enrsig.gemspec └── spec │ ├── filters │ └── enrsig_spec.rb │ └── spec_helper.rb ├── logstash-filter-sig ├── .travis.yml ├── CHANGELOG.md ├── CONTRIBUTORS ├── DEVELOPER.md ├── Gemfile ├── LICENSE ├── NOTICE.TXT ├── README.md ├── Rakefile ├── conf-samples │ ├── blacklist.json │ ├── conf_bl.json │ ├── conf_freq.json │ ├── conf_ref.json │ ├── drop-db.json │ ├── drop-fp.json │ ├── enr.json │ ├── fingerprint_conf.json │ ├── ioc.json │ ├── ioc_conf.json │ ├── ioc_local.json │ ├── new-save.json │ ├── new.json │ ├── note.json │ ├── note_ref_defaut.json │ ├── pattern.db │ ├── reference.json │ ├── sig.json │ └── whois.json ├── lib │ └── logstash │ │ └── filters │ │ └── sig.rb ├── logstash-filter-sig.gemspec ├── scripts-create-db │ ├── create_ref.rb │ ├── ioc_create.sh │ └── misp2json4ioc.py └── spec │ ├── filters │ └── sig_spec.rb │ └── spec_helper.rb ├── logstash-output-fir ├── .travis.yml ├── CHANGELOG.md ├── CONTRIBUTORS ├── DEVELOPER.md ├── Gemfile ├── LICENSE ├── NOTICE.TXT ├── README.md ├── Rakefile ├── lib │ └── logstash │ │ └── outputs │ │ └── fir.rb ├── logstash-output-fir.gemspec ├── sample_conf │ └── conf_fir.json ├── spec │ └── outputs │ │ └── fir_spec.rb └── template_erb │ ├── subject_template_new.erb │ ├── subject_template_update.erb │ ├── template_new.erb │ └── template_update.erb └── sample-architecture ├── Architecture-sample.png └── Diagramme-archi.png /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "{}" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright {yyyy} {name of copyright owner} 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Logstash security plugins 2 | 3 | *These plugins help you in your security log analysis (close of SIEM but without post correlate, just real time) and make score for create alert.* 4 | 5 | - logstash-filter-sig (filter plugin): analysis and detect security threat for make alert 6 | - logstash-filter-ensig (filter plugin): enrich informations in event by different way (local databases, dynamic request, ...) 7 | - logstash-output-fir (output plugin): push alert on FIR platform (CERT SG) 8 | 9 | ## logstash-filter-sig 10 | 11 | *Logstash plugin Filter "Sig" can help you to detect security threat in log by differents ways.* 12 | 13 | ### Features 14 | 15 | * Drop first time False positive and noise events 16 | * Enrichissement event with database or/and send event to plugin enrsig for enrich event by active method (whois, ssl_check, nmap, ...) 17 | * Drop second time False positive and noise events (based on enrichissement informations) 18 | * Check new value in field 19 | * Check blacklist reputation 20 | * Check IOC in event (extracted on MISP) 21 | * Check signatures with some fonctionnality: 22 | * Rule compisition: 23 | * Name: name of rule for report 24 | * ID: id use for correlate and change score 25 | * Score (note): give score if matched (use score for triggered alert) 26 | * Type: 2 types possibility, first is 'primary signature' and second is 'second signature'. Second signature match only if a primary signature matched before. 27 | * ModeFP: it's boolean variable for indicate if rule match 'false positive' 28 | * extract (optional): use for extract data informations if you are sure that threat, and put in IOC local database for detect in all next events. 29 | * Use multi-search techniques (you can use one or more techniques in one rule) by event on field and value: 30 | * Check if field is present or field is not present 31 | * Check Regexp if present or not present 32 | * Check motif (Array or String) 33 | * Compare value field against another value field (string or numeric -- operator: ==, <, >, !=) 34 | * Check size (length) of string in field with operator: ==, <, >, != 35 | * Check ipaddr (ex: 192.168.0.0/24) in field with operator: ==, != 36 | * Check numeric value in field with operator: ==, !=, <, > 37 | * Check date value in relationship with time now + X, with operator: ==, <, >, != 38 | * Check date value if in hour, use operator: ==, <, >, != 39 | * Check date value if in day (number day, ex: 0==sunday,1==monday), use operator: ==, <, >, != 40 | * Check frequence on multi event 41 | * Can be used for brute force (ex: if 3 times auth error in 60 secondes,then don't research before 3600 secondes) 42 | * Correlate multi-sources with same field value (ex: ip) on different events (ex: squid event IP DST == sysmon event IP DST) 43 | * Check frequence on event 44 | * Check event by compare with reference data (require to make reference database with ES when contains clean data) (it's new version of my project AEE [https://github.com/lprat/AEE]) 45 | * Check size, check regexp form value, check if unique or determined list value (ex: don't be @timestamp because change everytime) 46 | * Check link/relationship between not signle/determined list value of fields (idea inspired by tool PicViz [http://picviz.com/]). Exemple on apache log page test.php return always 200 in all logs. The link/relationship value/field is "uri_page(test.php)<->return_code(200)" 47 | * Analys matched rules for adapt score of alert 48 | * Fingerprint event according by rule for identify unique event & Drop fingerprint (false positive usage) 49 | * Check frequence on specifique event by filters. Alert not created on a specifique event, but it create new event. 50 | 51 | ### Install by Docker 52 | 53 | * You can use docker cloud: docker pull lprat/logstash-plugins:latest 54 | *DockerFile create contener with last logstash and install plugin: sig, enrsig and fir. If you add others plugins, please edit Dockerfile before run docker composer* 55 | 56 | Enter in directory "docker" and edit file "docker-compose.yml" : 57 | * volumes: change volume source (on host) with your logstash path configuration 58 | 59 | Before run docker composer, verify configuration logstash is valid. Verify configuration plugin logstash is valid too (use sample configuration in plugins directory for help you). 60 | 61 | ## logstash-filter-ensig 62 | 63 | *Logstash plugin Filter "EnrSig" can help you to enrich event with different sources informations (database, system command, external request, ...).* 64 | Normaly, enrsig is called by plugin "sig" (in begin check) according by the rules, it send event to logstash enrsig and it wait to reveive result on another input. When receive result, the enriched event goes back in sig filter. 65 | 66 | ### Features 67 | 68 | * Check if enrichissement ask exist in the configuration (WHOIS, SSL_CHECK, NBTSCAN, NMAP, ...) 69 | * If exist, check if target field value exist and if format is valid (regexp) 70 | * Check if result exist already for value then pass to other ask, or if the end then send result 71 | * If data not exist then execute commande syntaxe with value(s) and parse result according by template, and pass to next ask or send result 72 | 73 | ### Install by Docker 74 | 75 | *DockerFile create contener with last logstash and install plugin: sig, enrsig and fir. If you add others plugins, please edit Dockerfile before run docker composer* 76 | 77 | Enter in directory "docker" and edit file "docker-compose.yml" : 78 | * volumes: change volume source (on host) with your logstash path configuration 79 | 80 | Before run docker composer, verify configuration logstash is valid. Verify configuration plugin logstash is valid too (use sample configuration in plugins directory for help you). 81 | 82 | ## logstash-output-fir 83 | 84 | *Logstash plugin Output for send alert (created by filter sig) in FIR (Cert SG - https://github.com/certsocietegenerale/FIR)* 85 | 86 | ### Features 87 | 88 | * Create rule for send alert to FIR 89 | * Create or use default template to custom sent alert to FIR. 90 | * use fingerprint(sig plugin) for create one thread by IP SRC/MAC ADR in FIR for all alert 91 | 92 | ### Install by Docker 93 | 94 | *Docker in cloud: lprat/logstash-plugins:latest 95 | *DockerFile create contener with last logstash and install plugin: sig, enrsig and fir. If you add others plugins, please edit Dockerfile before run docker composer* 96 | 97 | Enter in directory "docker" and edit file "docker-compose.yml" : 98 | * volumes: change volume source (on host) with your logstash path configuration 99 | 100 | Before run docker composer, verify configuration logstash is valid. Verify configuration plugin logstash is valid too (use sample configuration in plugins directory for help you). 101 | 102 | * Lumberjack certificat 103 | ~~~ 104 | reference: https://github.com/logstash-plugins/logstash-output-lumberjack/issues/11 105 | #Generate a CA, Key + Signed Cert 106 | /opt/elasticsearch-6.4.2/bin/elasticsearch-certutil cert --pem 107 | unzip certificate-bundle.zip 108 | 109 | # Convert the key to PKCS8 110 | openSSL pkcs8 -in ./instance/instance.key -topk8 -nocrypt -out ./instance/instance.pk8 111 | 112 | ################################################################### 113 | #Upstream Logstash Server 114 | 115 | # This input could be anything... 116 | input { 117 | stdin {} 118 | } 119 | } 120 | 121 | # Send the output to a downstream server 122 | output { 123 | lumberjack { 124 | codec => json 125 | hosts => [ "127.0.0.1" ] 126 | port => 5044 127 | ssl_certificate => "/Users/Downloads/lumberjack/certs/ca.crt" 128 | } 129 | } 130 | 131 | ################################################################### 132 | #Downstream Logstash Server 133 | # 134 | # Using the BEATS input to receive data from the upsteam Logstash server 135 | # which is using the lumberjack output. 136 | # 137 | input { 138 | beats { 139 | id => "mylumberjack" 140 | codec => json 141 | port => 5044 142 | ssl_certificate => "/Users/Downloads/lumberjack/certs/instance.crt" 143 | ssl_key => "/Users/Downloads/lumberjack/certs/instance.pk8" 144 | ssl => true 145 | } 146 | 147 | } 148 | 149 | output { 150 | stdout { codec => rubydebug } 151 | } 152 | ~~~ 153 | 154 | ## Architecture sample (FR version) 155 | ![alt text](https://github.com/lprat/logstash-plugins/raw/master/sample-architecture/Architecture-sample.png "Architecture sample") 156 | ![alt text](https://github.com/lprat/logstash-plugins/raw/master/sample-architecture/Diagramme-archi.png "Diagramme architecture sample") 157 | 158 | ## Contact 159 | 160 | @ lionel.prat9 (at) gmail.com Ou cronos56 (at) yahoo.com 161 | 162 | -------------------------------------------------------------------------------- /docker/Dockerfile: -------------------------------------------------------------------------------- 1 | #Docker logstash with plugin output FIR && filter sig + enrsig 2 | # Pull base image. 3 | FROM docker.elastic.co/logstash/logstash-oss:6.6.2 4 | MAINTAINER Lionel PRAT 5 | 6 | #install simhash 7 | USER root 8 | RUN yum update -y && yum groupinstall -y 'Development Tools' 9 | USER logstash 10 | RUN env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/2.3.0 /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/jruby/bin/gem install activesupport -v '4.1.16' 11 | RUN env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/2.3.0 /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/jruby/bin/gem install simhash -v '0.2.5' 12 | 13 | #install plugins 14 | RUN logstash-plugin install logstash-output-fir 15 | RUN logstash-plugin install logstash-filter-sig 16 | RUN logstash-plugin install logstash-filter-enrsig 17 | 18 | #install others plugins 19 | RUN logstash-plugin install logstash-output-lumberjack 20 | RUN logstash-plugin install logstash-input-lumberjack 21 | 22 | #install extra for enrsig by example 23 | #RUN apt-get update && apt-get install -y --no-install-recommends nbtscan python-whois && rm -rf /var/lib/apt/lists/* 24 | 25 | ENTRYPOINT ["/usr/local/bin/docker-entrypoint"] 26 | CMD ["-e", ""] 27 | -------------------------------------------------------------------------------- /docker/cron_gitlogstashfir: -------------------------------------------------------------------------------- 1 | SHELL=/bin/bash 2 | PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin 3 | 4 | */15 * * * * root /usr/local/bin/git-logstash-instance-x.sh 5 | -------------------------------------------------------------------------------- /docker/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '2' 2 | 3 | services: 4 | logstash-instancex: 5 | image: lprat/logstash-plugins:latest 6 | build: 7 | context: . 8 | args: 9 | - http_proxy=${http_proxy} 10 | - https_proxy=${https_proxy} 11 | #!!! set http_proxy_host & http_proxy_port before run docker-compose build 12 | - JRUBY_OPTS="-J-Dhttp.proxyHost=${http_proxy_host} -J-Dhttp.proxyPort=${http_proxy_port}" 13 | - no_proxy=${no_proxy} 14 | #mem_reservation: '512m' 15 | #memswap_limit: '1g' 16 | #mem_limit: '1g' 17 | #cpuset: '1' 18 | #!!!run git-logstash-fir for load good configuration for instance before RUN docker composer!!!! 19 | volumes: 20 | - /opt/logstash-instance-x:/etc/logstash 21 | #open port logstash use in input 22 | ports: 23 | - "5000:5000" 24 | #depends_on: 25 | #external_links: 26 | #environment: 27 | # - ES_JAVA_OPTS="-Xms256m -Xmx1g" 28 | #env_file: 29 | restart: always 30 | command: logstash -f /etc/logstash/conf.d/ -w 5 -r 31 | logging: 32 | driver: "syslog" 33 | options: 34 | syslog-address: "tcp://172.17.0.1:514" 35 | tag: "docker_{{.ImageName}}_{{.Name}}" 36 | -------------------------------------------------------------------------------- /docker/git-logstash-instance-x.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | #contact: lionel.prat9@gmail.com 3 | #check env $GIT_UPDATE_LOGSTASH_FIR == PATH for define if update configuration logstash by GIT (PATH == path of git clone) 4 | #get GIT information $GIT_UPDATE_LOGSTASH_FIR_URL 5 | #valid path not empty 6 | GIT_UPDATE_LOGSTASH_FIR="/opt/logstash-instance-x" 7 | GIT_UPDATE_LOGSTASH_FIR_URL=https://mylocal.git/logstash-conf_instance-1.git 8 | if [ -n "$GIT_UPDATE_LOGSTASH_FIR" ]; then 9 | #valid path 10 | if [[ "$GIT_UPDATE_LOGSTASH_FIR" =~ ^(/[^/ ]*)+/?$ ]]; then 11 | #first time 12 | if [ ! -f "$GIT_UPDATE_LOGSTASH_FIR/.git" ]; then 13 | if [ ! -d "$GIT_UPDATE_LOGSTASH_FIR" ]; then 14 | mkdir -p $GIT_UPDATE_LOGSTASH_FIR 15 | fi 16 | if [ -n "$GIT_UPDATE_LOGSTASH_FIR_URL" ]; then 17 | git clone $GIT_UPDATE_LOGSTASH_FIR_URL $GIT_UPDATE_LOGSTASH_FIR 18 | else 19 | exit -1 20 | fi 21 | fi 22 | #else time 23 | if [ -d "$GIT_UPDATE_LOGSTASH_FIR/.git" ]; then 24 | cd $GIT_UPDATE_LOGSTASH_FIR && git pull 25 | fi 26 | fi 27 | fi 28 | -------------------------------------------------------------------------------- /docker/mkmf.rb.patch: -------------------------------------------------------------------------------- 1 | --- /usr/share/logstash/vendor/jruby/lib/ruby/shared/mkmf.rb 2017-04-28 18:11:45.000000000 +0000 2 | +++ /usr/share/logstash/vendor/jruby/lib/ruby/shared/mkmf.rb.new 2017-06-02 14:31:42.325422414 +0000 3 | @@ -43,7 +43,8 @@ 4 | RbConfig::MAKEFILE_CONFIG["CFLAGS"] += " $(cflags)" 5 | RbConfig::MAKEFILE_CONFIG["CPPFLAGS"] += " $(DEFS) $(cppflags)" 6 | RbConfig::MAKEFILE_CONFIG["CXXFLAGS"] += " $(cflags) $(cxxflags)" 7 | - 8 | +RbConfig::MAKEFILE_CONFIG["CPPFLAGS"] += ' -I/usr/local/rvm/rubies/ruby-1.9.3-p551-dev/include/ruby-1.9.1/x86_64-linux/' 9 | +RbConfig::MAKEFILE_CONFIG['includedir'] = "/usr/local/rvm/rubies/ruby-1.9.3-p551-dev/include/ruby-1.9.1/" 10 | $topdir = RbConfig::MAKEFILE_CONFIG['includedir'] 11 | $hdrdir = File.join($topdir, "ruby") 12 | $top_srcdir = $topdir 13 | -------------------------------------------------------------------------------- /docker/rsyslog-docker.conf: -------------------------------------------------------------------------------- 1 | #/etc/rsyslog.d/docker.conf add 2 | #active TCP module on ip docker 3 | #module(load="imtcp") 4 | #input(type="imtcp" port="514" address="172.17.0.1") 5 | $template HostBasedLog,"/var/log/dockers/%PROGRAMNAME%.log" 6 | if $programname startswith 'docker_' then -?HostBasedLog 7 | & ~ 8 | -------------------------------------------------------------------------------- /docker/validate-json.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash 2 | cat $1 | python -m json.tool >> /dev/null && exit 0 || echo "NOT valid JSON"; exit 1 3 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/.travis.yml: -------------------------------------------------------------------------------- 1 | sudo: false 2 | language: ruby 3 | cache: bundler 4 | rvm: 5 | - jruby-1.7.23 6 | script: 7 | - bundle exec rspec spec 8 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/CHANGELOG.md: -------------------------------------------------------------------------------- 1 | ## 0.9.2 2 | - add conf sample nbtscan 3 | - Correct bug, code work. 4 | ## 0.9.1 5 | - Add conf sample for whois 6 | - Correct code bug 7 | ## 0.9.0 8 | - Plugins work on logstash 5.4 9 | 10 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/CONTRIBUTORS: -------------------------------------------------------------------------------- 1 | The following is a list of people who have contributed ideas, code, bug 2 | reports, or in general have helped logstash along its way. 3 | 4 | Contributors: 5 | * Aaron Mildenstein (untergeek) 6 | * Pier-Hugues Pellerin (ph) 7 | 8 | Note: If you've sent us patches, bug reports, or otherwise contributed to 9 | Logstash, and you aren't on the list above and want to be, please let us know 10 | and we'll make sure you're here. Contributions from folks like you are what make 11 | open source awesome. 12 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/DEVELOPER.md: -------------------------------------------------------------------------------- 1 | # logstash-filter-example 2 | Example filter plugin. This should help bootstrap your effort to write your own filter plugin! 3 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/Gemfile: -------------------------------------------------------------------------------- 1 | source 'https://rubygems.org' 2 | gemspec 3 | gem "logstash", :github => "elastic/logstash", :branch => "5.4" 4 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2012–2016 Elasticsearch 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/NOTICE.TXT: -------------------------------------------------------------------------------- 1 | Elasticsearch 2 | Copyright 2012-2015 Elasticsearch 3 | 4 | This product includes software developed by The Apache Software 5 | Foundation (http://www.apache.org/). -------------------------------------------------------------------------------- /logstash-filter-enrsig/README.md: -------------------------------------------------------------------------------- 1 | # Logstash Plugin 2 | 3 | [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-filter-example.svg)](https://travis-ci.org/logstash-plugins/logstash-filter-example) 4 | 5 | This is a plugin for [Logstash](https://github.com/elastic/logstash). 6 | 7 | It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way. 8 | 9 | ## Documentation 10 | 11 | Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/). 12 | 13 | - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive 14 | - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide 15 | 16 | ## Need Help? 17 | 18 | Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum. 19 | 20 | ## Developing 21 | 22 | ### 1. Plugin Developement and Testing 23 | 24 | #### Code 25 | - To get started, you'll need JRuby with the Bundler gem installed. 26 | 27 | - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example). 28 | 29 | - Install dependencies 30 | ```sh 31 | bundle install 32 | ``` 33 | 34 | #### Test 35 | 36 | - Update your dependencies 37 | 38 | ```sh 39 | bundle install 40 | ``` 41 | 42 | - Run tests 43 | 44 | ```sh 45 | bundle exec rspec 46 | ``` 47 | 48 | ### 2. Running your unpublished Plugin in Logstash 49 | 50 | #### 2.1 Run in a local Logstash clone 51 | 52 | - Edit Logstash `Gemfile` and add the local plugin path, for example: 53 | ```ruby 54 | gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome" 55 | ``` 56 | - Install plugin 57 | ```sh 58 | # Logstash 2.3 and higher 59 | bin/logstash-plugin install --no-verify 60 | 61 | # Prior to Logstash 2.3 62 | bin/plugin install --no-verify 63 | 64 | ``` 65 | - Run Logstash with your plugin 66 | ```sh 67 | bin/logstash -e 'filter {awesome {}}' 68 | ``` 69 | At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash. 70 | 71 | #### 2.2 Run in an installed Logstash 72 | 73 | You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using: 74 | 75 | - Build your plugin gem 76 | ```sh 77 | gem build logstash-filter-awesome.gemspec 78 | ``` 79 | - Install the plugin from the Logstash home 80 | ```sh 81 | # Logstash 2.3 and higher 82 | bin/logstash-plugin install --no-verify 83 | 84 | # Prior to Logstash 2.3 85 | bin/plugin install --no-verify 86 | 87 | ``` 88 | - Start Logstash and proceed to test the plugin 89 | 90 | ## Contributing 91 | 92 | All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin. 93 | 94 | Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here. 95 | 96 | It is more important to the community that you are able to contribute. 97 | 98 | For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file. -------------------------------------------------------------------------------- /logstash-filter-enrsig/Rakefile: -------------------------------------------------------------------------------- 1 | require "logstash/devutils/rake" 2 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/conf-samples/conf_enrsig.json: -------------------------------------------------------------------------------- 1 | { 2 | "WHOIS": {"value_format": ["[\\.A-Z0-9_-]+"], "command_path": "/usr/local/bin/json-whois.py", "command_syntax": "$1$","result_parse": "/etc/logstash/db/template_whois_2_json.erb"}, 3 | "NBTSCAN": {"value_format": ["[0-9\\.]+","[A-Z0-9\\-\\.\\:]+"], "command_path": "/usr/bin/nbtscan", "command_syntax": "$1$ -v -t 1000 -s \'|\'","result_parse": "/etc/logstash/db/template_nbtscan_2_json.erb"} 4 | } 5 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/conf-samples/template_nbtscan_2_json.erb: -------------------------------------------------------------------------------- 1 | <%= 2 | hash_result={} 3 | unless output_cmd.to_s.empty? 4 | output_cmd.each_line do |line| 5 | ar_line=line.split("|") 6 | if ar_line.length == 3 7 | ar_line[2]=ar_line[2].to_s.gsub(/\n/,"") 8 | ar_line[1]=ar_line[1].to_s.gsub(/\n/,"") 9 | if ar_line[2] =~ /^[a-f0-9]{1}[a-f0-9]{1}[A-Z]$/ 10 | if ar_line[2] =~ /^[a-f0-9]{1}[a-f0-9]{1}[G]$/ 11 | hash_result['group']=ar_line[1].strip 12 | elsif ar_line[2] =~ /^[a-f0-9]{1}[a-f0-9]{1}[U]$/ 13 | hash_result['netbiosname']=ar_line[1].strip 14 | else 15 | hash_result[ar_line[2]]=ar_line[1] 16 | end 17 | else 18 | hash_result[ar_line[1]]=ar_line[2] 19 | end 20 | end 21 | end 22 | end 23 | hash_result 24 | %> 25 | 26 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/conf-samples/template_whois_2_json.erb: -------------------------------------------------------------------------------- 1 | <%= 2 | hash_result = {} 3 | output_cmd = output_cmd.gsub(': null,', ': "",') 4 | output_cmd = output_cmd.gsub(': null', ': ""') 5 | output_cmd = output_cmd.gsub('=>', ':') 6 | hash_result = JSON.parse(output_cmd) 7 | hash_result 8 | %> 9 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/extra/json-whois.py: -------------------------------------------------------------------------------- 1 | import whois 2 | import sys 3 | if len(sys.argv) > 1: 4 | w = whois.whois(sys.argv[1]) 5 | print w 6 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/lib/logstash/filters/enrsig.rb: -------------------------------------------------------------------------------- 1 | # encoding: utf-8 2 | require "logstash/filters/base" 3 | require "logstash/namespace" 4 | require "json" 5 | require "time" 6 | require 'erb' 7 | require 'digest' 8 | require 'openssl' 9 | 10 | # This example filter will replace the contents of the default 11 | # message field with whatever you specify in the configuration. 12 | # 13 | # It is only intended to be used as an example. 14 | class LogStash::Filters::Enrsig < LogStash::Filters::Base 15 | config_name "enrsig" 16 | 17 | # File containt configuration 18 | #{'WHOIS': {'value_format': ['regexp_valid_value_for_$1$',...]", 'command_path': '/usr/local/cmd', 'command_syntax': "-x ... $1$ $2$"},'result_parse': 'template_create_json.erb'} 19 | #$1$ is first element in element content in query: [{WHOIS: {"id": id_rule, "field": [field_$1$], "name_in_db": "$1$"}},{SSL: {"id": id_rule, "field": [field_$1$,field_$2$], "name_in_db": "https://$1$:$2$"}}] 20 | config :conf_enrsig, :validate => :string, :default => "/etc/logstash/db/conf_enrsig.json" 21 | # delay to refresh configuration - default all hours 22 | config :refresh_interval, :validate => :number, :default => 3600 23 | #field name where you add request for server add information active 24 | config :field_enr, :validate => :string, :default => "request_enrichiment" 25 | 26 | public 27 | def register 28 | @logger.info("Configuration Loading...") 29 | @cmd_db = {} 30 | @conf_enr = {} 31 | @hash_conf = "" 32 | load_conf 33 | @logger.info("finish") 34 | @next_refresh = Time.now + @refresh_interval 35 | @load_statut = true 36 | end # def register 37 | 38 | public 39 | def filter(event) 40 | return unless filter?(event) 41 | tnow = Time.now 42 | if @next_refresh < tnow 43 | if @load_statut == true 44 | @load_statut = false 45 | @logger.info("Configuration refresh...") 46 | load_conf 47 | @next_refresh = tnow + @refresh_interval 48 | @load_statut = true 49 | end 50 | end 51 | sleep(1) until @load_statut 52 | #verify if conf is not empty, if message contains ask 53 | if not @conf_enr.nil? and event.get(@field_enr).is_a?(Array) 54 | response=event.get(@field_enr).dup 55 | #verify if command exist in conf 56 | cnt_ea=0 57 | for request_cmd in event.get(@field_enr) 58 | if request_cmd.is_a?(Hash) and not request_cmd.empty? 59 | #verify if command in request, exist in db 60 | if @conf_enr[request_cmd.keys[0]].is_a?(Hash) 61 | #verify if answer already present in db 62 | if not @cmd_db[request_cmd.keys[0]].is_a?(Hash) and @cmd_db[request_cmd.keys[0]][request_cmd[request_cmd.keys[0]]['name_in_db']].is_a?(Hash) 63 | #add info 64 | response[cnt_ea][request_cmd.keys[0]]['response']=@cmd_db[request_cmd.keys[0]][request_cmd[request_cmd.keys[0]]['name_in_db']] 65 | else 66 | #verify if field is present in event 67 | next if @conf_enr[request_cmd.keys[0]]['value_format'].length != request_cmd[request_cmd.keys[0]]['field'].length 68 | syntax_cmd=@conf_enr[request_cmd.keys[0]]['command_syntax'].dup 69 | #if field link not present, next! 70 | pnext=false 71 | cnt_e=1 72 | for flval in request_cmd[request_cmd.keys[0]]['field'] 73 | if event.get(flval.to_s).nil? 74 | pnext=true 75 | break 76 | else 77 | #create syntaxe 78 | value_e=event.get(flval.to_s) 79 | pvf=cnt_e-1 80 | #verify format (avoid vulnerability escape) || FILTER 81 | begin 82 | if value_e =~ /#{@conf_enr[request_cmd.keys[0]]['value_format'][pvf]}/i 83 | syntax_cmd.gsub! '$'+cnt_e.to_s+'$', value_e 84 | cnt_e+=1 85 | else 86 | @logger.warn("Format of syntaxe command is bad with filter #{Regexp.escape(@conf_enr[request_cmd.keys[0]]['value_format'][pvf])}", :cmd => value_e) 87 | end 88 | rescue 89 | @logger.warn("Regexp error", :regexp => @conf_enr[request_cmd.keys[0]]['value_format'][pvf]) 90 | end 91 | 92 | end 93 | end 94 | next if pnext 95 | #verify if format valid is ok on all field 96 | next if cnt_e != request_cmd[request_cmd.keys[0]]['field'].length+1 or syntax_cmd =~ /\$\d+\$/ 97 | #run cmd 98 | output_cmd = `#{@conf_enr[request_cmd.keys[0]]['command_path']} #{syntax_cmd}` 99 | #transform "output_cmd" value to HASH with ERB 100 | begin 101 | result=ERB.new(@conf_enr[request_cmd.keys[0]]['template_erb']).result(binding) 102 | result=JSON.parse result.gsub('=>', ':') 103 | if result.is_a?(Hash) 104 | #insert in response 105 | response[cnt_ea][request_cmd.keys[0]]['response']=result 106 | #insert in db 107 | @cmd_db[request_cmd.keys[0]][request_cmd[request_cmd.keys[0]]['name_in_db']] = {} if @cmd_db[request_cmd.keys[0]][request_cmd[request_cmd.keys[0]]['name_in_db']].nil? 108 | @cmd_db[request_cmd.keys[0]][request_cmd[request_cmd.keys[0]]['name_in_db']]=result 109 | else 110 | @logger.warn("Command and ERB dont create HASH result!!", :result => result) 111 | end 112 | rescue 113 | @logger.warn("ERB/JSON parse error", :result => output_cmd) 114 | end 115 | end 116 | #finish (resend to origin) 117 | event.set(@field_enr,response) 118 | end 119 | end 120 | cnt_ea+=1 121 | end 122 | end 123 | # filter_matched should go in the last line of our successful code 124 | filter_matched(event) 125 | end # def filter 126 | 127 | private 128 | def load_conf 129 | if !File.exists?(@conf_enrsig) 130 | @logger.warn("DB file read failure, stop loading", :path => @conf_enrsig) 131 | exit -1 132 | end 133 | tmp_hash = Digest::SHA256.hexdigest File.read @conf_enrsig 134 | if not tmp_hash == @hash_conf 135 | @hash_conf = tmp_hash 136 | begin 137 | tmp_enr = JSON.parse( IO.read(@conf_enrsig, encoding:'utf-8') ) 138 | #create db structure 139 | @conf_enr = tmp_enr 140 | @conf_enr.each do |k,v| 141 | @cmd_db[k]={} if @cmd_db[k].nil? 142 | if File.file?(@conf_enr[k]['result_parse'].to_s) 143 | @conf_enr[k]['template_erb']=File.read(@conf_enr[k]['result_parse'].to_s) 144 | else 145 | @logger.warn("Template parse for rules #{k.to_s} not find...", :path => @conf_enr[k]['result_parse']) 146 | @conf_enr[k]['template_erb']="" 147 | end 148 | end 149 | rescue 150 | @logger.error("JSON CONF ENR_SIG -- PARSE ERROR") 151 | end 152 | end 153 | end 154 | end # class LogStash::Filters::Example 155 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/logstash-filter-enrsig.gemspec: -------------------------------------------------------------------------------- 1 | Gem::Specification.new do |s| 2 | s.name = 'logstash-filter-enrsig' 3 | s.version = '0.9.2' 4 | s.licenses = ['Apache License (2.0)'] 5 | s.summary = "This enrsig filter execute request (command) for enrich event." 6 | s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program" 7 | s.authors = ["Lionel PRAT"] 8 | s.email = 'lionel.prat9@gmail.com' 9 | s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html" 10 | s.require_paths = ["lib"] 11 | 12 | # Files 13 | s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT'] 14 | # Tests 15 | s.test_files = s.files.grep(%r{^(test|spec|features)/}) 16 | 17 | # Special flag to let us know this is actually a logstash plugin 18 | s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" } 19 | 20 | # Gem dependencies 21 | s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99" 22 | s.add_development_dependency 'logstash-devutils' 23 | end 24 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/spec/filters/enrsig_spec.rb: -------------------------------------------------------------------------------- 1 | # encoding: utf-8 2 | require 'spec_helper' 3 | require "logstash/filters/enrsig" 4 | 5 | describe LogStash::Filters::enrgisg do 6 | describe "Set to Hello World" do 7 | let(:config) do <<-CONFIG 8 | filter { 9 | example { 10 | message => "Hello World" 11 | } 12 | } 13 | CONFIG 14 | end 15 | 16 | sample("message" => "some text") do 17 | expect(subject).to include("message") 18 | expect(subject['message']).to eq('Hello World') 19 | end 20 | end 21 | end 22 | -------------------------------------------------------------------------------- /logstash-filter-enrsig/spec/spec_helper.rb: -------------------------------------------------------------------------------- 1 | # encoding: utf-8 2 | require "logstash/devutils/rspec/spec_helper" 3 | -------------------------------------------------------------------------------- /logstash-filter-sig/.travis.yml: -------------------------------------------------------------------------------- 1 | sudo: false 2 | language: ruby 3 | cache: bundler 4 | rvm: 5 | - jruby-1.7.25 6 | jdk: oraclejdk8 7 | script: 8 | - bundle exec rspec spec 9 | -------------------------------------------------------------------------------- /logstash-filter-sig/CHANGELOG.md: -------------------------------------------------------------------------------- 1 | ## 0.9.0 2 | - Plugins work on logstash 5.4 3 | -------------------------------------------------------------------------------- /logstash-filter-sig/CONTRIBUTORS: -------------------------------------------------------------------------------- 1 | The following is a list of people who have contributed ideas, code, bug 2 | reports, or in general have helped logstash along its way. 3 | 4 | Contributors: 5 | * Aaron Mildenstein (untergeek) 6 | * Pier-Hugues Pellerin (ph) 7 | 8 | Note: If you've sent us patches, bug reports, or otherwise contributed to 9 | Logstash, and you aren't on the list above and want to be, please let us know 10 | and we'll make sure you're here. Contributions from folks like you are what make 11 | open source awesome. 12 | -------------------------------------------------------------------------------- /logstash-filter-sig/DEVELOPER.md: -------------------------------------------------------------------------------- 1 | # logstash-filter-sig 2 | * Lionel PRAT - lionel.prat9@gmail.com 3 | -------------------------------------------------------------------------------- /logstash-filter-sig/Gemfile: -------------------------------------------------------------------------------- 1 | source 'https://rubygems.org' 2 | gemspec 3 | gem "logstash", :github => "elastic/logstash", :branch => "5.4" 4 | -------------------------------------------------------------------------------- /logstash-filter-sig/LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2012–2016 Elasticsearch 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /logstash-filter-sig/NOTICE.TXT: -------------------------------------------------------------------------------- 1 | Elasticsearch 2 | Copyright 2012-2015 Elasticsearch 3 | 4 | This product includes software developed by The Apache Software 5 | Foundation (http://www.apache.org/). -------------------------------------------------------------------------------- /logstash-filter-sig/README.md: -------------------------------------------------------------------------------- 1 | # Logstash Plugin Filter "SIG" 2 | 3 | *Logstash plugin Filter "Sig" can help you to detect security threat in log by differents ways.* 4 | 5 | ## Features 6 | 7 | * Drop first time False positive and noise events 8 | * Enrichissement event with database or/and send event to plugin enrsig for enrich event by active method (whois, ssl_check, nmap, ...) 9 | * Drop second time False positive and noise events (based on enrichissement informations) 10 | * Check new value in field 11 | * Check blacklist reputation 12 | * Check IOC in event (extracted on MISP) 13 | * Check signatures with some fonctionnality: 14 | * Rule compisition: 15 | * Name: name of rule for report 16 | * ID: id use for correlate and change score 17 | * Score (note): give score if matched (use score for triggered alert) 18 | * Type: 2 types possibility, first is 'primary signature' and second is 'second signature'. Second signature match only if a primary signature matched before. 19 | * ModeFP: it's boolean variable for indicate if rule match 'false positive' 20 | * extract (optional): use for extract data informations if you are sure that threat, and put in IOC local database for detect in all next events. 21 | * Use multi-search techniques (you can use one or more techniques in one rule) by event on field and value: 22 | * Check if field is present or field is not present 23 | * Check Regexp if present or not present 24 | * Check motif (Array or String) 25 | * Compare value field against another value field (string or numeric -- operator: ==, <, >, !=) 26 | * Check size (length) of string in field with operator: ==, <, >, != 27 | * Check ipaddr (ex: 192.168.0.0/24) in field with operator: ==, != 28 | * Check numeric value in field with operator: ==, !=, <, > 29 | * Check date value in relationship with time now + X, with operator: ==, <, >, != 30 | * Check date value if in hour, use operator: ==, <, >, != 31 | * Check date value if in day (number day, ex: 0==sunday,1==monday), use operator: ==, <, >, != 32 | * Check frequence on multi event 33 | * Can be used for brute force (ex: if 3 times auth error in 60 secondes,then don't research before 3600 secondes) 34 | * Correlate multi-sources with same field value (ex: ip) on different events (ex: squid event IP DST == sysmon event IP DST) 35 | * Check frequence on event 36 | * Check event by compare with reference data (require to make reference database with ES when contains clean data) (it's new version of my project AEE [https://github.com/lprat/AEE]) 37 | * Check size, check regexp form value, check if unique or determined list value (ex: don't be @timestamp because change everytime) 38 | * Check link/relationship between not signle/determined list value of fields (idea inspired by tool PicViz [http://picviz.com/]). Exemple on apache log page test.php return always 200 in all logs. The link/relationship value/field is "uri_page(test.php)<->return_code(200)" 39 | * Analys matched rules for adapt score of alert 40 | * Fingerprint event according by rule for identify unique event & Drop fingerprint (false positive usage) 41 | * Check frequence on specifique event by filters. Alert not created on a specifique event, but it create new event. 42 | 43 | ## Require 44 | 45 | This plugin use simhash for find around result and futur possibility check and correlation. 46 | 47 | **!!!! You must install simhash under logstash, follow instruction:** 48 | 49 | 1. curl -sSL https://get.rvm.io | bash && /usr/local/rvm/bin/rvm install 1.9.3-dev 50 | 2. IN vendor/jruby/lib/ruby/shared/mkmf.rb add line (45): 51 | * RbConfig::MAKEFILE_CONFIG["CPPFLAGS"] += ' -I/usr/local/rvm/rubies/ruby-1.9.3-p551-dev/include/ruby-1.9.1/x86_64-linux/' 52 | * RbConfig::MAKEFILE_CONFIG['includedir'] = "/usr/local/rvm/rubies/ruby-1.9.3-p551-dev/include/ruby-1.9.1/" 53 | 3. env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 JRUBY_OPTS='-Xcext.enabled=true' /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/bundle/jruby/1.9/bin/bundle install 54 | 4. env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 JRUBY_OPTS='-Xcext.enabled=true' /usr/share/logstash/vendor/jruby/bin/jruby /usr/share/logstash/vendor/jruby/bin/gem build logstash-filter-sig.gemspec 55 | 5. /usr/share/logstash/bin/logstash-plugin install logstash-filter-sig-3.0.0.gem 56 | 57 | 58 | ## Let's start with docker 59 | 60 | *DockerFile create contener with last logstash and install plugin: sig, enrsig and fir. If you add others plugins, please edit Dockerfile before run docker composer* 61 | 62 | Enter in directory "docker" and edit file "docker-compose.yml" : 63 | * volumes: change volume source (on host) with your logstash path configuration 64 | 65 | Before run docker composer, verify configuration logstash is valid. Verify configuration plugin logstash is valid too (use sample configuration in plugins directory for help you). 66 | 67 | Run docker-compose 68 | 69 | ## Main Configuration (logstash-filter.conf) 70 | **Refresh DB : The plugin use some files configurations, files are reload every hour (default). You can use config/db files with git update...** 71 | 72 | Configuration of each features: 73 | * Disable check features: use for disable feature check 74 | * no_check => "sig_no_apply_all" : add in event a field name "sig_no_apply_all" for disable all checking 75 | * disable_drop => false : if you turn to true, feature "drop" is disable 76 | * disable_enr => false : if you turn to true, feature "enrichissement" is disable 77 | * disable_fp => false : if you turn to true, feature "fingerprint & drop fingerprint" is disable 78 | * disable_nv => false : if you turn to true, feature "new value" is disable 79 | * disable_bl => false : if you turn to true, feature "blacklist" is disable 80 | * disable_ioc => false : if you turn to true, feature "ioc" is disable 81 | * disable_sig => false : if you turn to true, feature "signature" is disable 82 | * disable_ref => false : if you turn to true, feature "reference" is disable 83 | * disable_freq => false : if you turn to true, feature "frequence" is disable 84 | * disable_note => false : if you turn to true, feature "score" is disable 85 | 86 | * Drop feature: drop false positive and noise events (used before and after enrichissement feature) 87 | * noapply_sig_dropdb => "sig_no_apply_dropdb" : add in event a field name "sig_no_apply_dropdb" for disable checking 88 | * db_drop => "/etc/logstash/db/drop-db.json" : path of file drop-db.json (see below for more information) 89 | * refresh_interval_dropdb => 3600 : delay interval (in second) to reload db_drop 90 | 91 | * Enrichissement feature: Enrichissement event with database or/and send event to plugin enrsig for enrich event by active method (whois, ssl_check, nmap, ...) 92 | * noapply_sig_enr => "sig_no_apply_enr" : add in event a field name "sig_no_apply_enr" for disable checking 93 | * conf_enr => "/etc/logstash/db/enr.json" : path of file enr.json (see below for more information) 94 | * refresh_interval_enr => 3600 : delay interval (in second) to reload "enr" 95 | * field_enr => "request_enrichiment": field name where to add ask for logstash enrsig plugin (active check). 96 | * enr_tag_response => "ENR_RETURN_TO_JOHN": add tag to event for identify who is origin of resquest, and resend the result to good server 97 | 98 | * New value feature : check new value in field 99 | * conf_nv => "/etc/logstash/db/new.json" : path of file new.json (see below for more information) 100 | * db_nv => "/etc/logstash/db/new-save.json" : path of file new-save.json (see below for more information) 101 | * noapply_sig_nv => "sig_no_apply_nv" : add in event a field name "sig_no_apply_nv" for disable checking 102 | * refresh_interval_confnv => 3600 : delay interval (in second) to reaload "conf_nv" 103 | * save_interval_dbnv => 3600 : delay interval (in second) to save "db_nv" 104 | * target_nv => "new_value_" : prefix value of name field created if new value detected 105 | 106 | * BL (Black list) REPUTATION feature: check ip reputation 107 | * conf_bl => "/etc/logstash/db/bl_conf.json" : path of file bl_conf.json (see below for more information) 108 | * file_bl => [Array type] ["/etc/logstash/db/firehol_level1.netset","/etc/logstash/db/firehol_level2.netset","/etc/logstash/db/firehol_level3.netset","/etc/logstash/db/firehol_level4.netset","/etc/logstash/db/firehol_webserver.netset","/etc/logstash/db/firehol_webclient.netset","/etc/logstash/db/firehol_abusers_30d.netset","/etc/logstash/db/firehol_anonymous.netset","/etc/logstash/db/firehol_proxies.netset"] : path of files contains ip reputation 109 | * You can use firehol BL: https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level1.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level2.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level3.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level4.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_webserver.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_webclient.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_abusers_30d.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_anonymous.netset,https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_proxies.netset 110 | * noapply_sig_bl => "sig_no_apply_bl" : add in event a field name "sig_no_apply_bl" for disable checking 111 | * refresh_interval_confbl => 3600 : delay interval (in second) to reload conf_bl & db_bl (file bl) 112 | * targetname_bl => "bl_detected_category" : field name to save value of category if ip reputation found 113 | 114 | * IOC feature: check IOC (IP/URL/HASH/EMAIL/ ...) 115 | * db_ioc => ["/etc/logstash/db/ioc.json", "/etc/logstash/db/ioc_local.json"] : Array contains path of files db (ioc_local.json => created by signature function [file_save_localioc], ioc.json) (see below for more information) 116 | * conf_ioc => "/etc/logstash/db/ioc_conf.json" : path of file ioc_conf.json (see below for more information) 117 | * target_ioc => "ioc_detected" : name of field where you save detected IOC 118 | * targetnum_ioc => "ioc_detected_count" : name of field where you save count of detected IOC 119 | * targetname_ioc => "ioc_detected_name" : name of field where you save detected IOC name 120 | * refresh_interval_dbioc => 3600 : delay interval (in second) to reload conf_ioc & db_ioc 121 | * noapply_ioc => "sig_no_apply_ioc" : add in event a field name "sig_no_apply_ioc" for disable checking 122 | 123 | * Signature feature: check signatures 124 | * conf_rules_sig => "/etc/logstash/db/sig.json" : path of file sig.json (see below for more information) 125 | * file_save_localioc => "/etc/logstash/db/ioc_local.json" : path of file ioc_local.json (see below for more information) 126 | * target_sig => "sig_detected" : name of field where you save Rules detected 127 | * targetnum_sig => "sig_detected_count" : name of field where you save count of rules detected 128 | * targetname_sig => "sig_detected_name" : name of field where you save name of rules detected 129 | * refresh_interval_confrules => 3600 : delay interval (in second) to refresh file_save_localioc & conf_rules_sig 130 | * noapply_sig_rules => "sig_no_apply_rules" : add in event a field name "sig_no_apply_rules" for disable checking 131 | * check_stop => false : turn to true if you want stop checking after first found 132 | 133 | * REFERENCE (old ANOMALIE) feature: Check event by compare with reference data 134 | * conf_ref => "/etc/logstash/db/conf_ref.json" : path of file conf_ref.json (see below for more information) 135 | * db_ref => "/etc/logstash/db/reference.json" : path of file reference.json (see below for more information) 136 | * db_pattern => "/etc/logstash/db/pattern.db" : path of file pattern.db (see below for more information) 137 | * refresh_interval_dbref => 3600 : delay interval (in second) to reload db_ref & db_pattern & conf_ref 138 | * noapply_ref => "sig_no_apply_ref" : add in event a field name "sig_no_apply_ref" for disable checking 139 | * target_ref => "ref_detected" : name of field where you save detected differences between event and reference 140 | * targetnum_ref => "ref_detected_count" : name of field where you save count detected differences between event and reference 141 | * targetname_ref => "ref_detected_name" : name of field where you save detected name of difference between event and reference 142 | * ref_aroundfloat => 0.5 : round the score if not integer (float result) 143 | * ref_stop_after_firstffind => true : turn to false if you want continue to checking after first difference found 144 | 145 | * Score feature: change value of score if event is matched by several features 146 | * targetnote => "sig_detected_note" : name of field where you saved score provieded by features: IOC/SIG/REF/BL... 147 | * targetid => "sig_detected_id" : name of field where you saved ID rule provieded by features: IOC/SIG/REF/BL... 148 | * conf_rules_note => "/etc/logstash/db/note.json" : path of file note.json (see below for more information) 149 | 150 | * Fingerprint feature: Limit alert sent. When first detected, make fingerprint value and tag 'first'. After all others events with same fingerprint is tagged 'info' (information complementary). 151 | * noapply_sig_dropfp => "sig_no_apply_dropfp" : add in event a field name "sig_no_apply_dropfp" for disable checking 152 | * conf_fp => "/etc/logstash/db/fingerprint_conf.json" : path of file fingerprint_conf.json (see below for more information) 153 | * db_dropfp => "/etc/logstash/db/drop-fp.json" : path of file drop-fp.json (see below for more information) 154 | * select_fp => "tags" : name field for select/filter type event, relationship with fingerprint_conf.json. Exemple: event['tags']="squid" --> (fingerprint_conf.json->>) {"squid":{"fields":[....],...}} 155 | * target_fp => "fingerprint" : name field where you save fingerprint value 156 | * tag_name_first => "first_alert" : tag value for first event alert 157 | * tag_name_after => "info_comp" : tag value for everything after the first alert 158 | * target_tag_fp => "tags" : field name where you save tag value 159 | * refresh_interval_conffp => 3600 : delay interval (in second) to reload db_dropfp and conf_fp 160 | 161 | * FREQUENCE feature: detect anormaly frequence increase on event flux 162 | * conf_freq => "/etc/logstash/db/conf_freq.json" : path of file conf_freq.json (see below for more information) 163 | * refresh_interval_freqrules => 3600 : delay interval (in second) to reload conf_freq 164 | * noapply_freq => "sig_no_apply_freq" : add in event a field name "sig_no_apply_freq" for disable checking 165 | 166 | ## Files Configuration 167 | **Check in folder conf-samples and scripts-create-db** 168 | 169 | ### DROP Feature 170 | #### drop-db.json 171 | The file drop-db.json contains rule for drop noise/false positive event. 172 | 173 | ```json 174 | {"dst_domain": "^google.com$|^mydomain.ext$", "dst_ip": "10.0.0.\\d+"} 175 | ``` 176 | 177 | This configuration drop all event with 'dst_domain' to 'google.com' or 'mydomain.ext', as 'dst-ip' to '10.0.0.0/24'. 178 | 179 | The json key is field name to check in event (event['field']), and the value is a regexp to check on field. If regexp matched then event is dropped. 180 | 181 | ### Enrichissement feature 182 | #### enr.json 183 | 184 | You have 2 choices: 185 | - Use database local (passive enrichissement) 186 | - Use active enrichissement (command, whois, ssl check, ...) with enrsig. Use enrsig on other server for avoid slow down. In global configuration, if field "enr_tag_response" or "field_enr" exist then pass direct to output and send event to server enrsig. This one resend event to you with result in your input logstash. 187 | ```json 188 | {"1": 189 | { 190 | "file":"/etc/logstash/db/whois.json", 191 | "db": {}, 192 | "prefix": "whois_info_", "filters": {"type": "squid", "src_ip": "^192\\.168\\.0\\.\\\\d+$"}, "link": ["domain_dst"], "if_empty": "WHOIS", "form_in_db": "$1$", "filter_insert": [], "filter_noinsert": [] 193 | } 194 | } 195 | ``` 196 | 197 | This configuration have one rule with ID "1". This rule load local database (whois.json) in db key. 198 | If event match to "filters" (type == squid and src_ip field matched on regexp "^192\\.168\\.0\\.\\\\d+$") then check in "db": if value contained in "link" name field exist (value of event['domain_dst'] exist in "db") then add information in event field with "prefix"("whois_info_"). 199 | else send request to enrsig plugin (use globale config logstash to redirect request to server logstash with enrsig plugin). 200 | 201 | ### New Value feature 202 | #### new-save.json 203 | The file is auto generated by plugin, but you create file with contains '{}' before run plugin for first time. 204 | The plugin save information extracted in this file and can reload after restart. 205 | 206 | If you want restart at begin, just remove file and recreate json empty '{}'. 207 | 208 | #### new.json 209 | The file contains a key "rules", value indicate selected fields for check new values. 210 | 211 | 212 | ```json 213 | {"rules": ["dst_host","user_agent"]} 214 | ``` 215 | This configuration check new value on field "dst_host" and "user_agent". 216 | 217 | ### IP REPUTATION 218 | #### bl_conf.json 219 | The first key of json indicates field selected to check ip reputation list (give path of each files db in 'dbs'). 220 | 221 | ```json 222 | {"fieldx": {'dbs':[file_name,file_name2,...], id: '180XX', 'note': 1, 'category': "malware"}} 223 | ``` 224 | * ID must be unique (ID of rule). 225 | * Note (score): between 1 and 4. 226 | * Category: indicate category contained in dbs files (malware, webserveur attack, proxies, ...) 227 | * dbs: contains path of file db (must be too in main conf 'file_bl' in "logstash-filter.conf") 228 | 229 | ### IOC 230 | #### ioc_conf.json 231 | This file contains rules to check IOC in event. 232 | 233 | Whole of rules is hash, one type IOC (ex: URL) is configured by 4 keys: 234 | * First: Key is IOC name in DB, and value is name of field in event to check IOC 235 | * ex: "ioc_hostname":["_host"] => check IOC hostname on field name *_host* (wildcard indicate all field name contains '_host' by exemple event['dst_hostname'] are checked) 236 | * Second: Key is same name than first key with '_downcase' at end, it's value can be true or false. True verify IOC without case (AbC == abc) and False opposite (AbC != abc) 237 | * Third: key is same name than first key with 'iocnote' at end, it's value is score if IOC detected. 238 | * Fourth key is same name than first key with 'iocid' at end, it's value is ID of rule for use after in SCORE feature by example. 239 | 240 | ```json 241 | { 242 | "ioc_hostname":["_host"], "ioc_hostname_downcase":true, "ioc_hostname_iocnote":2, "ioc_hostname_iocid":1001, 243 | "ioc_domain":["_domain"], "ioc_domain_downcase":true, "ioc_domain_iocnote":2, "ioc_domain_iocid":1002, 244 | "ioc_ip":["_ip"], "ioc_ip_downcase":false, "ioc_ip_iocnote":1, "ioc_ip_iocid":1003, 245 | "ioc_emailaddr":["_emailaddr"], "ioc_emailaddr_downcase":true, "ioc_emailaddr_iocnote":3, "ioc_emailaddr_iocid":1004, 246 | "ioc_user-agent":["user_agent"], "ioc_user-agent_downcase":false, "ioc_user-agent_iocnote":2, "ioc_user-agent_iocid":1005, 247 | "ioc_uri":["_url","_request","_uripath_global"], "ioc_uri_downcase":false, "ioc_uri_iocnote":2, "ioc_uri_iocid":1006, 248 | "ioc_attachment":["attachment","_uriparam","_uripage"], "ioc_attachment_downcase":false, "ioc_attachment_iocnote":1, "ioc_attachment_iocid":1007 249 | } 250 | ``` 251 | 252 | #### ioc_local.json 253 | This file is generated by plugin by featuring signature with parameter 'extract'. 254 | **Before start for irst time, you create file empty (echo '{}' > ioc_local.json)** 255 | 256 | #### Script to generate ioc.json 257 | Use script ioc_create.sh for generate ioc.json file (in path: "/etc/logstash/db/") from MISP database. 258 | **Require Pymisp (https://github.com/MISP/PyMISP), wget (for download alexa db), misp2json4ioc.py (include in folder scripts), blacklist.json (include in conf-samples)** 259 | 260 | ##### blacklist.json 261 | Drop IOC that trigger false positive. 262 | 263 | ```json 264 | { 265 | "ioc_ip":["(127\\.[0-9]+\\.[0-9]+\\.[0-9]+|10\\.\\d+\\.\\d+\\.\\d+|192\\.168\\.\\d+\\.\\d+|172\\.([1-2][0-9]|0|30|31)\\.\\d+\\.\\d+|255\\.255\\.255\\.\\d+)"], 266 | "email-attachment":[], 267 | "ioc_attachment":["2"], 268 | "ioc_emailaddr":[], 269 | "ioc_uri":["\/"], 270 | "ioc_domain":[], 271 | "ioc_hostname":[], 272 | "ioc_user-agent":[], 273 | "ioc_email-subject":[], 274 | "ioc_as":[] 275 | } 276 | ``` 277 | 278 | ### SIGNATURES 279 | #### sig.json 280 | The sig.json file contains rules of signature to check in event. 281 | 282 | The first key name is 'rules', the value is an array contains all signatures. 283 | Each signature is hash format composed of multi key/value: 284 | * Level 1 : all name key at first level is name of Field to check ({fieldX:{},fieldY:{},...}) 285 | * Only one key/value pair must be used to add the signature information. The key/value signature information are: 286 | * "id": (Integer) value is unique ID number of signature 287 | * "name": (String) value is name of signature 288 | * "type": (Integer 1 or 2) value is 1 if check signature on event without prerequisites, and value is 2 if you check on event only if another signature found before. 289 | * "note": (Integer) value is score of signature 290 | * "modeFP": (Boolean) if value is true, and signature matched then event is dropped (false positive mode) 291 | * "extract": (Hash -- Optional) (ex: {"field": "ioc_x"}) extract value of field indicated in hash key and put value in ioc local database in field 'ioc_X' indicated in configuration "extract". 292 | * Check frequence & correlation in different event: 293 | * "freq_field": (Array) value is array contains name of field of event relationship between anoter event 294 | * "freq_delay": (Interger / Second) time delay between first event and last event (if freq_count == 3 then first 1 and last 3) 295 | * "freq_count": (Interger) count of event you must see for match 296 | * "freq_resettime": (Interger / Second) time to wait for reseach new frequence when you already detected 297 | * "correlate_change_fieldvalue": (Array) value is fields name, check field indicated and verify if value is different for each event matched 298 | * Check differents methods 299 | * "motif": (Array) value is all motifs to check in field selected. 300 | * "false": (Hash empty) add key "false" with value hash empty for verify than field not exist 301 | * "regexp": (Array) (ex: ["^\\d+$","..."]) value is contains regexp, each regexp must be match for valid check. 302 | * "notregexp": (Array) value is contains regex, no regexp musn't be match for valid check. 303 | * "date": (Hash) (syntax: {'egal'|'inf'|'sup'|'diff': x in second}) value contains operator and time in second, check if date is (time.now)-value of time is validate by operator (<,>,!=,==) 304 | * "hour": (Hash) (syntax: {'egal'|'inf'|'sup'|'diff': 0 to 23}) value contains operator and hour range, check if current hour is valid operator (<,>,!=,==) compared to hour indicated 305 | * "day": (Hash) (syntax: {'egal'|'inf'|'sup'|'diff': 0 to 6}) value contains operator and day range, check if current day is day of value with operator (<,>,!=,==) compared to day indicated 306 | * "ipaddr": (Hash) (syntax: {'egal'|'diff']: ipaddr or subnet}) value contains operator and ipaddr range, check if ipaddr in field event is valid operator (equalf or different) compared to ipaddr range indicated 307 | * "sizeope" : (Hash) (syntax: {['egal'|'inf'|'sup'|'diff']: x}) value contains operator and length(x), check size of string contained in field selected, and compare according by operator selected with the value length. 308 | * "numop" : (Hash) (syntax: {['egal'|'inf'|'sup'|'diff']: x}) value contains operator and integer value (x), check interger contained in field selected, and compare according by operator selected with the integer value. 309 | * "compope": (Hash) (syntax: {"fieldz": {['egal'|'inf'|'sup'|'diff']: nil}}) value contains other name field compare, operator, compare field and fieldz and check operator if valid. 310 | 311 | ```json 312 | {"rules":[ 313 | {"type":{"motif":["squid"],"type":1,"note":1,"name":"Field User-AGENT not present","id":1},"user_agent":{"false":{}}}, 314 | {"new_value_dst_host":{"sizeope":{"sup":1},"type":1,"note":1,"name":"New value dst_host","id":2},"type":{"motif":["squid"]}}, 315 | {"elapsed_time":{"numope":{"sup":900000},"type":1,"note":2,"name":"Connection time too long > 15minutes","id":3}}, 316 | {"type":{"motif":["squid"],"type":2,"note":2,"name":"Referer FIELD not present","id":4},"uri_proto":{"notregexp":["tunnel"]},"referer_host":{"false":{}}} 317 | ]} 318 | ``` 319 | 320 | ### REFERENCE (OLD ANOMALIE) 321 | #### conf_ref.json 322 | The conf_ref.json file contains rules to compare event and reference database. 323 | 324 | The first key name is 'rules', the value is an array contains all rule. 325 | Run script for generate reference database before use this feature (**when you generate database reference use clean data or/and verify configuration generated!!**). 326 | A rule is composed of several pair of key/value: 327 | * Key "pivot_field" : filter for select event to check 328 | * value is a hash with key as field name and value is an array contains value present in event field selected. 329 | * Key "list_sig" : value is an array contains all fields name selected for compare with reference database. If some fields not present in some case, it's doesn't matter. 330 | * Key "relation_min" : value is integer, verify than relationship simhash exist and is supperior to "relation_min". 331 | * Key "simhash_size" : value is integer, make size of simhash... (change according by data size to simhash) 332 | * Key "simhash_use_size" (Not works, i will work on!) 333 | * Key "id" : valud is ID of rule 334 | 335 | ```json 336 | {"rules":[ 337 | {"pivot_field":{"tags":["squid"]}, "list_sig": ["src_host","src_ip","dst_host","dst_ip","uri_proto","uri_global"], "relation_min": 10, "simhash_size": 32, "simhash_use_size": 32, "id": 2001} 338 | ]} 339 | ``` 340 | 341 | #### Create reference database (reference.json) 342 | Generate database reference (reference.json file) with script include in scripts folder. 343 | Run script with syntaxe: ./create.rb conf_ref.json pattern.db https://user:secret@localhost:9200 344 | For make good databases, use elasticsearch contains clean data log and verify database for change strange value. 345 | 346 | ##### note_ref_defaut.json 347 | This file contains score by default for each rule matched. 348 | The names keys contains suffix "NOTE" and name of verification method, the value fix note for method matched. 349 | Only a key is different, "NOTE_UNIQ_REDUC" can reduce score when event is "unique". By example if matched LEN method and if "uniq" matched then score value is not 0.25 but 0.25-0.1 => 0.15 (according by configuration below). 350 | 351 | ```json 352 | { 353 | 'NOTE_UNIQ_REDUC': 0.1, 354 | 'NOTE_DEFAULT': 2, 355 | 'NOTE_LISTV': 0.25, 356 | 'NOTE_ENCODING': 0.25, 357 | 'NOTE_LEN': 0.25, 358 | 'NOTE_LEN_AVG': 0.25, 359 | 'NOTE_LEN_EVEN': 0.25, 360 | 'NOTE_REGEXP': 0.25, 361 | 'NOTE_REGEXP_MIN': 0.25 362 | } 363 | ``` 364 | 365 | ##### pattern.db 366 | 367 | This file contains regexp for check format of field value. 368 | 369 | ```json 370 | ALPHA_MAJU=>>[A-Z] 371 | ALPHA_MINU=>>[a-z] 372 | NUM_1to9=>>[1-9] 373 | NUM_0to9=>>[0-9] 374 | ALPHA_MAJandMIN=>>[A-Za-z] 375 | HEXA=>>(0x|x|\\x)[0-9A-Fa-f][0-9A-Fa-f] 376 | CHAR_SPE_NUL=>>\x00 377 | CHAR_SPE_SOH=>>\x01 378 | CHAR_SPE_STX=>>\x02 379 | CHAR_SPE_ETX=>>\x03 380 | CHAR_SPE_EOT=>>\x04 381 | CHAR_SPE_ENQ=>>\x05 382 | CHAR_SPE_ACK=>>\x06 383 | CHAR_SPE_BEL=>>\x07 384 | CHAR_SPE_BS=>>\x08 385 | CHAR_SPE_HT=>>\x09 386 | CHAR_SPE_LF=>>\x0A 387 | CHAR_SPE_VT=>>\x0B 388 | CHAR_SPE_FF=>>\x0C 389 | CHAR_SPE_CR=>>\x0D 390 | CHAR_SPE_SO=>>\x0E 391 | CHAR_SPE_SI=>>\x0F 392 | CHAR_SPE_DLE=>>\x10 393 | CHAR_SPE_DC1=>>\x11 394 | CHAR_SPE_DC2=>>\x12 395 | CHAR_SPE_DC3=>>\x13 396 | CHAR_SPE_DC4=>>\x14 397 | CHAR_SPE_NAK=>>\x15 398 | CHAR_SPE_SYN=>>\x16 399 | CHAR_SPE_ETB=>>\x17 400 | CHAR_SPE_CAN=>>\x18 401 | CHAR_SPE_EM=>>\x19 402 | CHAR_SPE_SUB=>>\x1A 403 | CHAR_SPE_ESC=>>\x1B 404 | CHAR_SPE_FS=>>\x1C 405 | CHAR_SPE_GS=>>\x1D 406 | CHAR_SPE_RS=>>\x1E 407 | CHAR_SPE_US=>>\x1F 408 | CHAR_SPE_SP=>>\x20 409 | CHAR_SPE_EXCL=>>\x21 410 | CHAR_SPE_QUOTE=>>\x22 411 | CHAR_SPE_DIEZ=>>\x23 412 | CHAR_SPE_DOLLAR=>>\x24 413 | CHAR_SPE_POURC=>>\x25 414 | CHAR_SPE_AND=>>\x26 415 | CHAR_SPE_QUOTE2=>>\x27 416 | CHAR_SPE_DPARA=>>\x28 417 | CHAR_SPE_FPARA=>>\x29 418 | CHAR_SPE_ETOI=>>\x2A 419 | CHAR_SPE_PLUS=>>\x2B 420 | CHAR_SPE_VIRG=>>\x2C 421 | CHAR_SPE_MOINS=>>\x2D 422 | CHAR_SPE_POINT=>>\x2E 423 | CHAR_SPE_SLASH=>>\x2F 424 | CHAR_SPE_2POINT=>>\x3A 425 | CHAR_SPE_POINTVIRG=>>\x3B 426 | CHAR_SPE_DBALIZ=>>\x3C 427 | CHAR_SPE_EGAL=>>\x3D 428 | CHAR_SPE_FBALIZ=>>\x3E 429 | CHAR_SPE_INTER=>>\x3F 430 | CHAR_SPE_AROB=>>\x40 431 | CHAR_SPE_DCROCH=>>\x5B 432 | CHAR_SPE_ASLASH=>>\x5C 433 | CHAR_SPE_DCROCH=>>\x5D 434 | CHAR_SPE_CHAP=>>\x5E 435 | CHAR_SPE_UNDERS=>>\x5F 436 | CHAR_SPE_QUOTE3=>>\x60 437 | CHAR_SPE_DACCOL=>>\x7B 438 | CHAR_SPE_OR=>>\x7C 439 | CHAR_SPE_FACCOL=>>\x7D 440 | CHAR_SPE_TILD=>>\x7E 441 | CHAR_SPE_DEL=>>\x7F 442 | CHAR_ETEND_80=>>\x80 443 | CHAR_ETEND_81=>>\x81 444 | CHAR_ETEND_82=>>\x82 445 | CHAR_ETEND_83=>>\x83 446 | CHAR_ETEND_84=>>\x84 447 | CHAR_ETEND_85=>>\x85 448 | CHAR_ETEND_86=>>\x86 449 | CHAR_ETEND_87=>>\x87 450 | CHAR_ETEND_88=>>\x88 451 | CHAR_ETEND_89=>>\x89 452 | CHAR_ETEND_8A=>>\x8A 453 | CHAR_ETEND_8B=>>\x8B 454 | CHAR_ETEND_8C=>>\x8C 455 | CHAR_ETEND_8D=>>\x8D 456 | CHAR_ETEND_8E=>>\x8E 457 | CHAR_ETEND_8F=>>\x8F 458 | CHAR_ETEND_90=>>\x90 459 | CHAR_ETEND_91=>>\x91 460 | CHAR_ETEND_92=>>\x92 461 | CHAR_ETEND_93=>>\x93 462 | CHAR_ETEND_94=>>\x94 463 | CHAR_ETEND_95=>>\x95 464 | CHAR_ETEND_96=>>\x96 465 | CHAR_ETEND_97=>>\x97 466 | CHAR_ETEND_98=>>\x98 467 | CHAR_ETEND_99=>>\x99 468 | CHAR_ETEND_9A=>>\x9A 469 | CHAR_ETEND_9B=>>\x9B 470 | CHAR_ETEND_9C=>>\x9C 471 | CHAR_ETEND_9D=>>\x9D 472 | CHAR_ETEND_9E=>>\x9E 473 | CHAR_ETEND_9F=>>\x9F 474 | CHAR_ETEND_A0=>>\xA0 475 | CHAR_ETEND_A1=>>\xA1 476 | CHAR_ETEND_A2=>>\xA2 477 | CHAR_ETEND_A3=>>\xA3 478 | CHAR_ETEND_A4=>>\xA4 479 | CHAR_ETEND_A5=>>\xA5 480 | CHAR_ETEND_A6=>>\xA6 481 | CHAR_ETEND_A7=>>\xA7 482 | CHAR_ETEND_A8=>>\xA8 483 | CHAR_ETEND_A9=>>\xA9 484 | CHAR_ETEND_AA=>>\xAA 485 | CHAR_ETEND_AB=>>\xAB 486 | CHAR_ETEND_AC=>>\xAC 487 | CHAR_ETEND_AD=>>\xAD 488 | CHAR_ETEND_AE=>>\xAE 489 | CHAR_ETEND_PD=>>\xAF 490 | CHAR_ETEND_B0=>>\xB0 491 | CHAR_ETEND_B1=>>\xB1 492 | CHAR_ETEND_B2=>>\xB2 493 | CHAR_ETEND_B3=>>\xB3 494 | CHAR_ETEND_B4=>>\xB4 495 | CHAR_ETEND_B5=>>\xB5 496 | CHAR_ETEND_B6=>>\xB6 497 | CHAR_ETEND_B7=>>\xB7 498 | CHAR_ETEND_B8=>>\xB8 499 | CHAR_ETEND_B9=>>\xB9 500 | CHAR_ETEND_BA=>>\xBA 501 | CHAR_ETEND_BB=>>\xBB 502 | CHAR_ETEND_BC=>>\xBC 503 | CHAR_ETEND_BD=>>\xBD 504 | CHAR_ETEND_BE=>>\xBE 505 | CHAR_ETEND_BF=>>\xBF 506 | CHAR_ETEND_C0=>>\xC0 507 | CHAR_ETEND_C1=>>\xC1 508 | CHAR_ETEND_C2=>>\xC2 509 | CHAR_ETEND_C3=>>\xC3 510 | CHAR_ETEND_C4=>>\xC4 511 | CHAR_ETEND_C5=>>\xC5 512 | CHAR_ETEND_C6=>>\xC6 513 | CHAR_ETEND_C7=>>\xC7 514 | CHAR_ETEND_C8=>>\xC8 515 | CHAR_ETEND_C9=>>\xC9 516 | CHAR_ETEND_CA=>>\xCA 517 | CHAR_ETEND_CB=>>\xCB 518 | CHAR_ETEND_CC=>>\xCC 519 | CHAR_ETEND_CD=>>\xCD 520 | CHAR_ETEND_CE=>>\xCE 521 | CHAR_ETEND_CF=>>\xCF 522 | CHAR_ETEND_D0=>>\xD0 523 | CHAR_ETEND_D1=>>\xD1 524 | CHAR_ETEND_D2=>>\xD2 525 | CHAR_ETEND_D3=>>\xD3 526 | CHAR_ETEND_D4=>>\xD4 527 | CHAR_ETEND_D5=>>\xD5 528 | CHAR_ETEND_D6=>>\xD6 529 | CHAR_ETEND_D7=>>\xD7 530 | CHAR_ETEND_D8=>>\xD8 531 | CHAR_ETEND_D9=>>\xD9 532 | CHAR_ETEND_DA=>>\xDA 533 | CHAR_ETEND_DB=>>\xDB 534 | CHAR_ETEND_DC=>>\xDC 535 | CHAR_ETEND_JJ=>>\xDD 536 | CHAR_ETEND_DE=>>\xDE 537 | CHAR_ETEND_D=>>\xDF 538 | CHAR_ETEND_E0=>>\xE0 539 | CHAR_ETEND_E1=>>\xE1 540 | CHAR_ETEND_E2=>>\xE2 541 | CHAR_ETEND_E3=>>\xE3 542 | CHAR_ETEND_E4=>>\xE4 543 | CHAR_ETEND_E5=>>\xE5 544 | CHAR_ETEND_E6=>>\xE6 545 | CHAR_ETEND_E=>>\xE7 546 | CHAR_ETEND_E8=>>\xE8 547 | CHAR_ETEND_E9=>>\xE9 548 | CHAR_ETEND_EA=>>\xEA 549 | CHAR_ETEND_EB=>>\xEB 550 | CHAR_ETEND_EC=>>\xEC 551 | CHAR_ETEND_ED=>>\xED 552 | CHAR_ETEND_EE=>>\xEE 553 | CHAR_ETEND_EF=>>\xEF 554 | CHAR_ETEND_F0=>>\xF0 555 | CHAR_ETEND_F1=>>\xF1 556 | CHAR_ETEND_F2=>>\xF2 557 | CHAR_ETEND_F3=>>\xF3 558 | CHAR_ETEND_F4=>>\xF4 559 | CHAR_ETEND_F5=>>\xF5 560 | CHAR_ETEND_F6=>>\xF6 561 | CHAR_ETEND_F7=>>\xF7 562 | CHAR_ETEND_F8=>>\xF8 563 | CHAR_ETEND_F9=>>\xF9 564 | CHAR_ETEND_FA=>>\xFA 565 | CHAR_ETEND_FB=>>\xFB 566 | CHAR_ETEND_FC=>>\xFC 567 | CHAR_ETEND_FD=>>\xFD 568 | CHAR_ETEND_FE=>>\xFE 569 | CHAR_ETEND_FF=>>\xFF 570 | ``` 571 | 572 | ##### reference.json 573 | This file is generated by script on clean data Elasticsearch. 574 | 575 | ### NOTE 576 | #### note.json 577 | This file (note.json) contains rules for correlation score, you can reduce or inscrease score when you matched several features (IOC/REF/SIG). 578 | The json file contains main key 'rules' and value is an array contains each rule in hash format. 579 | key/value of hash Rule are: 580 | * 'id' Key : value is a array contains all "ID" matched in event 581 | * 'optid' Key : value is a array contains all "ID" maybe matched in event 582 | * 'opt_num' Key : value is a integer indicate count of 'optid' must be present in event. In example below, at least one ID, 3 or 4 must be present. 583 | * 'noid' Key : value is a array contains all ID musn't be present in event 584 | * 'overwrite' Key : value is boolean, indicate if you can overwrite score for reduce. 585 | 586 | ```json 587 | {"rules":[ 588 | {"id":[2],"optid":[3,4],"opt_num":1,"noid":[],"note":3,"overwrite":true} 589 | ] 590 | } 591 | ``` 592 | 593 | ### FINGERPRINT 594 | #### fingerprint_conf.json 595 | The file fingerprint_conf.json contains rules for create fingerprint, and tag "first" or "complementary information" in event. 596 | The first key is value must be present in select_fp (main configuration). The value of key is Hash composed with key/value: 597 | * Key 'fields': value is array contains name of field used for create simhash. 598 | * Key 'delay': reset all fingerprint for "fields" after time exceeded (use for dhcp by example). The value is second number. 599 | * Key 'hashbit': value is number, define size of simhash. 600 | 601 | ```json 602 | { 603 | "squid":{"fields":["src_ip","dst_host","dst_ip","uri_proto","sig_detected_name","ioc_detected","tags"],"delay":36000, "hashbit": 32} 604 | } 605 | ``` 606 | 607 | #### drop-fp.json 608 | Drop false positive event. The key of json is simhash and value is reason of drop. 609 | 610 | ```json 611 | {"821861840": "false positive: update of software XXX"} 612 | ``` 613 | 614 | ### FREQUENCE 615 | #### conf_freq.json 616 | The file conf_freq.json contains rules for create interne db frequence (reset if you restart logstash). 617 | The first key is rules and value is array which contains each rule in hash format. 618 | A rule is hash composed of key/value: 619 | * Key 'select_field': value is hash, key is field in event and value is an array contains value must be present in event field. This parameter is filter for selected event to check. 620 | * Key 'note': score of rule 621 | * Key 'refresh_time': parameter give delay check (event increase?) 622 | * Key 'reset_time': paramter give delay for reset database value (for only this rule) 623 | * Key 'wait_after_reset': time to wait after reset database or first start 624 | * Key 'id': value is number. ID of rule 625 | 626 | ```json 627 | {"rules":[ 628 | {"select_field": {"tags":["squid"],"return_code":["404"]}, "note": 2, "refresh_time": 60, "reset_time": 86400, "wait_after_reset": 10, "id": 3001} 629 | ]} 630 | ``` 631 | 632 | ## Contributing 633 | 634 | 635 | ** You welcome to contribute (report bug, new functionality, ...)! ** 636 | 637 | ** Possibility you meet bug, I recently ported on logstash 5.x !! ** 638 | 639 | This is a plugin for [Logstash](https://github.com/elastic/logstash). 640 | 641 | It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way. 642 | 643 | ## Contact 644 | Lionel PRAT lionel.prat9 (at) gmail.com or cronos56 (at) yahoo.com 645 | -------------------------------------------------------------------------------- /logstash-filter-sig/Rakefile: -------------------------------------------------------------------------------- 1 | require "logstash/devutils/rake" 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/blacklist.json: -------------------------------------------------------------------------------- 1 | { 2 | "ioc_ip":["(127\\.[0-9]+\\.[0-9]+\\.[0-9]+|10\\.\\d+\\.\\d+\\.\\d+|192\\.168\\.\\d+\\.\\d+|172\\.([1-2][0-9]|0|30|31)\\.\\d+\\.\\d+|255\\.255\\.255\\.\\d+)"], 3 | "email-attachment":[], 4 | "ioc_attachment":["2"], 5 | "ioc_emailaddr":[], 6 | "ioc_uri":["\/"], 7 | "ioc_domain":[], 8 | "ioc_hostname":[], 9 | "ioc_user-agent":[], 10 | "ioc_email-subject":[], 11 | "ioc_as":[] 12 | } 13 | 14 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/conf_bl.json: -------------------------------------------------------------------------------- 1 | { 2 | "src_ip":{"dbs":["firehol_webserver.netset"],"id":"18001","note": 2,"category":"webserver attack"}, 3 | "dst_ip":{"dbs":["firehol_webclient.netset"],"id":"18002","note":2,"category":"malware"} 4 | } 5 | 6 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/conf_freq.json: -------------------------------------------------------------------------------- 1 | {"rules":[ 2 | {"select_field": {"tags":["squid"],"return_code":["404"]}, "note": 2, "refresh_time": 60, "reset_time": 86400, "wait_after_reset": 10, "id": 3001} 3 | ]} 4 | 5 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/conf_ref.json: -------------------------------------------------------------------------------- 1 | {"rules":[ 2 | {"pivot_field":{"tags":["squid"]}, "list_sig": ["src_host","src_ip","dst_host","dst_ip","uri_proto","uri_global"], "relation_min": 10, "simhash_size": 32, "simhash_use_size": 32, "id": 2001} 3 | ]} 4 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/drop-db.json: -------------------------------------------------------------------------------- 1 | {"dst_domain": "^google.com$|^mydomain.ext$", "dst_ip": "10.0.0.\\d+"} 2 | 3 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/drop-fp.json: -------------------------------------------------------------------------------- 1 | {"821861840": "false positive: update of software XXX"} 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/enr.json: -------------------------------------------------------------------------------- 1 | {"1": 2 | { 3 | "file":"/etc/logstash/db/whois.json", 4 | "db": {}, 5 | "prefix": "whois_info_", "filters": {"type": "squid", "src_ip": "^192\\.168\\.0\\.\\\\d+$"}, "link": ["domain_dst"], "if_empty": "WHOIS", "form_in_db": "$1$", "filter_insert": [], "filter_noinsert": [] 6 | } 7 | } 8 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/fingerprint_conf.json: -------------------------------------------------------------------------------- 1 | {"squid": {"fields": ["src_ip","dst_host","dst_ip","uri_proto","tags"], "delay": 3600, "hashbit": 32}} 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/ioc.json: -------------------------------------------------------------------------------- 1 | {} 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/ioc_conf.json: -------------------------------------------------------------------------------- 1 | { 2 | "ioc_hostname":["_host"], "ioc_hostname_downcase":true, "ioc_hostname_iocnote":2, "ioc_hostname_iocid":1001, 3 | "ioc_domain":["_domain"], "ioc_domain_downcase":true, "ioc_domain_iocnote":2, "ioc_domain_iocid":1002, 4 | "ioc_ip":["_ip"], "ioc_ip_downcase":false, "ioc_ip_iocnote":1, "ioc_ip_iocid":1003, 5 | "ioc_emailaddr":["_emailaddr"], "ioc_emailaddr_downcase":true, "ioc_emailaddr_iocnote":3, "ioc_emailaddr_iocid":1004, 6 | "ioc_user-agent":["user_agent"], "ioc_user-agent_downcase":false, "ioc_user-agent_iocnote":2, "ioc_user-agent_iocid":1005, 7 | "ioc_uri":["_url","_request","_uripath_global"], "ioc_uri_downcase":false, "ioc_uri_iocnote":2, "ioc_uri_iocid":1006, 8 | "ioc_attachment":["attachment","_uriparam","_uripage"], "ioc_attachment_downcase":false, "ioc_attachment_iocnote":1, "ioc_attachment_iocid":1007 9 | } 10 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/ioc_local.json: -------------------------------------------------------------------------------- 1 | {} 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/new-save.json: -------------------------------------------------------------------------------- 1 | {} 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/new.json: -------------------------------------------------------------------------------- 1 | {"rules": ["dst_host","user_agent"]} 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/note.json: -------------------------------------------------------------------------------- 1 | {"rules":[ 2 | {"id":[2],"optid":[3,4],"opt_num":1,"noid":[],"note":3,"overwrite":true} 3 | ] 4 | } 5 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/note_ref_defaut.json: -------------------------------------------------------------------------------- 1 | { 2 | "NOTE_UNIQ_REDUC": 0.1, 3 | "NOTE_DEFAULT": 2, 4 | "NOTE_LISTV": 0.25, 5 | "NOTE_ENCODING": 0.25, 6 | "NOTE_LEN": 0.25, 7 | "NOTE_LEN_AVG": 0.25, 8 | "NOTE_LEN_EVEN": 0.25, 9 | "NOTE_REGEXP": 0.25, 10 | "NOTE_REGEXP_MIN": 0.25 11 | } 12 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/pattern.db: -------------------------------------------------------------------------------- 1 | ALPHA_MAJU=>>[A-Z] 2 | ALPHA_MINU=>>[a-z] 3 | NUM_1to9=>>[1-9] 4 | NUM_0to9=>>[0-9] 5 | ALPHA_MAJandMIN=>>[A-Za-z] 6 | HEXA=>>(0x|x|\\x)[0-9A-Fa-f][0-9A-Fa-f] 7 | CHAR_SPE_NUL=>>\x00 8 | CHAR_SPE_SOH=>>\x01 9 | CHAR_SPE_STX=>>\x02 10 | CHAR_SPE_ETX=>>\x03 11 | CHAR_SPE_EOT=>>\x04 12 | CHAR_SPE_ENQ=>>\x05 13 | CHAR_SPE_ACK=>>\x06 14 | CHAR_SPE_BEL=>>\x07 15 | CHAR_SPE_BS=>>\x08 16 | CHAR_SPE_HT=>>\x09 17 | CHAR_SPE_LF=>>\x0A 18 | CHAR_SPE_VT=>>\x0B 19 | CHAR_SPE_FF=>>\x0C 20 | CHAR_SPE_CR=>>\x0D 21 | CHAR_SPE_SO=>>\x0E 22 | CHAR_SPE_SI=>>\x0F 23 | CHAR_SPE_DLE=>>\x10 24 | CHAR_SPE_DC1=>>\x11 25 | CHAR_SPE_DC2=>>\x12 26 | CHAR_SPE_DC3=>>\x13 27 | CHAR_SPE_DC4=>>\x14 28 | CHAR_SPE_NAK=>>\x15 29 | CHAR_SPE_SYN=>>\x16 30 | CHAR_SPE_ETB=>>\x17 31 | CHAR_SPE_CAN=>>\x18 32 | CHAR_SPE_EM=>>\x19 33 | CHAR_SPE_SUB=>>\x1A 34 | CHAR_SPE_ESC=>>\x1B 35 | CHAR_SPE_FS=>>\x1C 36 | CHAR_SPE_GS=>>\x1D 37 | CHAR_SPE_RS=>>\x1E 38 | CHAR_SPE_US=>>\x1F 39 | CHAR_SPE_SP=>>\x20 40 | CHAR_SPE_EXCL=>>\x21 41 | CHAR_SPE_QUOTE=>>\x22 42 | CHAR_SPE_DIEZ=>>\x23 43 | CHAR_SPE_DOLLAR=>>\x24 44 | CHAR_SPE_POURC=>>\x25 45 | CHAR_SPE_AND=>>\x26 46 | CHAR_SPE_QUOTE2=>>\x27 47 | CHAR_SPE_DPARA=>>\x28 48 | CHAR_SPE_FPARA=>>\x29 49 | CHAR_SPE_ETOI=>>\x2A 50 | CHAR_SPE_PLUS=>>\x2B 51 | CHAR_SPE_VIRG=>>\x2C 52 | CHAR_SPE_MOINS=>>\x2D 53 | CHAR_SPE_POINT=>>\x2E 54 | CHAR_SPE_SLASH=>>\x2F 55 | CHAR_SPE_2POINT=>>\x3A 56 | CHAR_SPE_POINTVIRG=>>\x3B 57 | CHAR_SPE_DBALIZ=>>\x3C 58 | CHAR_SPE_EGAL=>>\x3D 59 | CHAR_SPE_FBALIZ=>>\x3E 60 | CHAR_SPE_INTER=>>\x3F 61 | CHAR_SPE_AROB=>>\x40 62 | CHAR_SPE_DCROCH=>>\x5B 63 | CHAR_SPE_ASLASH=>>\x5C 64 | CHAR_SPE_DCROCH=>>\x5D 65 | CHAR_SPE_CHAP=>>\x5E 66 | CHAR_SPE_UNDERS=>>\x5F 67 | CHAR_SPE_QUOTE3=>>\x60 68 | CHAR_SPE_DACCOL=>>\x7B 69 | CHAR_SPE_OR=>>\x7C 70 | CHAR_SPE_FACCOL=>>\x7D 71 | CHAR_SPE_TILD=>>\x7E 72 | CHAR_SPE_DEL=>>\x7F 73 | CHAR_ETEND_80=>>\x80 74 | CHAR_ETEND_81=>>\x81 75 | CHAR_ETEND_82=>>\x82 76 | CHAR_ETEND_83=>>\x83 77 | CHAR_ETEND_84=>>\x84 78 | CHAR_ETEND_85=>>\x85 79 | CHAR_ETEND_86=>>\x86 80 | CHAR_ETEND_87=>>\x87 81 | CHAR_ETEND_88=>>\x88 82 | CHAR_ETEND_89=>>\x89 83 | CHAR_ETEND_8A=>>\x8A 84 | CHAR_ETEND_8B=>>\x8B 85 | CHAR_ETEND_8C=>>\x8C 86 | CHAR_ETEND_8D=>>\x8D 87 | CHAR_ETEND_8E=>>\x8E 88 | CHAR_ETEND_8F=>>\x8F 89 | CHAR_ETEND_90=>>\x90 90 | CHAR_ETEND_91=>>\x91 91 | CHAR_ETEND_92=>>\x92 92 | CHAR_ETEND_93=>>\x93 93 | CHAR_ETEND_94=>>\x94 94 | CHAR_ETEND_95=>>\x95 95 | CHAR_ETEND_96=>>\x96 96 | CHAR_ETEND_97=>>\x97 97 | CHAR_ETEND_98=>>\x98 98 | CHAR_ETEND_99=>>\x99 99 | CHAR_ETEND_9A=>>\x9A 100 | CHAR_ETEND_9B=>>\x9B 101 | CHAR_ETEND_9C=>>\x9C 102 | CHAR_ETEND_9D=>>\x9D 103 | CHAR_ETEND_9E=>>\x9E 104 | CHAR_ETEND_9F=>>\x9F 105 | CHAR_ETEND_A0=>>\xA0 106 | CHAR_ETEND_A1=>>\xA1 107 | CHAR_ETEND_A2=>>\xA2 108 | CHAR_ETEND_A3=>>\xA3 109 | CHAR_ETEND_A4=>>\xA4 110 | CHAR_ETEND_A5=>>\xA5 111 | CHAR_ETEND_A6=>>\xA6 112 | CHAR_ETEND_A7=>>\xA7 113 | CHAR_ETEND_A8=>>\xA8 114 | CHAR_ETEND_A9=>>\xA9 115 | CHAR_ETEND_AA=>>\xAA 116 | CHAR_ETEND_AB=>>\xAB 117 | CHAR_ETEND_AC=>>\xAC 118 | CHAR_ETEND_AD=>>\xAD 119 | CHAR_ETEND_AE=>>\xAE 120 | CHAR_ETEND_PD=>>\xAF 121 | CHAR_ETEND_B0=>>\xB0 122 | CHAR_ETEND_B1=>>\xB1 123 | CHAR_ETEND_B2=>>\xB2 124 | CHAR_ETEND_B3=>>\xB3 125 | CHAR_ETEND_B4=>>\xB4 126 | CHAR_ETEND_B5=>>\xB5 127 | CHAR_ETEND_B6=>>\xB6 128 | CHAR_ETEND_B7=>>\xB7 129 | CHAR_ETEND_B8=>>\xB8 130 | CHAR_ETEND_B9=>>\xB9 131 | CHAR_ETEND_BA=>>\xBA 132 | CHAR_ETEND_BB=>>\xBB 133 | CHAR_ETEND_BC=>>\xBC 134 | CHAR_ETEND_BD=>>\xBD 135 | CHAR_ETEND_BE=>>\xBE 136 | CHAR_ETEND_BF=>>\xBF 137 | CHAR_ETEND_C0=>>\xC0 138 | CHAR_ETEND_C1=>>\xC1 139 | CHAR_ETEND_C2=>>\xC2 140 | CHAR_ETEND_C3=>>\xC3 141 | CHAR_ETEND_C4=>>\xC4 142 | CHAR_ETEND_C5=>>\xC5 143 | CHAR_ETEND_C6=>>\xC6 144 | CHAR_ETEND_C7=>>\xC7 145 | CHAR_ETEND_C8=>>\xC8 146 | CHAR_ETEND_C9=>>\xC9 147 | CHAR_ETEND_CA=>>\xCA 148 | CHAR_ETEND_CB=>>\xCB 149 | CHAR_ETEND_CC=>>\xCC 150 | CHAR_ETEND_CD=>>\xCD 151 | CHAR_ETEND_CE=>>\xCE 152 | CHAR_ETEND_CF=>>\xCF 153 | CHAR_ETEND_D0=>>\xD0 154 | CHAR_ETEND_D1=>>\xD1 155 | CHAR_ETEND_D2=>>\xD2 156 | CHAR_ETEND_D3=>>\xD3 157 | CHAR_ETEND_D4=>>\xD4 158 | CHAR_ETEND_D5=>>\xD5 159 | CHAR_ETEND_D6=>>\xD6 160 | CHAR_ETEND_D7=>>\xD7 161 | CHAR_ETEND_D8=>>\xD8 162 | CHAR_ETEND_D9=>>\xD9 163 | CHAR_ETEND_DA=>>\xDA 164 | CHAR_ETEND_DB=>>\xDB 165 | CHAR_ETEND_DC=>>\xDC 166 | CHAR_ETEND_JJ=>>\xDD 167 | CHAR_ETEND_DE=>>\xDE 168 | CHAR_ETEND_D=>>\xDF 169 | CHAR_ETEND_E0=>>\xE0 170 | CHAR_ETEND_E1=>>\xE1 171 | CHAR_ETEND_E2=>>\xE2 172 | CHAR_ETEND_E3=>>\xE3 173 | CHAR_ETEND_E4=>>\xE4 174 | CHAR_ETEND_E5=>>\xE5 175 | CHAR_ETEND_E6=>>\xE6 176 | CHAR_ETEND_E=>>\xE7 177 | CHAR_ETEND_E8=>>\xE8 178 | CHAR_ETEND_E9=>>\xE9 179 | CHAR_ETEND_EA=>>\xEA 180 | CHAR_ETEND_EB=>>\xEB 181 | CHAR_ETEND_EC=>>\xEC 182 | CHAR_ETEND_ED=>>\xED 183 | CHAR_ETEND_EE=>>\xEE 184 | CHAR_ETEND_EF=>>\xEF 185 | CHAR_ETEND_F0=>>\xF0 186 | CHAR_ETEND_F1=>>\xF1 187 | CHAR_ETEND_F2=>>\xF2 188 | CHAR_ETEND_F3=>>\xF3 189 | CHAR_ETEND_F4=>>\xF4 190 | CHAR_ETEND_F5=>>\xF5 191 | CHAR_ETEND_F6=>>\xF6 192 | CHAR_ETEND_F7=>>\xF7 193 | CHAR_ETEND_F8=>>\xF8 194 | CHAR_ETEND_F9=>>\xF9 195 | CHAR_ETEND_FA=>>\xFA 196 | CHAR_ETEND_FB=>>\xFB 197 | CHAR_ETEND_FC=>>\xFC 198 | CHAR_ETEND_FD=>>\xFD 199 | CHAR_ETEND_FE=>>\xFE 200 | CHAR_ETEND_FF=>>\xFF 201 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/reference.json: -------------------------------------------------------------------------------- 1 | {} 2 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/sig.json: -------------------------------------------------------------------------------- 1 | {"rules":[ 2 | {"type":{"motif":["squid"],"type":1,"note":1,"name":"Field User-AGENT not present","id":1},"user_agent":{"false":{}}}, 3 | {"new_value_dst_host":{"sizeope":{"sup":1},"type":1,"note":1,"name":"New value dst_host","id":2},"type":{"motif":["squid"]}}, 4 | {"elapsed_time":{"numope":{"sup":900000},"type":1,"note":2,"name":"Connection time too long > 15minutes","id":3}}, 5 | {"type":{"motif":["squid"],"type":2,"note":2,"name":"Referer FIELD not present","id":4},"uri_proto":{"notregexp":["tunnel"]},"referer_host":{"false":{}}} 6 | ]} 7 | -------------------------------------------------------------------------------- /logstash-filter-sig/conf-samples/whois.json: -------------------------------------------------------------------------------- 1 | { 2 | "google.com": 3 | {"creation_date": ["1997-09-15 00:00:00", "1997-09-15T00:00:00-0700"], "emails": ["abusecomplaints@markmonitor.com", "contact-admin@google.com", "dns-admin@google.com"], "org": "Google Inc.", "expiration_date": ["2020-09-14 00:00:00", "2020-09-13T21:00:00-0700"]} 4 | } 5 | -------------------------------------------------------------------------------- /logstash-filter-sig/logstash-filter-sig.gemspec: -------------------------------------------------------------------------------- 1 | Gem::Specification.new do |s| 2 | s.name = 'logstash-filter-sig' 3 | s.version = '0.9.0' 4 | s.licenses = ['Apache License (2.0)'] 5 | s.summary = "This filter can detect IOC, signature and comportemental change on flux." 6 | s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program" 7 | s.authors = ["Lionel PRAT"] 8 | s.email = 'lionel.prat9@gmail.com' 9 | s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html" 10 | s.require_paths = ["lib"] 11 | 12 | # Files 13 | s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT'] 14 | # Tests 15 | s.test_files = s.files.grep(%r{^(test|spec|features)/}) 16 | 17 | # Special flag to let us know this is actually a logstash plugin 18 | s.metadata = { "logstash_plugin" => "true", "logstash_group" => "filter" } 19 | 20 | # Gem dependencies 21 | s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99" 22 | s.add_runtime_dependency "simhash" 23 | s.add_development_dependency 'logstash-devutils' 24 | end 25 | -------------------------------------------------------------------------------- /logstash-filter-sig/scripts-create-db/create_ref.rb: -------------------------------------------------------------------------------- 1 | #code for create DB REF for use in logstash SIG plugin (function ref) 2 | #author: lionel.prat9@gmail.com 3 | 4 | require 'elasticsearch' 5 | require 'json' 6 | require 'simhash' 7 | 8 | ############################## 9 | #func nested_hash_value is find on http://stackoverflow.com/questions/8301566/find-key-value-pairs-deep-inside-a-hash-containing-an-arbitrary-number-of-nested 10 | def nested_hash_value(obj,key) 11 | if obj.respond_to?(:key?) && obj.key?(key) 12 | obj[key] 13 | elsif obj.respond_to?(:each) 14 | r = nil 15 | obj.find{ |*a| r=nested_hash_value(a.last,key) } 16 | r 17 | end 18 | end 19 | ############################## 20 | stop = false 21 | 22 | #load rules 23 | rules={} 24 | if ARGV[0] 25 | if !File.exists?(ARGV[0]) 26 | @logger.warn("File RULES not exist", :path => ARGV[0]) 27 | exit(0) 28 | else 29 | tmp_db = JSON.parse( IO.read(ARGV[0], encoding:'utf-8') ) 30 | unless tmp_db.nil? 31 | unless tmp_db['rules'].nil? 32 | if tmp_db['rules'].is_a?(Array) 33 | rules= tmp_db['rules'] 34 | end 35 | end 36 | end 37 | end 38 | else 39 | stop = true 40 | end 41 | 42 | 43 | #load pattern 44 | pattern_db = {} 45 | if ARGV[1] 46 | #load rules 47 | if !File.exists?(ARGV[1]) 48 | @logger.warn("File PATTERN DB not exist", :path => ARGV[1]) 49 | exit(0) 50 | else 51 | File.readlines(ARGV[1]).each do |line| 52 | elem1, elem2 = line.split(/=>>/) 53 | elem2.delete!("\n") 54 | pattern_db[elem1] = elem2 55 | end 56 | end 57 | else 58 | stop = true 59 | end 60 | 61 | #create connexion ES 62 | if ARGV[2] 63 | @client = Elasticsearch::Client.new url: ARGV[2], timeout: 600, 64 | transport_options: { ssl: { verify: false } } 65 | #verify connection 66 | begin 67 | res=@client.info 68 | rescue 69 | puts 'error connexion to ES' 70 | exit (0) 71 | end 72 | else 73 | stop = true 74 | end 75 | 76 | #usage 77 | if ARGV.empty? or stop 78 | puts "Usage: create_ref.rb conf_ref.json pattern.db ES_URI(ex: https://user:secret@localhost:9200) [INDEX_NAME:logstash-*] [DEFAUT_NOTE_FILE:note_ref_defaut.json]" 79 | exit 1 80 | end 81 | 82 | #outpout file: 83 | output_file='reference.json' 84 | #choice option index name 85 | if ARGV[3] 86 | index_name = ARGV[3].to_s 87 | else 88 | index_name = "logstash-*" 89 | end 90 | 91 | #choice option file contains note 92 | if ARGV[4] 93 | note_file = ARGV[4].to_s 94 | else 95 | note_file = "note_ref_defaut.json" 96 | end 97 | #note_db => {"NOTE_ENCODING": 0.25, "NOTE_ASCII": 0.25, "DEFAULT_NOTE": 2, "NOTE_UNIQ_REDUC": 0.1...} 98 | note_db = {} 99 | if !File.exists?(note_file) 100 | @logger.warn("File NOTE DB not exist", :path => note_file) 101 | exit(0) 102 | else 103 | note_db = JSON.parse( IO.read(note_file, encoding:'utf-8') ) 104 | end 105 | #rules[ {"pivot_field":{field1:'value'},{field2:'value'}, "list_sig": [fieldx,fieldy,...], "relation_min": 10, "simhash_size": 16, "simhash_use_size": 14, "id": 200X} ] 106 | 107 | ref_db = {} 108 | #create db rule by rule 109 | for rule in rules 110 | #create reference in db with ID of rule 111 | ref_db[rule['id']] = {} 112 | #sav field not uniq for use to relation/link 113 | tmp_field_notuniq = [] 114 | #create search with pivot field and value, with aggregation on each field 115 | #create pivot 116 | query_pivot = [] 117 | rule['pivot_field'].each do |fieldp,valp| 118 | # exist field query: { "exists":{ "field": "src_ip" } } 119 | tmp_query = {"query": { "match": { } } } 120 | tmp_query["query"]["match"][fieldp] = {"query": valp, "type": "phrase"} 121 | query_pivot.push(tmp_query) 122 | end 123 | for fieldx in rule['list_sig'] 124 | ref_db[rule['id']][fieldx] = {} 125 | #pivot => {"query": {"match": {field_name: {"query": field_value,"type": "phrase"}}}, 126 | #get mapping info for field for choice use ".raw" or no 127 | res=client.indices.get_field_mapping index: index_name, field: fieldx 128 | type_field=nested_hash_value(res,'type') 129 | #https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-types.html 130 | field_name=fieldx 131 | #use raw for string 132 | #problem on array hash... by exemple sig_detected 133 | #if not ['ip', 'binary', 'date', 'boolean', 'long', 'integer', 'short', 'byte', 'double', 'float'].include?(type_field.to_s) 134 | ref_db[rule['id']][fieldx]['TYPE'] = type_field.to_s 135 | if ['boolean', 'long', 'integer', 'short', 'byte', 'double', 'float'].include?(type_field.to_s) 136 | #method for number 137 | res=@client.search index: index_name, size: 0, search_type: 'count', body: {"query": { "filtered": { "filter": { "bool": { "must": query_pivot } } } }, "aggregations": { "SP": { "terms": { "field": field_name, "size": 0, "order": {"_count": "desc"}}} } } 138 | #res['aggregations']['SP']['buckets'].length.to_s 139 | #res["hits"]["total"].to_s 140 | #check uniq term if more than 1% difference res["hits"]["total"]*0.01) == 1% && if 1% > res['aggregations']['SP']['buckets'].length then UNIQ else NOT UNIQ END 141 | if (res['aggregations']['SP']['buckets'].length >= 1) and (res['aggregations']['SP']['buckets'].length>(res["hits"]["total"]*0.01) 142 | #not uniq 143 | ref_db[rule['id']][fieldx]['Uniq_value'] = false 144 | else 145 | #uniq 146 | ref_db[rule['id']][fieldx]['Uniq_value'] = true 147 | end 148 | if res['aggregations']['SP']['buckets'].length >= 1 and 20 <= res['aggregations']['SP']['buckets'].length 149 | ref_db[rule['id']][fieldx]['LIST_VALUE'] = [] 150 | for hash_aggr in res['aggregations']['SP']['buckets'] 151 | ref_db[rule['id']][fieldx]['LIST_VALUE'].push(*hash_aggr['key']) 152 | end 153 | end 154 | ref_db[rule['id']][fieldx]['ENCODING'] = [] 155 | ref_db[rule['id']][fieldx]['LEN_MAX'] = 0 156 | ref_db[rule['id']][fieldx]['LEN_MIN']= 9999999999 157 | count_avg = 0 158 | len_avg = 0 159 | evennumber=false 160 | noevennumber=false 161 | ref_db[rule['id']][fieldx]['LEN_AVG_PRCT'] = 20 # +/- 20% of LEN_AVG for accept 162 | ref_db[rule['id']][fieldx]['REGEXP'] = [] 163 | ref_db[rule['id']][fieldx]['REGEXP_MIN'] = [] 164 | for hash_aggr in res['aggregations']['SP']['buckets'] 165 | #analyz entry by entry: {"key"=>"XXX", "doc_count"=>NNN} 166 | #collect encoding -> float, 167 | ref_db[rule['id']][fieldx]['ENCODING'].push(*hash_aggr['key'].class.to_s) unless ref_db[rule['id']][fieldx]['ENCODING'].include?(hash_aggr['key'].class.to_s) 168 | #collect len info 169 | ref_db[rule['id']][fieldx]['LEN_MAX'] = hash_aggr['key'].to_i if ref_db[rule['id']][fieldx]['LEN_MAX'] < hash_aggr['key'].to_i 170 | ref_db[rule['id']][fieldx]['LEN_MIN'] = hash_aggr['key'].to_i if ref_db[rule['id']][fieldx]['LEN_MIN'] > hash_aggr['key'].to_i 171 | count_avg = count_avg + hash_aggr['doc_count'].to_i 172 | len_avg = len_avg + (hash_aggr['doc_count'].to_i * hash_aggr['doc_count'].to_i) 173 | #check EVEN number or not (length for string) 174 | if hash_aggr['doc_count'].to_i.even? 175 | evennumber = true 176 | else 177 | noevennumber = true 178 | end 179 | #check pattern on aggr value 180 | rlist = [] 181 | pattern_db.each do |key, value| 182 | match = Regexp.new(value, nil, 'n').match(hash_aggr['key'].to_s) 183 | if not match.nil? 184 | rlist << key 185 | end 186 | end 187 | #create list regexp min 188 | if ref_db[rule['id']][fieldx]['REGEXP_MIN'].empty? 189 | ref_db[rule['id']][fieldx]['REGEXP_MIN'] = rlist 190 | else 191 | intersec = ref_db[rule['id']][fieldx]['REGEXP_MIN'] & rlist 192 | ref_db[rule['id']][fieldx]['REGEXP_MIN'] = intersec 193 | end 194 | #create regexp list 195 | ref_db[rule['id']][fieldx]['REGEXP'].push(*rlist.join("::")) unless ref_db[rule['id']][fieldx]['REGEXP'].include?(rlist.join("::")) 196 | end 197 | ref_db[rule['id']][fieldx]['LEN_AVG'] = (len_avg / count_avg).to_i if count_avg != 0 198 | if evennumber && noevennumber 199 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 0 200 | elsif evennumber 201 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 1 202 | elsif noevennumber 203 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 2 204 | else 205 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 0 206 | end 207 | note_db.each do |nkey, nval| 208 | if nkey.to_s != 'NOTE_DEFAULT' and nkey.to_s != 'NOTE_UNIQ_REDUC' 209 | ref_db[rule['id']][fieldx][nkey] = nval 210 | #if field is uniq then reduce note possible 211 | ref_db[rule['id']][fieldx][nkey] = ref_db[rule['id']][fieldx][nkey] - note_db['NOTE_UNIQ_REDUC'] if ref_db[rule['id']][fieldx]['Uniq_value'] 212 | end 213 | end 214 | else 215 | if 'string' == type_field.to_s 216 | field_name=fieldx+'.raw' 217 | end 218 | #method for string 219 | res=@client.search index: index_name, size: 0, search_type: 'count', body: {"query": { "filtered": { "filter": { "bool": { "must": query_pivot } } } }, "aggregations": { "SP": { "terms": { "field": field_name, "size": 0, "order": {"_count": "desc"}}} } } 220 | #res['aggregations']['SP']['buckets'].length.to_s 221 | #res["hits"]["total"].to_s 222 | #check uniq term if more than 1% difference res["hits"]["total"]*0.01) == 1% && if 1% > res['aggregations']['SP']['buckets'].length then UNIQ else NOT UNIQ END 223 | if (res['aggregations']['SP']['buckets'].length >= 1) and (res['aggregations']['SP']['buckets'].length>(res["hits"]["total"]*0.01) 224 | #not uniq 225 | ref_db[rule['id']][fieldx]['Uniq_value'] = false 226 | tmp_field_notuniq.push(*fieldx) 227 | else 228 | #uniq 229 | ref_db[rule['id']][fieldx]['Uniq_value'] = true 230 | end 231 | if res['aggregations']['SP']['buckets'].length >= 1 and 20 <= res['aggregations']['SP']['buckets'].length 232 | ref_db[rule['id']][fieldx]['LIST_VALUE'] = [] 233 | for hash_aggr in res['aggregations']['SP']['buckets'] 234 | ref_db[rule['id']][fieldx]['LIST_VALUE'].push(*hash_aggr['key']) 235 | end 236 | end 237 | ref_db[rule['id']][fieldx]['ENCODING'] = [] 238 | ref_db[rule['id']][fieldx]['LEN_MAX'] = 0 239 | ref_db[rule['id']][fieldx]['LEN_MIN']= 9999999999 240 | count_avg = 0 241 | len_avg = 0 242 | evennumber=false 243 | noevennumber=false 244 | ref_db[rule['id']][fieldx]['LEN_AVG_PRCT'] = 20 # +/- 20% of LEN_AVG for accept 245 | ref_db[rule['id']][fieldx]['REGEXP'] = [] 246 | ref_db[rule['id']][fieldx]['REGEXP_MIN'] = [] 247 | for hash_aggr in res['aggregations']['SP']['buckets'] 248 | #analyz entry by entry: {"key"=>"XXX", "doc_count"=>NNN} 249 | #collect encoding 250 | ref_db[rule['id']][fieldx]['ENCODING'].push(*hash_aggr['key'].encoding.to_s) unless ref_db[rule['id']][fieldx]['ENCODING'].include?(hash_aggr['key'].encoding.to_s) 251 | #collect len info 252 | ref_db[rule['id']][fieldx]['LEN_MAX'] = hash_aggr['key'].length if ref_db[rule['id']][fieldx]['LEN_MAX'] < hash_aggr['key'].length 253 | ref_db[rule['id']][fieldx]['LEN_MIN'] = hash_aggr['key'].length if ref_db[rule['id']][fieldx]['LEN_MIN'] > hash_aggr['key'].length 254 | count_avg = count_avg + hash_aggr['doc_count'].to_i 255 | len_avg = len_avg + (hash_aggr['doc_count'].to_i * hash_aggr['doc_count'].length) 256 | #check EVEN number or not (length for string) 257 | if hash_aggr['doc_count'].length.even? 258 | evennumber = true 259 | else 260 | noevennumber = true 261 | end 262 | #check pattern on aggr value 263 | rlist = [] 264 | pattern_db.each do |key, value| 265 | match = Regexp.new(value, nil, 'n').match(hash_aggr['key'].to_s) 266 | if not match.nil? 267 | rlist << key 268 | end 269 | end 270 | #create list regexp min 271 | if ref_db[rule['id']][fieldx]['REGEXP_MIN'].empty? 272 | ref_db[rule['id']][fieldx]['REGEXP_MIN'] = rlist 273 | else 274 | intersec = ref_db[rule['id']][fieldx]['REGEXP_MIN'] & rlist 275 | ref_db[rule['id']][fieldx]['REGEXP_MIN'] = intersec 276 | end 277 | #create regexp list 278 | ref_db[rule['id']][fieldx]['REGEXP'].push(*rlist.join("::")) unless ref_db[rule['id']][fieldx]['REGEXP'].include?(rlist.join("::")) 279 | end 280 | ref_db[rule['id']][fieldx]['LEN_AVG'] = (len_avg / count_avg).to_i if count_avg != 0 281 | if evennumber && noevennumber 282 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 0 283 | elsif evennumber 284 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 1 285 | elsif noevennumber 286 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 2 287 | else 288 | ref_db[rule['id']][fieldx]['LEN_EVENorUNEVENnum'] = 0 289 | end 290 | #create note for field 291 | note_db.each do |nkey, nval| 292 | if nkey.to_s != 'NOTE_DEFAULT' and nkey.to_s != 'NOTE_UNIQ_REDUC' 293 | ref_db[rule['id']][fieldx][nkey] = nval 294 | #if field is uniq then reduce note possible 295 | ref_db[rule['id']][fieldx][nkey] = ref_db[rule['id']][fieldx][nkey] - note_db['NOTE_UNIQ_REDUC'] if ref_db[rule['id']][fieldx]['Uniq_value'] 296 | end 297 | end 298 | #check result & analyz & create reference for field 299 | end 300 | end 301 | #leave of aggr field 302 | #create link information 303 | #1- find if field not uniq in all event or some optionnal (& determine) 304 | count1=@client.search index: index_name, size: 0, search_type: 'count', body: {"query": { "filtered": { "filter": { "bool": { "must": query_pivot } } } } } 305 | # two type field: optionnal present and always present 306 | field_option = [] 307 | field_always = [] 308 | for fieldy in tmp_field_notuniq 309 | # exist field query: { "exists":{ "field": "src_ip" } } 310 | tmp_query = [] 311 | query_pivot.each{|e| tmp_query << e.dup} 312 | tmp_filter = { "exists":{ "field": fieldy.to_s } } 313 | tmp_query.push(tmp_filter) 314 | count2=@client.search index: index_name, size: 0, search_type: 'count', body: {"query": { "filtered": { "filter": { "bool": { "must": tmp_query } } } } } 315 | if ((count1["hits"]["total"]+(count1["hits"]["total"]*0.01)) >= count2["hits"]["total"]) and ((count1["hits"]["total"]-(count1["hits"]["total"]*0.01)) <= count2["hits"]["total"]) 316 | # always present 317 | field_always.push(*fieldy) 318 | else 319 | #optionnal present 320 | field_option.push(*fieldy) 321 | end 322 | end 323 | #2- create link with all possibility field 324 | combis = [] 325 | for n in 1..field_option.length 326 | combis.push(field_option.combination(n).to_a) 327 | end 328 | combis=combis.flatten(1) 329 | if combis.empty? 330 | # use field_alaways if option is empty 331 | combis.push(field_always) 332 | else 333 | for combi in combis 334 | for fa in field_always 335 | combi.push(fa) 336 | end 337 | end 338 | end 339 | #3- order value (classed by field name) for create simhash 340 | simhash_combi = {} 341 | for combi in combis 342 | tmp_filter = { "exists":{ "field": fieldy.to_s } } 343 | tmp_query = [] 344 | query_pivot.each{|e| tmp_query << e.dup} 345 | #create filter with field combinaison 346 | for field_choice in combi 347 | tmp_filter = { "exists":{ "field": field_choice.to_s } } 348 | tmp_query.push(tmp_filter) 349 | end 350 | res_link=@client.search index: index_name, size: 0, body: {"query": { "filtered": { "filter": { "bool": { "must": tmp_query } } } } } 351 | #use scroll for view all event -- duration max 5minutes => '5m' 352 | #verify you rigth to access scroll!!!! (shield/elasticguard) 353 | r=client.search index: 'logstash-*', scroll: '5m', size: 10000, body: {"query": { "filtered": { "filter": { "bool": { "must": tmp_query } } } } } 354 | # Call the `scroll` API until empty results are returned 355 | while r = client.scroll(scroll_id: r['_scroll_id'], scroll: '5m') and not r['hits']['hits'].empty? do 356 | for rfields in r['hits']['hits'] 357 | simhash_tmp="" 358 | for field_choice in combi.sort 359 | #create simhash with order by field name 360 | simhash_tmp=simhash_tmp+rfields["_source"][field_choice.to_s].to_s.force_encoding('iso-8859-1').encode('utf-8') 361 | end 362 | simhash_complet = simhash_tmp.simhash(:hashbits => rule["simhash_size"]).to_s 363 | if simhash_combi[simhash_complet] 364 | simhash_combi[simhash_complet] = simhash_combi[simhash_complet] + 1 365 | else 366 | simhash_combi[simhash_complet] = 1 367 | end 368 | end 369 | end 370 | #delete scroll 371 | end 372 | ref_db[rule['id']]['NOTE_DEFAULT'] = note_db['NOTE_DEFAULT'] 373 | ref_db[rule['id']]['relation_value_fix'] = simhash_combi 374 | end 375 | @client.indices.clear_cache() 376 | #create json file 377 | File.open(@file_save_localioc,"w+") do |f| 378 | f.write(JSON.pretty_generate(ref_db)) 379 | end 380 | 381 | #{ 'ID20XXXX': { 382 | # 'field': { 383 | # 'TYPE': 'Array|Int|String|...', # not use mapping, because bad way , regexp can used for this 384 | # 'Uniq_value': true or false, #define if value is random => true << OK AGG 385 | # 'NOTE_UNIQ_REDUC': 0.1 # for reduce note if match on uniq fueld 386 | # 'LIST_VALUE': ['value_possible1','value_possible2','value_possibleX'], << OK AGG 387 | # 'NOTE_LISTV': 0.25 # note between 0.x and 4 default 0.25 388 | # 'ENCODING': true or false, # value contains than ascii caratere << OK AGG 389 | # 'NOTE_ENCODING': 0.25 # note between 0.x and 4 default 0.25 390 | # 'LEN_MAX': numeric_value, << OK AGG 391 | # 'NOTE_LEN': 0.25 # note between 0.x and 4 default 0.25 392 | # 'LEN_MIN': numeric_value, << OK AGG 393 | # 'LEN_AVG': numeric_value, << OK AGG 394 | # 'LEN_AVG_PRCT': pourcent for AVG, << OK AGG 395 | # 'NOTE_LEN_AVG': 0.1 # note between 0.x and 4 default 0.1 396 | # 'LEN_EVENorUNEVENnum': numeric_value, #even num = 1;uneven num = 2; unknown/undefine = 0 << OK AGG 397 | # 'NOTE_LEN_EVEN': 0.25 # note between 0.x and 4 default 0.25 398 | # 'REGEXP_MIN': [], << OK AGG 399 | # 'NOTE_REGEXP_MIN': 0.25 # note between 0.x and 4 default 0.25 400 | # 'REGEXP': [] << OK AGG 401 | # 'NOTE_REGEXP': 0.25 # note between 0.x and 4 default 0.25 402 | # } , 403 | # #relation value_fix contains list of value of field not unique (random) 404 | # # by exemple fld1: '1'; fld2: 'blabla';fld3: '10.10.10.10' 405 | # # create LIST simhash value and attention to order field 406 | # # you can optimiz with simhash - end if earn place memory 407 | # # important you count SIMHASH:COUNT for use COUNT if very little score => suspect [use conf -> relation_min] 408 | # 'relation_value_fix": {'SIMHASH1':COUNTX,'SIMHASH2':COUNTY,'SIMHASH3':COUNTX}, 409 | # 'NOTE_DEFAULT': 2# note between 0.x and 4 default 2 410 | # }}} 411 | -------------------------------------------------------------------------------- /logstash-filter-sig/scripts-create-db/ioc_create.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | #contact: lionel.prat9@gmail.com 3 | #require: mkdir /opt/logstash-extra/ && cd /opt/logstash-extra/ && git clone https://github.com/CIRCL/PyMISP 4 | rm -f /tmp/misp.json 5 | python ./PyMISP/examples/last.py -l 90d -o /tmp/misp.json 6 | python ./misp2json4ioc.py /tmp/misp.json /etc/logstash/db/ioc.json /etc/logstash/db/list_field_ioc ./blacklist.json 7 | rm -f /tmp/misp.json 8 | chown -R logstash.logstash /etc/logstash/db 9 | -------------------------------------------------------------------------------- /logstash-filter-sig/scripts-create-db/misp2json4ioc.py: -------------------------------------------------------------------------------- 1 | #Parse misp result to json format for logstash filter ioc.rb 2 | #https://github.com/MISP/PyMISP 3 | #Contact: lionel.prat9@gmail.com 4 | #put file result.json and types.list 5 | #remove alexa top 100 of IOC 6 | import ujson 7 | import sys 8 | import re 9 | import os 10 | from pprint import pprint 11 | 12 | #ARGV 13 | if len(sys.argv) < 5: 14 | print("Syntaxe: ioc_parse.py original_ioc_json output.json type.list blacklist.json") 15 | sys.exit(0) 16 | 17 | exclude_type = ["yara","snort","link","comment"] 18 | ndata = {} 19 | data = [] 20 | i=0 21 | if os.path.exists(sys.argv[1]): 22 | with open(sys.argv[1]) as data_file: 23 | content = data_file.readlines() 24 | for line in content: 25 | data.append(ujson.loads(line)) 26 | data_file.close() 27 | else: 28 | exit 29 | 30 | bldata = {} 31 | if os.path.exists(sys.argv[4]): 32 | with open(sys.argv[4]) as data_file: 33 | bldata=ujson.load(data_file) 34 | data_file.close() 35 | else: 36 | exit 37 | 38 | #get 100 list url alexa 39 | os.system('wget -qO- http://s3.amazonaws.com/alexa-static/top-1m.csv.zip | bsdtar -xvf- -O | head -100 |awk -F \',\' \'{print $2}\' > /tmp/top100uri') 40 | #wget -qO- http://s3.amazonaws.com/alexa-static/top-1m.csv.zip | bsdtar -xvf- -O | head -100 |awk -F ',' '{print $2}' > /tmp/top100uri 41 | topdom = [] 42 | if os.path.exists("/tmp/top100uri"): 43 | #load file array 44 | with open("/tmp/top100uri", "r") as ins: 45 | for line in ins: 46 | line=line.rstrip('\n') 47 | line=line.lower() 48 | topdom.append(line) 49 | else: 50 | exit 51 | 52 | for datax in data : 53 | i=i+1 54 | if datax['Event']['Attribute']: 55 | if type(datax['Event']['Attribute']) is not dict: 56 | for elem in datax['Event']['Attribute']: 57 | if str("External analysis") != str(elem['category']) and str(elem['type']) not in exclude_type: 58 | if str(elem['type']) not in ndata.keys(): 59 | ndata[str(elem['type'])] = [] 60 | ndata[str(elem['type'])].append(elem['value']) 61 | else: 62 | for key,elem in datax['Event']['Attribute'].items(): 63 | if str("External analysis") != str(elem['category']) and str(elem['type']) not in exclude_type: 64 | if str(elem['type']) not in ndata.keys(): 65 | ndata[str(elem['type'])] = [] 66 | ndata[str(elem['type'])].append(elem['value']) 67 | if datax['Event']['ShadowAttribute']: 68 | if type(datax['Event']['ShadowAttribute']) is not dict: 69 | for elem in datax['Event']['ShadowAttribute']: 70 | if str("External analysis") != str(elem['category']) and str(elem['type']) not in exclude_type: 71 | if str(elem['type']) not in ndata.keys(): 72 | ndata[str(elem['type'])] = [] 73 | ndata[str(elem['type'])].append(elem['value']) 74 | else: 75 | for key,elem in datax['Event']['ShadowAttribute'].items(): 76 | if str("External analysis") != str(elem['category']) and str(elem['type']) not in exclude_type: 77 | if str(elem['type']) not in ndata.keys(): 78 | ndata[str(elem['type'])] = [] 79 | ndata[str(elem['type'])].append(elem['value']) 80 | #change key name, add subfix 'ioc_' 81 | xdata = {} 82 | 83 | for key,elem in ndata.items(): 84 | if str(key) == "ip-dst" or str(key) == "ip-src": 85 | if 'ioc_ip' in xdata.keys(): 86 | #elem = [element.lower() for element in elem] 87 | #10.x.x.x, 192.168.x.x, 172.16.0.0 - 172.31.255.255 88 | if 'ioc_ip' in bldata.keys(): 89 | for list_bl in bldata['ioc_ip']: 90 | regex = re.compile(list_bl) 91 | elem = [x for x in elem if not regex.match(x)] 92 | regex = re.compile(r'(127\.[0-9]+\.[0-9]+\.[0-9]+|10\.\d+\.\d+\.\d+|192\.168\.\d+\.\d+|172\.([1-2][0-9]|0|30|31)\.\d+\.\d+|255\.255\.255\.\d+)') 93 | elem = [x for x in elem if not regex.match(x)] 94 | xdata['ioc_ip'] = list(set(xdata['ioc_ip'] + elem)) 95 | else: 96 | #elem = [element.lower() for element in elem] 97 | if 'ioc_ip' in bldata.keys(): 98 | for list_bl in bldata['ioc_ip']: 99 | regex = re.compile(list_bl) 100 | elem = [x for x in elem if not regex.match(x)] 101 | regex = re.compile(r'(127\.[0-9]+\.[0-9]+\.[0-9]+|10\.\d+\.\d+\.\d+|192\.168\.\d+\.\d+|172\.([1-2][0-9]|0|30|31)\.\d+\.\d+|255\.255\.255\.\d+)') 102 | elem = [x for x in elem if not regex.match(x)] 103 | xdata['ioc_ip'] = list(set(elem)) 104 | elif str(key) == "email-attachment" or str(key) == "attachment": 105 | if 'ioc_attachment' in xdata.keys(): 106 | #elem = [element.lower() for element in elem] 107 | if 'ioc_attachment' in bldata.keys(): 108 | for list_bl in bldata['ioc_attachment']: 109 | regex = re.compile(list_bl) 110 | elem = [x for x in elem if not regex.match(x)] 111 | xdata['ioc_attachment'] = list(set(xdata['ioc_attachment'] + elem)) 112 | else: 113 | #elem = [element.lower() for element in elem] 114 | if 'ioc_attachment' in bldata.keys(): 115 | for list_bl in bldata['ioc_attachment']: 116 | regex = re.compile(list_bl) 117 | elem = [x for x in elem if not regex.match(x)] 118 | xdata['ioc_attachment'] = list(set(elem)) 119 | elif str(key) == "email-dst" or str(key) == "email-src": 120 | if 'ioc_emailaddr' in xdata.keys(): 121 | elem = [element.lower() for element in elem] 122 | if 'ioc_emailaddr' in bldata.keys(): 123 | for list_bl in bldata['ioc_emailaddr']: 124 | regex = re.compile(list_bl) 125 | elem = [x for x in elem if not regex.match(x)] 126 | xdata['ioc_emailaddr'] = list(set(xdata['ioc_emailaddr'] + elem)) 127 | else: 128 | elem = [element.lower() for element in elem] 129 | if 'ioc_emailaddr' in bldata.keys(): 130 | for list_bl in bldata['ioc_emailaddr']: 131 | regex = re.compile(list_bl) 132 | elem = [x for x in elem if not regex.match(x)] 133 | xdata['ioc_emailaddr'] = list(set(elem)) 134 | elif str(key) == "url" or str(key) == "uri": 135 | if 'ioc_uri' in xdata.keys(): 136 | #elem = [element.lower() for element in elem] 137 | #elem = [w.replace('\\', '') for w in elem] 138 | elem = [w.replace('hxxp://', 'http://') for w in elem] 139 | elem = [w.replace('hxxps://', 'https://') for w in elem] 140 | if 'ioc_uri' in bldata.keys(): 141 | for list_bl in bldata['ioc_uri']: 142 | regex = re.compile(list_bl) 143 | elem = [x for x in elem if not regex.match(x)] 144 | xdata['ioc_uri'] = list(set(xdata['ioc_uri'] + elem)) 145 | else: 146 | #elem = [element.lower() for element in elem] 147 | #elem = [w.replace('\\', '') for w in elem] 148 | elem = [w.replace('hxxp://', 'http://') for w in elem] 149 | elem = [w.replace('hxxps://', 'https://') for w in elem] 150 | if 'ioc_uri' in bldata.keys(): 151 | for list_bl in bldata['ioc_uri']: 152 | regex = re.compile(list_bl) 153 | elem = [x for x in elem if not regex.match(x)] 154 | xdata['ioc_uri'] = list(set(elem)) 155 | elif str(key) == "domain": 156 | elem = [element.lower() for element in elem] 157 | elem = [x for x in elem if not x in topdom] 158 | if 'ioc_domain' in bldata.keys(): 159 | for list_bl in bldata['ioc_domain']: 160 | regex = re.compile(list_bl) 161 | elem = [x for x in elem if not regex.match(x)] 162 | xdata['ioc_domain'] = list(set(elem)) 163 | elif str(key) == "hostname": 164 | elem = [element.lower() for element in elem] 165 | if 'ioc_hostname' in bldata.keys(): 166 | for list_bl in bldata['ioc_hostname']: 167 | regex = re.compile(list_bl) 168 | elem = [x for x in elem if not regex.match(x)] 169 | xdata['ioc_hostname'] = list(set(elem)) 170 | elif str(key) == "user-agent": 171 | #elem = [element.lower() for element in elem] 172 | if 'ioc_user-agent' in bldata.keys(): 173 | for list_bl in bldata['ioc_user-agent']: 174 | regex = re.compile(list_bl) 175 | elem = [x for x in elem if not regex.match(x)] 176 | xdata['ioc_user-agent'] = list(set(elem)) 177 | elif str(key) == "email-subject": 178 | #elem = [element.lower() for element in elem] 179 | if 'ioc_email-subject' in bldata.keys(): 180 | for list_bl in bldata['ioc_email-subject']: 181 | regex = re.compile(list_bl) 182 | elem = [x for x in elem if not regex.match(x)] 183 | xdata['ioc_email-subject'] = list(set(elem)) 184 | elif str(key) == "AS": 185 | #elem = [element.lower() for element in elem] 186 | if 'ioc_as' in bldata.keys(): 187 | for list_bl in bldata['ioc_as']: 188 | regex = re.compile(list_bl) 189 | elem = [x for x in elem if not regex.match(x)] 190 | xdata['ioc_as'] = list(set(elem)) 191 | with open(sys.argv[2]+'.all', 'w') as outfile: 192 | ujson.dump(ndata, outfile) 193 | outfile.close() 194 | with open(sys.argv[2], 'w') as outfile: 195 | ujson.dump(xdata, outfile) 196 | outfile.close() 197 | with open(sys.argv[3], 'w') as outfile: 198 | for key in xdata.keys(): 199 | outfile.write(key+"\n") 200 | outfile.close() 201 | #pprint(xdata) 202 | print "Event count:"+str(len(data)) 203 | 204 | 205 | -------------------------------------------------------------------------------- /logstash-filter-sig/spec/filters/sig_spec.rb: -------------------------------------------------------------------------------- 1 | # encoding: utf-8 2 | require 'spec_helper' 3 | require "logstash/filters/sig" 4 | 5 | describe LogStash::Filters::Sig do 6 | describe "Set to Hello World" do 7 | let(:config) do <<-CONFIG 8 | filter { 9 | sig { 10 | message => "Hello World" 11 | } 12 | } 13 | CONFIG 14 | end 15 | 16 | sample("message" => "some text") do 17 | expect(subject.get("message")).to eq('Hello World') 18 | end 19 | end 20 | end 21 | -------------------------------------------------------------------------------- /logstash-filter-sig/spec/spec_helper.rb: -------------------------------------------------------------------------------- 1 | # encoding: utf-8 2 | require "logstash/devutils/rspec/spec_helper" 3 | -------------------------------------------------------------------------------- /logstash-output-fir/.travis.yml: -------------------------------------------------------------------------------- 1 | sudo: false 2 | language: ruby 3 | cache: bundler 4 | rvm: 5 | - jruby-1.7.23 6 | script: 7 | - bundle exec rspec spec 8 | -------------------------------------------------------------------------------- /logstash-output-fir/CHANGELOG.md: -------------------------------------------------------------------------------- 1 | ## 0.9.0 2 | - Plugins work on logstash 5.4 3 | 4 | -------------------------------------------------------------------------------- /logstash-output-fir/CONTRIBUTORS: -------------------------------------------------------------------------------- 1 | The following is a list of people who have contributed ideas, code, bug 2 | reports, or in general have helped logstash along its way. 3 | 4 | Contributors: 5 | * Aaron Mildenstein (untergeek) 6 | * Pier-Hugues Pellerin (ph) 7 | 8 | Note: If you've sent us patches, bug reports, or otherwise contributed to 9 | Logstash, and you aren't on the list above and want to be, please let us know 10 | and we'll make sure you're here. Contributions from folks like you are what make 11 | open source awesome. 12 | -------------------------------------------------------------------------------- /logstash-output-fir/DEVELOPER.md: -------------------------------------------------------------------------------- 1 | # logstash-output-fir 2 | * Lionel PRAT - lionel.prat9@gmail.com 3 | -------------------------------------------------------------------------------- /logstash-output-fir/Gemfile: -------------------------------------------------------------------------------- 1 | source 'https://rubygems.org' 2 | gemspec 3 | gem "logstash", :github => "elastic/logstash", :branch => "5.4" 4 | -------------------------------------------------------------------------------- /logstash-output-fir/LICENSE: -------------------------------------------------------------------------------- 1 | Copyright (c) 2012–2016 Elasticsearch 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | -------------------------------------------------------------------------------- /logstash-output-fir/NOTICE.TXT: -------------------------------------------------------------------------------- 1 | Elasticsearch 2 | Copyright 2012-2015 Elasticsearch 3 | 4 | This product includes software developed by The Apache Software 5 | Foundation (http://www.apache.org/). -------------------------------------------------------------------------------- /logstash-output-fir/README.md: -------------------------------------------------------------------------------- 1 | # Logstash Plugin 2 | 3 | Logstash plugin output for send alert to FIR (Incident platform of Cert SG - https://github.com/certsocietegenerale/FIR) 4 | ** require manticore ** 5 | 6 | Work on version 5.x and older. 7 | 8 | This is a plugin for [Logstash](https://github.com/elastic/logstash). 9 | 10 | It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way. 11 | 12 | ## Install 13 | ``` 14 | env GEM_HOME=/usr/share/logstash/vendor/bundle/jruby/1.9 /usr/share/logstash/vendor/jby/bin/jruby /usr/share/logstash/vendor/jruby/bin/gem build logstash-output-fir.gemspec 15 | usr/share/logstash/bin/logstash-plugin install logstash-output-fir-2.0.0.gem 16 | ``` 17 | 18 | ## Main Configuration (logstash-output.conf) 19 | ** Refresh DB : The plugin use some files configurations, you can change it during run. The plugin get change and apply all refresh time. You can use config/db file with git system... ** 20 | When i create alert/event in FIR, it's possible to update whith news informations (news alert in relationship, ...), the plugin can update a event FIR for avoid remake near event... 21 | * url_api_fir [string]: The uri for send alert to FIR API (REST) 22 | * Example: "https://127.0.0.1:8000/api/" 23 | * refresh_interval_remote [numeric]: Delay to refresh database by donwload incidentsDB in FIR (for re-syncronisation, when new event, is add in db_incident of logstash -- utility when you close event in FIR) 24 | * Example: 3600 25 | * If you have much incidents in DB, you take a long time for download DB... 26 | * refresh_interval [numeric]: Delay to refresh configuration FIR 27 | * Example: 3600 28 | * headers [hash]: You give token API FIR 29 | * Example: {"Authorization" => "Token 0000000000000000000000000000", "Content-Type" => "application/json"} 30 | * ssl_options [string]: If you change SSL configuration of manticore, by example disable verify cert. 31 | * Example for disable verify: "{ :verify => :disable }" 32 | * template_new [path]: the path of erb file for make body/description of event in FIR when new alert 33 | * Example "/etc/logstash/db/template_update.erb" (template example in folder: template_erb) 34 | * template_update [path]: the path of erb file for make body/description of event in FIR when update alert 35 | * Example "/etc/logstash/db/template_new.erb" (template example in folder: template_erb) 36 | * subj_template_new [path]: the path of erb file for make subject of event in FIR when new alert 37 | * Example "/etc/logstash/db/template_update.erb" (template example in folder: template_erb) 38 | * subj_template_update [path]: the path of erb file for make subject of event in FIR when update alert 39 | * Example "/etc/logstash/db/template_new.erb" (template example in folder: template_erb) 40 | * confile [path]: configuration file for rules of create event in FIR (filter and filter near event) 41 | * Example "/etc/logstash/db/conf_fir.json" (example in folder sample_conf) 42 | * subject_field [string]: The name of field "subject" in event FIR when you create event 43 | * Example field by default: "subject" 44 | * body_field [string]: The name of field "description" in event FIR when you create event 45 | * Example field by default: "description" 46 | * severity_field [string]: The name of field "severity" in event FIR when you create event 47 | * Example field by default: "severity" 48 | * status_field [string]: The name of field "status" in event FIR when you create event 49 | * Example field by default: "status" 50 | 51 | ### conf_fir.json 52 | The file contains rules which give filter and filter near event, choice template for create event in FIR 53 | ** This file is a Json format ** 54 | ``` 55 | { "rules": [ 56 | { 57 | "filters": {"sig_detected_note": "3|4"}, 58 | "subject_filter": "src_ip", 59 | "subject_filter_prefix": "-", 60 | "subject_filter_sufix": "-", 61 | "body_filter": "fingerprint", 62 | "body_filter_prefix": "", 63 | "body_filter_sufix": " -> SCORE", 64 | "count_filter": " Count: ", 65 | "severity_add": "sig_detected_note", 66 | "fields_create": {"actor": 6, "category": 26,"confidentiality": 0,"detection": 36, "plan": 37,"is_starred": false,"is_major": false,"is_incident": false,"concerned_business_lines": []}, 67 | "template_new_sujet": "", 68 | "template_new_body": "", 69 | "template_up_sujet": "", 70 | "template_up_body": "", 71 | } 72 | ] } 73 | ``` 74 | Json contains key "rules" which contains all rule in hash format. 75 | Each element of rule: 76 | * filters [hash]: filter rule by field value, you can use multi field filter => {"field1": "regexp", "field2": "regexp"} 77 | * field_name [string]: name of field in event where you search regexp value 78 | * value_search [regexp]: regexp value to search in event field selected 79 | * subject_filter [string]: When you match filter then you search if event is near. For apply, you search value of field event in subject of all incident FIR DB. If you find then event DB know server or client then you verify if same event for this client. Else you don't find value field event in incident DB, you create new event in FIR. The search use "subject == '*value_field_event*' " 80 | * subject_filter_prefix [string]: For avoid error when match subject_filter you can add prefix and change search by "subject == '*prefix+value_field_event*' " 81 | * subject_filter_sufix [string]: For avoid error when match subject_filter you can add sufix and change search by "subject == '*value_field_event+sufix*' " 82 | * body_filter [string]: When subject matched, then verify if description event contains same event. I use fingerprint field (Plugin logstash-filter-sig) for it. If it find then stop, is ok, else, update event FIR with new event for client/server subject. The search use "body == '*value_field_event*' " (body is field description in FIR) 83 | * body_filter_prefix [string]: For avoid error when match body_filter you can add prefix and change search by "body == '*prefix+value_field_event*' " 84 | * body_filter_sufix [string]: For avoid error when match body_filter you can add sufix and change search by "body == '*value_field_event+sufix*' " 85 | * count_filter [string]: For incriment count of number same alert receveive. 86 | * severity_add [string]: The name of field used for copy value in severity field of FIR event. If empty then set severity to 1 87 | * fields_create [hash]: contains information must need for create event in FIR 88 | * actor [numeric]: 1 89 | * category [numeric]: 2 90 | * confidentiality [numeric]: 1 91 | * detection [numeric]: 3 92 | * plan [numeric]: 4 93 | * is_starred [boolean]: false 94 | * is_major [boolean]: false 95 | * is_incident [boolean]: false 96 | * concerned_business_lines [array]: [] 97 | * template_new_sujet [path]: The path of file erb template for new subject 98 | * template_new_body [path]: The path of file erb template for new description/body 99 | * template_up_sujet [path]: The path of file erb template for update subject 100 | * template_up_body [path]: The path of file erb template for update description/body 101 | 102 | ## Documentation 103 | 104 | Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one [central location](http://www.elastic.co/guide/en/logstash/current/). 105 | 106 | - For formatting code or config example, you can use the asciidoc `[source,ruby]` directive 107 | - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide 108 | 109 | ## Need Help? 110 | 111 | Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum. 112 | 113 | ## Developing 114 | 115 | ### 1. Plugin Developement and Testing 116 | 117 | #### Code 118 | - To get started, you'll need JRuby with the Bundler gem installed. 119 | 120 | - Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example). 121 | 122 | - Install dependencies 123 | ```sh 124 | bundle install 125 | ``` 126 | 127 | #### Test 128 | 129 | - Update your dependencies 130 | 131 | ```sh 132 | bundle install 133 | ``` 134 | 135 | - Run tests 136 | 137 | ```sh 138 | bundle exec rspec 139 | ``` 140 | 141 | ### 2. Running your unpublished Plugin in Logstash 142 | 143 | #### 2.1 Run in a local Logstash clone 144 | 145 | - Edit Logstash `Gemfile` and add the local plugin path, for example: 146 | ```ruby 147 | gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome" 148 | ``` 149 | - Install plugin 150 | ```sh 151 | # Logstash 2.3 and higher 152 | bin/logstash-plugin install --no-verify 153 | 154 | # Prior to Logstash 2.3 155 | bin/plugin install --no-verify 156 | 157 | ``` 158 | - Run Logstash with your plugin 159 | ```sh 160 | bin/logstash -e 'filter {awesome {}}' 161 | ``` 162 | At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash. 163 | 164 | #### 2.2 Run in an installed Logstash 165 | 166 | You can use the same **2.1** method to run your plugin in an installed Logstash by editing its `Gemfile` and pointing the `:path` to your local plugin development directory or you can build the gem and install it using: 167 | 168 | - Build your plugin gem 169 | ```sh 170 | gem build logstash-filter-awesome.gemspec 171 | ``` 172 | - Install the plugin from the Logstash home 173 | ```sh 174 | # Logstash 2.3 and higher 175 | bin/logstash-plugin install --no-verify 176 | 177 | # Prior to Logstash 2.3 178 | bin/plugin install --no-verify 179 | 180 | ``` 181 | - Start Logstash and proceed to test the plugin 182 | 183 | ## Contributing 184 | 185 | All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin. 186 | 187 | Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here. 188 | 189 | It is more important to the community that you are able to contribute. 190 | 191 | For more information about contributing, see the [CONTRIBUTING](https://github.com/elastic/logstash/blob/master/CONTRIBUTING.md) file. 192 | -------------------------------------------------------------------------------- /logstash-output-fir/Rakefile: -------------------------------------------------------------------------------- 1 | require "logstash/devutils/rake" 2 | -------------------------------------------------------------------------------- /logstash-output-fir/lib/logstash/outputs/fir.rb: -------------------------------------------------------------------------------- 1 | # encoding: utf-8 2 | # Output "FIR" send alert to FIR platform (CERT SG) 3 | # can be use with logstash-filter-sig 4 | # Contact: Lionel PRAT (lionel.prat9@gmail.com) 5 | require "logstash/outputs/base" 6 | require "logstash/namespace" 7 | require "json" 8 | require "uri" 9 | require "manticore" 10 | require "time" 11 | require "erb" 12 | 13 | # An example output that does nothing. 14 | class LogStash::Outputs::Fir < LogStash::Outputs::Base 15 | config_name "fir" 16 | 17 | # URL to use 18 | config :url_api_fir, :validate => :string, :required => :true, :default => "https://127.0.0.1:8000/api/" 19 | #refresh remote information FIR - get closed information & new ticket manual & artefacts db 20 | config :refresh_interval_remote, :validate => :number, :default => 3600 21 | 22 | # Custom headers to use 23 | # format is `headers => ["X-My-Header", "%{host}"]` 24 | #REPLACE 0000000000000000000000000000 by your token 25 | config :headers, :validate => :hash, :required => :true, :default => {"Authorization" => "Token 0000000000000000000000000000", "Content-Type" => "application/json"} 26 | #use insecure mode change by '{ verify: :disable }' 27 | config :ssl_options, :validate => :string, :default => "{ :verify => :disable }" 28 | 29 | #template body & subject ERB 30 | #template for new alert 31 | config :template_new, :validate => :path, :default => "/etc/logstash/db/template_new.erb" 32 | #template for update alert 33 | config :template_update, :validate => :path, :default => "/etc/logstash/db/template_update.erb" 34 | #template subject for new alert 35 | config :subj_template_new, :validate => :path, :default => "/etc/logstash/db/subject_template_new.erb" 36 | #template subject for update alert 37 | config :subj_template_update, :validate => :path, :default => "/etc/logstash/db/subject_template_update.erb" 38 | #MATCH CONFIGURATION: 39 | config :conffile, :validate => :string, :default => "/etc/logstash/db/conf_fir.json" 40 | #need: filter + first identification in sujet + second identification for alert exist in body + optionnal change template 41 | #Configuration -- format HASH 42 | #rules [ 43 | # { 44 | # filters: {}, 45 | # subject_filter: 'name_field_in_event', # by example field src_ip 46 | # subject_filter_prefix: 'string must match.. before content of field subject filter' #optionnal 47 | # subject_filter_sufix: 'string must match.. after content of field subject filter' #optionnal 48 | # body_filter: 'name_field_in_event', # by example field fingerprint 49 | # body_filter_prefix: 'string must match.. before content of field body filter' #optionnal 50 | # body_filter_sufix: 'string must match.. after content of field body filter' #optionnal 51 | # severity_add: 'name_field_in_event', # by example field sig_detected_note !!optionnal!! 52 | # count_filter: ' Count: ' 53 | # fields_create: {'name' => value, '' => value} # !!!REQUIRE FOR CREATE: "actor" = 6 & "category" & "confidentiality" & "detection" & "plan" & "is_starred" & "is_major" & "is_incident" & "concerned_business_lines" 54 | # template_new_sujet: 'path', #optionnal 55 | # template_new_body: 'path', #optionnal 56 | # template_up_sujet: 'path', #optionnal 57 | # template_up_body: 'path', #optionnal 58 | # } 59 | #] 60 | 61 | # this setting will indicate how frequently 62 | # (in seconds) logstash will check the db file for updates. 63 | config :refresh_interval, :validate => :number, :default => 3600 64 | 65 | #choice in FIR for subject and body and severity -- API REST 66 | config :subject_field, :validate => :string, :default => "subject" 67 | config :body_field, :validate => :string, :default => "description" 68 | config :severity_field, :validate => :string, :default => "severity" 69 | config :status_field, :validate => :string, :default => "status" 70 | 71 | concurrency :single 72 | 73 | public 74 | def register 75 | @fir_conf = [] 76 | @incidents_db = {} 77 | @client = Manticore::Client.new(ssl: eval(@ssl_options)) 78 | @logger.info("FIR Configuration -- Loading...") 79 | @hash_file = "" 80 | @load_statut = false 81 | load_db 82 | @load_statut = true 83 | @logger.info("finish") 84 | #file load template 85 | @logger.info("FIR templates -- Loading...") 86 | @template_data_n = "" 87 | @template_data_u = "" 88 | @template_subj_n = "" 89 | @template_subj_u = "" 90 | if File.file?(@template_new) && File.file?(@template_update) && File.file?(@subj_template_new) && File.file?(@subj_template_update) 91 | @template_data_n = File.read(@template_new) 92 | @template_data_u = File.read(@template_update) 93 | @template_subj_n = File.read(@subj_template_new) 94 | @template_subj_u = File.read(@subj_template_update) 95 | else 96 | @logger.error("FIR templates not found!") 97 | exit -1 98 | end 99 | if @template_subj_u.empty? or @template_subj_n.empty? or @template_data_u.empty? or @template_data_n.empty? 100 | @logger.error("FIR templates is empty!!") 101 | exit -1 102 | end 103 | @logger.info("finish") 104 | @next_refresh = Time.now + @refresh_interval 105 | @load_statut_r = false 106 | @logger.info("FIR get incident DB -- Loading...could take a sometime...") 107 | load_incidents 108 | @logger.info("finish") 109 | @load_statut_r = true 110 | @next_refresh = Time.now + @refresh_interval 111 | @next_refresh_remote = Time.now + @refresh_interval_remote 112 | @token_create=true 113 | end # def register 114 | 115 | public 116 | def multi_receive(events) 117 | events.each {|event| receive(event)} 118 | end 119 | 120 | def receive(event) 121 | tnow = Time.now 122 | #refresh DB & conf 123 | if @next_refresh < tnow 124 | if @load_statut == true 125 | @load_statut = false 126 | load_db 127 | if File.file?(@template_new) && File.file?(@template_update) 128 | @template_data_n = File.read(@template_new) 129 | @template_data_u = File.read(@template_update) 130 | end 131 | @next_refresh = tnow + @refresh_interval 132 | @load_statut = true 133 | end 134 | end 135 | if @next_refresh_remote < tnow 136 | if @load_statut_r == true 137 | @load_statut_r = false 138 | @logger.info("FIR refresh incident DB -- could take a sometime...") 139 | load_incidents 140 | @next_refresh_remote = tnow + @refresh_interval_remote 141 | @load_statut_r = true 142 | end 143 | end 144 | sleep(1) until @load_statut_r 145 | sleep(1) until @load_statut 146 | #verify db & conf is OK! 147 | if @fir_conf.is_a?(Array) and not @incidents_db.nil? 148 | #check filter: {'field_name': [] or ""  -- if field event is numerci, the code change to string 149 | #match one regexp on field event use string value "regexp" 150 | #match multi regexp (for array type) on field event use array type: [regex1,regexp2,...] for match ok all regexp must matched one time 151 | # } 152 | for rule in @fir_conf 153 | #get key in rule: fields 154 | eventK = event.to_hash.keys 155 | inter = rule['filters'].keys & eventK 156 | #check if fields rule present in event 157 | if inter.length == rule['filters'].keys.length 158 | #ok field present - check filter 159 | #check field by field 160 | sig_add = {} 161 | check_sig=false 162 | for kfield in inter 163 | check_sig=false 164 | # field X -- check type 165 | if event.get(kfield).is_a?(Array) 166 | #array type 167 | # if rule value regexp is Array? 168 | if rule['filters'][kfield].is_a?(Array) 169 | for regexp in rule['filters'][kfield] 170 | check_sig=false 171 | for elem in event.get(kfield) 172 | match = Regexp.new(regexp, nil, 'n').match(elem.to_s) 173 | if not match.nil? 174 | check_sig=true 175 | break 176 | end 177 | end 178 | break unless check_sig 179 | end 180 | else 181 | #rule not array 182 | for elem in event.get(kfield) 183 | match = Regexp.new(rule['filters'][kfield], nil, 'n').match(elem.to_s) 184 | if not match.nil? 185 | check_sig=true 186 | break 187 | end 188 | end 189 | end 190 | else 191 | #other type 192 | # if rule value regexp is Array? 193 | if rule['filters'][kfield].is_a?(Array) 194 | #array 195 | for regexp in rule['filters'][kfield] 196 | match = Regexp.new(regexp, nil, 'n').match(event.get(kfield).to_s) 197 | if not match.nil? 198 | sig_add[kfield.to_s]="Regexp found #{match}" 199 | check_sig=true 200 | next 201 | end 202 | break unless check_sig 203 | end 204 | else 205 | #other 206 | match = Regexp.new(rule['filters'][kfield], nil, 'n').match(event.get(kfield).to_s) 207 | if not match.nil? 208 | check_sig=true 209 | next 210 | end 211 | end 212 | end 213 | break unless check_sig 214 | end 215 | if check_sig 216 | #filter matched 217 | #stat of exist alert 218 | check_if_create=true 219 | #verify filter present subject_filter && body_filter 220 | if not event.get(rule['subject_filter'].to_s).nil? and not event.get(rule['body_filter'].to_s).nil? 221 | #ok 222 | #not write in same time for avoid corruption db 223 | #check in incident db if alert exist 224 | for incident in @incidents_db["results"] 225 | #verify if incident is Open or Close -- just check incident open 226 | next if not incident[@status_field].to_s == "O" 227 | #verify if filter subject present in incident DB 228 | next if not incident[@subject_field].include?(rule['subject_filter_prefix'].to_s+event.get(rule['subject_filter'].to_s).to_s+rule['subject_filter_sufix'].to_s) 229 | #verify if filter body present in incident DB 230 | check_if_create=false 231 | #if body match, break, is ok -> created and updated 232 | if incident[@body_field].include?(rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s) 233 | break if rule['count_filter'].nil? or rule['count_filter'].empty? 234 | #count ++ 235 | #replace rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s+".*COUNT:" 236 | num_al=incident[@body_field].scan(/#{Regexp.escape(rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s)}.*#{Regexp.escape(rule['count_filter'].to_s)}(\d+)/).last 237 | if num_al.nil? 238 | break 239 | else 240 | if num_al.is_a?(Array) and not num_al.empty? 241 | num_al = num_al.first.to_i + 1 242 | incident[@body_field] = incident[@body_field].gsub(/(#{Regexp.escape(rule['body_filter_prefix'].to_s+event.get(rule['body_filter'].to_s).to_s+rule['body_filter_sufix'].to_s)}.*#{Regexp.escape(rule['count_filter'].to_s)})(\d+)/, '\1'+num_al.to_s) 243 | else 244 | break 245 | end 246 | end 247 | url = @url_api_fir + "incidents/" + incident["id"].to_s 248 | begin 249 | response = @client.patch(url, :body => incident.to_json, :headers => @headers) 250 | if response.code < 200 and response.code > 299 251 | log_failure( 252 | "Encountered non-200 HTTP code #{200}", 253 | :response_code => response.code, 254 | :url => url, 255 | :event => event) 256 | end 257 | rescue 258 | @logger.warn("ERROR SEND:", :string => body.to_json) 259 | end 260 | break 261 | end 262 | #if body no match, then update 263 | #UPDATE 264 | sleep(1) until @token_create 265 | @token_create=false 266 | #update incident 267 | #change severity if option not empty 268 | if rule['severity_add'] and (event.get(rule['severity_add'].to_s).is_a?(String) or event.get(rule['severity_add'].to_s).is_a?(Numeric)) 269 | if incident["severity"] < event.get(rule['severity_add'].to_s).to_i 270 | incident[@severity_field] = event.get(rule['severity_add'].to_s).to_i 271 | end 272 | end 273 | if not rule['template_up_sujet'].nil? and not rule['template_up_sujet'].empty? 274 | incident[@subject_field] = ERB.new(rule['template_up_sujet']).result(binding) 275 | else 276 | incident[@subject_field] = ERB.new(@template_subj_u).result(binding) 277 | end 278 | #keep old content 279 | if not rule['template_up_body'].nil? and not rule['template_up_body'].empty? 280 | incident[@body_field] = ERB.new(rule['template_up_body']).result(binding) + incident[@body_field] 281 | else 282 | incident[@body_field] = ERB.new(@template_data_u).result(binding) + incident[@body_field] 283 | end 284 | url = @url_api_fir + "incidents/" + incident["id"].to_s 285 | begin 286 | response = @client.patch(url, :body => incident.to_json, :headers => @headers) 287 | if response.code < 200 and response.code > 299 288 | log_failure( 289 | "Encountered non-200 HTTP code #{200}", 290 | :response_code => response.code, 291 | :url => url, 292 | :event => event) 293 | else 294 | begin 295 | url = @url_api_fir+"files/"+incident["id"].to_s+"/upload" 296 | bodyfile={"files" => [{"content" => JSON.pretty_generate(event.to_hash),"description" => "Incident details","filename" => "incident_detail-"+tnow.strftime("%Y-%m-%dT%H:%M:%S:%L").to_s+".json"}]} 297 | response = @client.post(url, :body => bodyfile.to_json, :headers => @headers) 298 | @logger.info("Upload file content incident: ", :string => response.body) 299 | rescue 300 | @logger.warn("Upload file content incident: ERROR.", :string => response.body) 301 | end 302 | end 303 | rescue 304 | @logger.warn("ERROR SEND:", :string => body.to_json) 305 | end 306 | #end - give token 307 | @token_create=true 308 | #break loop check_if_create is false, stop process => created and updated OK 309 | break 310 | end 311 | if check_if_create and rule['fields_create'].is_a?(Hash) 312 | #create 313 | sleep(1) until @token_create 314 | @token_create=false 315 | body = {} 316 | #create base JSON of fir incendent 317 | rule['fields_create'].each do |jkey,jval| 318 | body[jkey.to_s] = jval 319 | end 320 | body["date"] = tnow.strftime("%Y-%m-%dT%H:%M").to_s # format "2016-01-01T00:00" TODO!!! CHOICE GMT 321 | if rule['severity_add'] and (event.get(rule['severity_add'].to_s).is_a?(String) or event.get(rule['severity_add'].to_s).is_a?(Numeric)) 322 | if event.get(rule['severity_add'].to_s).to_i < 5 323 | body[@severity_field] = event.get(rule['severity_add'].to_s).to_i 324 | else 325 | body[@severity_field] = 1 326 | end 327 | else 328 | body[@severity_field] = 1 329 | end 330 | body[@status_field] = "O" 331 | if not rule['template_new_sujet'].nil? and not rule['template_new_sujet'].empty? 332 | body[@subject_field] = ERB.new(rule['template_new_sujet']).result(binding) 333 | else 334 | body[@subject_field] = ERB.new(@template_subj_n).result(binding) 335 | end 336 | #keep old content 337 | if not rule['template_new_body'].nil? and not rule['template_new_body'].empty? 338 | body[@body_field] = ERB.new(rule['template_new_body']).result(binding) 339 | else 340 | body[@body_field] = ERB.new(@template_data_n).result(binding) 341 | end 342 | url = @url_api_fir+"incidents" 343 | begin 344 | response = @client.post(url, :body => body.to_json, :headers => @headers) 345 | if response.code > 200 and response.code < 299 346 | #body 347 | begin 348 | add_inc = JSON.parse(response.body) 349 | (@incidents_db["results"] ||= []) << add_inc 350 | begin 351 | url = @url_api_fir+"files/"+add_inc["id"].to_s+"/upload" 352 | bodyfile={"files" => [{"content" => JSON.pretty_generate(event.to_hash),"description" => "Incident details","filename" => "incident_detail-"+tnow.strftime("%Y-%m-%dT%H:%M:%S:%L").to_s+".json"}]} 353 | response = @client.post(url, :body => bodyfile.to_json, :headers => @headers) 354 | @logger.info("Upload file content incident: ", :string => response.body) 355 | rescue 356 | @logger.warn("Upload file content incident: ERROR.", :string => response.body) 357 | end 358 | rescue 359 | @logger.warn("JSON CMD ERROR PARSE:", :string => response.body) 360 | end 361 | else 362 | log_failure( 363 | "Encountered non-200 HTTP code #{200}", 364 | :response_code => response.code, 365 | :url => url, 366 | :response => response, 367 | :body => body) 368 | end 369 | rescue 370 | @logger.warn("ERROR SEND:", :string => body.to_json) 371 | end 372 | #end - give token 373 | @token_create=true 374 | else 375 | break # exit of rules check & ok => stop 376 | end 377 | end 378 | end 379 | end 380 | end 381 | end 382 | end 383 | 384 | def close 385 | @client.close 386 | end 387 | 388 | private 389 | def load_db 390 | if !File.exists?(@conffile) 391 | @logger.warn("Configuration file read failure, stop loading", :path => @conffile) 392 | return 393 | end 394 | tmp_hash = Digest::SHA256.hexdigest File.read @conffile 395 | if not tmp_hash == @hash_file 396 | @hash_file = tmp_hash 397 | begin 398 | tmp_conf = JSON.parse( IO.read(@conffile, encoding:'utf-8') ) 399 | unless tmp_conf.nil? 400 | if tmp_conf['rules'].is_a?(Array) 401 | for rule in tmp_conf['rules'] 402 | if not rule['template_new_sujet'].nil? and not rule['template_new_sujet'].empty? and !File.exists?(rule['template_new_sujet'].to_s) 403 | @logger.error("Template in configuration file not exist", :path => rule['template_new_sujet'].to_s) 404 | return 405 | elsif not rule['template_new_sujet'].nil? and not rule['template_new_sujet'].empty? 406 | rule['template_new_sujet'] = File.read(rule['template_new_sujet'].to_s) 407 | end 408 | if not rule['template_new_body'].nil? and not rule['template_new_body'].empty? and !File.exists?(rule['template_new_body'].to_s) 409 | @logger.error("Template in configuration file not exist", :path => rule['template_new_body'].to_s) 410 | return 411 | elsif not rule['template_new_body'].nil? and not rule['template_new_body'].empty? 412 | rule['template_new_body'] = File.read(rule['template_new_body'].to_s) 413 | end 414 | if not rule['template_up_sujet'].nil? and not rule['template_up_sujet'].empty? and !File.exists?(rule['template_up_sujet'].to_s) 415 | @logger.error("Template in configuration file not exist", :path => rule['template_up_sujet'].to_s) 416 | return 417 | elsif not rule['template_up_sujet'].nil? and not rule['template_up_sujet'].empty? 418 | rule['template_up_sujet'] = File.read(rule['template_up_sujet'].to_s) 419 | end 420 | if not rule['template_up_body'].nil? and not rule['template_up_body'].empty? and !File.exists?(rule['template_up_body'].to_s) 421 | @logger.error("Template in configuration file not exist", :path => rule['template_up_body'].to_s) 422 | return 423 | elsif not rule['template_up_body'].nil? and not rule['template_up_body'].empty? 424 | rule['template_up_body'] = File.read(rule['template_up_body'].to_s) 425 | end 426 | if rule['subject_filter_prefix'].nil? 427 | rule['subject_filter_prefix'] = "" 428 | end 429 | if rule['subject_filter_sufix'].nil? 430 | rule['subject_filter_sufix'] = "" 431 | end 432 | if rule['body_filter_prefix'].nil? 433 | rule['body_filter_prefix'] = "" 434 | end 435 | if rule['body_filter_sufix'].nil? 436 | rule['body_filter_sufix'] = "" 437 | end 438 | end 439 | @fir_conf = tmp_conf['rules'] 440 | end 441 | end 442 | @logger.info("refreshing DB FIR condition file") 443 | rescue 444 | @logger.error("JSON CONF FIR -- PARSE ERROR") 445 | end 446 | end 447 | end 448 | 449 | # This is split into a separate method mostly to help testing 450 | def log_failure(message, opts) 451 | @logger.error("[HTTP Output Failure] #{message}", opts) 452 | end 453 | 454 | def load_incidents 455 | # Send the request 456 | stop_load = true 457 | incidents_db_tmp = {} 458 | first = true 459 | error_load = false 460 | url = @url_api_fir+"incidents?format=json" 461 | while stop_load 462 | response = @client.get(url, :headers => @headers) 463 | #body 464 | @logger.info("BODY: #{response.body}") 465 | begin 466 | field_next = "" 467 | if first 468 | incidents_db_tmp = JSON.parse(response.body) 469 | field_next = incidents_db_tmp["next"] 470 | first = false 471 | else 472 | tmp_db = JSON.parse(response.body) 473 | incidents_db_tmp["results"] = incidents_db_tmp["results"] + tmp_db["results"] 474 | field_next = tmp_db["next"] 475 | end 476 | if field_next != nil 477 | url = field_next 478 | else 479 | stop_load = false 480 | end 481 | rescue 482 | @logger.warn("JSON CMD ERROR PARSE:", :string => response.body) 483 | stop_load = false 484 | error_load = true 485 | end 486 | end 487 | @incidents_db = incidents_db_tmp unless error_load 488 | @logger.warn("INCIDENT DB LOADED") unless error_load 489 | end 490 | end # class LogStash::Outputs::Example 491 | -------------------------------------------------------------------------------- /logstash-output-fir/logstash-output-fir.gemspec: -------------------------------------------------------------------------------- 1 | Gem::Specification.new do |s| 2 | s.name = 'logstash-output-fir' 3 | s.version = "0.9.0" 4 | s.licenses = ["Apache License (2.0)"] 5 | s.summary = "This fir output send alert of sig filter to FIR (https://github.com/certsocietegenerale/FIR)." 6 | s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install fir. This gem is not a stand-alone program" 7 | s.authors = ["Lionel PRAT"] 8 | s.email = 'lionel.prat9@gmail.com' 9 | s.homepage = "http://www.elastic.co/guide/en/logstash/current/index.html" 10 | s.require_paths = ["lib"] 11 | 12 | # Files 13 | s.files = Dir['lib/**/*','spec/**/*','vendor/**/*','*.gemspec','*.md','CONTRIBUTORS','Gemfile','LICENSE','NOTICE.TXT'] 14 | # Tests 15 | s.test_files = s.files.grep(%r{^(test|spec|features)/}) 16 | 17 | # Special flag to let us know this is actually a logstash plugin 18 | s.metadata = { "logstash_plugin" => "true", "logstash_group" => "output" } 19 | 20 | # Gem dependencies 21 | # Gem dependencies 22 | s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99" 23 | s.add_development_dependency 'logstash-devutils' 24 | end 25 | -------------------------------------------------------------------------------- /logstash-output-fir/sample_conf/conf_fir.json: -------------------------------------------------------------------------------- 1 | { "rules": [ 2 | { 3 | "filters": {"alert_source": "honeypot"}, 4 | "subject_filter": "src_ip", 5 | "subject_filter_prefix": "-", 6 | "subject_filter_sufix": "-", 7 | "body_filter": "fingerprint", 8 | "body_filter_prefix": "", 9 | "body_filter_sufix": " -> SCORE", 10 | "count_filter": " Count: ", 11 | "severity_add": "sig_detected_note", 12 | "fields_create": {"actor": 6, "category": 26,"confidentiality": 0,"detection": 36, "plan": 37,"is_starred": false,"is_major": false,"is_incident": false,"concerned_business_lines": []}, 13 | "template_new_sujet": "", 14 | "template_new_body": "", 15 | "template_up_sujet": "", 16 | "template_up_body": "", 17 | } 18 | ] } 19 | -------------------------------------------------------------------------------- /logstash-output-fir/spec/outputs/fir_spec.rb: -------------------------------------------------------------------------------- 1 | # encoding: utf-8 2 | require "logstash/devutils/rspec/spec_helper" 3 | require "logstash/outputs/fir" 4 | require "logstash/codecs/plain" 5 | require "logstash/event" 6 | 7 | describe LogStash::Outputs::Fir do 8 | let(:sample_event) { LogStash::Event.new } 9 | let(:output) { LogStash::Outputs::Fir.new } 10 | 11 | before do 12 | output.register 13 | end 14 | 15 | describe "receive message" do 16 | subject { output.receive(sample_event) } 17 | 18 | it "returns a string" do 19 | expect(subject).to eq("Event received") 20 | end 21 | end 22 | end 23 | -------------------------------------------------------------------------------- /logstash-output-fir/template_erb/subject_template_new.erb: -------------------------------------------------------------------------------- 1 | Alert automatic on -<%= event.get("src_ip").to_s %>- New at <%= tnow.strftime("%Y-%m-%dT%H:%M").to_s %> 2 | -------------------------------------------------------------------------------- /logstash-output-fir/template_erb/subject_template_update.erb: -------------------------------------------------------------------------------- 1 | <%= 2 | if incident[@subject_field] =~ /New at/ 3 | tmp = incident[@subject_field].gsub(/New at \S+/, "Update at " + tnow.strftime("%Y-%m-%dT%H:%M").to_s + " - 2 alerts") 4 | else 5 | tmp = incident[@subject_field].gsub(/Update at \S+/, "Update at " + tnow.strftime("%Y-%m-%dT%H:%M").to_s) 6 | num_al=incident[@subject_field].scan(/- \d+ alerts/).last 7 | if num_al 8 | num_al = num_al.scan(/\d+/).first.to_i + 1 9 | tmp = incident[@subject_field].gsub(/- \d+ alerts/, "- " + num_al.to_s + " alerts") 10 | end 11 | end 12 | tmp 13 | %> 14 | 15 | -------------------------------------------------------------------------------- /logstash-output-fir/template_erb/template_new.erb: -------------------------------------------------------------------------------- 1 | # Alert automatic SIEM on <%= event.get("src_ip").to_s %> 2 | 3 | ## Begin alert [source: <%= event.get("alert_source").to_s %>]: <%= tnow.strftime("%Y-%m-%dT%H:%M:%S:%L%z").to_s %> 4 | 5 | ### Tags: <%= event.get("tags").to_s %> 6 | 7 | ### <%= event.get("fingerprint").to_s + " -> SCORE:" + event.get("sig_detected_note").to_s + " Count: 1 with ID(s): " + event.get("sig_detected_id").to_s %> 8 | 9 | Event in file: <%= "incident_detail-"+tnow.strftime("%Y-%m-%dT%H:%M:%S:%L").to_s+".json" %> 10 | 11 | #### Links to get informations on IP & Domain name 12 | 13 | [Info domain VT](https://www.virustotal.com/fr/domain/<%= event.get("dst_domain") %>/information/) 14 | [Info IP VT](https://www.virustotal.com/fr/ip-address/<%= event.get("dst_ip") %>/information/) 15 | [Info IP OTX](https://otx.alienvault.com/indicator/ip/<%= event.get("dst_ip") %>/) 16 | 17 | #### False positive 18 | 19 | ["Add false positive](https://myip_for_false:8002/add_fp.php?fingerprint=<%= event.get("fingerprint") %>) 20 | -------------------------------------------------------------------------------- /logstash-output-fir/template_erb/template_update.erb: -------------------------------------------------------------------------------- 1 | * * * 2 | 3 | ## Update automatic [source: <%= event.get("alert_source").to_s %>]: <%= tnow.strftime("%Y-%m-%dT%H:%M:%S:%L%z").to_s %> 4 | 5 | ### Tags: <%= event.get("tags").to_s %> 6 | 7 | ### <%= event.get("fingerprint").to_s + " -> SCORE:" + event.get("sig_detected_note").to_s + " Count: 1 with ID(s): " + event.get("sig_detected_id").to_s %> 8 | 9 | Event in file: <%= "incident_detail-"+tnow.strftime("%Y-%m-%dT%H:%M:%S:%L").to_s+".json" %> 10 | 11 | #### Links to get informations on IP & Domain name 12 | 13 | [Info domain VT](https://www.virustotal.com/fr/domain/<%= event.get("dst_domain") %>/information/) 14 | [Info IP VT](https://www.virustotal.com/fr/ip-address/<%= event.get("dst_ip") %>/information/) 15 | [Info IP OTX](https://otx.alienvault.com/indicator/ip/<%= event.get("dst_ip") %>/) 16 | 17 | #### False positive 18 | 19 | ["Add false positive](https://myip_for_false:8002/add_fp.php?fingerprint=<%= event.get("fingerprint") %>) 20 | -------------------------------------------------------------------------------- /sample-architecture/Architecture-sample.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lprat/logstash-plugins/c95e58efc7babcb7a018125f513736450a06b5e1/sample-architecture/Architecture-sample.png -------------------------------------------------------------------------------- /sample-architecture/Diagramme-archi.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lprat/logstash-plugins/c95e58efc7babcb7a018125f513736450a06b5e1/sample-architecture/Diagramme-archi.png --------------------------------------------------------------------------------