├── .github └── PULL_REQUEST_TEMPLATE.md ├── .gitignore ├── CHANGES ├── CMakeLists.txt ├── COPYING ├── MAINTAINER ├── Makefile ├── NOTICE ├── README.md ├── VERSION ├── cmake ├── FindLibRDKafka.cmake └── FindOpenSSL.cmake ├── configure ├── configure.plugin ├── dev_utilities └── release-utils │ └── metron-bro-kafka-rc-check ├── docker ├── README.md ├── containers │ ├── kafka │ │ └── Dockerfile │ ├── zeek │ │ ├── .screenrc │ │ ├── Dockerfile │ │ ├── Makefile │ │ ├── requirements-to-freeze.txt │ │ └── requirements.txt │ └── zookeeper │ │ └── Dockerfile ├── data │ └── .gitignore ├── docker-compose.yml ├── finish_end_to_end.sh ├── in_docker_scripts │ ├── build_plugin.sh │ ├── configure_plugin.sh │ └── process_data_file.sh ├── run_end_to_end.sh ├── scripts │ ├── analyze_results.sh │ ├── docker_execute_build_plugin.sh │ ├── docker_execute_configure_plugin.sh │ ├── docker_execute_create_topic_in_kafka.sh │ ├── docker_execute_process_data_file.sh │ ├── docker_execute_shell.sh │ ├── docker_run_consume_kafka.sh │ ├── docker_run_get_offset_kafka.sh │ ├── download_sample_pcaps.sh │ ├── print_results.sh │ └── split_kafka_output_by_log.sh └── test_output │ └── .gitignore ├── scripts ├── Apache │ └── Kafka │ │ ├── __load__.zeek │ │ └── logs-to-kafka.zeek ├── __load__.zeek └── init.zeek ├── src ├── KafkaWriter.cc ├── KafkaWriter.h ├── Plugin.cc ├── Plugin.h ├── TaggedJSON.cc ├── TaggedJSON.h ├── events.bif ├── kafka.bif └── kafka_const.bif ├── tests ├── .gitignore ├── Baseline │ ├── kafka.l2s-l2e-no-overlap │ │ └── output │ ├── kafka.l2s-set-l2e-set │ │ └── output │ ├── kafka.l2s-set-l2e-unset │ │ └── output │ ├── kafka.l2s-unset-l2e-set │ │ └── output │ ├── kafka.l2s-unset-l2e-unset │ │ └── output │ ├── kafka.resolved-topic-config │ │ └── output │ ├── kafka.resolved-topic-default │ │ └── output │ ├── kafka.resolved-topic-override-and-config │ │ └── output │ ├── kafka.resolved-topic-override-only │ │ └── output │ ├── kafka.send-all-active-logs-l2e-set │ │ └── output │ ├── kafka.send-all-active-logs-l2e-unset │ │ └── output │ ├── kafka.send-all-active-logs-l2s-set-l2e-set │ │ └── output │ ├── kafka.send-all-active-logs-l2s-set-l2e-unset │ │ └── output │ └── kafka.show-plugin │ │ └── output ├── Makefile ├── Scripts │ ├── diff-remove-timestamps │ └── get-zeek-env ├── btest.cfg ├── kafka │ ├── l2s-l2e-no-overlap.zeek │ ├── l2s-set-l2e-set.zeek │ ├── l2s-set-l2e-unset.zeek │ ├── l2s-unset-l2e-set.zeek │ ├── l2s-unset-l2e-unset.zeek │ ├── resolved-topic-config.zeek │ ├── resolved-topic-default.zeek │ ├── resolved-topic-override-and-config.zeek │ ├── resolved-topic-override-only.zeek │ ├── send-all-active-logs-l2e-set.zeek │ ├── send-all-active-logs-l2e-unset.zeek │ ├── send-all-active-logs-l2s-set-l2e-set.zeek │ ├── send-all-active-logs-l2s-set-l2e-unset.zeek │ └── show-plugin.zeek ├── pcaps │ └── exercise-traffic.pcap └── random.seed └── zkg.meta /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | ## Contributor Comments 2 | [Please place any comments here. A description of the problem/enhancement, how to reproduce the issue, your testing methodology, etc.] 3 | 4 | 5 | ## Pull Request Checklist 6 | 7 | Thank you for submitting a contribution to Apache Metron's Bro kafka writer plugin. 8 | 9 | In order to streamline the review of the contribution we ask you follow these guidelines and ask you to double check the following: 10 | 11 | ### For all changes: 12 | - [ ] Is there a JIRA ticket associated with this PR? If not one needs to be created at [Metron Jira](https://issues.apache.org/jira/browse/METRON/?selectedTab=com.atlassian.jira.jira-projects-plugin:summary-panel). 13 | - [ ] Does your PR title start with METRON-XXXX where XXXX is the JIRA number you are trying to resolve? Pay particular attention to the hyphen "-" character. 14 | - [ ] Has your PR been rebased against the latest commit within the target branch (typically master)? 15 | 16 | ### For code changes: 17 | - [ ] Have you included steps to reproduce the behavior or problem that is being changed or addressed? 18 | - [ ] Have you included steps or a guide to how the change may be verified and tested manually? 19 | - [ ] Have you ensured that the full suite of tests and checks have been executed via: 20 | ``` 21 | bro-pkg test $GITHUB_USERNAME/metron-bro-plugin-kafka --version $BRANCH 22 | ``` 23 | - [ ] Have you written or updated unit tests and or integration tests to verify your changes? 24 | - [ ] If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under [ASF 2.0](http://www.apache.org/legal/resolved.html#category-a)? 25 | - [ ] Have you verified the basic functionality of the build by building and running locally with Apache Metron's [Vagrant full-dev environment](https://github.com/apache/metron/tree/master/metron-deployment/development/centos6) or the equivalent? 26 | 27 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | # ide stuff 2 | .idea 3 | *.iml 4 | *.iws 5 | /cmake-build-* 6 | .state 7 | build 8 | 9 | # Log files 10 | *.log 11 | 12 | 13 | # Created by https://www.gitignore.io/api/vim,c++,emacs,git,macos 14 | # Edit at https://www.gitignore.io/?templates=vim,c++,emacs,git,macos 15 | 16 | ### C++ ### 17 | # Prerequisites 18 | *.d 19 | 20 | # Compiled Object files 21 | *.slo 22 | *.lo 23 | *.o 24 | *.obj 25 | 26 | # Precompiled Headers 27 | *.gch 28 | *.pch 29 | 30 | # Compiled Dynamic libraries 31 | *.so 32 | *.dylib 33 | *.dll 34 | 35 | # Fortran module files 36 | *.mod 37 | *.smod 38 | 39 | # Compiled Static libraries 40 | *.lai 41 | *.la 42 | *.a 43 | *.lib 44 | 45 | # Executables 46 | *.exe 47 | *.out 48 | *.app 49 | 50 | ### Emacs ### 51 | # -*- mode: gitignore; -*- 52 | *~ 53 | \#*\# 54 | /.emacs.desktop 55 | /.emacs.desktop.lock 56 | *.elc 57 | auto-save-list 58 | tramp 59 | .\#* 60 | 61 | # Org-mode 62 | .org-id-locations 63 | *_archive 64 | 65 | # flymake-mode 66 | *_flymake.* 67 | 68 | # eshell files 69 | /eshell/history 70 | /eshell/lastdir 71 | 72 | # elpa packages 73 | /elpa/ 74 | 75 | # reftex files 76 | *.rel 77 | 78 | # AUCTeX auto folder 79 | /auto/ 80 | 81 | # cask packages 82 | .cask/ 83 | dist/ 84 | 85 | # Flycheck 86 | flycheck_*.el 87 | 88 | # server auth directory 89 | /server/ 90 | 91 | # projectiles files 92 | .projectile 93 | 94 | # directory configuration 95 | .dir-locals.el 96 | 97 | # network security 98 | /network-security.data 99 | 100 | 101 | ### Git ### 102 | # Created by git for backups. To disable backups in Git: 103 | # $ git config --global mergetool.keepBackup false 104 | *.orig 105 | 106 | # Created by git when using merge tools for conflicts 107 | *.BACKUP.* 108 | *.BASE.* 109 | *.LOCAL.* 110 | *.REMOTE.* 111 | *_BACKUP_*.txt 112 | *_BASE_*.txt 113 | *_LOCAL_*.txt 114 | *_REMOTE_*.txt 115 | 116 | ### macOS ### 117 | # General 118 | .DS_Store 119 | .AppleDouble 120 | .LSOverride 121 | 122 | # Icon must end with two \r 123 | Icon 124 | 125 | # Thumbnails 126 | ._* 127 | 128 | # Files that might appear in the root of a volume 129 | .DocumentRevisions-V100 130 | .fseventsd 131 | .Spotlight-V100 132 | .TemporaryItems 133 | .Trashes 134 | .VolumeIcon.icns 135 | .com.apple.timemachine.donotpresent 136 | 137 | # Directories potentially created on remote AFP share 138 | .AppleDB 139 | .AppleDesktop 140 | Network Trash Folder 141 | Temporary Items 142 | .apdisk 143 | 144 | ### Vim ### 145 | # Swap 146 | [._]*.s[a-v][a-z] 147 | [._]*.sw[a-p] 148 | [._]s[a-rt-v][a-z] 149 | [._]ss[a-gi-z] 150 | [._]sw[a-p] 151 | 152 | # Session 153 | Session.vim 154 | Sessionx.vim 155 | 156 | # Temporary 157 | .netrwhist 158 | 159 | # Auto-generated tag files 160 | tags 161 | 162 | # Persistent undo 163 | [._]*.un~ 164 | 165 | # Coc configuration directory 166 | .vim 167 | 168 | # End of https://www.gitignore.io/api/vim,c++,emacs,git,macos 169 | -------------------------------------------------------------------------------- /CHANGES: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | -------------------------------------------------------------------------------- /CMakeLists.txt: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | cmake_minimum_required(VERSION 3.0 FATAL_ERROR) 19 | project(ZeekPlugin_Kafka) 20 | include(ZeekPlugin) 21 | find_package(LibRDKafka) 22 | find_package(OpenSSL) 23 | 24 | if (LIBRDKAFKA_FOUND AND OPENSSL_FOUND) 25 | include_directories(BEFORE ${LibRDKafka_INCLUDE_DIR} ${OpenSSL_INCLUDE_DIR}) 26 | zeek_plugin_begin(APACHE KAFKA) 27 | zeek_plugin_cc(src/KafkaWriter.cc) 28 | zeek_plugin_cc(src/Plugin.cc) 29 | zeek_plugin_cc(src/TaggedJSON.cc) 30 | zeek_plugin_bif(src/kafka.bif) 31 | zeek_plugin_bif(src/events.bif) 32 | zeek_plugin_dist_files(README CHANGES COPYING VERSION) 33 | zeek_plugin_link_library(${LibRDKafka_LIBRARIES}) 34 | zeek_plugin_link_library(${LibRDKafka_C_LIBRARIES}) 35 | zeek_plugin_link_library(${OpenSSL_LIBRARIES}) 36 | zeek_plugin_end() 37 | 38 | elseif (NOT LIBRDKAFKA_FOUND) 39 | message(FATAL_ERROR "LibRDKafka not found.") 40 | 41 | elseif (NOT OPENSSL_FOUND) 42 | message(FATAL_ERROR "OpenSSL not found.") 43 | 44 | endif () 45 | -------------------------------------------------------------------------------- /COPYING: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "{}" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright {yyyy} {name of copyright owner} 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /MAINTAINER: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | Apache Metron 19 | -------------------------------------------------------------------------------- /Makefile: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | # Convenience Makefile providing a few common top-level targets. 18 | # 19 | 20 | cmake_build_dir=build 21 | arch=`uname -s | tr A-Z a-z`-`uname -m` 22 | 23 | all: build-it 24 | 25 | build-it: 26 | @test -e $(cmake_build_dir)/config.status || ./configure 27 | -@test -e $(cmake_build_dir)/CMakeCache.txt && \ 28 | test $(cmake_build_dir)/CMakeCache.txt -ot `cat $(cmake_build_dir)/CMakeCache.txt | grep ZEEK_DIST | cut -d '=' -f 2`/build/CMakeCache.txt && \ 29 | echo Updating stale CMake cache && \ 30 | touch $(cmake_build_dir)/CMakeCache.txt 31 | 32 | ( cd $(cmake_build_dir) && make ) 33 | 34 | install: 35 | ( cd $(cmake_build_dir) && make install ) 36 | 37 | clean: 38 | ( cd $(cmake_build_dir) && make clean ) 39 | 40 | distclean: 41 | rm -rf $(cmake_build_dir) 42 | 43 | test: 44 | make -C tests 45 | -------------------------------------------------------------------------------- /NOTICE: -------------------------------------------------------------------------------- 1 | Apache Metron 2 | Copyright 2015-2020 The Apache Software Foundation 3 | 4 | This product includes software developed at 5 | The Apache Software Foundation (http://www.apache.org/). 6 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Logging Zeek Output to Kafka 2 | 3 | A Zeek log writer that sends logging output to Kafka. This provides a convenient means for tools in the Hadoop ecosystem, such as Storm, Spark, and others, to process the data generated by Zeek. 4 | 5 | This software is a part of the [Apache Metron](https://metron.apache.org/) project which integrates a variety of open source, big data technologies to offer a platform to detect and respond to cyber threats at-scale. 6 | 7 | * [Installation](#installation) 8 | * [Activation](#activation) 9 | * [Settings](#settings) 10 | * [Kerberos](#kerberos) 11 | * [Contributing](#contributing) 12 | 13 | ## Installation 14 | 15 | ### `zkg` Installation 16 | 17 | `zkg` is the preferred mechanism for installing this plugin, as it will dynamically retrieve, build, test, and load the plugin. Note, that you will still need to [activate](#activation) and configure the plugin after your installation. 18 | 19 | 1. Install [librdkafka](https://github.com/edenhill/librdkafka), a native client library for Kafka. This plugin has been tested against librdkafka v1.4.2. 20 | 21 | In order to use this plugin within a kerberized Kafka environment, you will also need `libsasl2` installed and will need to pass `--enable-sasl` to the `configure` script. 22 | 23 | ``` 24 | $ curl -L https://github.com/edenhill/librdkafka/archive/v1.4.2.tar.gz | tar xvz 25 | $ cd librdkafka-1.4.2/ 26 | $ ./configure --enable-sasl 27 | $ make 28 | $ sudo make install 29 | ``` 30 | 31 | 1. Configure `zkg` by following the quickstart guide [here](https://docs.zeek.org/projects/package-manager/en/stable/quickstart.html). 32 | 33 | 1. Install the plugin using `zkg install`. 34 | 35 | ``` 36 | $ zkg install apache/metron-bro-plugin-kafka --version master 37 | The following packages will be INSTALLED: 38 | zeek/apache/metron-bro-plugin-kafka (master) 39 | 40 | Verify the following REQUIRED external dependencies: 41 | (Ensure their installation on all relevant systems before proceeding): 42 | from zeek/apache/metron-bro-plugin-kafka (master): 43 | librdkafka ~1.4.2 44 | 45 | Proceed? [Y/n] 46 | zeek/apache/metron-bro-plugin-kafka asks for LIBRDKAFKA_ROOT (Path to librdkafka installation tree) ? [/usr/local/lib] 47 | Saved answers to config file: /home/jonzeolla/.zkg/config 48 | Running unit tests for "zeek/apache/metron-bro-plugin-kafka" 49 | all 10 tests successful 50 | 51 | 52 | Installing "zeek/apache/metron-bro-plugin-kafka"........ 53 | Installed "zeek/apache/metron-bro-plugin-kafka" (master) 54 | Loaded "zeek/apache/metron-bro-plugin-kafka" 55 | ``` 56 | 57 | 1. Run the following command to ensure that the plugin was installed successfully. 58 | 59 | ``` 60 | $ zeek -N Apache::Kafka 61 | Apache::Kafka - Writes logs to Kafka (dynamic, version 0.3.0) 62 | ``` 63 | 64 | ### Manual Installation 65 | 66 | Manually installing the plugin should *only* occur in situations where installing and configuring `zkg` is not reasonable. If you are running zeek in an environment where you do not have Internet connectivity, investigate [bundles](https://docs.zeek.org/projects/package-manager/en/stable/zkg.html#bundle) or creating an internal [package source](https://docs.zeek.org/projects/package-manager/en/stable/source.html). 67 | 68 | These instructions could also be helpful if you were interested in distributing this as a package (such as a deb or rpm). 69 | 70 | 1. Install [librdkafka](https://github.com/edenhill/librdkafka), a native client library for Kafka. This plugin has been tested against librdkafka v1.4.2. 71 | 72 | In order to use this plugin within a kerberized Kafka environment, you will also need `libsasl2` installed and will need to pass `--enable-sasl` to the `configure` script. 73 | 74 | ``` 75 | $ curl -L https://github.com/edenhill/librdkafka/archive/v1.4.2.tar.gz | tar xvz 76 | $ cd librdkafka-1.4.2/ 77 | $ ./configure --enable-sasl 78 | $ make 79 | $ sudo make install 80 | ``` 81 | 82 | 1. Build the plugin using the following commands. 83 | 84 | ``` 85 | $ ./configure --with-librdkafka=$librdkafka_root 86 | $ make 87 | $ sudo make install 88 | ``` 89 | 90 | 1. Run the following command to ensure that the plugin was installed successfully. 91 | 92 | ``` 93 | $ zeek -N Apache::Kafka 94 | Apache::Kafka - Writes logs to Kafka (dynamic, version 0.3.0) 95 | ``` 96 | 97 | ## Activation 98 | 99 | The following examples highlight different ways that the plugin can be used. Simply add the Zeek script language to your `local.zeek` file (for example, `/usr/share/zeek/site/local.zeek`) as shown to demonstrate the example. 100 | 101 | In addition to activating the plugin, when running Zeek in a cluster it is highly recommended to leverage one or more Zeek [loggers](https://docs.zeek.org/en/v3.1.2/cluster/index.html#logger) as shown [here](https://docs.zeek.org/en/v3.1.2/configuration/index.html#basic-cluster-configuration) to separate logging activities from the manager thread. 102 | 103 | ### Example 1 - Send a list of logs to kafka 104 | 105 | The goal in this example is to send all HTTP and DNS records to a Kafka topic named `zeek`. 106 | * Any configuration value accepted by librdkafka can be added to the `kafka_conf` configuration table. 107 | * The `topic_name` will default to send all records to a single Kafka topic called 'zeek'. 108 | * Defining `logs_to_send` will send the HTTP and DNS records to the brokers specified in your `Kafka::kafka_conf`. 109 | ``` 110 | @load packages/metron-bro-plugin-kafka/Apache/Kafka 111 | redef Kafka::logs_to_send = set(HTTP::LOG, DNS::LOG); 112 | redef Kafka::kafka_conf = table( 113 | ["metadata.broker.list"] = "server1.example.com:9092,server2.example.com:9092" 114 | ); 115 | ``` 116 | 117 | ### Example 2 - Send all active logs 118 | 119 | This plugin has the ability send all active logs to the "zeek" kafka topic with the following configuration. 120 | 121 | ``` 122 | @load packages/metron-bro-plugin-kafka/Apache/Kafka 123 | redef Kafka::send_all_active_logs = T; 124 | redef Kafka::kafka_conf = table( 125 | ["metadata.broker.list"] = "localhost:9092" 126 | ); 127 | ``` 128 | 129 | ### Example 3 - Send all active logs with exclusions 130 | 131 | You can also specify a blacklist of zeek logs to ensure they aren't being sent to kafka regardless of the `Kafka::send_all_active_logs` and `Kafka::logs_to_send` configurations. In this example, we will send all of the enabled logs except for the Conn log. 132 | 133 | ``` 134 | @load packages/metron-bro-plugin-kafka/Apache/Kafka 135 | redef Kafka::send_all_active_logs = T; 136 | redef Kafka::logs_to_exclude = set(Conn::LOG); 137 | redef Kafka::topic_name = "zeek"; 138 | redef Kafka::kafka_conf = table( 139 | ["metadata.broker.list"] = "localhost:9092" 140 | ); 141 | ``` 142 | 143 | ### Example 4 - Send each zeek log to a unique topic 144 | 145 | It is also possible to send each log stream to a uniquely named topic. The goal in this example is to send all HTTP records to a Kafka topic named `http` and all DNS records to a separate Kafka topic named `dns`. 146 | * The `topic_name` value must be set to an empty string. 147 | * The `$path` value of Zeek's Log Writer mechanism is used to define the topic name. 148 | * Any configuration value accepted by librdkafka can be added to the `$config` configuration table. 149 | * Each log writer accepts a separate configuration table. 150 | 151 | ``` 152 | @load packages/metron-bro-plugin-kafka/Apache/Kafka 153 | redef Kafka::topic_name = ""; 154 | redef Kafka::tag_json = T; 155 | 156 | event zeek_init() &priority=-10 157 | { 158 | # handles HTTP 159 | local http_filter: Log::Filter = [ 160 | $name = "kafka-http", 161 | $writer = Log::WRITER_KAFKAWRITER, 162 | $config = table( 163 | ["metadata.broker.list"] = "localhost:9092" 164 | ), 165 | $path = "http" 166 | ]; 167 | Log::add_filter(HTTP::LOG, http_filter); 168 | 169 | # handles DNS 170 | local dns_filter: Log::Filter = [ 171 | $name = "kafka-dns", 172 | $writer = Log::WRITER_KAFKAWRITER, 173 | $config = table( 174 | ["metadata.broker.list"] = "localhost:9092" 175 | ), 176 | $path = "dns" 177 | ]; 178 | Log::add_filter(DNS::LOG, dns_filter); 179 | } 180 | ``` 181 | 182 | ### Example 5 - Zeek log filtering 183 | 184 | You may want to configure zeek to filter log messages with certain characteristics from being sent to your kafka topics. For instance, Apache Metron currently doesn't support IPv6 source or destination IPs in the default enrichments, so it may be helpful to filter those log messages from being sent to kafka (although there are [multiple ways](#notes) to approach this). In this example we will do that that, and are assuming a somewhat standard zeek kafka plugin configuration, such that: 185 | * All zeek logs are sent to the default `zeek` topic. 186 | * Each JSON message is tagged with the appropriate log type (such as `http`, `dns`, or `conn`), by setting `Kafka::tag_json` to true. 187 | * If the log message contains a 128 byte long source or destination IP address, the log is not sent to kafka. 188 | 189 | ``` 190 | @load packages/metron-bro-plugin-kafka/Apache/Kafka 191 | redef Kafka::tag_json = T; 192 | 193 | event zeek_init() &priority=-10 194 | { 195 | # handles HTTP 196 | Log::add_filter(HTTP::LOG, [ 197 | $name = "kafka-http", 198 | $writer = Log::WRITER_KAFKAWRITER, 199 | $pred(rec: HTTP::Info) = { return ! (( |rec$id$orig_h| == 128 || |rec$id$resp_h| == 128 )); }, 200 | $config = table( 201 | ["metadata.broker.list"] = "localhost:9092" 202 | ) 203 | ]); 204 | 205 | # handles DNS 206 | Log::add_filter(DNS::LOG, [ 207 | $name = "kafka-dns", 208 | $writer = Log::WRITER_KAFKAWRITER, 209 | $pred(rec: DNS::Info) = { return ! (( |rec$id$orig_h| == 128 || |rec$id$resp_h| == 128 )); }, 210 | $config = table( 211 | ["metadata.broker.list"] = "localhost:9092" 212 | ) 213 | ]); 214 | 215 | # handles Conn 216 | Log::add_filter(Conn::LOG, [ 217 | $name = "kafka-conn", 218 | $writer = Log::WRITER_KAFKAWRITER, 219 | $pred(rec: Conn::Info) = { return ! (( |rec$id$orig_h| == 128 || |rec$id$resp_h| == 128 )); }, 220 | $config = table( 221 | ["metadata.broker.list"] = "localhost:9092" 222 | ) 223 | ]); 224 | } 225 | ``` 226 | 227 | #### Notes 228 | * `logs_to_send` is mutually exclusive with `$pred`, thus for each log you want to set `$pred` on, you must individually setup a `Log::add_filter` and refrain from including that log in `logs_to_send`. 229 | * The [`is_v6_addr()`](https://docs.zeek.org/en/v3.1.2/scripts/base/bif/zeek.bif.zeek.html#id-is_v6_addr) function can also be used in your `$pred` to identify if an IP address is IPv6. 230 | * Alternatively, if you are using Apache Metron to pull from the specified kafka topic, you could filter the IPv6 logs [using Stellar](https://metron.apache.org/current-book/metron-stellar/stellar-common/index.html#IS_IP). In that case Stellar would filter the logs out and a `$pred` would not be necessary. The benefit to this approach is that kafka would receive an unfiltered set of logs. 231 | 232 | ### Example 6 - Sending a log to multiple topics 233 | 234 | You are able to send a single zeek log to multiple different kafka topics in the same kafka cluster by overriding the default topic (configured with `Kafka::topic_name`) by creating a custom zeek `Log::Filter`. In this example, the DHCP, RADIUS, and DNS logs are sent to the "zeek" topic; the RADIUS log is duplicated to the "shew_zeek_radius" topic; and the DHCP log is duplicated to the "shew_zeek_dhcp" topic. 235 | 236 | ``` 237 | @load packages/metron-bro-plugin-kafka/Apache/Kafka 238 | redef Kafka::logs_to_send = set(DHCP::LOG, RADIUS::LOG, DNS::LOG); 239 | redef Kafka::topic_name = "zeek"; 240 | redef Kafka::kafka_conf = table( 241 | ["metadata.broker.list"] = "server1.example.com:9092,server2.example.com:9092" 242 | ); 243 | redef Kafka::tag_json = T; 244 | 245 | event zeek_init() &priority=-10 246 | { 247 | # Send RADIUS to the shew_zeek_radius topic 248 | local shew_radius_filter: Log::Filter = [ 249 | $name = "kafka-radius-shew", 250 | $writer = Log::WRITER_KAFKAWRITER, 251 | $path = "shew_zeek_radius" 252 | $config = table(["topic_name"] = "shew_zeek_radius") 253 | ]; 254 | Log::add_filter(RADIUS::LOG, shew_radius_filter); 255 | 256 | # Send DHCP to the shew_zeek_dhcp topic 257 | local shew_dhcp_filter: Log::Filter = [ 258 | $name = "kafka-dhcp-shew", 259 | $writer = Log::WRITER_KAFKAWRITER, 260 | $path = "shew_zeek_dhcp" 261 | $config = table(["topic_name"] = "shew_zeek_dhcp") 262 | ]; 263 | Log::add_filter(DHCP::LOG, shew_dhcp_filter); 264 | } 265 | ``` 266 | 267 | _Note_: Because `Kafka::tag_json` is set to True in this example, the value of `$path` is used as the tag for each `Log::Filter`. If you were to add a log filter with the same `$path` as an existing filter, Zeek will append "-N", where N is an integer starting at 2, to the end of the log path so that each filter has its own unique log path. For instance, the second instance of `conn` would become `conn-2`. 268 | 269 | ### Example 7 - Add static values to each outgoing Kafka message 270 | It is possible to define name value pairs and have them added to each outgoing Kafka json message when tagged_json is set to true. Each will be added to the root json object. 271 | * the Kafka::additional_message_values table can be configured with each name and value 272 | * based on the following configuration, each outgoing message will have "FIRST_STATIC_NAME": "FIRST_STATIC_VALUE", "SECOND_STATIC_NAME": "SECOND_STATIC_VALUE" added. 273 | ``` 274 | @load packages 275 | redef Kafka::logs_to_send = set(HTTP::LOG, DNS::LOG, Conn::LOG, DPD::LOG, FTP::LOG, Files::LOG, Known::CERTS_LOG, SMTP::LOG, SSL::LOG, Weird::LOG, Notice::LOG, DHCP::LOG, SSH::LOG, Software::LOG, RADIUS::LOG, X509::LOG, RFB::LOG, Stats::LOG, CaptureLoss::LOG, SIP::LOG); 276 | redef Kafka::topic_name = "zeek"; 277 | redef Kafka::tag_json = T; 278 | redef Kafka::kafka_conf = table(["metadata.broker.list"] = "kafka-1:9092,kafka-2:9092"); 279 | redef Kafka::additional_message_values = table(["FIRST_STATIC_NAME"] = "FIRST_STATIC_VALUE", ["SECOND_STATIC_NAME"] = "SECOND_STATIC_VALUE"); 280 | redef Kafka::logs_to_exclude = set(Conn::LOG, DHCP::LOG); 281 | redef Known::cert_tracking = ALL_HOSTS; 282 | redef Software::asset_tracking = ALL_HOSTS; 283 | ``` 284 | 285 | ## Settings 286 | 287 | ### `logs_to_send` 288 | 289 | A set of logs to send to kafka. 290 | 291 | ``` 292 | redef Kafka::logs_to_send = set(Conn::LOG, DHCP::LOG); 293 | ``` 294 | 295 | ### `send_all_active_logs` 296 | 297 | If true, all active logs will be sent to kafka other than those specified in 298 | `logs_to_exclude`. 299 | 300 | ``` 301 | redef Kafka::send_all_active_logs = T; 302 | ``` 303 | 304 | ### `logs_to_exclude` 305 | 306 | A set of logs to exclude from being sent to kafka. 307 | 308 | ``` 309 | redef Kafka::logs_to_exclude = set(Conn::LOG, DNS::LOG); 310 | ``` 311 | 312 | ### `topic_name` 313 | 314 | The name of the topic in Kafka where all Zeek logs will be sent to. 315 | 316 | ``` 317 | redef Kafka::topic_name = "zeek"; 318 | ``` 319 | 320 | ### `kafka_conf` 321 | 322 | The global configuration settings for Kafka. These values are passed through 323 | directly to librdkafka. Any valid librdkafka settings can be defined in this 324 | table. The full set of valid librdkafka settings are available 325 | [here](https://github.com/edenhill/librdkafka/blob/v1.4.2/CONFIGURATION.md). 326 | 327 | ``` 328 | redef Kafka::kafka_conf = table( 329 | ["metadata.broker.list"] = "localhost:9092", 330 | ["client.id"] = "zeek" 331 | ); 332 | ``` 333 | 334 | ### `additonal_message_values` 335 | 336 | A table of of name value pairs. Each item in this table will be added to each outgoing message 337 | at the root level if tag_json is set to T. 338 | 339 | ``` 340 | redef Kafka::additional_message_values = table( 341 | ["FIRST_STATIC_NAME"] = "FIRST_STATIC_VALUE", 342 | ["SECOND_STATIC_NAME"] = "SECOND_STATIC_VALUE" 343 | ); 344 | ``` 345 | 346 | ### `tag_json` 347 | 348 | If true, a log stream identifier is appended to each JSON-formatted message. For 349 | example, a Conn::LOG message will look like `{ 'conn' : { ... }}`. 350 | 351 | ``` 352 | redef Kafka::tag_json = T; 353 | ``` 354 | 355 | ### `json_timestamps` 356 | 357 | Uses Ascii log writer for timestamp format. Default is `JSON::TS_EPOCH`. Other 358 | options are `JSON::TS_MILLIS` and `JSON::TS_ISO8601`. 359 | 360 | ``` 361 | redef Kafka::json_timestamps = JSON::TS_ISO8601; 362 | ``` 363 | 364 | ### `max_wait_on_shutdown` 365 | 366 | The maximum number of milliseconds that the plugin will wait for any backlog of 367 | queued messages to be sent to Kafka before forced shutdown. 368 | 369 | ``` 370 | redef Kafka::max_wait_on_shutdown = 3000; 371 | ``` 372 | 373 | ### `debug` 374 | 375 | A comma separated list of debug contexts in librdkafka which you want to 376 | enable. The available contexts are: 377 | * all 378 | * generic 379 | * broker 380 | * topic 381 | * metadata 382 | * feature 383 | * queue 384 | * msg 385 | * protocol 386 | * cgrp 387 | * security 388 | * fetch 389 | * feature 390 | * interceptor 391 | * plugin 392 | * consumer 393 | * admin 394 | 395 | ## Kerberos 396 | 397 | This plugin supports producing messages from a kerberized kafka. There 398 | are a couple of prerequisites and a couple of settings to set. 399 | 400 | ### SASL 401 | If you are using SASL as a security protocol for kafka, then you must have 402 | libsasl or libsasl2 installed. You can tell if sasl is enabled by 403 | running the following from the directory in which you have build 404 | librdkafka: 405 | ``` 406 | examples/rdkafka_example -X builtin.features 407 | builtin.features = gzip,snappy,ssl,sasl,regex 408 | ``` 409 | 410 | ### Producer Config 411 | 412 | As stated above, you can configure the producer kafka configs in 413 | `${ZEEK_HOME}/share/zeek/site/local.zeek`. There are a few configs 414 | necessary to set, which are described 415 | [here](https://github.com/edenhill/librdkafka/wiki/Using-SASL-with-librdkafka). 416 | For an environment where the following is true: 417 | * The broker is `node1:6667` 418 | * This kafka is using `SASL_PLAINTEXT` as the security protocol 419 | * The keytab used is the `metron` keytab 420 | * The service principal for `metron` is `metron@EXAMPLE.COM` 421 | 422 | The kafka topic `zeek` has been given permission for the `metron` user to 423 | write: 424 | ``` 425 | # login using the metron user 426 | kinit -kt /etc/security/keytabs/metron.headless.keytab metron@EXAMPLE.COM 427 | ${KAFKA_HOME}/kafka-broker/bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=node1:2181 --add --allow-principal User:metron --topic zeek 428 | ``` 429 | 430 | The following is how the `${ZEEK_HOME}/share/zeek/site/local.zeek` looks: 431 | ``` 432 | @load packages/metron-bro-plugin-kafka/Apache/Kafka 433 | redef Kafka::logs_to_send = set(HTTP::LOG, DNS::LOG); 434 | redef Kafka::topic_name = "zeek"; 435 | redef Kafka::tag_json = T; 436 | redef Kafka::kafka_conf = table( ["metadata.broker.list"] = "node1:6667" 437 | , ["security.protocol"] = "SASL_PLAINTEXT" 438 | , ["sasl.kerberos.keytab"] = "/etc/security/keytabs/metron.headless.keytab" 439 | , ["sasl.kerberos.principal"] = "metron@EXAMPLE.COM" 440 | ); 441 | ``` 442 | 443 | ## Contributing 444 | 445 | If you are interested in contributing to this plugin, please see the Apache Metron [CONTRIBUTING.md](https://github.com/apache/metron/blob/master/CONTRIBUTING.md). 446 | 447 | -------------------------------------------------------------------------------- /VERSION: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | 0.3.0 19 | -------------------------------------------------------------------------------- /cmake/FindLibRDKafka.cmake: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | find_path(LibRDKafka_ROOT_DIR 19 | NAMES include/librdkafka/rdkafkacpp.h 20 | ) 21 | 22 | find_library(LibRDKafka_LIBRARIES 23 | NAMES rdkafka++ 24 | HINTS ${LibRDKafka_ROOT_DIR}/lib 25 | PATH_SUFFIXES ${CMAKE_LIBRARY_ARCHITECTURE} 26 | ) 27 | 28 | find_library(LibRDKafka_C_LIBRARIES 29 | NAMES rdkafka 30 | HINTS ${LibRDKafka_ROOT_DIR}/lib 31 | PATH_SUFFIXES ${CMAKE_LIBRARY_ARCHITECTURE} 32 | ) 33 | 34 | find_path(LibRDKafka_INCLUDE_DIR 35 | NAMES librdkafka/rdkafkacpp.h 36 | HINTS ${LibRDKafka_ROOT_DIR}/include 37 | ) 38 | 39 | include(FindPackageHandleStandardArgs) 40 | find_package_handle_standard_args(LibRDKafka DEFAULT_MSG 41 | LibRDKafka_LIBRARIES 42 | LibRDKafka_C_LIBRARIES 43 | LibRDKafka_INCLUDE_DIR 44 | ) 45 | 46 | mark_as_advanced( 47 | LibRDKafka_ROOT_DIR 48 | LibRDKafka_LIBRARIES 49 | LibRDKafka_C_LIBRARIES 50 | LibRDKafka_INCLUDE_DIR 51 | ) 52 | -------------------------------------------------------------------------------- /cmake/FindOpenSSL.cmake: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | # - Try to find openssl include dirs and libraries 18 | # 19 | # Usage of this module as follows: 20 | # 21 | # find_package(OpenSSL) 22 | # 23 | # Variables used by this module, they can change the default behaviour and need 24 | # to be set before calling find_package: 25 | # 26 | # OpenSSL_ROOT_DIR Set this variable to the root installation of 27 | # openssl if the module has problems finding the 28 | # proper installation path. 29 | # 30 | # Variables defined by this module: 31 | # 32 | # OPENSSL_FOUND System has openssl, include and library dirs found 33 | # OpenSSL_INCLUDE_DIR The openssl include directories. 34 | # OpenSSL_LIBRARIES The openssl libraries. 35 | # OpenSSL_CYRPTO_LIBRARY The openssl crypto library. 36 | # OpenSSL_SSL_LIBRARY The openssl ssl library. 37 | 38 | find_path(OpenSSL_ROOT_DIR 39 | NAMES include/openssl/ssl.h 40 | ) 41 | 42 | find_path(OpenSSL_INCLUDE_DIR 43 | NAMES openssl/ssl.h 44 | HINTS ${OpenSSL_ROOT_DIR}/include 45 | ) 46 | 47 | find_library(OpenSSL_SSL_LIBRARY 48 | NAMES ssl ssleay32 ssleay32MD 49 | HINTS ${OpenSSL_ROOT_DIR}/lib 50 | PATH_SUFFIXES ${CMAKE_LIBRARY_ARCHITECTURE} 51 | ) 52 | 53 | find_library(OpenSSL_CRYPTO_LIBRARY 54 | NAMES crypto 55 | HINTS ${OpenSSL_ROOT_DIR}/lib 56 | PATH_SUFFIXES ${CMAKE_LIBRARY_ARCHITECTURE} 57 | ) 58 | 59 | set(OpenSSL_LIBRARIES ${OpenSSL_SSL_LIBRARY} ${OpenSSL_CRYPTO_LIBRARY} 60 | CACHE STRING "OpenSSL SSL and crypto libraries" FORCE) 61 | 62 | include(FindPackageHandleStandardArgs) 63 | find_package_handle_standard_args(OpenSSL DEFAULT_MSG 64 | OpenSSL_LIBRARIES 65 | OpenSSL_INCLUDE_DIR 66 | ) 67 | 68 | mark_as_advanced( 69 | OpenSSL_ROOT_DIR 70 | OpenSSL_INCLUDE_DIR 71 | OpenSSL_LIBRARIES 72 | OpenSSL_CRYPTO_LIBRARY 73 | OpenSSL_SSL_LIBRARY 74 | ) 75 | -------------------------------------------------------------------------------- /configure: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | # 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | # 20 | # The upstream version of this is at 21 | # https://github.com/zeek/zeek-aux/blob/master/plugin-support/skeleton/configure 22 | # 23 | # Wrapper for viewing/setting options that the plugin's CMake 24 | # scripts will recognize. 25 | # 26 | # Don't edit this. Edit configure.plugin to add plugin-specific options. 27 | # 28 | 29 | set -e 30 | command="$0 $*" 31 | 32 | if [ -e `dirname $0`/configure.plugin ]; then 33 | # Include custom additions. 34 | . `dirname $0`/configure.plugin 35 | fi 36 | 37 | usage() { 38 | 39 | cat 1>&2 </dev/null 2>&1; then 54 | plugin_usage 1>&2 55 | fi 56 | 57 | echo 58 | 59 | exit 1 60 | } 61 | 62 | # Function to append a CMake cache entry definition to the 63 | # CMakeCacheEntries variable 64 | # $1 is the cache entry variable name 65 | # $2 is the cache entry variable type 66 | # $3 is the cache entry variable value 67 | append_cache_entry () { 68 | CMakeCacheEntries="$CMakeCacheEntries -D $1:$2=$3" 69 | } 70 | 71 | # set defaults 72 | builddir=build 73 | zeekdist="" 74 | installroot="default" 75 | CMakeCacheEntries="" 76 | 77 | while [ $# -ne 0 ]; do 78 | case "$1" in 79 | -*=*) optarg=`echo "$1" | sed 's/[-_a-zA-Z0-9]*=//'` ;; 80 | *) optarg= ;; 81 | esac 82 | 83 | case "$1" in 84 | --help|-h) 85 | usage 86 | ;; 87 | 88 | --cmake=*) 89 | CMakeCommand=$optarg 90 | ;; 91 | 92 | --zeek-dist=*) 93 | zeekdist=`cd $optarg && pwd` 94 | ;; 95 | 96 | --install-root=*) 97 | installroot=$optarg 98 | ;; 99 | 100 | --with-binpac=*) 101 | append_cache_entry BinPAC_ROOT_DIR PATH $optarg 102 | binpac_root=$optarg 103 | ;; 104 | 105 | --with-broker=*) 106 | append_cache_entry BROKER_ROOT_DIR PATH $optarg 107 | broker_root=$optarg 108 | ;; 109 | 110 | --with-caf=*) 111 | append_cache_entry CAF_ROOT_DIR PATH $optarg 112 | caf_root=$optarg 113 | ;; 114 | 115 | --with-bifcl=*) 116 | append_cache_entry BifCl_EXE PATH $optarg 117 | ;; 118 | 119 | --enable-debug) 120 | append_cache_entry BRO_PLUGIN_ENABLE_DEBUG BOOL true 121 | ;; 122 | 123 | *) 124 | if type plugin_option >/dev/null 2>&1; then 125 | plugin_option $1 && shift && continue; 126 | fi 127 | 128 | echo "Invalid option '$1'. Try $0 --help to see available options." 129 | exit 1 130 | ;; 131 | esac 132 | shift 133 | done 134 | 135 | if [ -z "$CMakeCommand" ]; then 136 | # prefer cmake3 over "regular" cmake (cmake == cmake2 on RHEL) 137 | if command -v cmake3 >/dev/null 2>&1 ; then 138 | CMakeCommand="cmake3" 139 | elif command -v cmake >/dev/null 2>&1 ; then 140 | CMakeCommand="cmake" 141 | else 142 | echo "This package requires CMake, please install it first." 143 | echo "Then you may use this script to configure the CMake build." 144 | echo "Note: pass --cmake=PATH to use cmake in non-standard locations." 145 | exit 1; 146 | fi 147 | fi 148 | 149 | if [ -z "$zeekdist" ]; then 150 | if type zeek-config >/dev/null 2>&1; then 151 | zeek_config="zeek-config" 152 | else 153 | echo "Either 'zeek-config' must be in PATH or '--zeek-dist=' used" 154 | exit 1 155 | fi 156 | 157 | append_cache_entry BRO_CONFIG_PREFIX PATH `${zeek_config} --prefix` 158 | append_cache_entry BRO_CONFIG_INCLUDE_DIR PATH `${zeek_config} --include_dir` 159 | append_cache_entry BRO_CONFIG_PLUGIN_DIR PATH `${zeek_config} --plugin_dir` 160 | append_cache_entry BRO_CONFIG_CMAKE_DIR PATH `${zeek_config} --cmake_dir` 161 | append_cache_entry CMAKE_MODULE_PATH PATH `${zeek_config} --cmake_dir` 162 | 163 | build_type=`${zeek_config} --build_type` 164 | 165 | if [ "$build_type" = "debug" ]; then 166 | append_cache_entry BRO_PLUGIN_ENABLE_DEBUG BOOL true 167 | fi 168 | 169 | if [ -z "$binpac_root" ]; then 170 | append_cache_entry BinPAC_ROOT_DIR PATH `${zeek_config} --binpac_root` 171 | fi 172 | 173 | if [ -z "$broker_root" ]; then 174 | append_cache_entry BROKER_ROOT_DIR PATH `${zeek_config} --broker_root` 175 | fi 176 | 177 | if [ -z "$caf_root" ]; then 178 | append_cache_entry CAF_ROOT_DIR PATH `${zeek_config} --caf_root` 179 | fi 180 | else 181 | if [ ! -e "$zeekdist/zeek-path-dev.in" ]; then 182 | echo "$zeekdist does not appear to be a valid Zeek source tree." 183 | exit 1 184 | fi 185 | 186 | # BRO_DIST is the canonical/historical name used by plugin CMake scripts 187 | # ZEEK_DIST doesn't serve a function at the moment, but set/provided anyway 188 | append_cache_entry BRO_DIST PATH $zeekdist 189 | append_cache_entry ZEEK_DIST PATH $zeekdist 190 | append_cache_entry CMAKE_MODULE_PATH PATH $zeekdist/cmake 191 | fi 192 | 193 | if [ "$installroot" != "default" ]; then 194 | mkdir -p $installroot 195 | append_cache_entry BRO_PLUGIN_INSTALL_ROOT PATH $installroot 196 | fi 197 | 198 | echo "Build Directory : $builddir" 199 | echo "Zeek Source Directory : $zeekdist" 200 | 201 | mkdir -p $builddir 202 | cd $builddir 203 | 204 | "$CMakeCommand" $CMakeCacheEntries .. 205 | 206 | echo "# This is the command used to configure this build" > config.status 207 | echo $command >> config.status 208 | chmod u+x config.status 209 | -------------------------------------------------------------------------------- /configure.plugin: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | # 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | # 20 | # Hooks to add custom options to the configure script. 21 | # 22 | 23 | plugin_usage() 24 | { 25 | cat < The version of the metron-bro-plugin-kafka release. [Required]" 24 | echo " -c/--candidate= Defines the Release Candidate. [Required]" 25 | echo " -h/--help Usage information." 26 | echo " " 27 | echo "example: " 28 | echo " metron-bro-kafka-rc-check --version=0.3.0 --candidate=RC2" 29 | echo " " 30 | } 31 | 32 | APACHE_REPO="https://dist.apache.org/repos/dist/" 33 | METRON_DIST=${APACHE_REPO}"dev/metron/metron-bro-plugin-kafka/" 34 | METRON_KEYS=${APACHE_REPO}"release/metron/KEYS" 35 | 36 | # 37 | # runs the package kafka plugin's docker based tests 38 | # 39 | function run_package_docker { 40 | cd docker &> /dev/null || { echo "failed to change directory to docker" ; exit 1; } 41 | ./run_end_to_end.sh 42 | 43 | rc=$?; if [[ ${rc} != 0 ]]; then 44 | echo "ERROR> FAILED run_end_to_end" 45 | # do NOT exit here 46 | fi 47 | cd .. &> /dev/null || { echo "failed to change directory to plugin root"; exit 1; } 48 | } 49 | 50 | # 51 | # runs the finish package docker script to cleanup 52 | # 53 | function finish_package_docker { 54 | cd docker &> /dev/null || { echo "failed to change directory to docker"; exit 1; } 55 | ./finish_end_to_end.sh 56 | 57 | rc=$?; if [[ ${rc} != 0 ]]; then 58 | echo "ERROR> FAILED finish_end_to_end" 59 | exit ${rc} 60 | fi 61 | cd .. &> /dev/null || { echo "failed to change directory to plugin root"; 62 | exit 1; } 63 | } 64 | 65 | # print help, if the user just runs this without any args 66 | if [ "$#" -eq 0 ]; then 67 | help 68 | exit 1 69 | fi 70 | 71 | # handle command line options 72 | for i in "$@"; do 73 | case $i in 74 | # 75 | # VERSION: The release version of Metron to validate. 76 | # 77 | # 78 | -v=*|--version=*) 79 | VERSION="${i#*=}" 80 | shift # past argument=value 81 | ;; 82 | 83 | # 84 | # RC: Defines the RC# to use 85 | # 86 | # -c=RC2 87 | # --candidate=RC2 88 | # 89 | -c=*|--candidate=*) 90 | CANDIDATE="${i#*=}" 91 | shift # past argument=value 92 | ;; 93 | 94 | # 95 | # -h/--help 96 | # 97 | -h|--help) 98 | help 99 | exit 0 100 | shift # past argument with no value 101 | ;; 102 | 103 | # 104 | # Unknown option 105 | # 106 | *) 107 | UNKNOWN_OPTION="${i#*=}" 108 | echo "Error: unknown option: $UNKNOWN_OPTION" 109 | help 110 | ;; 111 | esac 112 | done 113 | 114 | # validation 115 | if [ -z "$VERSION" ]; then 116 | echo "Missing -v/--version is is required" 117 | exit 1 118 | fi 119 | if [[ "$VERSION" =~ ^[0-9]{1,2}\.[0-9]{1,2}\.[0-9]{1,2} ]]; then 120 | PLUGIN_VERSION="$VERSION" 121 | else 122 | echo "[ERROR] \"$VERSION\" may not be a valid version number" 123 | exit 1 124 | fi 125 | 126 | if [ -z "$CANDIDATE" ]; then 127 | echo "Missing -c/--candidate which is required" 128 | exit 1 129 | fi 130 | 131 | if [[ "$CANDIDATE" =~ ^RC[0-9]+ ]]; then 132 | RC=$(echo "$CANDIDATE" | tr '[:upper:]' '[:lower:]') 133 | UPPER_RC=$(echo "$CANDIDATE" | tr '[:lower:]' '[:upper:]') 134 | elif [[ "$CANDIDATE" =~ ^[0-9]+ ]]; then 135 | RC=rc"$CANDIDATE" 136 | UPPER_RC=RC"$CANDIDATE" 137 | else 138 | echo "[ERROR] invalid RC, valid is RC# or just #" 139 | exit 1 140 | fi 141 | 142 | echo "metron-bro-plugin-kafka Version $PLUGIN_VERSION" 143 | echo "Release Candidate $RC" 144 | 145 | PLUGIN_RC_DIST="$METRON_DIST$PLUGIN_VERSION-$UPPER_RC" 146 | echo "metron-bro-plugin-kafka RC Distribution Root is $PLUGIN_RC_DIST" 147 | 148 | # working directory 149 | WORK="$HOME/tmp/metron-bro-plugin-kafka_$PLUGIN_VERSION-$RC" 150 | 151 | # handle tilde expansion 152 | WORK="${WORK/#\~/$HOME}" 153 | 154 | # warn the user if the working directory exists 155 | if [ -d "$WORK" ]; then 156 | echo "[ERROR] Directory $WORK exists, please rename it and start over" 157 | exit 1 158 | fi 159 | 160 | if [ ! -d "$WORK" ]; then 161 | mkdir -p "$WORK" 162 | fi 163 | echo "Working directory $WORK" 164 | 165 | PLUGIN_ASSEMBLY="$PLUGIN_RC_DIST/apache-metron-bro-plugin-kafka_$PLUGIN_VERSION-$RC.tar.gz" 166 | PLUGIN_ASSEMBLY_SIG="$PLUGIN_ASSEMBLY.asc" 167 | 168 | 169 | echo "Downloading $METRON_KEYS" 170 | if ! wget -P "$WORK" "$METRON_KEYS" ; then 171 | echo "[ERROR] Failed to download $METRON_KEYS" 172 | exit 1 173 | fi 174 | 175 | echo "Downloading $PLUGIN_ASSEMBLY" 176 | if ! wget -P "$WORK" "$PLUGIN_ASSEMBLY" ; then 177 | echo "[ERROR] Failed to download $PLUGIN_ASSEMBLY" 178 | exit 1 179 | fi 180 | 181 | echo "Downloading $PLUGIN_ASSEMBLY_SIG" 182 | if ! wget -P "$WORK" "$PLUGIN_ASSEMBLY_SIG" ; then 183 | echo "[ERROR] Failed to download $PLUGIN_ASSEMBLY_SIG" 184 | exit 1 185 | fi 186 | 187 | cd "$WORK" || exit 1 188 | echo "importing metron keys" 189 | 190 | if ! gpg --import KEYS ; then 191 | echo "[ERROR] failed to import KEYS" 192 | exit 1 193 | fi 194 | 195 | echo "Verifying metron-bro-plugin-kafka Assembly" 196 | if ! gpg --verify ./"apache-metron-bro-plugin-kafka_$PLUGIN_VERSION-$RC.tar.gz.asc" "apache-metron-bro-plugin-kafka_$PLUGIN_VERSION-$RC.tar.gz" ; then 197 | echo "[ERROR] failed to verify metron-bro-plugin-kafka Assembly" 198 | exit 1 199 | fi 200 | 201 | echo "Unpacking Assemblies" 202 | if ! tar -xzf "apache-metron-bro-plugin-kafka_$PLUGIN_VERSION-$RC.tar.gz" ; then 203 | echo "[ERROR] failed to unpack metron-bro-plugin-kafka Assembly" 204 | exit 1 205 | fi 206 | 207 | echo "" 208 | echo "" 209 | read -p " run test suite [yN] " -n 1 -r 210 | echo 211 | if [[ $REPLY =~ ^[Yy]$ ]]; then 212 | echo " please verify that no metron-bro-plugin-kafka docker containers are running before continuing," 213 | read -p " no metron-bro-plugin-kafka docker containers are running, ready to proceed [yN] " -n 1 -r 214 | if [[ $REPLY =~ ^[Yy]$ ]]; then 215 | cd "apache-metron-bro-plugin-kafka_$PLUGIN_VERSION-$RC" || exit 1 216 | run_package_docker 217 | finish_package_docker 218 | else 219 | echo " when you are ready and the containers are stopped, please cd into the docker" 220 | echo " directory and execute the run_end_to_end.sh script" 221 | fi 222 | cd .. || exit 1 223 | fi 224 | 225 | -------------------------------------------------------------------------------- /docker/README.md: -------------------------------------------------------------------------------- 1 | 15 | 16 | ## Docker support for testing metron-bro-plugin-kafka 17 | 18 | These scripts and containers provide support for building and testing Zeek and the metron-bro-plugin-kafka using a number of Docker containers. 19 | The use of these scripts and containers allow an easier, automated workflow for testing new features, fixes, or regressions than before. 20 | One of the goals is for this to be extensible, such that new scripts can be introduced and run as well. This will allow, for example, one or more 21 | testing scripts to be added to a pull request, and subsequently to a test suite. 22 | 23 | 24 | #### Directories 25 | 26 | ```bash 27 | ├── containers 28 | │   └── zeek 29 | │   └── kafka 30 | │   └── zookeeper 31 | ├── data 32 | ├── in_docker_scripts 33 | ├── scripts 34 | └── test_output 35 | ``` 36 | - `containers`: The parent of all of the containers that this project defines. We use several containers, not all of them ours. 37 | - `zeek`: The directory for our zeek container, used for building zeek, the librdkafka, and our plugin, as well as running zeek. 38 | - `kafka`: The directory for our kafka container. 39 | - `zookeeper`: The directory for our zookeeper container. 40 | - `data`: The default path for pcap data to be used in tests. 41 | - `in_docker_scripts`: This directory is mapped to the zeek docker container as /root/built_in_scripts. These represent the library of scripts we provide to be run in the docker container. 42 | - `scripts`: These are the scripts that are run on the host for creating the docker bits, running containers, running or executing commands against containers ( such as executing one of the built_in_scripts ), and cleaning up resources. 43 | - `test_output`: Directory where the zeek logs and kafka logs per test/pcap are stored. 44 | 45 | 46 | #### Scripts that execute _in_ the docker container 47 | 48 | ```bash 49 | ├── build_plugin.sh 50 | ├── configure_plugin.sh 51 | ├── process_data_file.sh 52 | ``` 53 | 54 | - `build_plugin.sh`: Runs `zkg` to build and install the provided version of the plugin. 55 | - `configure_plugin.sh`: Configures the plugin for the kafka container, and routes all traffic types. 56 | ###### Parameters 57 | ```bash 58 | --kafka-topic [OPTIONAL] The kafka topic to configure. Default: zeek" 59 | ``` 60 | - `process_data_file.sh`: Runs `zeek -r` on the passed file 61 | 62 | 63 | #### Scripts executed on the host to setup and interact with the docker containers 64 | 65 | ```bash 66 | ├── analyze_results.sh 67 | ├── docker_execute_build_plugin.sh 68 | ├── docker_execute_configure_plugin.sh 69 | ├── docker_execute_create_topic_in_kafka.sh 70 | ├── docker_execute_process_data_file.sh 71 | ├── docker_execute_shell.sh 72 | ├── docker_run_consume_kafka.sh 73 | ├── docker_run_get_offset_kafka.sh 74 | ├── download_sample_pcaps.sh 75 | ├── print_results.sh 76 | ├── split_kakfa_output_by_log.sh 77 | ``` 78 | 79 | - `analyze_results.sh`: Analyzes the `results.csv` files for any issues 80 | ###### Parameters 81 | ```bash 82 | --test-directory [REQUIRED] The directory for the tests 83 | ``` 84 | - `docker_execute_build_plugin.sh`: Executes `build_plugin.sh` in the zeek container 85 | ###### Parameters 86 | ```bash 87 | --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1 88 | ``` 89 | - `docker_execute_configure_plugin.sh`: Executes `configure_plugin.sh` in the zeek container 90 | ###### Parameters 91 | ```bash 92 | --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1 93 | ``` 94 | - `docker_execute_create_topic_in_kafka.sh`: Creates the specified kafka topic in the kafka container 95 | ###### Parameters 96 | ```bash 97 | --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_kafka-1_1 98 | --kafka-topic [OPTIONAL] The kafka topic to create. Default: zeek 99 | --partitions [OPTIONAL] The number of kafka partitions to create. Default: 2 100 | ``` 101 | - `docker_execute_process_data_file.sh`: Executes `process_data_file.sh` in the zeek container 102 | ###### Parameters 103 | ```bash 104 | --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1 105 | ``` 106 | - `docker_execute_shell.sh`: `docker execute -i -t bash` to get a shell in a given container 107 | ###### Parameters 108 | ```bash 109 | --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1 110 | ``` 111 | - `docker_run_consume_kafka.sh`: Runs an instance of the kafka container, with the console consumer `kafka-console-consumer.sh --topic $KAFKA_TOPIC --offset $OFFSET --partition $PARTITION --bootstrap-server kafka-1:9092` 112 | ###### Parameters 113 | ```bash 114 | --network-name [OPTIONAL] The Docker network name. Default: metron-bro-plugin-kafka_default 115 | --offset [OPTIONAL] The kafka offset to read from. Default: 0 116 | --partition [OPTIONAL] The kafka partition to read from. Default: 0 117 | --kafka-topic [OPTIONAL] The kafka topic to consume from. Default: zeek 118 | ``` 119 | - `docker_run_get_offset_kafka.sh`: Runs an instance of the kafka container and gets the current offset for the specified topic 120 | ###### Parameters 121 | ```bash 122 | --network-name [OPTIONAL] The Docker network name. Default: metron-bro-plugin-kafka_default 123 | --kafka-topic [OPTIONAL] The kafka topic to get the offset from. Default: zeek 124 | ``` 125 | - `download_sample_pcaps.sh`: Downloads the sample pcaps to a specified directory. If they exist, it is a no-op 126 | 127 | > The sample pcaps are: 128 | > - https://github.com/zeek/try-zeek/blob/master/manager/static/pcaps/exercise_traffic.pcap 129 | > - http://downloads.digitalcorpora.org/corpora/network-packet-dumps/2008-nitroba/nitroba.pcap 130 | > - https://github.com/zeek/try-zeek/raw/master/manager/static/pcaps/ssh.pcap 131 | > - https://github.com/markofu/pcaps/blob/master/PracticalPacketAnalysis/ppa-capture-files/ftp.pcap?raw=true 132 | > - https://github.com/EmpowerSecurityAcademy/wireshark/blob/master/radius_localhost.pcapng?raw=true 133 | > - https://github.com/kholia/my-pcaps/blob/master/VNC/07-vnc 134 | 135 | ###### Parameters 136 | ```bash 137 | --data-path [REQUIRED] The pcap data path 138 | ``` 139 | - `print_results.sh`: Prints the `results.csv` for all the pcaps processed in the given directory to console 140 | ###### Parameters 141 | ```bash 142 | --test-directory [REQUIRED] The directory for the tests 143 | ``` 144 | - `split_kafka_output_by_log.sh`: For a pcap result directory, will create a LOG.kafka.log for each LOG.log's entry in the kafka-output.log 145 | ###### Parameters 146 | ```bash 147 | --log-directory [REQUIRED] The directory with the logs 148 | ``` 149 | 150 | #### The example end to end test script 151 | 152 | `run_end_to_end.sh` is provided as an example of a testing script. Specific or extended scripts can be created similar to this script to use the containers. 153 | This script does the following: 154 | 155 | 1. Runs docker compose 156 | 1. Creates the specified topic with the specified number of partitions 157 | 1. Downloads sample PCAP data 158 | 1. Runs the zeek container in the background 159 | 1. Builds the zeek plugin 160 | 1. Configures the zeek plugin 161 | 1. Runs zeek against all the pcap data, one at a time 162 | 1. Executes a kafka client to read the data from zeek for each pcap file 163 | 1. Stores the output kafka messages and the zeek logs into the test_output directory 164 | ```bash 165 | >tree Tue_Jan__8_21_54_10_EST_2019 166 | Tue_Jan__8_21_54_10_EST_2019 167 | ├── exercise-traffic_pcap 168 | │   ├── capture_loss.log 169 | │   ├── conn.log 170 | │   ├── dhcp.log 171 | │   ├── dns.log 172 | │   ├── files.log 173 | │   ├── http.log 174 | │   ├── kafka-output.log 175 | │   ├── known_certs.log 176 | │   ├── loaded_scripts.log 177 | │   ├── notice.log 178 | │   ├── packet_filter.log 179 | │   ├── reporter.log 180 | │   ├── smtp.log 181 | │   ├── software.log 182 | │   ├── ssl.log 183 | │   ├── stats.log 184 | │   ├── weird.log 185 | │   └── x509.log 186 | ├── ftp_pcap 187 | │   ├── capture_loss.log 188 | │   ├── conn.log 189 | │   ├── files.log 190 | │   ├── ftp.log 191 | │   ├── kafka-output.log 192 | │   ├── loaded_scripts.log 193 | │   ├── packet_filter.log 194 | │   ├── reporter.log 195 | │   ├── software.log 196 | │   └── stats.log 197 | ``` 198 | 1. Creates a results.csv for each pcap that has the line counts of the kafka and the zeek output for each log 199 | 1. Prints all the results.csv to the screen 200 | 201 | As we can see, the output is a folder named for the test run time, with a sub folder per pcap, containing all the zeek logs and the `kafka_output.log`. 202 | 203 | 204 | At this point the containers are up and running in the background. 205 | 206 | Other scripts may then be used to do your testing, for example running: 207 | ```bash 208 | ./scripts/docker_execute_shell.sh 209 | ``` 210 | 211 | > NOTE: If the scripts are run repeatedly, and there is no change in zeek or the librdkafka, the line `./run_end_to_end.sh` can be replaced by `./run_end_to_end.sh --skip-docker-build`, which uses the `--skip-docker-build` flag to not rebuild the containers, saving the significant time of rebuilding zeek and librdkafka. 212 | 213 | > NOTE: After you are done, you must call the `finish_end_to_end.sh` script to cleanup. 214 | 215 | 216 | ##### `run_end_to_end.sh` 217 | ###### Parameters 218 | ```bash 219 | --skip-docker-build [OPTIONAL] Skip build of zeek docker machine. 220 | --no-pcaps [OPTIONAL] Do not run pcaps. 221 | --data-path [OPTIONAL] The pcap data path. Default: ./data 222 | --kafka-topic [OPTIONAL] The kafka topic name to use. Default: zeek 223 | --partitions [OPTIONAL] The number of kafka partitions to create. Default: 2 224 | --plugin-version [OPTIONAL] The plugin version. Default: the current branch name 225 | ``` 226 | 227 | > NOTE: The provided `--plugin-version` is passed to the [`zkg install`](https://docs.zeek.org/projects/package-manager/en/stable/zeek-pkg.html#install-command) command within the container, which allows you to specify a version tag, branch name, or commit hash. However, that tag, branch, or commit *must* be available in the currently checked out plugin repository. 228 | 229 | -------------------------------------------------------------------------------- /docker/containers/kafka/Dockerfile: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | ARG FROM_IMAGE="wurstmeister/kafka" 18 | ARG FROM_IMAGE_TAG="2.12-2.5.0" 19 | 20 | FROM "${FROM_IMAGE}":"${FROM_IMAGE_TAG}" 21 | 22 | HEALTHCHECK --interval=5s --timeout=10s --start-period=2s --retries=3 \ 23 | CMD JMX_PORT= /opt/kafka/bin/kafka-configs.sh --describe --zookeeper zookeeper:2181 --entity-type brokers || exit 1 24 | 25 | -------------------------------------------------------------------------------- /docker/containers/zeek/.screenrc: -------------------------------------------------------------------------------- 1 | # terminfo and termcap for nice 256 color terminal 2 | # allow bold colors - necessary for some reason 3 | attrcolor b ".I" 4 | 5 | # tell screen how to set colors. AB = background, AF=foreground 6 | termcapinfo xterm 'Co#256:AB=\E[48;5;%dm:AF=\E[38;5;%dm' 7 | 8 | # erase background with current bg color 9 | defbce "on" 10 | 11 | # the status at the bottom of the window 12 | hardstatus alwayslastline 13 | hardstatus string '%{gk}[ %{G}%H %{g}][%S][%= %{wk}%?%-Lw%?%{=b kR}(%{W}%n*%f %t%?(%u)%?%{=b kR})%{= kw}%?%+Lw%?%?%= %{g}][%{Y}%l%{g}]%{=b C}[ %m/%d %c ]%{W}' 14 | 15 | #turn off the startup banner 16 | startup_message off 17 | 18 | #i want to see all screen messages for a longer time 19 | msgwait 86400 20 | 21 | # Set scrollback to 20k 22 | defscrollback 20000 23 | -------------------------------------------------------------------------------- /docker/containers/zeek/Dockerfile: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | ARG FROM_IMAGE="centos" 18 | ARG FROM_IMAGE_TAG="8" 19 | 20 | FROM "${FROM_IMAGE}":"${FROM_IMAGE_TAG}" 21 | 22 | ARG ZEEK_VERSION 23 | ARG LIBRDKAFKA_VERSION 24 | 25 | # install epel for screen 26 | RUN dnf install -y epel-release 27 | 28 | # copy in the .screenrc 29 | COPY .screenrc /root 30 | 31 | # install powertools for libpcap-devel 32 | RUN dnf install -y 'dnf-command(config-manager)' && \ 33 | dnf config-manager --set-enabled PowerTools 34 | 35 | # install prereqs then clean dnf cache 36 | RUN dnf -y update && \ 37 | dnf -y install cmake make gcc gcc-c++ \ 38 | flex bison libpcap libpcap-devel \ 39 | openssl-devel python3 platform-python-devel \ 40 | swig zlib-devel perl \ 41 | cyrus-sasl cyrus-sasl-devel cyrus-sasl-gssapi \ 42 | git jq screen tree vim && \ 43 | dnf -y clean all 44 | 45 | # install zeek 46 | WORKDIR /root 47 | RUN git clone https://github.com/zeek/zeek 48 | WORKDIR zeek/ 49 | RUN git checkout "v${ZEEK_VERSION}" && \ 50 | git submodule update --init --recursive && \ 51 | ./configure && \ 52 | make && \ 53 | make install 54 | ENV PATH="${PATH}:/usr/local/zeek/bin" 55 | ENV PATH="${PATH}:/usr/bin" 56 | 57 | # install pip3 and zkg 58 | WORKDIR /root 59 | COPY requirements.txt requirements.txt 60 | RUN dnf -y install python3-pip diffutils && \ 61 | dnf clean all && \ 62 | python3 -m pip install --upgrade pip && \ 63 | python3 -m pip install -r requirements.txt && \ 64 | zkg autoconfig 65 | 66 | # install librdkafka 67 | WORKDIR /root 68 | RUN curl -L "https://github.com/edenhill/librdkafka/archive/v${LIBRDKAFKA_VERSION}.tar.gz" | tar xvz 69 | WORKDIR "librdkafka-${LIBRDKAFKA_VERSION}/" 70 | RUN ./configure --enable-sasl && \ 71 | make && \ 72 | make install 73 | 74 | -------------------------------------------------------------------------------- /docker/containers/zeek/Makefile: -------------------------------------------------------------------------------- 1 | requirements: requirements-to-freeze.txt 2 | @docker run --rm -v $$(pwd):/usr/src/app/ python:3 /bin/bash -c "python3 -m pip install --upgrade pip && python3 -m pip install -r /usr/src/app/requirements-to-freeze.txt && python3 -m pip freeze > /usr/src/app/requirements.txt" 3 | -------------------------------------------------------------------------------- /docker/containers/zeek/requirements-to-freeze.txt: -------------------------------------------------------------------------------- 1 | zkg 2 | -------------------------------------------------------------------------------- /docker/containers/zeek/requirements.txt: -------------------------------------------------------------------------------- 1 | btest==0.61 2 | configparser==5.0.0 3 | gitdb==4.0.5 4 | GitPython==3.1.2 5 | semantic-version==2.8.5 6 | smmap==3.0.4 7 | zkg==2.1.2 8 | -------------------------------------------------------------------------------- /docker/containers/zookeeper/Dockerfile: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | ARG FROM_IMAGE="zookeeper" 18 | ARG FROM_IMAGE_TAG="3.4" 19 | 20 | FROM "${FROM_IMAGE}":"${FROM_IMAGE_TAG}" 21 | 22 | HEALTHCHECK --interval=2s --timeout=1s --start-period=.5s --retries=4 \ 23 | CMD echo ruok | nc localhost 2181 && echo stat | nc localhost 2181 | grep Mode || exit 1 24 | 25 | -------------------------------------------------------------------------------- /docker/data/.gitignore: -------------------------------------------------------------------------------- 1 | # Ignore everything in this directory 2 | * 3 | # Except this file 4 | !.gitignore -------------------------------------------------------------------------------- /docker/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '2.4' 2 | services: 3 | zookeeper: 4 | build: containers/zookeeper 5 | image: metron-bro-plugin-kafka_zookeeper:latest 6 | kafka-1: 7 | build: containers/kafka 8 | image: metron-bro-plugin-kafka_kafka:latest 9 | depends_on: 10 | zookeeper: 11 | condition: service_healthy 12 | environment: 13 | - KAFKA_BROKER_ID=1 14 | - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 15 | - KAFKA_LISTENERS=PLAINTEXT://kafka-1:9092 16 | - KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka-1:9092 17 | kafka-2: 18 | build: containers/kafka 19 | image: metron-bro-plugin-kafka_kafka:latest 20 | depends_on: 21 | zookeeper: 22 | condition: service_healthy 23 | environment: 24 | - KAFKA_BROKER_ID=2 25 | - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 26 | - KAFKA_LISTENERS=PLAINTEXT://kafka-2:9092 27 | - KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka-2:9092 28 | zeek: 29 | build: 30 | context: containers/zeek 31 | args: 32 | ZEEK_VERSION: "3.2.1" 33 | LIBRDKAFKA_VERSION: "1.4.2" 34 | image: metron-bro-plugin-kafka_zeek:latest 35 | depends_on: 36 | zookeeper: 37 | condition: service_healthy 38 | kafka-1: 39 | condition: service_healthy 40 | kafka-2: 41 | condition: service_healthy 42 | volumes: 43 | - "${DATA_PATH}:/root/data" 44 | - "${TEST_OUTPUT_PATH}:/root/test_output" 45 | - "${PLUGIN_ROOT_DIR}:/root/code" 46 | - "${OUR_SCRIPTS_PATH}:/root/built_in_scripts" 47 | tty: true 48 | -------------------------------------------------------------------------------- /docker/finish_end_to_end.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | # 21 | # This script should be run _after_ run_end_to_end.sh when you are finished with your testing and the containers. 22 | # Do not run this if you plan on running docker_execute_shell.sh for example 23 | # 24 | 25 | shopt -s nocasematch 26 | set -u # nounset 27 | set -e # errexit 28 | set -E # errtrap 29 | set -o pipefail 30 | 31 | PROJECT_NAME="metron-bro-plugin-kafka" 32 | 33 | # Stop docker compose 34 | COMPOSE_PROJECT_NAME="${PROJECT_NAME}" docker-compose down 35 | 36 | -------------------------------------------------------------------------------- /docker/in_docker_scripts/build_plugin.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | shopt -s globstar nullglob 22 | shopt -s nocasematch 23 | set -u # nounset 24 | # set -e (errexit) omitted to enable printfiles function call 25 | set -E # errtrap 26 | set -o pipefail 27 | 28 | # 29 | # Runs zkg to build and install the plugin 30 | # 31 | 32 | function help { 33 | echo " " 34 | echo "usage: ${0}" 35 | echo " --plugin-version [REQUIRED] The plugin version." 36 | echo " -h/--help Usage information." 37 | echo " " 38 | echo " " 39 | } 40 | 41 | function printfiles { 42 | echo "===================================================" 43 | echo "ERR" 44 | cat /root/.zkg/testing/code/clones/code/zkg.test_command.stderr 45 | echo "===================================================" 46 | echo "OUT" 47 | cat /root/.zkg/testing/code/clones/code/zkg.test_command.stdout 48 | echo "===================================================" 49 | echo "" 50 | echo "===================================================" 51 | echo "" 52 | } 53 | 54 | PLUGIN_VERSION= 55 | 56 | # Handle command line options 57 | for i in "$@"; do 58 | case $i in 59 | # 60 | # PLUGIN_VERSION 61 | # 62 | # --plugin-version 63 | # 64 | --plugin-version=*) 65 | PLUGIN_VERSION="${i#*=}" 66 | shift # past argument=value 67 | ;; 68 | 69 | # 70 | # -h/--help 71 | # 72 | -h | --help) 73 | help 74 | exit 0 75 | shift # past argument with no value 76 | ;; 77 | 78 | # 79 | # Unknown option 80 | # 81 | *) 82 | UNKNOWN_OPTION="${i#*=}" 83 | echo "Error: unknown option: $UNKNOWN_OPTION" 84 | help 85 | ;; 86 | esac 87 | done 88 | 89 | if [[ -z "${PLUGIN_VERSION}" ]]; then 90 | echo "PLUGIN_VERSION must be passed" 91 | exit 1 92 | fi 93 | 94 | echo "PLUGIN_VERSION = ${PLUGIN_VERSION}" 95 | 96 | cd /root || exit 1 97 | 98 | echo "===================================================" 99 | 100 | zkg -vvv test code --version "${PLUGIN_VERSION}" 101 | rc=$?; if [[ ${rc} != 0 ]]; then 102 | echo "ERROR running zkg test ${rc}" 103 | printfiles 104 | exit ${rc} 105 | fi 106 | 107 | zkg -vvv install code --skiptests --version "${PLUGIN_VERSION}" --force 108 | rc=$?; if [[ ${rc} != 0 ]]; then 109 | echo "ERROR running zkg install ${rc}" 110 | printfiles 111 | exit ${rc} 112 | fi 113 | 114 | zeek -NN Apache::Kafka 115 | 116 | echo "===================================================" 117 | echo "" 118 | 119 | -------------------------------------------------------------------------------- /docker/in_docker_scripts/configure_plugin.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | 22 | # 23 | # Configures the zeek kafka plugin 24 | # Configures the kafka broker 25 | # Configures the plugin for all the traffic types 26 | # Configures the plugin to add some additional json values 27 | # 28 | 29 | function help { 30 | echo " " 31 | echo "usage: ${0}" 32 | echo " --kafka-topic [OPTIONAL] The kafka topic to configure. Default: zeek" 33 | echo " -h/--help Usage information." 34 | echo " " 35 | echo " " 36 | } 37 | 38 | KAFKA_TOPIC=zeek 39 | 40 | # Handle command line options 41 | for i in "$@"; do 42 | case $i in 43 | # 44 | # KAFKA_TOPIC 45 | # 46 | # --kafka-topic 47 | # 48 | --kafka-topic=*) 49 | KAFKA_TOPIC="${i#*=}" 50 | shift # past argument=value 51 | ;; 52 | # 53 | # -h/--help 54 | # 55 | -h | --help) 56 | help 57 | exit 0 58 | shift # past argument with no value 59 | ;; 60 | # 61 | # Unknown option 62 | # 63 | *) 64 | UNKNOWN_OPTION="${i#*=}" 65 | echo "Error: unknown option: $UNKNOWN_OPTION" 66 | help 67 | ;; 68 | esac 69 | done 70 | 71 | echo "Configuring kafka plugin" 72 | { 73 | echo "@load packages" 74 | echo "redef Kafka::logs_to_send = set(HTTP::LOG, DNS::LOG, Conn::LOG, DPD::LOG, FTP::LOG, Files::LOG, Known::CERTS_LOG, SMTP::LOG, SSL::LOG, Weird::LOG, Notice::LOG, DHCP::LOG, SSH::LOG, Software::LOG, RADIUS::LOG, X509::LOG, RFB::LOG, Stats::LOG, CaptureLoss::LOG, SIP::LOG);" 75 | echo "redef Kafka::topic_name = \"${KAFKA_TOPIC}\";" 76 | echo "redef Kafka::tag_json = T;" 77 | echo "redef Kafka::kafka_conf = table([\"metadata.broker.list\"] = \"kafka-1:9092,kafka-2:9092\");" 78 | echo "redef Kafka::additional_message_values = table([\"FIRST_STATIC_NAME\"] = \"FIRST_STATIC_VALUE\", [\"SECOND_STATIC_NAME\"] = \"SECOND_STATIC_VALUE\");" 79 | echo "redef Kafka::logs_to_exclude = set(Conn::LOG, DHCP::LOG);" 80 | echo "redef Known::cert_tracking = ALL_HOSTS;" 81 | echo "redef Software::asset_tracking = ALL_HOSTS;" 82 | } >> /usr/local/zeek/share/zeek/site/local.zeek 83 | 84 | # Comment out the load statement for "log-hostcerts-only.zeek" in zeek's 85 | # default local.zeek as of 3.1.2 in order to log all certificates to x509.log 86 | sed -i 's%^@load protocols/ssl/log-hostcerts-only%#&%' /usr/local/zeek/share/zeek/site/local.zeek 87 | 88 | -------------------------------------------------------------------------------- /docker/in_docker_scripts/process_data_file.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | # shellcheck disable=SC2010 3 | 4 | # 5 | # Licensed to the Apache Software Foundation (ASF) under one or more 6 | # contributor license agreements. See the NOTICE file distributed with 7 | # this work for additional information regarding copyright ownership. 8 | # The ASF licenses this file to You under the Apache License, Version 2.0 9 | # (the "License"); you may not use this file except in compliance with 10 | # the License. You may obtain a copy of the License at 11 | # 12 | # http://www.apache.org/licenses/LICENSE-2.0 13 | # 14 | # Unless required by applicable law or agreed to in writing, software 15 | # distributed under the License is distributed on an "AS IS" BASIS, 16 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 17 | # See the License for the specific language governing permissions and 18 | # limitations under the License. 19 | # 20 | 21 | 22 | shopt -s nocasematch 23 | shopt -s globstar nullglob 24 | shopt -s nocasematch 25 | set -u # nounset 26 | set -e # errexit 27 | set -E # errtrap 28 | set -o pipefail 29 | 30 | PCAP_FILE_NAME= 31 | OUTPUT_DIRECTORY_NAME= 32 | 33 | # Handle command line options 34 | for i in "$@"; do 35 | case $i in 36 | # 37 | # PCAP_FILE_NAME 38 | # 39 | # --pcap-file-name 40 | # 41 | --pcap-file-name=*) 42 | PCAP_FILE_NAME="${i#*=}" 43 | shift # past argument=value 44 | ;; 45 | 46 | # 47 | # OUTPUT_DIRECTORY_NAME 48 | # 49 | # --output-directory-name 50 | # 51 | --output-directory-name=*) 52 | OUTPUT_DIRECTORY_NAME="${i#*=}" 53 | shift # past argument=value 54 | ;; 55 | 56 | # 57 | # Unknown option 58 | # 59 | *) 60 | UNKNOWN_OPTION="${i#*=}" 61 | echo "Error: unknown option: $UNKNOWN_OPTION" 62 | help 63 | ;; 64 | esac 65 | done 66 | 67 | echo "PCAP_FILE_NAME = ${PCAP_FILE_NAME}" 68 | echo "OUTPUT_DIRECTORY_NAME = ${OUTPUT_DIRECTORY_NAME}" 69 | 70 | echo "================================" 71 | if [ ! -d /root/data ]; then 72 | echo "DATA_PATH is not available" 73 | exit 1 74 | fi 75 | 76 | cd /root/test_output/"${OUTPUT_DIRECTORY_NAME}" || exit 1 77 | find /root/data -type f -name "${PCAP_FILE_NAME}" -print0 | xargs -0 zeek /usr/local/zeek/share/zeek/site/local.zeek -C -r 78 | echo "done with ${PCAP_FILE_NAME}" 79 | 80 | -------------------------------------------------------------------------------- /docker/run_end_to_end.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -E # errtrap 23 | set -o pipefail 24 | 25 | function help { 26 | echo " " 27 | echo "USAGE" 28 | echo " --skip-docker-build [OPTIONAL] Skip build of zeek docker machine." 29 | echo " --data-path [OPTIONAL] The pcap data path. Default: ./data" 30 | echo " --kafka-topic [OPTIONAL] The kafka topic to consume from. Default: zeek" 31 | echo " --partitions [OPTIONAL] The number of kafka partitions to create. Default: 2" 32 | echo " --plugin-version [OPTIONAL] The plugin version. Default: the current branch name" 33 | echo " --no-pcap [OPTIONAL] Do not run pcaps." 34 | echo " -h/--help Usage information." 35 | echo " " 36 | echo "COMPATABILITY" 37 | echo " bash >= 4.0 is required." 38 | echo " " 39 | } 40 | 41 | # Require bash >= 4 42 | if (( BASH_VERSINFO[0] < 4 )); then 43 | >&2 echo "ERROR> bash >= 4.0 is required" >&2 44 | help 45 | exit 1 46 | fi 47 | 48 | SKIP_REBUILD_ZEEK=false 49 | NO_PCAP=false 50 | ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" > /dev/null && pwd)" 51 | PLUGIN_ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && cd .. > /dev/null && pwd)" 52 | SCRIPT_DIR="${ROOT_DIR}"/scripts 53 | DATA_PATH="${ROOT_DIR}"/data 54 | DATE=$(date) 55 | LOG_DATE=${DATE// /_} 56 | TEST_OUTPUT_PATH="${ROOT_DIR}/test_output/"${LOG_DATE//:/_} 57 | KAFKA_TOPIC="zeek" 58 | PARTITIONS=2 59 | PROJECT_NAME="metron-bro-plugin-kafka" 60 | OUR_SCRIPTS_PATH="${PLUGIN_ROOT_DIR}/docker/in_docker_scripts" 61 | 62 | cd "${PLUGIN_ROOT_DIR}" || { echo "NO PLUGIN ROOT" ; exit 1; } 63 | # we may not be checked out from git, check and make it so that we are since 64 | # zkg requires it 65 | 66 | git status &>/dev/null 67 | rc=$?; if [[ ${rc} != 0 ]]; then 68 | echo "zkg requires the plugin to be a git repo, creating..." 69 | git init . 70 | rc=$?; if [[ ${rc} != 0 ]]; then 71 | echo "ERROR> FAILED TO INITIALIZE GIT IN PLUGIN DIRECTORY. ${rc}" 72 | exit ${rc} 73 | fi 74 | git add . 75 | rc=$?; if [[ ${rc} != 0 ]]; then 76 | echo "ERROR> FAILED TO ADD ALL TO GIT PLUGIN DIRECTORY. ${rc}" 77 | exit ${rc} 78 | fi 79 | git commit -m 'docker run' 80 | rc=$?; if [[ ${rc} != 0 ]]; then 81 | echo "ERROR> FAILED TO COMMIT TO GIT MASTER IN PLUGIN DIRECTORY. ${rc}" 82 | exit ${rc} 83 | fi 84 | echo "git repo created" 85 | fi 86 | 87 | # set errexit for the rest of the run 88 | set -e 89 | 90 | # use the local hash as refs will use remotes by default 91 | PLUGIN_VERSION=$(git rev-parse --verify HEAD) 92 | 93 | # Handle command line options 94 | for i in "$@"; do 95 | case $i in 96 | # 97 | # SKIP_REBUILD_ZEEK 98 | # 99 | # --skip-docker-build 100 | # 101 | --skip-docker-build) 102 | SKIP_REBUILD_ZEEK=true 103 | shift # past argument 104 | ;; 105 | # 106 | # NO_PCAP 107 | # 108 | # --no-pcap 109 | # 110 | --no-pcap) 111 | NO_PCAP=true 112 | shift # past argument 113 | ;; 114 | # 115 | # DATA_PATH 116 | # 117 | --data-path=*) 118 | DATA_PATH="${i#*=}" 119 | shift # past argument=value 120 | ;; 121 | # 122 | # KAFKA_TOPIC 123 | # 124 | # --kafka-topic 125 | # 126 | --kafka-topic=*) 127 | KAFKA_TOPIC="${i#*=}" 128 | shift # past argument=value 129 | ;; 130 | # 131 | # PARTITIONS 132 | # 133 | # --partitions 134 | # 135 | --partitions=*) 136 | PARTITIONS="${i#*=}" 137 | shift # past argument=value 138 | ;; 139 | # 140 | # PLUGIN_VERSION 141 | # 142 | # --plugin-version 143 | # 144 | --plugin-version=*) 145 | PLUGIN_VERSION="${i#*=}" 146 | shift # past argument=value 147 | ;; 148 | # 149 | # -h/--help 150 | # 151 | -h | --help) 152 | help 153 | exit 0 154 | shift # past argument with no value 155 | ;; 156 | esac 157 | done 158 | 159 | cd "${ROOT_DIR}" || { echo "ROOT_DIR unavailable" ; exit 1; } 160 | echo "Running the end to end tests with" 161 | echo "COMPOSE_PROJECT_NAME = ${PROJECT_NAME}" 162 | echo "SKIP_REBUILD_ZEEK = ${SKIP_REBUILD_ZEEK}" 163 | echo "KAFKA_TOPIC = ${KAFKA_TOPIC}" 164 | echo "PARTITIONS = ${PARTITIONS}" 165 | echo "PLUGIN_VERSION = ${PLUGIN_VERSION}" 166 | echo "DATA_PATH = ${DATA_PATH}" 167 | echo "TEST_OUTPUT_PATH = ${TEST_OUTPUT_PATH}" 168 | echo "PLUGIN_ROOT_DIR = ${PLUGIN_ROOT_DIR}" 169 | echo "OUR_SCRIPTS_PATH = ${OUR_SCRIPTS_PATH}" 170 | echo "===================================================" 171 | 172 | # Run docker compose, rebuilding as specified 173 | if [[ "$SKIP_REBUILD_ZEEK" = false ]]; then 174 | COMPOSE_PROJECT_NAME="${PROJECT_NAME}" \ 175 | DATA_PATH=${DATA_PATH} \ 176 | TEST_OUTPUT_PATH=${TEST_OUTPUT_PATH} \ 177 | PLUGIN_ROOT_DIR=${PLUGIN_ROOT_DIR} \ 178 | OUR_SCRIPTS_PATH=${OUR_SCRIPTS_PATH} \ 179 | docker-compose up -d --build 180 | else 181 | COMPOSE_PROJECT_NAME="${PROJECT_NAME}" \ 182 | DATA_PATH=${DATA_PATH} \ 183 | TEST_OUTPUT_PATH=${TEST_OUTPUT_PATH} \ 184 | PLUGIN_ROOT_DIR=${PLUGIN_ROOT_DIR} \ 185 | OUR_SCRIPTS_PATH=${OUR_SCRIPTS_PATH} \ 186 | docker-compose up -d 187 | fi 188 | 189 | # Create the kafka topic 190 | "${SCRIPT_DIR}"/docker_execute_create_topic_in_kafka.sh --kafka-topic="${KAFKA_TOPIC}" --partitions="${PARTITIONS}" 191 | 192 | # Download the pcaps 193 | "${SCRIPT_DIR}"/download_sample_pcaps.sh --data-path="${DATA_PATH}" 194 | 195 | # Build the zeek plugin 196 | "${SCRIPT_DIR}"/docker_execute_build_plugin.sh --plugin-version="${PLUGIN_VERSION}" 197 | 198 | # Configure the plugin 199 | "${SCRIPT_DIR}"/docker_execute_configure_plugin.sh --kafka-topic="${KAFKA_TOPIC}" 200 | 201 | if [[ "$NO_PCAP" == false ]]; then 202 | # for each pcap in the data directory, we want to 203 | # run zeek then read the output from kafka 204 | # and output both of them to the same directory named 205 | # for the date/pcap 206 | for file in "${DATA_PATH}"/**/*.pcap* 207 | do 208 | # get the file name 209 | BASE_FILE_NAME=$(basename "${file}") 210 | DOCKER_DIRECTORY_NAME=${BASE_FILE_NAME//\./_} 211 | 212 | mkdir "${TEST_OUTPUT_PATH}/${DOCKER_DIRECTORY_NAME}" || exit 1 213 | echo "MADE ${TEST_OUTPUT_PATH}/${DOCKER_DIRECTORY_NAME}" 214 | 215 | # get the offsets in kafka for the provided topic 216 | # this is where we are going to _start_, and must happen 217 | # before processing the pcap 218 | OFFSETS=$("${SCRIPT_DIR}"/docker_run_get_offset_kafka.sh --kafka-topic="${KAFKA_TOPIC}") 219 | 220 | "${SCRIPT_DIR}"/docker_execute_process_data_file.sh --pcap-file-name="${BASE_FILE_NAME}" --output-directory-name="${DOCKER_DIRECTORY_NAME}" 221 | 222 | # loop through each partition 223 | while IFS= read -r line; do 224 | # shellcheck disable=SC2001 225 | OFFSET=$(echo "${line}" | sed "s/^${KAFKA_TOPIC}:.*:\(.*\)$/\1/") 226 | # shellcheck disable=SC2001 227 | PARTITION=$(echo "${line}" | sed "s/^${KAFKA_TOPIC}:\(.*\):.*$/\1/") 228 | 229 | echo "PARTITION---------------> ${PARTITION}" 230 | echo "OFFSET------------------> ${OFFSET}" 231 | 232 | KAFKA_OUTPUT_FILE="${TEST_OUTPUT_PATH}/${DOCKER_DIRECTORY_NAME}/kafka-output.log" 233 | "${SCRIPT_DIR}"/docker_run_consume_kafka.sh --offset="${OFFSET}" --partition="${PARTITION}" --kafka-topic="${KAFKA_TOPIC}" 1>>"${KAFKA_OUTPUT_FILE}" 2>/dev/null 234 | done <<< "${OFFSETS}" 235 | 236 | "${SCRIPT_DIR}"/split_kafka_output_by_log.sh --log-directory="${TEST_OUTPUT_PATH}/${DOCKER_DIRECTORY_NAME}" 237 | done 238 | 239 | "${SCRIPT_DIR}"/print_results.sh --test-directory="${TEST_OUTPUT_PATH}" 240 | 241 | "${SCRIPT_DIR}"/analyze_results.sh --test-directory="${TEST_OUTPUT_PATH}" 242 | fi 243 | 244 | echo "" 245 | echo "Run complete" 246 | echo "The kafka and zeek output can be found at ${TEST_OUTPUT_PATH}" 247 | echo "You may now work with the containers if you will. You need to call finish_end_to_end.sh when you are done" 248 | 249 | -------------------------------------------------------------------------------- /docker/scripts/analyze_results.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | #set -u # nounset disabled 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | declare -r txtDEFAULT='\033[0m' 27 | # shellcheck disable=SC2034 28 | declare -r txtERROR='\033[0;31m' 29 | # shellcheck disable=SC2034 30 | declare -r txtWARN='\033[0;33m' 31 | 32 | # 33 | # Analyzes the results.csv files to identify issues 34 | # 35 | 36 | function help { 37 | echo " " 38 | echo "usage: ${0}" 39 | echo " --test-directory [REQUIRED] The directory for the tests" 40 | echo " -h/--help Usage information." 41 | echo " " 42 | echo " " 43 | } 44 | 45 | function _echo() { 46 | color="txt${1:-DEFAULT}" 47 | case "${1}" in 48 | ERROR) 49 | >&2 echo -e "${!color}${1}> ${2}${txtDEFAULT}" 50 | ;; 51 | WARN) 52 | echo -e "${!color}${1}> ${2}${txtDEFAULT}" 53 | ;; 54 | *) 55 | echo -e "${!color}${1}> ${2}${txtDEFAULT}" 56 | ;; 57 | esac 58 | } 59 | 60 | # Require bash >= 4 61 | if (( BASH_VERSINFO[0] < 4 )); then 62 | _echo ERROR "bash >= 4.0 is required" 63 | exit 1 64 | fi 65 | 66 | SCRIPT_NAME=$(basename -- "$0") 67 | TEST_DIRECTORY= 68 | declare -A LOGS_WITH_UNEQUAL_RESULTS 69 | declare -a LOG_NAMES 70 | declare -A OVERALL_LOG_CARDINALITY 71 | declare -A LOG_ISSUE_COUNT 72 | 73 | # Handle command line options 74 | for i in "$@"; do 75 | case $i in 76 | # 77 | # TEST_DIRECTORY 78 | # 79 | # --test-directory 80 | # 81 | --test-directory=*) 82 | TEST_DIRECTORY="${i#*=}" 83 | shift # past argument=value 84 | ;; 85 | 86 | # 87 | # -h/--help 88 | # 89 | -h | --help) 90 | help 91 | exit 0 92 | shift # past argument with no value 93 | ;; 94 | 95 | # 96 | # Unknown option 97 | # 98 | *) 99 | UNKNOWN_OPTION="${i#*=}" 100 | _echo ERROR "unknown option: $UNKNOWN_OPTION" 101 | help 102 | ;; 103 | esac 104 | done 105 | 106 | if [[ -z "$TEST_DIRECTORY" ]]; then 107 | echo "$TEST_DIRECTORY must be passed" 108 | exit 1 109 | fi 110 | 111 | echo "Running ${SCRIPT_NAME} with" 112 | echo "TEST_DIRECTORY = $TEST_DIRECTORY" 113 | echo "===================================================" 114 | 115 | ## Main functions 116 | function count_occurrences_of_each_log_file 117 | { 118 | # Count the number of occurences of each log name 119 | for LOG_NAME in "${LOG_NAMES[@]}"; do 120 | (( ++OVERALL_LOG_CARDINALITY["${LOG_NAME}"] )) 121 | done 122 | } 123 | 124 | function check_for_unequal_log_counts 125 | { 126 | RESULTS_FILE="${1}" 127 | 128 | # Get the pcap folder name from the provided file 129 | # shellcheck disable=SC2001 130 | PCAP_FOLDER="$( cd "$( dirname "${RESULTS_FILE}" )" >/dev/null 2>&1 && echo "${PWD##*/}")" 131 | 132 | # Check each log line in the provided log file for unequal results 133 | for LOG_NAME in "${LOG_NAMES[@]}"; do 134 | # For each log in the provided results, identify any unequal log counts 135 | UNEQUAL_LOG=$(awk -F\, -v log_name="${LOG_NAME}" '$1 == log_name && $2 != $3 {print $1}' "${RESULTS_FILE}") 136 | 137 | # Create a space separated list of unequal logs to simulate a 138 | # multidimensional array 139 | if [[ -n "${UNEQUAL_LOG}" ]]; then 140 | if [[ "${#LOGS_WITH_UNEQUAL_RESULTS[${PCAP_FOLDER}]}" -eq 0 ]]; then 141 | LOGS_WITH_UNEQUAL_RESULTS["${PCAP_FOLDER}"]="${UNEQUAL_LOG}" 142 | else 143 | LOGS_WITH_UNEQUAL_RESULTS["${PCAP_FOLDER}"]+=" ${UNEQUAL_LOG}" 144 | fi 145 | fi 146 | done 147 | } 148 | 149 | function print_unequal_results 150 | { 151 | # Output a table with the pcap file and log name details where the imbalance 152 | # was detected 153 | { 154 | echo "PCAP FOLDER,LOG NAME" 155 | 156 | for KEY in "${!LOGS_WITH_UNEQUAL_RESULTS[@]}"; do 157 | # This must be done because we are simulating multidimensional arrays due to 158 | # the lack of native bash support 159 | for VALUE in ${LOGS_WITH_UNEQUAL_RESULTS[${KEY}]}; do 160 | echo "${KEY},${VALUE}" 161 | done 162 | done 163 | } | column -t -s ',' 164 | } 165 | 166 | function print_log_comparison_insights 167 | { 168 | # Load the log to instance count mapping from LOGS_WITH_UNEQUAL_RESULTS into a new 169 | # associative array 170 | # shellcheck disable=SC2046 171 | declare -A $(echo "${LOGS_WITH_UNEQUAL_RESULTS[@]}" | tr ' ' '\n' | sort | uniq -c | awk '{print "LOG_ISSUE_COUNT["$2"]="$1}') 172 | 173 | # Compare each log type's instances of inequality to the total number of 174 | # instances of each log. If they are equal, this indicates that there may be 175 | # a log-type related issue. 176 | # 177 | # For example, if count_occurrences_of_each_log_file identified that there 178 | # were 10 instances of http logs across all of the `results.csv` files, 179 | # ${OVERALL_LOG_CARDINALITY[http]} should equal 10. If check_for_unequal_log_counts 180 | # independently found 10 instances where the http zeek and kafka log counts 181 | # from the `results.csv` files were not equal, ${LOG_ISSUE_COUNT[http]} 182 | # would also have 10 entries, causing us to warn the user of that insight. 183 | for KEY in "${!LOG_ISSUE_COUNT[@]}"; do 184 | if [[ "${LOG_ISSUE_COUNT[${KEY}]}" == "${OVERALL_LOG_CARDINALITY[${KEY}]}" ]]; then 185 | _echo WARN "None of the ${KEY} log counts were the same between zeek and kafka. This may indicate an issue specific to that log." 186 | fi 187 | done 188 | } 189 | 190 | ## Main 191 | # Move over to the docker area 192 | cd "${TEST_DIRECTORY}" || exit 1 193 | # Get a list of results files 194 | RESULTS_FILES=$(find "${TEST_DIRECTORY}" -name "results.csv") 195 | # Analyze each results file for issues 196 | for file in $RESULTS_FILES; do 197 | # Capture the first column (the log names) of the provided file's contents in 198 | # the array LOG_NAMES, excluding the header 199 | mapfile -s 1 -t LOG_NAMES < <(awk -F\, '{print $1}' "${file}") 200 | 201 | count_occurrences_of_each_log_file 202 | check_for_unequal_log_counts "${file}" 203 | done 204 | 205 | if [[ "${#LOGS_WITH_UNEQUAL_RESULTS[@]}" -gt 0 ]]; then 206 | _echo ERROR "UNEQUALITY FOUND IN ZEEK AND KAFKA LOG COUNTS" 207 | echo "" 208 | 209 | print_unequal_results 210 | print_log_comparison_insights 211 | 212 | exit 1 213 | fi 214 | 215 | -------------------------------------------------------------------------------- /docker/scripts/docker_execute_build_plugin.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Executes the build_plugin.sh script in the container 28 | # 29 | 30 | function help { 31 | echo " " 32 | echo "usage: ${0}" 33 | echo " --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1" 34 | echo " --plugin-version [REQUIRED] The plugin version." 35 | echo " -h/--help Usage information." 36 | echo " " 37 | echo " " 38 | } 39 | 40 | CONTAINER_NAME="metron-bro-plugin-kafka_zeek_1" 41 | PLUGIN_VERSION= 42 | 43 | # handle command line options 44 | for i in "$@"; do 45 | case $i in 46 | # 47 | # CONTAINER_NAME 48 | # 49 | # --container-name 50 | # 51 | --container-name=*) 52 | CONTAINER_NAME="${i#*=}" 53 | shift # past argument=value 54 | ;; 55 | 56 | # 57 | # PLUGIN_VERSION 58 | # 59 | # --plugin-version 60 | # 61 | --plugin-version=*) 62 | PLUGIN_VERSION="${i#*=}" 63 | shift # past argument=value 64 | ;; 65 | 66 | # 67 | # -h/--help 68 | # 69 | -h | --help) 70 | help 71 | exit 0 72 | shift # past argument with no value 73 | ;; 74 | 75 | # 76 | # Unknown option 77 | # 78 | *) 79 | UNKNOWN_OPTION="${i#*=}" 80 | echo "Error: unknown option: $UNKNOWN_OPTION" 81 | help 82 | ;; 83 | esac 84 | done 85 | 86 | if [[ -z "${PLUGIN_VERSION}" ]]; then 87 | echo "PLUGIN_VERSION must be passed" 88 | exit 1 89 | fi 90 | 91 | echo "Running build_plugin with " 92 | echo "CONTAINER_NAME = $CONTAINER_NAME" 93 | echo "===================================================" 94 | 95 | docker exec -w /root "${CONTAINER_NAME}" bash -c "/root/built_in_scripts/build_plugin.sh --plugin-version=${PLUGIN_VERSION}" 96 | 97 | echo "Built the plugin" 98 | 99 | -------------------------------------------------------------------------------- /docker/scripts/docker_execute_configure_plugin.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Executes the configure_plugin.sh in the docker container 28 | # 29 | 30 | function help { 31 | echo " " 32 | echo "usage: ${0}" 33 | echo " --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1" 34 | echo " --kafka-topic [OPTIONAL] The kafka topic to create. Default: zeek" 35 | echo " -h/--help Usage information." 36 | echo " " 37 | echo " " 38 | } 39 | 40 | CONTAINER_NAME=metron-bro-plugin-kafka_zeek_1 41 | KAFKA_TOPIC=zeek 42 | 43 | # Handle command line options 44 | for i in "$@"; do 45 | case $i in 46 | # 47 | # CONTAINER_NAME 48 | # 49 | # --container-name 50 | # 51 | --container-name=*) 52 | CONTAINER_NAME="${i#*=}" 53 | shift # past argument=value 54 | ;; 55 | # 56 | # KAFKA_TOPIC 57 | # 58 | # --kafka-topic 59 | # 60 | --kafka-topic=*) 61 | KAFKA_TOPIC="${i#*=}" 62 | shift # past argument=value 63 | ;; 64 | # 65 | # -h/--help 66 | # 67 | -h | --help) 68 | help 69 | exit 0 70 | shift # past argument with no value 71 | ;; 72 | # 73 | # Unknown option 74 | # 75 | *) 76 | UNKNOWN_OPTION="${i#*=}" 77 | echo "Error: unknown option: $UNKNOWN_OPTION" 78 | help 79 | ;; 80 | esac 81 | done 82 | 83 | echo "Running docker_execute_configure_plugin.sh with " 84 | echo "CONTAINER_NAME = ${CONTAINER_NAME}" 85 | echo "KAFKA_TOPIC = ${KAFKA_TOPIC}" 86 | echo "===================================================" 87 | 88 | docker exec -w /root "${CONTAINER_NAME}" bash -c "/root/built_in_scripts/configure_plugin.sh --kafka-topic=\"${KAFKA_TOPIC}\"" 89 | 90 | echo "configured the kafka plugin" 91 | 92 | -------------------------------------------------------------------------------- /docker/scripts/docker_execute_create_topic_in_kafka.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | function help { 27 | echo " " 28 | echo "usage: ${0}" 29 | echo " --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_kafka-1_1" 30 | echo " --kafka-topic [OPTIONAL] The kafka topic to create. Default: zeek" 31 | echo " --partitions [OPTIONAL] The number of kafka partitions to create. Default: 2" 32 | echo " -h/--help Usage information." 33 | echo " " 34 | } 35 | 36 | CONTAINER_NAME="metron-bro-plugin-kafka_kafka-1_1" 37 | KAFKA_TOPIC=zeek 38 | PARTITIONS=2 39 | 40 | # handle command line options 41 | for i in "$@"; do 42 | case $i in 43 | # 44 | # CONTAINER_NAME 45 | # 46 | # --container-name 47 | # 48 | --container-name=*) 49 | CONTAINER_NAME="${i#*=}" 50 | shift # past argument=value 51 | ;; 52 | # 53 | # KAFKA_TOPIC 54 | # 55 | # --kafka-topic 56 | # 57 | --kafka-topic=*) 58 | KAFKA_TOPIC="${i#*=}" 59 | shift # past argument=value 60 | ;; 61 | # 62 | # PARTITIONS 63 | # 64 | # --partitions 65 | # 66 | --partitions=*) 67 | PARTITIONS="${i#*=}" 68 | shift # past argument=value 69 | ;; 70 | # 71 | # -h/--help 72 | # 73 | -h | --help) 74 | help 75 | exit 0 76 | shift # past argument with no value 77 | ;; 78 | 79 | # 80 | # Unknown option 81 | # 82 | *) 83 | UNKNOWN_OPTION="${i#*=}" 84 | echo "Error: unknown option: $UNKNOWN_OPTION" 85 | help 86 | ;; 87 | esac 88 | done 89 | 90 | echo "Running docker_execute_create_topic_in_kafka.sh with " 91 | echo "CONTAINER_NAME = ${CONTAINER_NAME}" 92 | echo "KAFKA_TOPIC = ${KAFKA_TOPIC}" 93 | echo "PARTITIONS = ${PARTITIONS}" 94 | echo "===================================================" 95 | 96 | docker exec -w /opt/kafka/bin/ "${CONTAINER_NAME}" \ 97 | bash -c "JMX_PORT= ./kafka-topics.sh --create --topic ${KAFKA_TOPIC} --replication-factor 1 --partitions ${PARTITIONS} --zookeeper zookeeper:2181" 98 | 99 | -------------------------------------------------------------------------------- /docker/scripts/docker_execute_process_data_file.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Executes the process_data_dir.sh script in the container 28 | # 29 | 30 | function help { 31 | echo " " 32 | echo "usage: ${0}" 33 | echo " --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1" 34 | echo " --pcap-file-name [REQUIRED] The name of the pcap file" 35 | echo " --output-directory-name [REQUIRED] The name of the output directory" 36 | echo " -h/--help Usage information." 37 | echo " " 38 | echo " " 39 | } 40 | 41 | CONTAINER_NAME=metron-bro-plugin-kafka_zeek_1 42 | PCAP_FILE_NAME= 43 | OUTPUT_DIRECTORY_NAME= 44 | 45 | # Handle command line options 46 | for i in "$@"; do 47 | case $i in 48 | # 49 | # CONTAINER_NAME 50 | # 51 | # --container-name 52 | # 53 | --container-name=*) 54 | CONTAINER_NAME="${i#*=}" 55 | shift # past argument=value 56 | ;; 57 | 58 | # 59 | # PCAP_FILE_NAME 60 | # 61 | # --pcap-file-name 62 | # 63 | --pcap-file-name=*) 64 | PCAP_FILE_NAME="${i#*=}" 65 | shift # past argument=value 66 | ;; 67 | 68 | # 69 | # OUTPUT_DIRECTORY_NAME 70 | # 71 | # --output-directory-name 72 | # 73 | --output-directory-name=*) 74 | OUTPUT_DIRECTORY_NAME="${i#*=}" 75 | shift # past argument=value 76 | ;; 77 | 78 | # 79 | # -h/--help 80 | # 81 | -h | --help) 82 | help 83 | exit 0 84 | shift # past argument with no value 85 | ;; 86 | 87 | # 88 | # Unknown option 89 | # 90 | *) 91 | UNKNOWN_OPTION="${i#*=}" 92 | echo "Error: unknown option: $UNKNOWN_OPTION" 93 | help 94 | ;; 95 | esac 96 | done 97 | 98 | echo "Running docker_execute_process_data_dir with " 99 | echo "CONTAINER_NAME = $CONTAINER_NAME" 100 | echo "PCAP_FILE_NAME = ${PCAP_FILE_NAME}" 101 | echo "OUTPUT_DIRECTORY_NAME = ${OUTPUT_DIRECTORY_NAME}" 102 | echo "===================================================" 103 | 104 | echo "executing process_data_file.sh in the zeek docker container" 105 | echo " " 106 | 107 | docker exec -w /root "${CONTAINER_NAME}" bash -c "built_in_scripts/process_data_file.sh --pcap-file-name=${PCAP_FILE_NAME} --output-directory-name=${OUTPUT_DIRECTORY_NAME}" 108 | 109 | echo "done processing ${PCAP_FILE_NAME}" 110 | 111 | -------------------------------------------------------------------------------- /docker/scripts/docker_execute_shell.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Gets a bash shell for a container 28 | # 29 | 30 | function help { 31 | echo " " 32 | echo "usage: ${0}" 33 | echo " --container-name [OPTIONAL] The Docker container name. Default: metron-bro-plugin-kafka_zeek_1" 34 | echo " -h/--help Usage information." 35 | echo " " 36 | echo " " 37 | } 38 | 39 | CONTAINER_NAME=metron-bro-plugin-kafka_zeek_1 40 | 41 | # handle command line options 42 | for i in "$@"; do 43 | case $i in 44 | # 45 | # CONTAINER_NAME 46 | # 47 | # --container-name 48 | # 49 | --container-name=*) 50 | CONTAINER_NAME="${i#*=}" 51 | shift # past argument=value 52 | ;; 53 | 54 | # 55 | # -h/--help 56 | # 57 | -h | --help) 58 | help 59 | exit 0 60 | shift # past argument with no value 61 | ;; 62 | 63 | # 64 | # Unknown option 65 | # 66 | *) 67 | UNKNOWN_OPTION="${i#*=}" 68 | echo "Error: unknown option: $UNKNOWN_OPTION" 69 | help 70 | ;; 71 | esac 72 | done 73 | 74 | echo "Running bash on " 75 | echo "CONTAINER_NAME = $CONTAINER_NAME" 76 | echo "===================================================" 77 | 78 | docker exec -i -t "${CONTAINER_NAME}" bash 79 | 80 | -------------------------------------------------------------------------------- /docker/scripts/docker_run_consume_kafka.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Runs a kafka container with the console consumer for the appropriate topic. 28 | # The consumer should quit when it has read all of the messages available on 29 | # the given partition. 30 | # 31 | 32 | function help { 33 | echo " " 34 | echo "usage: ${0}" 35 | echo " --network-name [OPTIONAL] The Docker network name. Default: metron-bro-plugin-kafka_default" 36 | echo " --offset [OPTIONAL] The kafka offset to read from. Default: 0" 37 | echo " --partition [OPTIONAL] The kafka partition to read from. Default: 0" 38 | echo " --kafka-topic [OPTIONAL] The kafka topic to consume from. Default: zeek" 39 | echo " -h/--help Usage information." 40 | echo " " 41 | } 42 | 43 | NETWORK_NAME=metron-bro-plugin-kafka_default 44 | OFFSET=0 45 | PARTITION=0 46 | KAFKA_TOPIC=zeek 47 | 48 | # handle command line options 49 | for i in "$@"; do 50 | case $i in 51 | # 52 | # NETWORK_NAME 53 | # 54 | # --network-name 55 | # 56 | --network-name=*) 57 | NETWORK_NAME="${i#*=}" 58 | shift # past argument=value 59 | ;; 60 | # 61 | # OFFSET 62 | # 63 | # --offset 64 | # 65 | --offset=*) 66 | OFFSET="${i#*=}" 67 | shift # past argument=value 68 | ;; 69 | # 70 | # PARTITION 71 | # 72 | # --partition 73 | # 74 | --partition=*) 75 | PARTITION="${i#*=}" 76 | shift # past argument=value 77 | ;; 78 | # 79 | # KAFKA_TOPIC 80 | # 81 | # --kafka-topic 82 | # 83 | --kafka-topic=*) 84 | KAFKA_TOPIC="${i#*=}" 85 | shift # past argument=value 86 | ;; 87 | # 88 | # -h/--help 89 | # 90 | -h | --help) 91 | help 92 | exit 0 93 | shift # past argument with no value 94 | ;; 95 | # 96 | # Unknown option 97 | # 98 | *) 99 | UNKNOWN_OPTION="${i#*=}" 100 | echo "Error: unknown option: $UNKNOWN_OPTION" 101 | help 102 | ;; 103 | esac 104 | done 105 | 106 | docker run --rm --network "${NETWORK_NAME}" metron-bro-plugin-kafka_kafka \ 107 | kafka-console-consumer.sh --topic "${KAFKA_TOPIC}" --offset "${OFFSET}" --partition "${PARTITION}" --bootstrap-server kafka-1:9092 --timeout-ms 5000 108 | 109 | -------------------------------------------------------------------------------- /docker/scripts/docker_run_get_offset_kafka.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Runs a kafka container to retrieve the offset for the provided topic 28 | # 29 | 30 | function help { 31 | echo " " 32 | echo "usage: ${0}" 33 | echo " --network-name [OPTIONAL] The Docker network name. Default: metron-bro-plugin-kafka_default" 34 | echo " --kafka-topic [OPTIONAL] The kafka topic to pull the offset from. Default: zeek" 35 | echo " -h/--help Usage information." 36 | echo " " 37 | } 38 | 39 | NETWORK_NAME=metron-bro-plugin-kafka_default 40 | KAFKA_TOPIC=zeek 41 | 42 | # handle command line options 43 | for i in "$@"; do 44 | case $i in 45 | # 46 | # NETWORK_NAME 47 | # 48 | # --network-name 49 | # 50 | --network-name=*) 51 | NETWORK_NAME="${i#*=}" 52 | shift # past argument=value 53 | ;; 54 | # 55 | # KAFKA_TOPIC 56 | # 57 | # --kafka-topic 58 | # 59 | --kafka-topic=*) 60 | KAFKA_TOPIC="${i#*=}" 61 | shift # past argument=value 62 | ;; 63 | # 64 | # -h/--help 65 | # 66 | -h | --help) 67 | help 68 | exit 0 69 | shift # past argument with no value 70 | ;; 71 | 72 | # 73 | # Unknown option 74 | # 75 | *) 76 | UNKNOWN_OPTION="${i#*=}" 77 | echo "Error: unknown option: $UNKNOWN_OPTION" 78 | help 79 | ;; 80 | esac 81 | done 82 | 83 | docker run --rm --network "${NETWORK_NAME}" metron-bro-plugin-kafka_kafka \ 84 | kafka-run-class.sh kafka.tools.GetOffsetShell --topic "${KAFKA_TOPIC}" --broker-list "kafka-1:9092,kafka-2:9092" 85 | 86 | -------------------------------------------------------------------------------- /docker/scripts/download_sample_pcaps.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Downloads sample pcap files to the data directory 28 | # 29 | 30 | function help { 31 | echo " " 32 | echo "usage: ${0}" 33 | echo " --data-path [REQUIRED] The pcap data path" 34 | echo " -h/--help Usage information." 35 | echo " " 36 | echo " " 37 | } 38 | 39 | DATA_PATH= 40 | 41 | # Handle command line options 42 | for i in "$@"; do 43 | case $i in 44 | # 45 | # DATA_PATH 46 | # 47 | # --data-path 48 | # 49 | --data-path=*) 50 | DATA_PATH="${i#*=}" 51 | shift # past argument=value 52 | ;; 53 | 54 | # 55 | # -h/--help 56 | # 57 | -h | --help) 58 | help 59 | exit 0 60 | shift # past argument with no value 61 | ;; 62 | 63 | # 64 | # Unknown option 65 | # 66 | *) 67 | UNKNOWN_OPTION="${i#*=}" 68 | echo "Error: unknown option: $UNKNOWN_OPTION" 69 | help 70 | ;; 71 | esac 72 | done 73 | 74 | if [[ -z "$DATA_PATH" ]]; then 75 | echo "DATA_PATH must be passed" 76 | exit 1 77 | fi 78 | 79 | echo "Running download_sample_pcaps with " 80 | echo "DATA_PATH = $DATA_PATH" 81 | echo "===================================================" 82 | 83 | for folder in nitroba example-traffic ssh ftp radius rfb; do 84 | if [[ ! -d "${DATA_PATH}"/${folder} ]]; then 85 | mkdir -p "${DATA_PATH}"/${folder} 86 | fi 87 | done 88 | 89 | if [[ ! -f "${DATA_PATH}"/example-traffic/exercise-traffic.pcap ]]; then 90 | wget https://github.com/zeek/try-zeek/raw/master/manager/static/pcaps/exercise_traffic.pcap -O "${DATA_PATH}"/example-traffic/exercise-traffic.pcap 91 | fi 92 | 93 | if [[ ! -f "${DATA_PATH}"/nitroba/nitroba.pcap ]]; then 94 | wget http://downloads.digitalcorpora.org/corpora/network-packet-dumps/2008-nitroba/nitroba.pcap -O "${DATA_PATH}"/nitroba/nitroba.pcap 95 | fi 96 | 97 | if [[ ! -f "${DATA_PATH}"/ssh/ssh.pcap ]]; then 98 | wget https://github.com/zeek/try-zeek/raw/master/manager/static/pcaps/ssh.pcap -O "${DATA_PATH}"/ssh/ssh.pcap 99 | fi 100 | 101 | if [[ ! -f "${DATA_PATH}"/ftp/ftp.pcap ]]; then 102 | wget https://github.com/markofu/pcaps/blob/master/PracticalPacketAnalysis/ppa-capture-files/ftp.pcap?raw=true -O "${DATA_PATH}"/ftp/ftp.pcap 103 | fi 104 | 105 | if [[ ! -f "${DATA_PATH}"/radius/radius_localhost.pcapng ]]; then 106 | wget https://github.com/EmpowerSecurityAcademy/wireshark/blob/master/radius_localhost.pcapng?raw=true -O "${DATA_PATH}"/radius/radius_localhost.pcapng 107 | fi 108 | 109 | if [[ ! -f "${DATA_PATH}"/rfb/rfb.pcap ]]; then 110 | wget https://github.com/kholia/my-pcaps/blob/master/VNC/07-vnc-openwall-3.7.pcap?raw=true -O "${DATA_PATH}"/rfb/rfb.pcap 111 | fi 112 | 113 | -------------------------------------------------------------------------------- /docker/scripts/print_results.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | 3 | # 4 | # Licensed to the Apache Software Foundation (ASF) under one or more 5 | # contributor license agreements. See the NOTICE file distributed with 6 | # this work for additional information regarding copyright ownership. 7 | # The ASF licenses this file to You under the Apache License, Version 2.0 8 | # (the "License"); you may not use this file except in compliance with 9 | # the License. You may obtain a copy of the License at 10 | # 11 | # http://www.apache.org/licenses/LICENSE-2.0 12 | # 13 | # Unless required by applicable law or agreed to in writing, software 14 | # distributed under the License is distributed on an "AS IS" BASIS, 15 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 16 | # See the License for the specific language governing permissions and 17 | # limitations under the License. 18 | # 19 | 20 | shopt -s nocasematch 21 | set -u # nounset 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # Prints all the results.csv files 28 | # 29 | 30 | function help { 31 | echo " " 32 | echo "usage: ${0}" 33 | echo " --test-directory [REQUIRED] The directory for the tests" 34 | echo " -h/--help Usage information." 35 | echo " " 36 | echo " " 37 | } 38 | 39 | SCRIPT_NAME=$(basename -- "$0") 40 | TEST_DIRECTORY= 41 | 42 | # Handle command line options 43 | for i in "$@"; do 44 | case $i in 45 | # 46 | # TEST_DIRECTORY 47 | # 48 | # --test-directory 49 | # 50 | --test-directory=*) 51 | TEST_DIRECTORY="${i#*=}" 52 | shift # past argument=value 53 | ;; 54 | 55 | # 56 | # -h/--help 57 | # 58 | -h | --help) 59 | help 60 | exit 0 61 | shift # past argument with no value 62 | ;; 63 | 64 | # 65 | # Unknown option 66 | # 67 | *) 68 | UNKNOWN_OPTION="${i#*=}" 69 | echo "Error: unknown option: $UNKNOWN_OPTION" 70 | help 71 | ;; 72 | esac 73 | done 74 | 75 | if [[ -z "$TEST_DIRECTORY" ]]; then 76 | echo "$TEST_DIRECTORY must be passed" 77 | exit 1 78 | fi 79 | 80 | 81 | echo "Running ${SCRIPT_NAME} with" 82 | echo "TEST_DIRECTORY = $TEST_DIRECTORY" 83 | echo "===================================================" 84 | 85 | # Move over to the docker area 86 | cd "${TEST_DIRECTORY}" || exit 1 87 | find "${TEST_DIRECTORY}" -name "results.csv" \ 88 | -exec echo "-->" '{}' \; \ 89 | -exec column -t -s ',' '{}' \; \ 90 | -exec echo "========================================================" \; \ 91 | -exec echo "" \; 92 | 93 | -------------------------------------------------------------------------------- /docker/scripts/split_kafka_output_by_log.sh: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | # shellcheck disable=SC2143,SC1083,SC2002,SC2126 3 | 4 | # 5 | # Licensed to the Apache Software Foundation (ASF) under one or more 6 | # contributor license agreements. See the NOTICE file distributed with 7 | # this work for additional information regarding copyright ownership. 8 | # The ASF licenses this file to You under the Apache License, Version 2.0 9 | # (the "License"); you may not use this file except in compliance with 10 | # the License. You may obtain a copy of the License at 11 | # 12 | # http://www.apache.org/licenses/LICENSE-2.0 13 | # 14 | # Unless required by applicable law or agreed to in writing, software 15 | # distributed under the License is distributed on an "AS IS" BASIS, 16 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 17 | # See the License for the specific language governing permissions and 18 | # limitations under the License. 19 | # 20 | 21 | shopt -s nocasematch 22 | set -e # errexit 23 | set -E # errtrap 24 | set -o pipefail 25 | 26 | # 27 | # For a given directory, finds all the zeek log output, and splits the kafka 28 | # output file by zeek log, such that there is a zeek log -> zeek log kafka log 29 | # 30 | 31 | function help { 32 | echo " " 33 | echo "usage: ${0}" 34 | echo " --log-directory [REQUIRED] The directory with the logs" 35 | echo " -h/--help Usage information." 36 | echo " " 37 | echo " " 38 | } 39 | 40 | SCRIPT_NAME=$(basename -- "$0") 41 | LOG_DIRECTORY= 42 | 43 | # Handle command line options 44 | for i in "$@"; do 45 | case $i in 46 | # 47 | # LOG_DIRECTORY 48 | # 49 | # --log-directory 50 | # 51 | --log-directory=*) 52 | LOG_DIRECTORY="${i#*=}" 53 | shift # past argument=value 54 | ;; 55 | 56 | # 57 | # -h/--help 58 | # 59 | -h | --help) 60 | help 61 | exit 0 62 | shift # past argument with no value 63 | ;; 64 | 65 | # 66 | # Unknown option 67 | # 68 | *) 69 | UNKNOWN_OPTION="${i#*=}" 70 | echo "Error: unknown option: $UNKNOWN_OPTION" 71 | help 72 | ;; 73 | esac 74 | done 75 | 76 | if [[ -z "$LOG_DIRECTORY" ]]; then 77 | echo "$LOG_DIRECTORY must be passed" 78 | exit 1 79 | fi 80 | 81 | echo "Running ${SCRIPT_NAME} with" 82 | echo "LOG_DIRECTORY = $LOG_DIRECTORY" 83 | echo "===================================================" 84 | 85 | # Move over to the docker area 86 | cd "${LOG_DIRECTORY}" || exit 1 87 | 88 | # for each log file, that is NOT KAFKA_OUTPUT_FILE we want to get the name 89 | # and extract the start 90 | # then we want to grep that name > name.kafka.log from the KAFKA_OUTPUT_FILE 91 | RESULTS_FILE="${LOG_DIRECTORY}/results.csv" 92 | echo "LOG,ZEEK_COUNT,KAFKA_COUNT" >> "${RESULTS_FILE}" 93 | for log in "${LOG_DIRECTORY}"/*.log 94 | do 95 | BASE_LOG_FILE_NAME=$(basename "$log" .log) 96 | 97 | # skip kafka-output.log 98 | if [[ "$BASE_LOG_FILE_NAME" == "kafka-output" ]]; then 99 | continue 100 | fi 101 | 102 | # search the kafka output for each log and count them 103 | if grep -q \{\""${BASE_LOG_FILE_NAME}"\": "${LOG_DIRECTORY}"/kafka-output.log ; then 104 | grep \{\""${BASE_LOG_FILE_NAME}"\": "${LOG_DIRECTORY}"/kafka-output.log > "${LOG_DIRECTORY}"/"${BASE_LOG_FILE_NAME}".kafka.log 105 | 106 | KAKFA_COUNT=$(cat "${LOG_DIRECTORY}/${BASE_LOG_FILE_NAME}.kafka.log" | wc -l) 107 | ZEEK_COUNT=$(grep -v "^#" "${log}" | wc -l) 108 | 109 | echo "${BASE_LOG_FILE_NAME},${ZEEK_COUNT},${KAKFA_COUNT}" >> "${RESULTS_FILE}" 110 | fi 111 | done 112 | 113 | -------------------------------------------------------------------------------- /docker/test_output/.gitignore: -------------------------------------------------------------------------------- 1 | # Ignore everything in this directory 2 | * 3 | # Except this file 4 | !.gitignore -------------------------------------------------------------------------------- /scripts/Apache/Kafka/__load__.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # 19 | # This is loaded when a user activates the plugin. Include scripts here that should be 20 | # loaded automatically at that point. 21 | # 22 | 23 | @load ./logs-to-kafka.zeek 24 | -------------------------------------------------------------------------------- /scripts/Apache/Kafka/logs-to-kafka.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | ##! Load this script to enable log output to kafka 19 | 20 | module Kafka; 21 | 22 | 23 | function send_to_kafka(id: Log::ID): bool 24 | { 25 | if (|logs_to_send| == 0 && send_all_active_logs == F) 26 | # Send nothing unless it's explicitly set to send 27 | return F; 28 | else if (id in logs_to_exclude || 29 | (|logs_to_send| > 0 && id !in logs_to_send && send_all_active_logs == F)) 30 | # Don't send logs in the exclusion set 31 | return F; 32 | else 33 | # If send_all_active_logs is True, send all logs except those 34 | # in the exclusion set. Otherwise, send only the logs that are 35 | # in the inclusion set, but not the exclusions set 36 | return T; 37 | } 38 | 39 | event zeek_init() &priority=-10 40 | { 41 | for (stream_id in Log::active_streams) 42 | { 43 | if (send_to_kafka(stream_id)) 44 | { 45 | local filter: Log::Filter = [ 46 | $name = fmt("kafka-%s", stream_id), 47 | $writer = Log::WRITER_KAFKAWRITER, 48 | $config = table(["stream_id"] = fmt("%s", stream_id)) 49 | ]; 50 | 51 | Log::add_filter(stream_id, filter); 52 | } 53 | } 54 | } 55 | 56 | event kafka_topic_resolved_event(topic: string) { 57 | print(fmt("Kafka topic set to %s",topic)); 58 | } 59 | -------------------------------------------------------------------------------- /scripts/__load__.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # 19 | # This is loaded automatically at Zeek startup once the plugin gets activated 20 | # and its BiF elements have become available. Include code here that should 21 | # always execute unconditionally at that time. 22 | # 23 | # Note that often you may want your plugin's accompanying scripts not here, but 24 | # in scripts///__load__.zeek. That's processed 25 | # only on explicit `@load /`. 26 | # 27 | 28 | @load ./init.zeek 29 | -------------------------------------------------------------------------------- /scripts/init.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | module Kafka; 19 | 20 | export { 21 | ## Send all active logs to kafka except for those that are explicitly 22 | ## excluded via logs_to_exclude. 23 | ## 24 | ## Example: redef Kafka::send_all_active_logs = T; 25 | const send_all_active_logs: bool = F &redef; 26 | 27 | ## Specify which :zeek:type:`Log::ID` to send to kafka. 28 | ## 29 | ## Example: redef Kafka::logs_to_send = set(Conn::Log, DNS::LOG); 30 | const logs_to_send: set[Log::ID] &redef; 31 | 32 | ## Specify which :zeek:type:`Log::ID` to exclude from being sent to kafka. 33 | ## 34 | ## Example: redef Kafka::logs_to_exclude = set(SSH::LOG); 35 | const logs_to_exclude: set[Log::ID] &redef; 36 | 37 | ## Specify a different timestamp format. 38 | ## 39 | ## Example: redef Kafka::json_timestamps = JSON::TS_ISO8601; 40 | const json_timestamps: JSON::TimestampFormat = JSON::TS_EPOCH &redef; 41 | 42 | ## Destination kafka topic name 43 | const topic_name: string = "zeek" &redef; 44 | 45 | ## Maximum wait on shutdown in milliseconds 46 | const max_wait_on_shutdown: count = 3000 &redef; 47 | 48 | ## Whether or not to tag JSON with a log stream identifier 49 | const tag_json: bool = F &redef; 50 | 51 | ## Any additional configs to pass to librdkafka 52 | const kafka_conf: table[string] of string = table( 53 | ["metadata.broker.list"] = "localhost:9092" 54 | ) &redef; 55 | 56 | ## Key value pairs that will be added to outgoing messages at the root level 57 | ## for example: ["zeek_server"] = "this_server_name" 58 | ## will results in a "zeek_server":"this_server_name" field added to the outgoing 59 | ## json 60 | ## note this depends on tag_json being T 61 | const additional_message_values: table[string] of string = table() &redef; 62 | 63 | ## A comma separated list of librdkafka debug contexts 64 | const debug: string = "" &redef; 65 | 66 | const mock: bool = F &redef; 67 | } 68 | 69 | -------------------------------------------------------------------------------- /src/KafkaWriter.cc: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | #include "KafkaWriter.h" 19 | #include "events.bif.h" 20 | 21 | using namespace logging; 22 | using namespace writer; 23 | 24 | // The Constructor is called once for each log filter that uses this log writer. 25 | KafkaWriter::KafkaWriter(WriterFrontend *frontend) 26 | : WriterBackend(frontend), formatter(NULL), producer(NULL), topic(NULL) { 27 | /** 28 | * We need thread-local copies of all user-defined settings coming from zeek 29 | * scripting land. accessing these is not thread-safe and 'DoInit' is 30 | * potentially accessed from multiple threads. 31 | */ 32 | 33 | // tag_json - thread local copy 34 | tag_json = BifConst::Kafka::tag_json; 35 | mocking = BifConst::Kafka::mock; 36 | 37 | // json_timestamps 38 | ODesc tsfmt; 39 | BifConst::Kafka::json_timestamps->Describe(&tsfmt); 40 | json_timestamps.assign((const char *)tsfmt.Bytes(), tsfmt.Len()); 41 | 42 | // topic name - thread local copy 43 | topic_name.assign((const char *)BifConst::Kafka::topic_name->Bytes(), 44 | BifConst::Kafka::topic_name->Len()); 45 | 46 | // kafka_conf - thread local copy 47 | Val *val = BifConst::Kafka::kafka_conf->AsTableVal(); 48 | IterCookie *c = val->AsTable()->InitForIteration(); 49 | HashKey *k; 50 | TableEntryVal *v; 51 | while ((v = val->AsTable()->NextEntry(k, c))) { 52 | // fetch the key and value 53 | ListVal *index = val->AsTableVal()->RecoverIndex(k); 54 | std::string key = index->Index(0)->AsString()->CheckString(); 55 | std::string val = v->Value()->AsString()->CheckString(); 56 | kafka_conf.insert(kafka_conf.begin(), 57 | std::pair(key, val)); 58 | 59 | // cleanup 60 | Unref(index); 61 | delete k; 62 | } 63 | 64 | Val *mvals = BifConst::Kafka::additional_message_values->AsTableVal(); 65 | c = val->AsTable()->InitForIteration(); 66 | while ((v = mvals->AsTable()->NextEntry(k, c))) { 67 | ListVal *index = mvals->AsTableVal()->RecoverIndex(k); 68 | std::string key = index->Index(0)->AsString()->CheckString(); 69 | std::string val = v->Value()->AsString()->CheckString(); 70 | additional_message_values.insert(additional_message_values.begin(), 71 | std::pair(key, val)); 72 | Unref(index); 73 | delete k; 74 | } 75 | } 76 | 77 | KafkaWriter::~KafkaWriter() { 78 | // Cleanup must happen in DoFinish, not in the destructor 79 | } 80 | 81 | std::string KafkaWriter::GetConfigValue(const WriterInfo &info, 82 | const std::string name) const { 83 | std::map::const_iterator it = 84 | info.config.find(name.c_str()); 85 | if (it == info.config.end()) 86 | return std::string(); 87 | else 88 | return it->second; 89 | } 90 | 91 | /** 92 | * DoInit is called once for each call to the constructor, but in a separate 93 | * thread 94 | */ 95 | bool KafkaWriter::DoInit(const WriterInfo &info, int num_fields, 96 | const threading::Field *const *fields) { 97 | // TimeFormat object, default to TS_EPOCH 98 | threading::formatter::JSON::TimeFormat tf = 99 | threading::formatter::JSON::TS_EPOCH; 100 | 101 | // Allow overriding of the kafka topic via the Zeek script constant 102 | // 'topic_name' which can be applied when adding a new Zeek log filter. 103 | topic_name_override = GetConfigValue(info, "topic_name"); 104 | 105 | if (!topic_name_override.empty()) { 106 | // Override the topic name if 'topic_name' is specified in the log 107 | // filter's $conf 108 | topic_name = topic_name_override; 109 | } else if (topic_name.empty()) { 110 | // If no global 'topic_name' is defined, use the log stream's 'path' 111 | topic_name = info.path; 112 | } 113 | 114 | if (mocking) { 115 | raise_topic_resolved_event(topic_name); 116 | } 117 | 118 | /** 119 | * Format the timestamps 120 | * NOTE: This string comparision implementation is currently the necessary 121 | * way to do it, as there isn't a way to pass the Zeek enum into C++ enum. 122 | * This makes the user interface consistent with the existing Zeek Logging 123 | * configuration for the ASCII log output. 124 | */ 125 | if (strcmp(json_timestamps.c_str(), "JSON::TS_EPOCH") == 0) { 126 | tf = threading::formatter::JSON::TS_EPOCH; 127 | } else if (strcmp(json_timestamps.c_str(), "JSON::TS_MILLIS") == 0) { 128 | tf = threading::formatter::JSON::TS_MILLIS; 129 | } else if (strcmp(json_timestamps.c_str(), "JSON::TS_ISO8601") == 0) { 130 | tf = threading::formatter::JSON::TS_ISO8601; 131 | } else { 132 | Error(Fmt("KafkaWriter::DoInit: Invalid JSON timestamp format %s", 133 | json_timestamps.c_str())); 134 | return false; 135 | } 136 | 137 | // initialize the formatter 138 | if (BifConst::Kafka::tag_json) { 139 | formatter = new threading::formatter::TaggedJSON(info.path, this, tf); 140 | } else { 141 | formatter = new threading::formatter::JSON(this, tf); 142 | } 143 | 144 | // is debug enabled 145 | std::string debug; 146 | debug.assign((const char *)BifConst::Kafka::debug->Bytes(), 147 | BifConst::Kafka::debug->Len()); 148 | bool is_debug(!debug.empty()); 149 | if (is_debug) { 150 | MsgThread::Info( 151 | Fmt("Debug is turned on and set to: %s. Available debug context: %s.", 152 | debug.c_str(), RdKafka::get_debug_contexts().c_str())); 153 | } 154 | 155 | // kafka global configuration 156 | std::string err; 157 | conf = RdKafka::Conf::create(RdKafka::Conf::CONF_GLOBAL); 158 | 159 | // apply the user-defined settings to kafka 160 | std::map::iterator i; 161 | for (i = kafka_conf.begin(); i != kafka_conf.end(); ++i) { 162 | std::string key = i->first; 163 | std::string val = i->second; 164 | 165 | // apply setting to kafka 166 | if (RdKafka::Conf::CONF_OK != conf->set(key, val, err)) { 167 | Error(Fmt("Failed to set '%s'='%s': %s", key.c_str(), val.c_str(), 168 | err.c_str())); 169 | return false; 170 | } 171 | } 172 | 173 | if (is_debug) { 174 | std::string key("debug"); 175 | std::string val(debug); 176 | if (RdKafka::Conf::CONF_OK != conf->set(key, val, err)) { 177 | Error(Fmt("Failed to set '%s'='%s': %s", key.c_str(), val.c_str(), 178 | err.c_str())); 179 | return false; 180 | } 181 | } 182 | 183 | if (!mocking) { 184 | // create kafka producer 185 | producer = RdKafka::Producer::create(conf, err); 186 | if (!producer) { 187 | Error(Fmt("Failed to create producer: %s", err.c_str())); 188 | return false; 189 | } 190 | 191 | // create handle to topic 192 | topic_conf = RdKafka::Conf::create(RdKafka::Conf::CONF_TOPIC); 193 | topic = RdKafka::Topic::create(producer, topic_name, topic_conf, err); 194 | if (!topic) { 195 | Error(Fmt("Failed to create topic handle: %s", err.c_str())); 196 | return false; 197 | } 198 | 199 | if (is_debug) { 200 | MsgThread::Info(Fmt("Successfully created producer.")); 201 | } 202 | } 203 | return true; 204 | } 205 | 206 | /** 207 | * Writer-specific method called just before the threading system is 208 | * going to shutdown. It is assumed that once this messages returns, 209 | * the thread can be safely terminated. As such, all resources created must be 210 | * removed here. 211 | */ 212 | bool KafkaWriter::DoFinish(double network_time) { 213 | bool success = false; 214 | int poll_interval = 1000; 215 | int waited = 0; 216 | int max_wait = BifConst::Kafka::max_wait_on_shutdown; 217 | 218 | if (!mocking) { 219 | // wait a bit for queued messages to be delivered 220 | while (producer->outq_len() > 0 && waited <= max_wait) { 221 | producer->poll(poll_interval); 222 | waited += poll_interval; 223 | } 224 | 225 | // successful only if all messages delivered 226 | if (producer->outq_len() == 0) { 227 | success = true; 228 | } else { 229 | Error(Fmt("Unable to deliver %0d message(s)", producer->outq_len())); 230 | } 231 | 232 | delete topic; 233 | delete producer; 234 | delete topic_conf; 235 | } 236 | delete formatter; 237 | delete conf; 238 | 239 | return success; 240 | } 241 | 242 | /** 243 | * Writer-specific output method implementing recording of one log 244 | * entry. 245 | */ 246 | bool KafkaWriter::DoWrite(int num_fields, const threading::Field *const *fields, 247 | threading::Value **vals) { 248 | if (!mocking) { 249 | ODesc buff; 250 | buff.Clear(); 251 | 252 | // format the log entry 253 | if (BifConst::Kafka::tag_json) { 254 | dynamic_cast(formatter)->Describe( 255 | &buff, num_fields, fields, vals, additional_message_values); 256 | } else { 257 | formatter->Describe(&buff, num_fields, fields, vals); 258 | } 259 | 260 | // send the formatted log entry to kafka 261 | const char *raw = (const char *)buff.Bytes(); 262 | RdKafka::ErrorCode resp = producer->produce( 263 | topic, RdKafka::Topic::PARTITION_UA, RdKafka::Producer::RK_MSG_COPY, 264 | const_cast(raw), strlen(raw), NULL, NULL); 265 | 266 | if (RdKafka::ERR_NO_ERROR == resp) { 267 | producer->poll(0); 268 | } else { 269 | std::string err = RdKafka::err2str(resp); 270 | Error(Fmt("Kafka send failed: %s", err.c_str())); 271 | } 272 | } 273 | return true; 274 | } 275 | 276 | /** 277 | * Writer-specific method implementing a change of the buffering 278 | * state. If buffering is disabled, the writer should attempt to 279 | * write out information as quickly as possible even if doing so may 280 | * have a performance impact. If enabled (which is the default), it 281 | * may buffer data as helpful and write it out later in a way 282 | * optimized for performance. The current buffering state can be 283 | * queried via IsBuf(). 284 | */ 285 | bool KafkaWriter::DoSetBuf(bool enabled) { 286 | // no change in behavior 287 | return true; 288 | } 289 | 290 | /** 291 | * Writer-specific method implementing flushing of its output. A writer 292 | * implementation must override this method but it can just 293 | * ignore calls if flushing doesn't align with its semantics. 294 | */ 295 | bool KafkaWriter::DoFlush(double network_time) { 296 | if (!mocking) { 297 | producer->flush(0); 298 | } 299 | return true; 300 | } 301 | 302 | /** 303 | * Writer-specific method implementing log rotation. Most directly 304 | * this only applies to writers writing into files, which should then 305 | * close the current file and open a new one. However, a writer may 306 | * also trigger other apppropiate actions if semantics are similar. 307 | * Once rotation has finished, the implementation *must* call 308 | * FinishedRotation() to signal the log manager that potential 309 | * postprocessors can now run. 310 | */ 311 | bool KafkaWriter::DoRotate(const char *rotated_path, double open, double close, 312 | bool terminating) { 313 | // no need to perform log rotation 314 | return FinishedRotation(); 315 | } 316 | 317 | /** 318 | * Triggered by regular heartbeat messages from the main thread. 319 | */ 320 | bool KafkaWriter::DoHeartbeat(double network_time, double current_time) { 321 | if (!mocking) { 322 | producer->poll(0); 323 | } 324 | return true; 325 | } 326 | 327 | /** 328 | * Triggered when the topic is resolved from the configuration, when 329 | * mocking/testing 330 | * @param topic 331 | */ 332 | void KafkaWriter::raise_topic_resolved_event(const std::string topic) { 333 | if (kafka_topic_resolved_event) { 334 | val_list *vl = new val_list; 335 | vl->append(new StringVal(topic.c_str())); 336 | mgr.QueueEvent(kafka_topic_resolved_event, vl); 337 | } 338 | } 339 | -------------------------------------------------------------------------------- /src/KafkaWriter.h: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | #ifndef ZEEK_PLUGIN_BRO_KAFKA_KAFKAWRITER_H 19 | #define ZEEK_PLUGIN_BRO_KAFKA_KAFKAWRITER_H 20 | 21 | #include 22 | #include 23 | #include 24 | #include 25 | #include 26 | #include 27 | #include 28 | 29 | #include "kafka.bif.h" 30 | #include "TaggedJSON.h" 31 | 32 | namespace RdKafka { 33 | class Conf; 34 | class Producer; 35 | class Topic; 36 | } 37 | 38 | namespace threading { 39 | namespace formatter { 40 | class Formatter; 41 | }} 42 | 43 | namespace logging { namespace writer { 44 | 45 | /** 46 | * A logging writer that sends data to a Kafka broker. 47 | */ 48 | class KafkaWriter : public WriterBackend { 49 | 50 | public: 51 | explicit KafkaWriter(WriterFrontend* frontend); 52 | ~KafkaWriter(); 53 | 54 | static WriterBackend* Instantiate(WriterFrontend* frontend) 55 | { 56 | return new KafkaWriter(frontend); 57 | } 58 | 59 | protected: 60 | virtual bool DoInit(const WriterBackend::WriterInfo& info, int num_fields, const threading::Field* const* fields); 61 | virtual bool DoWrite(int num_fields, const threading::Field* const* fields, threading::Value** vals); 62 | virtual bool DoSetBuf(bool enabled); 63 | virtual bool DoRotate(const char* rotated_path, double open, double close, bool terminating); 64 | virtual bool DoFlush(double network_time); 65 | virtual bool DoFinish(double network_time); 66 | virtual bool DoHeartbeat(double network_time, double current_time); 67 | 68 | private: 69 | std::string GetConfigValue(const WriterInfo& info, const std::string name) const; 70 | void raise_topic_resolved_event(const std::string topic); 71 | static const std::string default_topic_key; 72 | std::string stream_id; 73 | bool tag_json; 74 | bool mocking; 75 | std::string json_timestamps; 76 | std::map kafka_conf; 77 | std::map additional_message_values; 78 | std::string topic_name; 79 | std::string topic_name_override; 80 | threading::formatter::Formatter *formatter; 81 | RdKafka::Producer* producer; 82 | RdKafka::Topic* topic; 83 | RdKafka::Conf* conf; 84 | RdKafka::Conf* topic_conf; 85 | }; 86 | 87 | }} 88 | 89 | #endif 90 | -------------------------------------------------------------------------------- /src/Plugin.cc: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | #include "Plugin.h" 19 | 20 | namespace plugin { namespace Apache_Kafka { 21 | Plugin plugin; 22 | }} 23 | 24 | using namespace plugin::Apache_Kafka; 25 | 26 | plugin::Configuration Plugin::Configure() 27 | { 28 | AddComponent(new ::logging::Component("KafkaWriter", ::logging::writer::KafkaWriter::Instantiate)); 29 | 30 | plugin::Configuration config; 31 | config.name = "Apache::Kafka"; 32 | config.description = "Writes logs to Kafka"; 33 | config.version.major = 0; 34 | config.version.minor = 3; 35 | return config; 36 | } 37 | -------------------------------------------------------------------------------- /src/Plugin.h: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | #ifndef ZEEK_PLUGIN_BRO_KAFKA 19 | #define ZEEK_PLUGIN_BRO_KAFKA 20 | 21 | #include "KafkaWriter.h" 22 | #include 23 | 24 | namespace plugin { namespace Apache_Kafka { 25 | 26 | class Plugin : public ::plugin::Plugin { 27 | protected: 28 | // Overridden from plugin::Plugin. 29 | virtual plugin::Configuration Configure(); 30 | }; 31 | 32 | extern Plugin plugin; 33 | }} 34 | 35 | #endif 36 | -------------------------------------------------------------------------------- /src/TaggedJSON.cc: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | #include "TaggedJSON.h" 19 | 20 | namespace threading { namespace formatter { 21 | 22 | TaggedJSON::TaggedJSON(std::string sn, MsgThread* t, JSON::TimeFormat tf): JSON(t, tf), stream_name(sn) 23 | {} 24 | 25 | TaggedJSON::~TaggedJSON() 26 | {} 27 | 28 | bool TaggedJSON::Describe(ODesc* desc, int num_fields, const Field* const* fields, Value** vals, std::map &const_vals) const 29 | { 30 | desc->AddRaw("{"); 31 | 32 | // 'tag' the json; aka prepend the stream name to the json-formatted log content 33 | desc->AddRaw("\""); 34 | desc->AddRaw(stream_name); 35 | desc->AddRaw("\": "); 36 | 37 | 38 | 39 | // append the JSON formatted log record itself 40 | JSON::Describe(desc, num_fields, fields, vals); 41 | if (const_vals.size() > 0) { 42 | 43 | std::map::iterator it = const_vals.begin(); 44 | while (it != const_vals.end()) { 45 | desc->AddRaw(","); 46 | desc->AddRaw("\""); 47 | desc->AddRaw(it->first); 48 | desc->AddRaw("\": "); 49 | desc->AddRaw("\""); 50 | desc->AddRaw(it->second); 51 | desc->AddRaw("\""); 52 | it++; 53 | } 54 | } 55 | 56 | desc->AddRaw("}"); 57 | return true; 58 | } 59 | } // namespace formatter 60 | } // namespace threading 61 | -------------------------------------------------------------------------------- /src/TaggedJSON.h: -------------------------------------------------------------------------------- 1 | /* 2 | * Licensed to the Apache Software Foundation (ASF) under one or more 3 | * contributor license agreements. See the NOTICE file distributed with 4 | * this work for additional information regarding copyright ownership. 5 | * The ASF licenses this file to You under the Apache License, Version 2.0 6 | * (the "License"); you may not use this file except in compliance with 7 | * the License. You may obtain a copy of the License at 8 | * 9 | * http://www.apache.org/licenses/LICENSE-2.0 10 | * 11 | * Unless required by applicable law or agreed to in writing, software 12 | * distributed under the License is distributed on an "AS IS" BASIS, 13 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | * See the License for the specific language governing permissions and 15 | * limitations under the License. 16 | */ 17 | 18 | #ifndef ZEEK_PLUGIN_BRO_KAFKA_TAGGEDJSON_H 19 | #define ZEEK_PLUGIN_BRO_KAFKA_TAGGEDJSON_H 20 | 21 | #include 22 | #include 23 | #include 24 | #include 25 | #include 26 | 27 | using threading::Field; 28 | using threading::MsgThread; 29 | using threading::Value; 30 | using threading::formatter::JSON; 31 | 32 | namespace threading { 33 | namespace formatter { 34 | 35 | /* 36 | * A JSON formatter that prepends or 'tags' the content with a log stream 37 | * identifier. For example, 38 | * { 'conn' : { ... }} 39 | * { 'http' : { ... }} 40 | */ 41 | class TaggedJSON : public JSON { 42 | public: 43 | TaggedJSON(std::string stream_name, MsgThread *t, JSON::TimeFormat tf); 44 | virtual ~TaggedJSON(); 45 | virtual bool Describe(ODesc *desc, int num_fields, const Field *const *fields, 46 | Value **vals, std::map &const_vals) const; 47 | 48 | private: 49 | std::string stream_name; 50 | }; 51 | 52 | } // namespace formatter 53 | } // namespace threading 54 | #endif 55 | -------------------------------------------------------------------------------- /src/events.bif: -------------------------------------------------------------------------------- 1 | event kafka_topic_resolved_event%(topic: string%); -------------------------------------------------------------------------------- /src/kafka.bif: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | module Kafka; 19 | 20 | const kafka_conf: config; 21 | const additional_message_values : config; 22 | const topic_name: string; 23 | const max_wait_on_shutdown: count; 24 | const tag_json: bool; 25 | const json_timestamps: JSON::TimestampFormat; 26 | const debug: string; 27 | const mock: bool; 28 | -------------------------------------------------------------------------------- /src/kafka_const.bif: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | module Kafka; 19 | 20 | type config : table[string] of string; 21 | -------------------------------------------------------------------------------- /tests/.gitignore: -------------------------------------------------------------------------------- 1 | .btest.failed.dat 2 | .tmp 3 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.l2s-l2e-no-overlap/output: -------------------------------------------------------------------------------- 1 | T 2 | T 3 | F 4 | F 5 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.l2s-set-l2e-set/output: -------------------------------------------------------------------------------- 1 | T 2 | F 3 | F 4 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.l2s-set-l2e-unset/output: -------------------------------------------------------------------------------- 1 | T 2 | T 3 | F 4 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.l2s-unset-l2e-set/output: -------------------------------------------------------------------------------- 1 | F 2 | F 3 | F 4 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.l2s-unset-l2e-unset/output: -------------------------------------------------------------------------------- 1 | F 2 | F 3 | F 4 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.resolved-topic-config/output: -------------------------------------------------------------------------------- 1 | Kafka topic set to const-variable-topic 2 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.resolved-topic-default/output: -------------------------------------------------------------------------------- 1 | Kafka topic set to zeek 2 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.resolved-topic-override-and-config/output: -------------------------------------------------------------------------------- 1 | Kafka topic set to configuration-table-topic 2 | Kafka topic set to const-variable-topic 3 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.resolved-topic-override-only/output: -------------------------------------------------------------------------------- 1 | Kafka topic set to configuration-table-topic 2 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.send-all-active-logs-l2e-set/output: -------------------------------------------------------------------------------- 1 | T 2 | T 3 | F 4 | F 5 | T 6 | F 7 | T 8 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.send-all-active-logs-l2e-unset/output: -------------------------------------------------------------------------------- 1 | T 2 | T 3 | T 4 | T 5 | T 6 | T 7 | T 8 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.send-all-active-logs-l2s-set-l2e-set/output: -------------------------------------------------------------------------------- 1 | F 2 | T 3 | T 4 | T 5 | T 6 | F 7 | T 8 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.send-all-active-logs-l2s-set-l2e-unset/output: -------------------------------------------------------------------------------- 1 | T 2 | T 3 | T 4 | T 5 | T 6 | T 7 | T 8 | -------------------------------------------------------------------------------- /tests/Baseline/kafka.show-plugin/output: -------------------------------------------------------------------------------- 1 | Apache::Kafka - Writes logs to Kafka (dynamic) 2 | [Writer] KafkaWriter (Log::WRITER_KAFKAWRITER) 3 | [Constant] Kafka::kafka_conf 4 | [Constant] Kafka::additional_message_values 5 | [Constant] Kafka::topic_name 6 | [Constant] Kafka::max_wait_on_shutdown 7 | [Constant] Kafka::tag_json 8 | [Constant] Kafka::json_timestamps 9 | [Constant] Kafka::debug 10 | [Constant] Kafka::mock 11 | [Event] kafka_topic_resolved_event 12 | 13 | -------------------------------------------------------------------------------- /tests/Makefile: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | test: 19 | @btest 20 | -------------------------------------------------------------------------------- /tests/Scripts/diff-remove-timestamps: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env bash 2 | # 3 | # Licensed to the Apache Software Foundation (ASF) under one or more 4 | # contributor license agreements. See the NOTICE file distributed with 5 | # this work for additional information regarding copyright ownership. 6 | # The ASF licenses this file to You under the Apache License, Version 2.0 7 | # (the "License"); you may not use this file except in compliance with 8 | # the License. You may obtain a copy of the License at 9 | # 10 | # http://www.apache.org/licenses/LICENSE-2.0 11 | # 12 | # Unless required by applicable law or agreed to in writing, software 13 | # distributed under the License is distributed on an "AS IS" BASIS, 14 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | # See the License for the specific language governing permissions and 16 | # limitations under the License. 17 | # 18 | # The upstream of this file is at 19 | # https://github.com/zeek/zeek-aux/blob/master/plugin-support/skeleton/tests/Scripts/diff-remove-timestamps 20 | # 21 | # Replace anything which looks like timestamps with XXXs (including the #start/end markers in logs). 22 | 23 | # Get us "modern" regexps with sed. 24 | if [ `uname` == "Linux" ]; then 25 | sed="sed -r" 26 | else 27 | sed="sed -E" 28 | fi 29 | 30 | $sed 's/(0\.000000)|([0-9]{9,10}\.[0-9]{2,8})/XXXXXXXXXX.XXXXXX/g' | \ 31 | $sed 's/^ *#(open|close).(19|20)..-..-..-..-..-..$/#\1 XXXX-XX-XX-XX-XX-XX/g' 32 | -------------------------------------------------------------------------------- /tests/Scripts/get-zeek-env: -------------------------------------------------------------------------------- 1 | #! /bin/sh 2 | # 3 | # Licensed to the Apache Software Foundation (ASF) under one or more 4 | # contributor license agreements. See the NOTICE file distributed with 5 | # this work for additional information regarding copyright ownership. 6 | # The ASF licenses this file to You under the Apache License, Version 2.0 7 | # (the "License"); you may not use this file except in compliance with 8 | # the License. You may obtain a copy of the License at 9 | # 10 | # http://www.apache.org/licenses/LICENSE-2.0 11 | # 12 | # Unless required by applicable law or agreed to in writing, software 13 | # distributed under the License is distributed on an "AS IS" BASIS, 14 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | # See the License for the specific language governing permissions and 16 | # limitations under the License. 17 | # 18 | # BTest helper for getting values for Zeek-related environment variables. 19 | 20 | base=`dirname $0` 21 | zeek_dist=`cat ${base}/../../build/CMakeCache.txt | grep ZEEK_DIST | cut -d = -f 2` 22 | 23 | if [ -n "${zeek_dist}" ]; then 24 | if [ "$1" = "zeekpath" ]; then 25 | ${zeek_dist}/build/zeek-path-dev 26 | elif [ "$1" = "zeek_plugin_path" ]; then 27 | ( cd ${base}/../.. && pwd ) 28 | elif [ "$1" = "path" ]; then 29 | echo ${zeek_dist}/build/src:${zeek_dist}/aux/btest:${base}/:${zeek_dist}/aux/zeek-cut:$PATH 30 | else 31 | echo "usage: `basename $0` " >&2 32 | exit 1 33 | fi 34 | else 35 | # Use Zeek installation for testing. In this case zeek-config must be in PATH. 36 | if ! which zeek-config >/dev/null; then 37 | echo "zeek-config not found" >&2 38 | exit 1 39 | fi 40 | 41 | if [ "$1" = "zeekpath" ]; then 42 | zeek-config --zeekpath 43 | elif [ "$1" = "zeek_plugin_path" ]; then 44 | ( cd ${base}/../.. && pwd ) 45 | elif [ "$1" = "path" ]; then 46 | echo ${PATH} 47 | else 48 | echo "usage: `basename $0` " >&2 49 | exit 1 50 | fi 51 | fi 52 | -------------------------------------------------------------------------------- /tests/btest.cfg: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | # The upstream of this file is at 18 | # https://github.com/zeek/zeek-aux/blob/master/plugin-support/skeleton/tests/btest.cfg 19 | 20 | [btest] 21 | TestDirs = kafka 22 | TmpDir = %(testbase)s/.tmp 23 | BaselineDir = %(testbase)s/Baseline 24 | IgnoreDirs = .svn CVS .tmp 25 | IgnoreFiles = *.tmp *.swp #* *.trace .DS_Store 26 | 27 | [environment] 28 | ZEEKPATH=`%(testbase)s/Scripts/get-zeek-env zeekpath` 29 | ZEEK_PLUGIN_PATH=`%(testbase)s/Scripts/get-zeek-env zeek_plugin_path` 30 | ZEEK_SEED_FILE=%(testbase)s/random.seed 31 | PATH=`%(testbase)s/Scripts/get-zeek-env path` 32 | TZ=UTC 33 | LC_ALL=C 34 | TRACES=%(testbase)s/Traces 35 | TMPDIR=%(testbase)s/.tmp 36 | TEST_DIFF_CANONIFIER=%(testbase)s/Scripts/diff-remove-timestamps 37 | -------------------------------------------------------------------------------- /tests/kafka/l2s-l2e-no-overlap.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef logs_to_send = set(HTTP::LOG, DHCP::LOG); 24 | redef logs_to_exclude = set(Conn::LOG, DNS::LOG); 25 | 26 | print send_to_kafka(HTTP::LOG); 27 | print send_to_kafka(DHCP::LOG); 28 | print send_to_kafka(Conn::LOG); 29 | print send_to_kafka(DNS::LOG); 30 | -------------------------------------------------------------------------------- /tests/kafka/l2s-set-l2e-set.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef logs_to_send = set(HTTP::LOG, Conn::LOG); 24 | redef logs_to_exclude = set(Conn::LOG, DNS::LOG); 25 | 26 | print send_to_kafka(HTTP::LOG); 27 | print send_to_kafka(Conn::LOG); 28 | print send_to_kafka(DNS::LOG); 29 | -------------------------------------------------------------------------------- /tests/kafka/l2s-set-l2e-unset.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef logs_to_send = set(HTTP::LOG, Conn::LOG); 24 | 25 | print send_to_kafka(HTTP::LOG); 26 | print send_to_kafka(Conn::LOG); 27 | print send_to_kafka(DNS::LOG); 28 | -------------------------------------------------------------------------------- /tests/kafka/l2s-unset-l2e-set.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef logs_to_exclude = set(Conn::LOG, DNS::LOG); 24 | 25 | print send_to_kafka(HTTP::LOG); 26 | print send_to_kafka(Conn::LOG); 27 | print send_to_kafka(DNS::LOG); 28 | -------------------------------------------------------------------------------- /tests/kafka/l2s-unset-l2e-unset.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | print send_to_kafka(HTTP::LOG); 24 | print send_to_kafka(Conn::LOG); 25 | print send_to_kafka(DNS::LOG); 26 | -------------------------------------------------------------------------------- /tests/kafka/resolved-topic-config.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek -r ../../../tests/pcaps/exercise-traffic.pcap ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | 24 | redef Kafka::logs_to_send = set(Conn::LOG); 25 | redef Kafka::topic_name = "const-variable-topic"; 26 | redef Kafka::mock = T; 27 | -------------------------------------------------------------------------------- /tests/kafka/resolved-topic-default.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek -r ../../../tests/pcaps/exercise-traffic.pcap ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef Kafka::logs_to_send = set(Conn::LOG); 24 | redef Kafka::mock = T; 25 | -------------------------------------------------------------------------------- /tests/kafka/resolved-topic-override-and-config.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek -r ../../../tests/pcaps/exercise-traffic.pcap ../../../scripts/Apache/Kafka/ %INPUT | sort > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef Kafka::logs_to_send = set(Conn::LOG); 24 | redef Kafka::topic_name = "const-variable-topic"; 25 | redef Kafka::mock = T; 26 | 27 | event zeek_init() &priority=-10 28 | { 29 | local xxx_filter: Log::Filter = [ 30 | $name = "kafka-xxx", 31 | $writer = Log::WRITER_KAFKAWRITER, 32 | $path = "kafka_xxx", 33 | $config = table(["topic_name"] = "configuration-table-topic") 34 | ]; 35 | Log::add_filter(Conn::LOG, xxx_filter); 36 | } 37 | 38 | -------------------------------------------------------------------------------- /tests/kafka/resolved-topic-override-only.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek -r ../../../tests/pcaps/exercise-traffic.pcap ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | 24 | redef Kafka::mock = T; 25 | event zeek_init() &priority=-10 26 | { 27 | local xxx_filter: Log::Filter = [ 28 | $name = "kafka-xxx", 29 | $writer = Log::WRITER_KAFKAWRITER, 30 | $path = "kafka_xxx", 31 | $config = table(["topic_name"] = "configuration-table-topic") 32 | ]; 33 | Log::add_filter(Conn::LOG, xxx_filter); 34 | } 35 | -------------------------------------------------------------------------------- /tests/kafka/send-all-active-logs-l2e-set.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef send_all_active_logs = T; 24 | redef logs_to_exclude = set(Conn::LOG, DNS::LOG, SSL::LOG); 25 | 26 | print send_to_kafka(HTTP::LOG); 27 | print send_to_kafka(DHCP::LOG); 28 | print send_to_kafka(Conn::LOG); 29 | print send_to_kafka(DNS::LOG); 30 | print send_to_kafka(SMTP::LOG); 31 | print send_to_kafka(SSL::LOG); 32 | print send_to_kafka(Files::LOG); 33 | -------------------------------------------------------------------------------- /tests/kafka/send-all-active-logs-l2e-unset.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef send_all_active_logs = T; 24 | 25 | print send_to_kafka(HTTP::LOG); 26 | print send_to_kafka(DHCP::LOG); 27 | print send_to_kafka(Conn::LOG); 28 | print send_to_kafka(DNS::LOG); 29 | print send_to_kafka(SMTP::LOG); 30 | print send_to_kafka(SSL::LOG); 31 | print send_to_kafka(Files::LOG); 32 | -------------------------------------------------------------------------------- /tests/kafka/send-all-active-logs-l2s-set-l2e-set.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef send_all_active_logs = T; 24 | redef logs_to_send = set(HTTP::LOG, Conn::LOG); 25 | redef logs_to_exclude = set(HTTP::LOG, SSL::LOG); 26 | 27 | print send_to_kafka(HTTP::LOG); 28 | print send_to_kafka(DHCP::LOG); 29 | print send_to_kafka(Conn::LOG); 30 | print send_to_kafka(DNS::LOG); 31 | print send_to_kafka(SMTP::LOG); 32 | print send_to_kafka(SSL::LOG); 33 | print send_to_kafka(Files::LOG); 34 | -------------------------------------------------------------------------------- /tests/kafka/send-all-active-logs-l2s-set-l2e-unset.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek ../../../scripts/Apache/Kafka/ %INPUT > output 19 | # @TEST-EXEC: btest-diff output 20 | 21 | module Kafka; 22 | 23 | redef send_all_active_logs = T; 24 | redef logs_to_send = set(HTTP::LOG, Conn::LOG, SSL::LOG); 25 | 26 | print send_to_kafka(HTTP::LOG); 27 | print send_to_kafka(DHCP::LOG); 28 | print send_to_kafka(Conn::LOG); 29 | print send_to_kafka(DNS::LOG); 30 | print send_to_kafka(SMTP::LOG); 31 | print send_to_kafka(SSL::LOG); 32 | print send_to_kafka(Files::LOG); 33 | -------------------------------------------------------------------------------- /tests/kafka/show-plugin.zeek: -------------------------------------------------------------------------------- 1 | # 2 | # Licensed to the Apache Software Foundation (ASF) under one or more 3 | # contributor license agreements. See the NOTICE file distributed with 4 | # this work for additional information regarding copyright ownership. 5 | # The ASF licenses this file to You under the Apache License, Version 2.0 6 | # (the "License"); you may not use this file except in compliance with 7 | # the License. You may obtain a copy of the License at 8 | # 9 | # http://www.apache.org/licenses/LICENSE-2.0 10 | # 11 | # Unless required by applicable law or agreed to in writing, software 12 | # distributed under the License is distributed on an "AS IS" BASIS, 13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 14 | # See the License for the specific language governing permissions and 15 | # limitations under the License. 16 | # 17 | 18 | # @TEST-EXEC: zeek -NN Apache::Kafka | sed 's/, version.*)/)/' > output 19 | # @TEST-EXEC: btest-diff output 20 | -------------------------------------------------------------------------------- /tests/pcaps/exercise-traffic.pcap: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/apache/metron-bro-plugin-kafka/92b85e5e00cc2fd9023ea7f53466db6592eb6634/tests/pcaps/exercise-traffic.pcap -------------------------------------------------------------------------------- /tests/random.seed: -------------------------------------------------------------------------------- 1 | 2983378351 2 | 1299727368 3 | 0 4 | 310447 5 | 0 6 | 1409073626 7 | 3975311262 8 | 34130240 9 | 1450515018 10 | 1466150520 11 | 1342286698 12 | 1193956778 13 | 2188527278 14 | 3361989254 15 | 3912865238 16 | 3596260151 17 | 517973768 18 | 1462428821 19 | 0 20 | 2278350848 21 | 32767 22 | -------------------------------------------------------------------------------- /zkg.meta: -------------------------------------------------------------------------------- 1 | [package] 2 | description = A Zeek log writer plugin that sends logging output to Kafka. 3 | tags = log writer, zeek plugin, kafka 4 | script_dir = build/scripts/Apache/Kafka 5 | build_command = ./configure --with-librdkafka=%(LIBRDKAFKA_ROOT)s && make 6 | test_command = ( cd tests && btest -d ) 7 | plugin_dir = build 8 | version = 0.3.0 9 | depends = 10 | zeek >=3.0.0 11 | zkg >=2.0 12 | external_depends = 13 | librdkafka ~1.4.2-RC1 14 | user_vars = 15 | LIBRDKAFKA_ROOT [/usr/local/lib] "Path to librdkafka installation tree" 16 | --------------------------------------------------------------------------------